WorldWideScience

Sample records for causal component mixtures

  1. Iterative Mixture Component Pruning Algorithm for Gaussian Mixture PHD Filter

    Directory of Open Access Journals (Sweden)

    Xiaoxi Yan

    2014-01-01

    Full Text Available As far as the increasing number of mixture components in the Gaussian mixture PHD filter is concerned, an iterative mixture component pruning algorithm is proposed. The pruning algorithm is based on maximizing the posterior probability density of the mixture weights. The entropy distribution of the mixture weights is adopted as the prior distribution of mixture component parameters. The iterative update formulations of the mixture weights are derived by Lagrange multiplier and Lambert W function. Mixture components, whose weights become negative during iterative procedure, are pruned by setting corresponding mixture weights to zeros. In addition, multiple mixture components with similar parameters describing the same PHD peak can be merged into one mixture component in the algorithm. Simulation results show that the proposed iterative mixture component pruning algorithm is superior to the typical pruning algorithm based on thresholds.

  2. Causality

    Science.gov (United States)

    Pearl, Judea

    2000-03-01

    Written by one of the pre-eminent researchers in the field, this book provides a comprehensive exposition of modern analysis of causation. It shows how causality has grown from a nebulous concept into a mathematical theory with significant applications in the fields of statistics, artificial intelligence, philosophy, cognitive science, and the health and social sciences. Pearl presents a unified account of the probabilistic, manipulative, counterfactual and structural approaches to causation, and devises simple mathematical tools for analyzing the relationships between causal connections, statistical associations, actions and observations. The book will open the way for including causal analysis in the standard curriculum of statistics, artifical intelligence, business, epidemiology, social science and economics. Students in these areas will find natural models, simple identification procedures, and precise mathematical definitions of causal concepts that traditional texts have tended to evade or make unduly complicated. This book will be of interest to professionals and students in a wide variety of fields. Anyone who wishes to elucidate meaningful relationships from data, predict effects of actions and policies, assess explanations of reported events, or form theories of causal understanding and causal speech will find this book stimulating and invaluable.

  3. Analysis of Minor Component Segregation in Ternary Powder Mixtures

    Directory of Open Access Journals (Sweden)

    Asachi Maryam

    2017-01-01

    Full Text Available In many powder handling operations, inhomogeneity in powder mixtures caused by segregation could have significant adverse impact on the quality as well as economics of the production. Segregation of a minor component of a highly active substance could have serious deleterious effects, an example is the segregation of enzyme granules in detergent powders. In this study, the effects of particle properties and bulk cohesion on the segregation tendency of minor component are analysed. The minor component is made sticky while not adversely affecting the flowability of samples. The segregation extent is evaluated using image processing of the photographic records taken from the front face of the heap after the pouring process. The optimum average sieve cut size of components for which segregation could be reduced is reported. It is also shown that the extent of segregation is significantly reduced by applying a thin layer of liquid to the surfaces of minor component, promoting an ordered mixture.

  4. Overfitting Bayesian Mixture Models with an Unknown Number of Components.

    Directory of Open Access Journals (Sweden)

    Zoé van Havre

    Full Text Available This paper proposes solutions to three issues pertaining to the estimation of finite mixture models with an unknown number of components: the non-identifiability induced by overfitting the number of components, the mixing limitations of standard Markov Chain Monte Carlo (MCMC sampling techniques, and the related label switching problem. An overfitting approach is used to estimate the number of components in a finite mixture model via a Zmix algorithm. Zmix provides a bridge between multidimensional samplers and test based estimation methods, whereby priors are chosen to encourage extra groups to have weights approaching zero. MCMC sampling is made possible by the implementation of prior parallel tempering, an extension of parallel tempering. Zmix can accurately estimate the number of components, posterior parameter estimates and allocation probabilities given a sufficiently large sample size. The results will reflect uncertainty in the final model and will report the range of possible candidate models and their respective estimated probabilities from a single run. Label switching is resolved with a computationally light-weight method, Zswitch, developed for overfitted mixtures by exploiting the intuitiveness of allocation-based relabelling algorithms and the precision of label-invariant loss functions. Four simulation studies are included to illustrate Zmix and Zswitch, as well as three case studies from the literature. All methods are available as part of the R package Zmix, which can currently be applied to univariate Gaussian mixture models.

  5. Overfitting Bayesian Mixture Models with an Unknown Number of Components.

    Science.gov (United States)

    van Havre, Zoé; White, Nicole; Rousseau, Judith; Mengersen, Kerrie

    2015-01-01

    This paper proposes solutions to three issues pertaining to the estimation of finite mixture models with an unknown number of components: the non-identifiability induced by overfitting the number of components, the mixing limitations of standard Markov Chain Monte Carlo (MCMC) sampling techniques, and the related label switching problem. An overfitting approach is used to estimate the number of components in a finite mixture model via a Zmix algorithm. Zmix provides a bridge between multidimensional samplers and test based estimation methods, whereby priors are chosen to encourage extra groups to have weights approaching zero. MCMC sampling is made possible by the implementation of prior parallel tempering, an extension of parallel tempering. Zmix can accurately estimate the number of components, posterior parameter estimates and allocation probabilities given a sufficiently large sample size. The results will reflect uncertainty in the final model and will report the range of possible candidate models and their respective estimated probabilities from a single run. Label switching is resolved with a computationally light-weight method, Zswitch, developed for overfitted mixtures by exploiting the intuitiveness of allocation-based relabelling algorithms and the precision of label-invariant loss functions. Four simulation studies are included to illustrate Zmix and Zswitch, as well as three case studies from the literature. All methods are available as part of the R package Zmix, which can currently be applied to univariate Gaussian mixture models.

  6. Component spectra extraction from terahertz measurements of unknown mixtures.

    Science.gov (United States)

    Li, Xian; Hou, D B; Huang, P J; Cai, J H; Zhang, G X

    2015-10-20

    The aim of this work is to extract component spectra from unknown mixtures in the terahertz region. To that end, a method, hard modeling factor analysis (HMFA), was applied to resolve terahertz spectral matrices collected from the unknown mixtures. This method does not require any expertise of the user and allows the consideration of nonlinear effects such as peak variations or peak shifts. It describes the spectra using a peak-based nonlinear mathematic model and builds the component spectra automatically by recombination of the resolved peaks through correlation analysis. Meanwhile, modifications on the method were made to take the features of terahertz spectra into account and to deal with the artificial baseline problem that troubles the extraction process of some terahertz spectra. In order to validate the proposed method, simulated wideband terahertz spectra of binary and ternary systems and experimental terahertz absorption spectra of amino acids mixtures were tested. In each test, not only the number of pure components could be correctly predicted but also the identified pure spectra had a good similarity with the true spectra. Moreover, the proposed method associated the molecular motions with the component extraction, making the identification process more physically meaningful and interpretable compared to other methods. The results indicate that the HMFA method with the modifications can be a practical tool for identifying component terahertz spectra in completely unknown mixtures. This work reports the solution to this kind of problem in the terahertz region for the first time, to the best of the authors' knowledge, and represents a significant advance toward exploring physical or chemical mechanisms of unknown complex systems by terahertz spectroscopy.

  7. Efficient speaker verification using Gaussian mixture model component clustering.

    Energy Technology Data Exchange (ETDEWEB)

    De Leon, Phillip L. (New Mexico State University, Las Cruces, NM); McClanahan, Richard D.

    2012-04-01

    In speaker verification (SV) systems that employ a support vector machine (SVM) classifier to make decisions on a supervector derived from Gaussian mixture model (GMM) component mean vectors, a significant portion of the computational load is involved in the calculation of the a posteriori probability of the feature vectors of the speaker under test with respect to the individual component densities of the universal background model (UBM). Further, the calculation of the sufficient statistics for the weight, mean, and covariance parameters derived from these same feature vectors also contribute a substantial amount of processing load to the SV system. In this paper, we propose a method that utilizes clusters of GMM-UBM mixture component densities in order to reduce the computational load required. In the adaptation step we score the feature vectors against the clusters and calculate the a posteriori probabilities and update the statistics exclusively for mixture components belonging to appropriate clusters. Each cluster is a grouping of multivariate normal distributions and is modeled by a single multivariate distribution. As such, the set of multivariate normal distributions representing the different clusters also form a GMM. This GMM is referred to as a hash GMM which can be considered to a lower resolution representation of the GMM-UBM. The mapping that associates the components of the hash GMM with components of the original GMM-UBM is referred to as a shortlist. This research investigates various methods of clustering the components of the GMM-UBM and forming hash GMMs. Of five different methods that are presented one method, Gaussian mixture reduction as proposed by Runnall's, easily outperformed the other methods. This method of Gaussian reduction iteratively reduces the size of a GMM by successively merging pairs of component densities. Pairs are selected for merger by using a Kullback-Leibler based metric. Using Runnal's method of reduction, we

  8. Merging Mixture Components for Cell Population Identification in Flow Cytometry

    Directory of Open Access Journals (Sweden)

    Greg Finak

    2009-01-01

    Full Text Available We present a framework for the identification of cell subpopulations in flow cytometry data based on merging mixture components using the flowClust methodology. We show that the cluster merging algorithm under our framework improves model fit and provides a better estimate of the number of distinct cell subpopulations than either Gaussian mixture models or flowClust, especially for complicated flow cytometry data distributions. Our framework allows the automated selection of the number of distinct cell subpopulations and we are able to identify cases where the algorithm fails, thus making it suitable for application in a high throughput FCM analysis pipeline. Furthermore, we demonstrate a method for summarizing complex merged cell subpopulations in a simple manner that integrates with the existing flowClust framework and enables downstream data analysis. We demonstrate the performance of our framework on simulated and real FCM data. The software is available in the flowMerge package through the Bioconductor project.

  9. Construction of a 21-Component Layered Mixture Experiment Design

    Energy Technology Data Exchange (ETDEWEB)

    Piepel, Gregory F.; Cooley, Scott K.; Jones, Bradley

    2004-09-22

    This paper describes the solution to a unique and challenging mixture experiment design problem involving: (1) 19 and 21 components for two different parts of the design, (2) many single-component and multi-component constraints, (3) augmentation of existing data, (4) a layered design developed in stages, and (5) a no-candidate-point optimal design approach. The problem involved studying the liquidus temperature of spinel crystals as a function of nuclear waste glass composition. The statistical objective was to develop an experimental design by augmenting existing glasses with new nonradioactive and radioactive glasses chosen to cover the designated nonradioactive and radioactive experimental regions. The existing 144 glasses were expressed as 19-component nonradioactive compositions and then augmented with 40 new nonradioactive glasses. These included 8 glasses on the outer layer of the region, 27 glasses on an inner layer, 2 replicate glasses at the centroid, and one replicate each of three existing glasses. Then, the 144 + 40 = 184 glasses were expressed as 21-component radioactive compositions and augmented with 5 radioactive glasses. A D-optimal design algorithm was used to select the new outer layer, inner layer, and radioactive glasses. Several statistical software packages can generate D-optimal experimental designs, but nearly all require a set of candidate points (e.g., vertices) from which to select design points. The large number of components (19 or 21) and many constraints made it impossible to generate the huge number of vertices and other typical candidate points. JMP(R) was used to select design points without candidate points. JMP uses a coordinate-exchange algorithm modified for mixture experiments, which is discussed in the paper.

  10. An equiratio mixture model for non-additive components : a case study for aspartame/acesulfame-K mixtures

    NARCIS (Netherlands)

    Schifferstein, H.N.J.

    1996-01-01

    The Equiratio Mixture Model predicts the psychophysical function for an equiratio mixture type on the basis of the psychophysical functions for the unmixed components. The model reliably estimates the sweetness of mixtures of sugars and sugar-alchohols, but is unable to predict intensity for

  11. An Equiratio Mixture Model for non-additive components: a case study for aspartame/acesulfame-K mixtures.

    Science.gov (United States)

    Schifferstein, H N

    1996-02-01

    The Equiratio Mixture Model predicts the psychophysical function for an equiratio mixture type on the basis of the psychophysical functions for the unmixed components. The model reliably estimates the sweetness of mixtures of sugars and sugar-alcohols, but is unable to predict intensity for aspartame/sucrose mixtures. In this paper, the sweetness of aspartame/acesulfame-K mixtures in aqueous and acidic solutions is investigated. These two intensive sweeteners probably do not comply with the model's original assumption of sensory dependency among components. However, they reveal how the Equiratio Mixture Model could be modified to describe and predict mixture functions for non-additive substances. To predict equiratio functions for all similar tasting substances, a new Equiratio Mixture Model should yield accurate predictions for components eliciting similar intensities at widely differing concentration levels, and for substances exhibiting hypo- or hyperadditivity. In addition, it should be able to correct violations of Stevens's power law. These three problems are resolved in a model that uses equi-intense units as the measure of physical concentration. An interaction index in the formula for the constant accounts for the degree of interaction between mixture components. Deviations from the power law are corrected by a nonlinear response output transformation, assuming a two-stage model of psychophysical judgment.

  12. Bonding and structure in dense multi-component molecular mixtures.

    Science.gov (United States)

    Meyer, Edmund R; Ticknor, Christopher; Bethkenhagen, Mandy; Hamel, Sebastien; Redmer, Ronald; Kress, Joel D; Collins, Lee A

    2015-10-28

    We have performed finite-temperature density functional theory molecular dynamics simulations on dense methane, ammonia, and water mixtures (CH4:NH3:H2O) for various compositions and temperatures (2000 K ≤ T ≤ 10,000 K) that span a set of possible conditions in the interiors of ice-giant exoplanets. The equation-of-state, pair distribution functions, and bond autocorrelation functions (BACF) were used to probe the structure and dynamics of these complex fluids. In particular, an improvement to the choice of the cutoff in the BACF was developed that allowed analysis refinements for density and temperature effects. We note the relative changes in the nature of these systems engendered by variations in the concentration ratios. A basic tenet emerges from all these comparisons that varying the relative amounts of the three heavy components (C,N,O) can effect considerable changes in the nature of the fluid and may in turn have ramifications for the structure and composition of various planetary layers.

  13. Disentangling the developmental and neurobehavioural effects of perinatal exposure to a chemical mixture found in blood of Arctic populations: differential toxicity of mixture components

    Energy Technology Data Exchange (ETDEWEB)

    Bowers, W.; Nakai, J.; Yagminas, A.; Chu, I.; Moir, D. [Health Canada (Canada)

    2004-09-15

    The current study was designed to evaluate the neurobehavioral effects of perinatal exposure to a chemical mixture that is based on relative concentrations of persistent organic pollutants found in the blood of Canadian Arctic populations and contains 14 PCB congeners, 12 organochlorine pesticides and methyl mercury. This study compared the effects of the complete mixture with the effects of three major components of the mixture (the PCB component, the organochlorine pesticide component, and the methyl mercury component). By examining a range of neurobehavioural functions over development we also determine if specific neurobehavioural disturbances produced by the mixture can be attributed to components of the mixture and if neurobehavioural effects produced by components of the mixture are altered by concurrent exposure to other components in the mixture. Ninety-two nulliparious female Sprague-Dawley rats served as subjects.

  14. Two-component mixture cure rate model with spline estimated nonparametric components.

    Science.gov (United States)

    Wang, Lu; Du, Pang; Liang, Hua

    2012-09-01

    In some survival analysis of medical studies, there are often long-term survivors who can be considered as permanently cured. The goals in these studies are to estimate the noncured probability of the whole population and the hazard rate of the susceptible subpopulation. When covariates are present as often happens in practice, to understand covariate effects on the noncured probability and hazard rate is of equal importance. The existing methods are limited to parametric and semiparametric models. We propose a two-component mixture cure rate model with nonparametric forms for both the cure probability and the hazard rate function. Identifiability of the model is guaranteed by an additive assumption that allows no time-covariate interactions in the logarithm of hazard rate. Estimation is carried out by an expectation-maximization algorithm on maximizing a penalized likelihood. For inferential purpose, we apply the Louis formula to obtain point-wise confidence intervals for noncured probability and hazard rate. Asymptotic convergence rates of our function estimates are established. We then evaluate the proposed method by extensive simulations. We analyze the survival data from a melanoma study and find interesting patterns for this study. © 2011, The International Biometric Society.

  15. Effect of the addition of mixture of plant components on the mechanical properties of wheat bread

    Science.gov (United States)

    Wójcik, Monika; Dziki, Dariusz; Biernacka, Beata; Różyło, Renata; Miś, Antoni; Hassoon, Waleed H.

    2017-10-01

    Instrumental methods of measuring the mechanical properties of bread can be used to determine changes in the properties of it during storage, as well as to determine the effect of various additives on the bread texture. The aim of this study was to investigate the effect of the mixture of plant components on the physical properties of wheat bread. In particular, the mechanical properties of the crumb and crust were studied. A sensory evaluation of the end product was also performed. The mixture of plant components included: carob fiber, milled grain red quinoa and black oat (1:2:2) - added at 0, 5, 10, 15, 20, 25 % - into wheat flour. The results showed that the increase of the addition of the proposed additive significantly increased the water absorption of flour mixtures. Moreover, the use of the mixture of plant components above 5% resulted in the increase of bread volume and decrease of crumb density. Furthermore, the addition of the mixture of plant components significantly affected the mechanical properties of bread crumb. The hardness of crumb also decreased as a result of the mixture of plant components addition. The highest cohesiveness was obtained for bread with 10% of additive and the lowest for bread with 25% of mixture of plant components. Most importantly, the enrichment of wheat flour with the mixture of plant components significantly reduced the crust failure force and crust failure work. The results of sensory evaluation showed that the addition of the mixture of plant components of up to 10% had little effect on bread quality.

  16. Component separation in harmonically trapped boson-fermion mixtures

    DEFF Research Database (Denmark)

    Nygaard, Nicolai; Mølmer, Klaus

    1999-01-01

    We present a numerical study of mixed boson-fermion systems at zero temperature in isotropic and anise tropic harmonic traps. We investigate the phenomenon of component separation as a function of the strength ut the interparticle interaction. While solving a Gross-Pitaevskii mean-field equation ...... for the boson distribution in the trap, we utilize two different methods to extract the density profile of the fermion component; a semiclassical Thomas-Fermi approximation and a quantum-mechanical Slater determinant Schrodinger equation....

  17. Component Identification in Multi-Chemical Mixtures With Swept-Wavelength Resonant-Raman Spectroscopy

    Science.gov (United States)

    2011-03-18

    03-2011 Journal Article Component Identification in Multi-Chemical Mixtures with Swept-Wavelength Resonant-Raman Spectroscopy Robert Lunsford, David...IDENTIFICATION IN MULTI-CHEMICAL MIXTURES WITH SWEPT-WAVELENGTH RESONANT-RAMAN SPECTROSCOPY Robert Lunsford1, David Gillis2, Jacob Grun1, Pratima...fractional molecular abundances. Introduction The utilization of Raman spectroscopy , specifically Ultraviolet Resonance Raman spectroscopy for

  18. Phase Equilibrium Calculations for Multi-Component Polar Fluid Mixtures with tPC-PSAFT

    DEFF Research Database (Denmark)

    Karakatsani, Eirini; Economou, Ioannis

    2007-01-01

    The truncated Perturbed-Chain Polar Statistical Associating Fluid Theory (tPC-PSAFT) is applied to a number of different mixtures, including binary, ternary and quaternary mixtures of components that differ substantially in terms of intermolecular interactions and molecular size. In contrast to m...

  19. XeCl Excimer Laser with Three- and Four-Component Mixture of Active Gases

    International Nuclear Information System (INIS)

    Iwanejko, L.; Pokora, L.

    1998-01-01

    Selected results of investigations of a XeCl excimer laser employing a new type (four-component)of mixture of gases, He-Kr:Xe-HCl, are presented. The mixture includes, instead of Xe, a mixture of not-separated Kr and Xe gases, much less expensive than pure xenon. A comparison of durations and energies of pulses generated in the XeCl excimer laser using three- or four-component gaseous active medium (He-Xe-HCl or He-Kr:Xe-HCl) is made. The investigations have been carried out with the use of a laser system with UV preionization and self sustained pumping discharge. (author)

  20. Mixture component effects on the in vitro dermal absorption of pentachlorophenol

    Energy Technology Data Exchange (ETDEWEB)

    Riviere, J.E.; Qiao, G.; Baynes, R.E.; Brooks, J.D. [Coll. of Veterinary Medicine, North Carolina State Univ., Raleigh, NC (United States); Mumtaz, M. [Agency for Toxic Substances and Disease Registry (ATSDR), Atlanta, GA (United States)

    2001-08-01

    Interactions between chemicals in a mixture and interactions of mixture components with the skin can significantly alter the rate and extent of percutaneous absorption, as well as the cutaneous disposition of a topically applied chemical. The predictive ability of dermal absorption models, and consequently the dermal risk assessment process, would be greatly improved by the elucidation and characterization of these interactions. Pentachlorophenol (PCP), a compound known to penetrate the skin readily, was used as a marker compound to examine mixture component effects using in vitro porcine skin models. PCP was administered in ethanol or in a 40% ethanol/60% water mixture or a 40% ethanol/60% water mixture containing either the rubefacient methyl nicotinate (MNA) or the surfactant sodium lauryl sulfate (SLS), or both MNA and SLS. Experiments were also conducted with {sup 14}C-labelled 3,3',4,4'-tetrachlorobiphenyl (TCB) and 3,3',4,4',5-pentachlorobiphenyl (PCB). Maximal PCP absorption was 14.12% of the applied dose from the mixture containing SLS, MNA, ethanol and water. However, when PCP was administered in ethanol only, absorption was only 1.12% of the applied dose. There were also qualitative differences among the absorption profiles for the different PCP mixtures. In contrast with the PCP results, absorption of TCB or PCB was negligible in perfused porcine skin, with only 0.14% of the applied TCB dose and 0.05% of the applied PCB dose being maximally absorbed. The low absorption levels for the PCB congeners precluded the identification of mixture component effects. These results suggest that dermal absorption estimates from a single chemical exposure may not reflect absorption seen after exposure as a chemical mixture and that absorption of both TCB and PCB are minimal in this model system. (orig.)

  1. Construction of a 21-Component Layered Mixture Experiment Design Using a New Mixture Coordinate-Exchange Algorithm

    Energy Technology Data Exchange (ETDEWEB)

    Piepel, Gregory F.; Cooley, Scott K.; Jones, Bradley

    2005-11-01

    This paper describes the solution to a unique and challenging mixture experiment design problem involving: 1) 19 and 21 components for two different parts of the design, 2) many single-component and multi-component constraints, 3) augmentation of existing data, 4) a layered design developed in stages, and 5) a no-candidate-point optimal design approach. The problem involved studying the liquidus temperature of spinel crystals as a function of nuclear waste glass composition. The statistical objective was to develop an experimental design by augmenting existing glasses with new nonradioactive and radioactive glasses chosen to cover the designated nonradioactive and radioactive experimental regions. The existing 144 glasses were expressed as 19-component nonradioactive compositions and then augmented with 40 new nonradioactive glasses. These included 8 glasses on the outer layer of the region, 27 glasses on an inner layer, 2 replicate glasses at the centroid, and one replicate each of three existing glasses. Then, the 144 + 40 = 184 glasses were expressed as 21-component radioactive compositions, and augmented with 5 radioactive glasses. A D-optimal design algorithm was used to select the new outer layer, inner layer, and radioactive glasses. Several statistical software packages can generate D-optimal experimental designs, but nearly all of them require a set of candidate points (e.g., vertices) from which to select design points. The large number of components (19 or 21) and many constraints made it impossible to generate the huge number of vertices and other typical candidate points. JMP was used to select design points without candidate points. JMP uses a coordinate-exchange algorithm modified for mixture experiments, which is discussed and illustrated in the paper.

  2. Mixture estimation with state-space components and Markov model of switching

    Czech Academy of Sciences Publication Activity Database

    Nagy, Ivan; Suzdaleva, Evgenia

    2013-01-01

    Roč. 37, č. 24 (2013), s. 9970-9984 ISSN 0307-904X R&D Projects: GA TA ČR TA01030123 Institutional support: RVO:67985556 Keywords : probabilistic dynamic mixtures, * probability density function * state-space models * recursive mixture estimation * Bayesian dynamic decision making under uncertainty * Kerridge inaccuracy Subject RIV: BC - Control Systems Theory Impact factor: 2.158, year: 2013 http://library.utia.cas.cz/separaty/2013/AS/nagy-mixture estimation with state-space components and markov model of switching.pdf

  3. Minimum Hellinger distance estimation for k-component poisson mixture with random effects.

    Science.gov (United States)

    Xiang, Liming; Yau, Kelvin K W; Van Hui, Yer; Lee, Andy H

    2008-06-01

    The k-component Poisson regression mixture with random effects is an effective model in describing the heterogeneity for clustered count data arising from several latent subpopulations. However, the residual maximum likelihood estimation (REML) of regression coefficients and variance component parameters tend to be unstable and may result in misleading inferences in the presence of outliers or extreme contamination. In the literature, the minimum Hellinger distance (MHD) estimation has been investigated to obtain robust estimation for finite Poisson mixtures. This article aims to develop a robust MHD estimation approach for k-component Poisson mixtures with normally distributed random effects. By applying the Gaussian quadrature technique to approximate the integrals involved in the marginal distribution, the marginal probability function of the k-component Poisson mixture with random effects can be approximated by the summation of a set of finite Poisson mixtures. Simulation study shows that the MHD estimates perform satisfactorily for data without outlying observation(s), and outperform the REML estimates when data are contaminated. Application to a data set of recurrent urinary tract infections (UTI) with random institution effects demonstrates the practical use of the robust MHD estimation method.

  4. Behavioral Evidence for Enhanced Processing of the Minor Component of Binary Odor Mixtures in Larval Drosophila

    Directory of Open Access Journals (Sweden)

    Yi-chun Chen

    2017-11-01

    Full Text Available A fundamental problem in deciding between mutually exclusive options is that the decision needs to be categorical although the properties of the options often differ but in grade. We developed an experimental handle to study this aspect of behavior organization. Larval Drosophila were trained such that in one set of animals odor A was rewarded, but odor B was not (A+/B, whereas a second set of animals was trained reciprocally (A/B+. We then measured the preference of the larvae either for A, or for B, or for “morphed” mixtures of A and B, that is for mixtures differing in the ratio of the two components. As expected, the larvae showed higher preference when only the previously rewarded odor was presented than when only the previously unrewarded odor was presented. For mixtures of A and B that differed in the ratio of the two components, the major component dominated preference behavior—but it dominated less than expected from a linear relationship between mixture ratio and preference behavior. This suggests that a minor component can have an enhanced impact in a mixture, relative to such a linear expectation. The current paradigm may prove useful in understanding how nervous systems generate discrete outputs in the face of inputs that differ only gradually.

  5. Algorithms and programs of dynamic mixture estimation unified approach to different types of components

    CERN Document Server

    Nagy, Ivan

    2017-01-01

    This book provides a general theoretical background for constructing the recursive Bayesian estimation algorithms for mixture models. It collects the recursive algorithms for estimating dynamic mixtures of various distributions and brings them in the unified form, providing a scheme for constructing the estimation algorithm for a mixture of components modeled by distributions with reproducible statistics. It offers the recursive estimation of dynamic mixtures, which are free of iterative processes and close to analytical solutions as much as possible. In addition, these methods can be used online and simultaneously perform learning, which improves their efficiency during estimation. The book includes detailed program codes for solving the presented theoretical tasks. Codes are implemented in the open source platform for engineering computations. The program codes given serve to illustrate the theory and demonstrate the work of the included algorithms.

  6. Measuring two-phase and two-component mixtures by radiometric technique

    International Nuclear Information System (INIS)

    Mackuliak, D.; Rajniak, I.

    1984-01-01

    The possibility was tried of the application of the radiometric method in measuring steam water content. The experiments were carried out in model conditions where steam was replaced with the two-component mixture of water and air. The beta radiation source was isotope 204 Tl (Esub(max)=0.765 MeV) with an activity of 19.35 MBq. Measurements were carried out within the range of the surface density of the mixture from 0.119 kg.m -2 to 0.130 kg.m -2 . Mixture speed was 5.1 m.s -1 to 7.1 m.s -1 . The observed dependence of relative pulse frequency on the specific water content in the mixture was approximated by a linear regression. (B.S.)

  7. Equilibrium properties of a multi-component ionic mixture I. Sum rules for correlation functions

    NARCIS (Netherlands)

    van Wonderen, A.J.; Suttorp, L.G.

    1987-01-01

    Equilibrium statistical methods are used to derive sum rules for two- and three-particle correlation functions of a multi-component ionic mixture. Some of these rules are general consequences of the electrostatic character of the interaction, whereas others depend on specific thermodynamic

  8. Transport of a two-component mixture in one-dimensional channels

    NARCIS (Netherlands)

    Borman, VD; Tronin, VN; Tronin, [No Value; Troyan, [No Value

    2004-01-01

    The transport of a two-component gas mixture in subnanometer channels is investigated theoretically for an arbitrary filling of channels. Special attention is paid to consistent inclusion of density effects, which are associated both with the interaction and with a finite size of particles. The

  9. Comparing numerical and analytical approaches to strongly interacting two-component mixtures in one dimensional traps

    DEFF Research Database (Denmark)

    Bellotti, Filipe Furlan; Salami Dehkharghani, Amin; Zinner, Nikolaj Thomas

    2017-01-01

    We investigate one-dimensional harmonically trapped two-component systems for repulsive interaction strengths ranging from the non-interacting to the strongly interacting regime for Fermi-Fermi mixtures. A new and powerful mapping between the interaction strength parameters from a continuous...

  10. Enantiomer-specific analysis of multi-component mixtures by correlated electron imaging-ion mass spectrometry

    NARCIS (Netherlands)

    Rafiee Fanood, M.M.; Ram, N.B.; Lehmann, C.S.; Powis, I.; Janssen, M.H.M.

    2015-01-01

    Simultaneous, enantiomer-specific identification of chiral molecules in multi-component mixtures is extremely challenging. Many established techniques for single-component analysis fail to provide selectivity in multi-component mixtures and lack sensitivity for dilute samples. Here we show how

  11. Mixture

    Directory of Open Access Journals (Sweden)

    Silva-Aguilar Martín

    2011-01-01

    Full Text Available Metals are ubiquitous pollutants present as mixtures. In particular, mixture of arsenic-cadmium-lead is among the leading toxic agents detected in the environment. These metals have carcinogenic and cell-transforming potential. In this study, we used a two step cell transformation model, to determine the role of oxidative stress in transformation induced by a mixture of arsenic-cadmium-lead. Oxidative damage and antioxidant response were determined. Metal mixture treatment induces the increase of damage markers and the antioxidant response. Loss of cell viability and increased transforming potential were observed during the promotion phase. This finding correlated significantly with generation of reactive oxygen species. Cotreatment with N-acetyl-cysteine induces effect on the transforming capacity; while a diminution was found in initiation, in promotion phase a total block of the transforming capacity was observed. Our results suggest that oxidative stress generated by metal mixture plays an important role only in promotion phase promoting transforming capacity.

  12. Bayesian estimation of mixtures with dynamic transitions and known component parameters

    Czech Academy of Sciences Publication Activity Database

    Nagy, I.; Suzdaleva, Evgenia; Kárný, Miroslav

    2011-01-01

    Roč. 47, č. 4 (2011), s. 572-594 ISSN 0023-5954 R&D Projects: GA MŠk 1M0572; GA TA ČR TA01030123; GA ČR GA102/08/0567 Grant - others:Skoda Auto(CZ) ENS/2009/UTIA Institutional research plan: CEZ:AV0Z10750506 Keywords : mixture model * Bayesian estimation * approximation * clustering * classification Subject RIV: BC - Control Systems Theory Impact factor: 0.454, year: 2011 http://library.utia.cas.cz/separaty/2011/AS/nagy-bayesian estimation of mixtures with dynamic transitions and known component parameters.pdf

  13. Mixture Statistical Distribution Based Multiple Component Model for Target Detection in High Resolution SAR Imagery

    Directory of Open Access Journals (Sweden)

    Chu He

    2017-11-01

    Full Text Available This paper proposes an innovative Mixture Statistical Distribution Based Multiple Component (MSDMC model for target detection in high spatial resolution Synthetic Aperture Radar (SAR images. Traditional detection algorithms usually ignore the spatial relationship among the target’s components. In the presented method, however, both the structural information and the statistical distribution are considered to better recognize the target. Firstly, the method based on compressed sensing reconstruction is used to recover the SAR image. Then, the multiple component model composed of a root filter and some corresponding part filters is applied to describe the structural information of the target. In the following step, mixture statistical distributions are utilised to discriminate the target from the background, and the Method of Logarithmic Cumulants (MoLC based Expectation Maximization (EM approach is adopted to estimate the parameters of the mixture statistical distribution model, which will be finally merged into the proposed MSDMC framework together with the multiple component model. In the experiment, the aeroplanes and the electrical power towers in TerraSAR-X SAR images are detected at three spatial resolutions. The results indicate that the presented MSDMC Model has potential for improving the detection performance compared with the state-of-the-art SAR target detection methods.

  14. Efficient testing of the homogeneity, scale parameters and number of components in the Rayleigh mixture

    International Nuclear Information System (INIS)

    Stehlik, M.; Ososkov, G.A.

    2003-01-01

    The statistical problem to expand the experimental distribution of transverse momenta into Rayleigh distribution is considered. A high-efficient testing procedure for testing the hypothesis of the homogeneity of the observed measurements which is optimal in the sense of Bahadur is constructed. The exact likelihood ratio (LR) test of the scale parameter of the Rayleigh distribution is proposed for cases when the hypothesis of homogeneity holds. Otherwise the efficient procedure for testing the number of components in the mixture is also proposed

  15. Radiation-energy partition among mixture components: current ideas on an old question

    International Nuclear Information System (INIS)

    Swallow, A.J.

    1988-01-01

    We review the basis of the familiar idea that the energy partition among mixture components in the initial stage would be governed by the total electron fraction. For considerations of many problems in radiation chemistry, it is better to use the valence-electron fraction. We also point out recent developments in more detailed treatments, which indicate limitations of the very concept of the energy partition for the determination of the yields of initial molecular species that appear under irradiation. (author)

  16. Optimum Tolerance Design Using Component-Amount and Mixture-Amount Experiments

    Energy Technology Data Exchange (ETDEWEB)

    Piepel, Gregory F.; Ozler, Cenk; Sehirlioglu, Ali Kemal

    2013-08-01

    One type of tolerance design problem involves optimizing component and assembly tolerances to minimize the total cost (sum of manufacturing cost and quality loss). Previous literature recommended using traditional response surface (RS) designs and models to solve this type of tolerance design problem. In this article, component-amount (CA) and mixture-amount (MA) approaches are proposed as more appropriate for solving this type of tolerance design problem. The advantages of the CA and MA approaches over the RS approach are discussed. Reasons for choosing between the CA and MA approaches are also discussed. The CA and MA approaches (experimental design, response modeling, and optimization) are illustrated using real examples.

  17. The cytoplasm of living cells: a functional mixture of thousands of components

    International Nuclear Information System (INIS)

    Sear, Richard P

    2005-01-01

    Inside every living cell is the cytoplasm: a fluid mixture of thousands of different macromolecules, predominantly proteins. This mixture is where most of the biochemistry occurs that enables living cells to function, and it is perhaps the most complex liquid on earth. Here we take an inventory of what is actually in this mixture. Recent genome-sequencing work has given us for the first time at least some information on all of these thousands of components. Having done so we consider two physical phenomena in the cytoplasm: diffusion and possible phase separation. Diffusion is slower in the highly crowded cytoplasm than in dilute solution. Reasonable estimates of this slow-down can be obtained and their consequences explored; for example, monomer-dimer equilibria are established approximately 20 times more slowly than in a dilute solution. Phase separation in all except exceptional cells appears not to be a problem, despite the high density and so strong protein-protein interactions present. We suggest that this may be partially a by-product of the evolution of other properties, and partially a result of the huge number of components present

  18. Decoupling multimode vibrational relaxations in multi-component gas mixtures: Analysis of sound relaxational absorption spectra

    International Nuclear Information System (INIS)

    Zhang Ke-Sheng; Wang Shu; Zhu Ming; Ding Yi; Hu Yi

    2013-01-01

    Decoupling the complicated vibrational—vibrational (V—V) coupling of a multimode vibrational relaxation remains a challenge for analyzing the sound relaxational absorption in multi-component gas mixtures. In our previous work [Acta Phys. Sin. 61 174301 (2012)], an analytical model to predict the sound absorption from vibrational relaxation in a gas medium is proposed. In this paper, we develop the model to decouple the V—V coupled energy to each vibrational—translational deexcitation path, and analyze how the multimode relaxations form the peaks of sound absorption spectra in gas mixtures. We prove that a multimode relaxation is the sum of its decoupled single-relaxation processes, and only the decoupled process with a significant isochoric-molar-heat can be observed as an absorption peak. The decoupling model clarifies the essential processes behind the peaks in spectra arising from the multimode relaxations in multi-component gas mixtures. The simulation validates the proposed decoupling model. (electromagnetism, optics, acoustics, heat transfer, classical mechanics, and fluid dynamics)

  19. Straightforward dimensionless experimental formulae for flash point of binary mixtures of two flammable components

    Directory of Open Access Journals (Sweden)

    Hristova Mariana

    2012-01-01

    Full Text Available Dimensionless experimental formulae based on a rational reciprocal function for correlation of flashpoint data of binary mixtures of two flammable components have been developed. The formulae are based on data obtained from flash-point experiments. The proposed approach requires only two coefficients, molar fraction of components and flashpoint temperatures of the pure flammable components to be known in advance. Literature data were used for formulae verification and validation obtained results indicate that accuracy is comparable and to some extent better than that of conventional flash point prediction models. Dimensional analysis and scaling of data have been performed in order to define the correct construction of the equation fitting flash-point data in dimensionless form using the independent variables suggested by Catoire. Stefan number relevant flash-point of a single compound or a blend has been defined.

  20. Maximum likelihood estimation of semiparametric mixture component models for competing risks data.

    Science.gov (United States)

    Choi, Sangbum; Huang, Xuelin

    2014-09-01

    In the analysis of competing risks data, the cumulative incidence function is a useful quantity to characterize the crude risk of failure from a specific event type. In this article, we consider an efficient semiparametric analysis of mixture component models on cumulative incidence functions. Under the proposed mixture model, latency survival regressions given the event type are performed through a class of semiparametric models that encompasses the proportional hazards model and the proportional odds model, allowing for time-dependent covariates. The marginal proportions of the occurrences of cause-specific events are assessed by a multinomial logistic model. Our mixture modeling approach is advantageous in that it makes a joint estimation of model parameters associated with all competing risks under consideration, satisfying the constraint that the cumulative probability of failing from any cause adds up to one given any covariates. We develop a novel maximum likelihood scheme based on semiparametric regression analysis that facilitates efficient and reliable estimation. Statistical inferences can be conveniently made from the inverse of the observed information matrix. We establish the consistency and asymptotic normality of the proposed estimators. We validate small sample properties with simulations and demonstrate the methodology with a data set from a study of follicular lymphoma. © 2014, The International Biometric Society.

  1. Partitioning detectability components in populations subject to within-season temporary emigration using binomial mixture models.

    Science.gov (United States)

    O'Donnell, Katherine M; Thompson, Frank R; Semlitsch, Raymond D

    2015-01-01

    Detectability of individual animals is highly variable and nearly always binomial mixture models to account for multiple sources of variation in detectability. The state process of the hierarchical model describes ecological mechanisms that generate spatial and temporal patterns in abundance, while the observation model accounts for the imperfect nature of counting individuals due to temporary emigration and false absences. We illustrate our model's potential advantages, including the allowance of temporary emigration between sampling periods, with a case study of southern red-backed salamanders Plethodon serratus. We fit our model and a standard binomial mixture model to counts of terrestrial salamanders surveyed at 40 sites during 3-5 surveys each spring and fall 2010-2012. Our models generated similar parameter estimates to standard binomial mixture models. Aspect was the best predictor of salamander abundance in our case study; abundance increased as aspect became more northeasterly. Increased time-since-rainfall strongly decreased salamander surface activity (i.e. availability for sampling), while higher amounts of woody cover objects and rocks increased conditional detection probability (i.e. probability of capture, given an animal is exposed to sampling). By explicitly accounting for both components of detectability, we increased congruence between our statistical modeling and our ecological understanding of the system. We stress the importance of choosing survey locations and protocols that maximize species availability and conditional detection probability to increase population parameter estimate reliability.

  2. Analysis of Influence of Foaming Mixture Components on Structure and Properties of Foam Glass

    Science.gov (United States)

    Karandashova, N. S.; Goltsman, B. M.; Yatsenko, E. A.

    2017-11-01

    It is recommended to use high-quality thermal insulation materials to increase the energy efficiency of buildings. One of the best thermal insulation materials is foam glass - durable, porous material that is resistant to almost any effect of substance. Glass foaming is a complex process depending on the foaming mode and the initial mixture composition. This paper discusses the influence of all components of the mixture - glass powder, foaming agent, enveloping material and water - on the foam glass structure. It was determined that glass powder is the basis of the future material. A foaming agent forms a gas phase in the process of thermal decomposition. This aforementioned gas foams the viscous glass mass. The unreacted residue thus changes a colour of the material. The enveloping agent slows the foaming agent decomposition preventing its premature burning out and, in addition, helps to accelerate the sintering of glass particles. The introduction of water reduces the viscosity of the foaming mixture making it evenly distributed and also promotes the formation of water gas that additionally foams the glass mass. The optimal composition for producing the foam glass with the density of 150 kg/m3 is defined according to the results of the research.

  3. Nonstationary Frequency Analysis of Extreme Floods Using the Time-varying Two-component Mixture Distributions

    Science.gov (United States)

    Yan, L.; Xiong, L.; Liu, D.; Hu, T.; Xu, C. Y.

    2016-12-01

    The basic IID assumption of the traditional flood frequency analysis has been challenged by nonstationarity. The most popular practice for analyzing nonstationarity of flood series is to use a fixed single-type probability distribution incorporated with the time-varying moments. However, the type of probability distribution could be both complex because of distinct flood populations and time-varying under changing environments. To allow the investigation of this complex nature, the time-varying two-component mixture distributions (TTMD) method is proposed in this study by considering the time variations of not only the moments of its component distributions but also the weighting coefficients. Having identified the existence of mixed flood populations based on circular statistics, the proposed TTMD was applied to model the annual maximum flood series (AMFS) of two stations in the Weihe River basin (WRB), with the model parameters calibrated by the meta-heuristic maximum likelihood (MHML) method. The performance of TTMD was evaluated by different diagnostic plots and indexes and compared with stationary single-type distributions, stationary mixture distributions and time-varying single-type distributions. The results highlighted the advantages of using TTMD models and physically-based covariates in nonstationary flood frequency analysis. Besides, the optimal TTMD models were considered to be capable of settling the issue of nonstationarity and capturing the mixed flood populations satisfactorily. It is concluded that the TTMD model is a good alternative in the nonstationary frequency analysis and can be applied to other regions with mixed flood populations.

  4. Effect of Substrate Wetting on the Morphology and Dynamics of Phase Separating Multi-Component Mixture

    Science.gov (United States)

    Goyal, Abheeti; Toschi, Federico; van der Schoot, Paul

    2017-11-01

    We study the morphological evolution and dynamics of phase separation of multi-component mixture in thin film constrained by a substrate. Specifically, we have explored the surface-directed spinodal decomposition of multicomponent mixture numerically by Free Energy Lattice Boltzmann (LB) simulations. The distinguishing feature of this model over the Shan-Chen (SC) model is that we have explicit and independent control over the free energy functional and EoS of the system. This vastly expands the ambit of physical systems that can be realistically simulated by LB simulations. We investigate the effect of composition, film thickness and substrate wetting on the phase morphology and the mechanism of growth in the vicinity of the substrate. The phase morphology and averaged size in the vicinity of the substrate fluctuate greatly due to the wetting of the substrate in both the parallel and perpendicular directions. Additionally, we also describe how the model presented here can be extended to include an arbitrary number of fluid components.

  5. Highly Selective Upgrading of Biomass-Derived Alcohol Mixtures for Jet/Diesel-Fuel Components.

    Science.gov (United States)

    Liu, Qiang; Xu, Guoqiang; Wang, Xicheng; Liu, Xiaoran; Mu, Xindong

    2016-12-20

    In light of the increasing concern about the energy and environmental problems caused by the combustion of petroleum-based fuels (e.g., jet and diesel fuels), the development of new procedures for their sustainable production from renewable biomass-derived platform compounds has attracted tremendous attention recently. Long-chain ketones/alcohols are promising fuel components owing to the fuel properties that closely resemble those of traditional fuels. The focus of this report is the production of long-chain ketones/alcohols by direct upgrading of biomass-derived short-chain alcohol mixtures (e.g., isopropanol-butanol-ethanol mixtures) in pure water. An efficient Pd catalyst system was developed for these highly selective transformations. Long-chain ketones/alcohols (C 8 -C 19 ), which can be used as precursors for renewable jet/diesel fuel, were obtained in good-to-high selectivity (>90 %) by using the developed Pd catalyst. © 2016 Wiley-VCH Verlag GmbH & Co. KGaA, Weinheim.

  6. CO2 Removal from Multi-component Gas Mixtures Utilizing Spiral-Wound Asymmetric Membranes

    International Nuclear Information System (INIS)

    Said, W.B.; Fahmy, M.F.M.; Gad, F.K.; EI-Aleem, G.A.

    2004-01-01

    A systematic procedure and a computer program have been developed for simulating the performance of a spiral-wound gas permeate for the CO 2 removal from natural gas and other hydrocarbon streams. The simulation program is based on the approximate multi-component model derived by Qi and Henson(l), in addition to the membrane parameters achieved from the binary simulation program(2) (permeability and selectivity). Applying the multi-component program on the same data used by Qi and Henson to evaluate the deviation of the approximate model from the basic transport model, showing results more accurate than those of the approximate model, and are very close to those of the basic transport model, while requiring significantly less than 1 % of the computation time. The program was successfully applied on the data of Salam gas plant membrane unit at Khalda Petroleum Company, Egypt, for the separation of CO 2 from hydrocarbons in an eight-component mixture to estimate the stage cut, residue, and permeate compositions, and gave results matched with the actual Gas Chromatography Analysis measured by the lab

  7. Human toxicology of chemical mixtures toxic consequences beyond the impact of one-component product and environmental exposures

    CERN Document Server

    Zeliger, Harold I

    2011-01-01

    In this important reference work, Zeliger catalogs the known effects of chemical mixtures on the human body and also proposes a framework for understanding and predicting their actions in terms of lipophile (fat soluble)/hydrophile (water soluble) interactions. The author's focus is on illnesses that ensue following exposures to mixtures of chemicals that cannot be attributed to any one component of the mixture. In the first part the mechanisms of chemical absorption at a molecular and macromolecular level are explained, as well as the body's methods of defending itself against xenobiotic intrusion. Part II examines the sources of the chemicals discussed, looking at air and water pollution, food additives, pharmaceuticals, etc. Part III, which includes numerous case studies, examines specific effects of particular mixtures on particular body systems and organs and presents a theoretical framework for predicting what the effects of uncharacterized mixtures might be. Part IV covers regulatory requirements and t...

  8. Phase equilibria for mixtures containing very many components. development and application of continuous thermodynamics for chemical process design

    International Nuclear Information System (INIS)

    Cotterman, R.L.; Bender, R.; Prausnitz, J.M.

    1984-01-01

    For some multicomponent mixtures, where detailed chemical analysis is not feasible, the compositio of the mixture may be described by a continuous distribution function of some convenient macroscopic property suc as normal boiling point or molecular weight. To attain a quantitative description of phase equilibria for such mixtures, this work has developed thermodynamic procedures for continuous systems; that procedure is called continuous thermodynamics. To illustrate, continuous thermodynamics is used to calculate dew points for natural-gas mixtures, solvent loss in a high-pressure absorber, and liquid-liquid phase equilibria in a polymer fractionation process. Continuous thermodynamics provides a rational method for calculating phase equilibria for those mixtures where complete chemical analysis is not available but where composition can be given by some statistical description. While continuous thermodynamics is only the logical limit of the well-known pseudo-component method, it is more efficient than that method because it is less arbitrary and it often requires less computer time

  9. Flow boiling heat transfer coefficients at cryogenic temperatures for multi-component refrigerant mixtures of nitrogen-hydrocarbons

    Science.gov (United States)

    Ardhapurkar, P. M.; Sridharan, Arunkumar; Atrey, M. D.

    2014-01-01

    The recuperative heat exchanger governs the overall performance of the mixed refrigerant Joule-Thomson cryocooler. In these heat exchangers, the non-azeotropic refrigerant mixture of nitrogen-hydrocarbons undergoes boiling and condensation simultaneously at cryogenic temperature. Hence, the design of such heat exchanger is crucial. However, due to lack of empirical correlations to predict two-phase heat transfer coefficients of multi-component mixtures at low temperature, the design of such heat exchanger is difficult.

  10. Extraction of lipid components from hibiscus seeds by supercritical carbon dioxide and ethanol mixtures

    Energy Technology Data Exchange (ETDEWEB)

    Holser, Ronald A.; King, J. W. (Jerry W.); Bost, G.

    2002-01-01

    The genus Hibiscus exhibits great diversity in the production of natural materials with edible and industrial applications. The seeds of twelve varieties of Hibiscus were investigated as a source for triglycerides and phospholipids that could be used in functional foods. Lipid components were extracted from seed samples ground to a nominal particle diameter of 0.1 mm. Extractions were performed with an ISCO model 3560 supercritical fluid extractor using carbon dioxide and a mixture of carbon dioxide modified with ethanol. The neutral lipids were extracted with carbon dioxide at 80 C and 5370 MPa for 45 min. Polar lipids were subsequently extracted with a mixture of carbon dioxide and 15% ethanol at the same temperature and pressure. High performance liquid chromatography (HPLC) was used to analyze extracts for major neutral and polar lipid classes. A silica column was used with a solvent gradient of hexane/isopropanol/ water and ultraviolet (UV) and evaporative light scattering detectors (ELSD). An aliquot of each triglyceride fraction was trans-methylated with sodium methoxide and analyzed by gas chromatography to obtain the corresponding fatty acid methyl esters. The total lipids extracted ranged from 8.5% for a variety indigenous to Madagascar (H. calyphyllus) to 20% for a hybrid species (Georgia Rose). The average oil yield was 11.4% for the other varieties tested. The fatty acid methyl ester analysis displayed a high degree of unsaturation for all varieties tested, e. g., 75 ' 83%. Oleic, linoleic, and linolenic fatty acids were the predominate unsaturated fatty acids with only minor amounts of C14, C18, and C20 saturated fatty acids measured. Palmitic acid was identified as the predominate saturated fatty acid. The distribution of the major phospholipids, i. e., phosphatidylethanolamine, phosphatidic acid, phosphatidylserine, phosphatidylcholine, and lysophosphatidylcholine, was found to vary significantly among the hibiscus species examined

  11. Damage Detection of Refractory Based on Principle Component Analysis and Gaussian Mixture Model

    Directory of Open Access Journals (Sweden)

    Changming Liu

    2018-01-01

    Full Text Available Acoustic emission (AE technique is a common approach to identify the damage of the refractories; however, there is a complex problem since there are as many as fifteen involved parameters, which calls for effective data processing and classification algorithms to reduce the level of complexity. In this paper, experiments involving three-point bending tests of refractories were conducted and AE signals were collected. A new data processing method of merging the similar parameters in the description of the damage and reducing the dimension was developed. By means of the principle component analysis (PCA for dimension reduction, the fifteen related parameters can be reduced to two parameters. The parameters were the linear combinations of the fifteen original parameters and taken as the indexes for damage classification. Based on the proposed approach, the Gaussian mixture model was integrated with the Bayesian information criterion to group the AE signals into two damage categories, which accounted for 99% of all damage. Electronic microscope scanning of the refractories verified the two types of damage.

  12. Investigations of an excimer laser working with a four-component gaseous mixture He-Kr:Xe-HCl

    Science.gov (United States)

    Iwanejko, Leszek; Pokora, Ludwik J.

    1991-08-01

    The paper presnts working conditions of an XCI excimer laser untypical gas mixture based on KrzXe instead of pure Xe. Such a choice was influenced by the necessity of Findin9 the way to replace imported and expensive Xe by gaseous components accesible in Poland. Determining the range of changes of laser extrnal parameters which enables its proper work with the new gas mixture was the aim of same investigations results of which are presented in this paper. The laser pulse output energy and the pulse duration as a Function of supply voltage and the mixture composition are presented. The range of proper conditions for the laser working with the new mixture He-Kr:Xe--HC1 was determined. The analysis of experimental results showed that using the new mixture ensures value of energy and pulse duration comparable with the ones obtained for the mixture He-''Xe--HCl. Spectral investigations showed the lack of influence of Kr presence in the mixture on the generation spectrum of the laser. L.

  13. METHODS OF ANALYSIS AND CLASSIFICATION OF THE COMPONENTS OF GRAIN MIXTURES BASED ON MEASURING THE REFLECTION AND TRANSMISSION SPECTRA

    Directory of Open Access Journals (Sweden)

    Artem O. Donskikh*

    2017-10-01

    Full Text Available The paper considers methods of classification of grain mixture components based on spectral analysis in visible and near-infrared wavelength ranges using various measurement approaches - reflection, transmission and combined spectrum methods. It also describes the experimental measuring units used and suggests the prototype of a multispectral grain mixture analyzer. The results of the spectral measurement were processed using neural network based classification algorithms. The probabilities of incorrect recognition for various numbers of spectral parts and combinations of spectral methods were estimated. The paper demonstrates that combined usage of two spectral analysis methods leads to higher classification accuracy and allows for reducing the number of the analyzed spectral parts. A detailed description of the proposed measurement device for high-performance real-time multispectral analysis of the components of grain mixtures is given.

  14. Introducing Students to Gas Chromatography-Mass Spectrometry Analysis and Determination of Kerosene Components in a Complex Mixture

    Science.gov (United States)

    Pacot, Giselle Mae M.; Lee, Lyn May; Chin, Sung-Tong; Marriott, Philip J.

    2016-01-01

    Gas chromatography-mass spectrometry (GC-MS) and GC-tandem MS (GC-MS/MS) are useful in many separation and characterization procedures. GC-MS is now a common tool in industry and research, and increasingly, GC-MS/MS is applied to the measurement of trace components in complex mixtures. This report describes an upper-level undergraduate experiment…

  15. The Development Of A Theoretical Lean Culture Causal Framework To Support The Effective Implementation Of Lean In Automotive Component Manufacturers

    Directory of Open Access Journals (Sweden)

    Van der Merwe, Karl Robert

    2014-05-01

    Full Text Available Although it is generally accepted that lean manufacturing improves operational performance, many organisations are struggling to adapt to the lean philosophy. The purpose of this study is to contribute to a more effective strategy for implementing the lean manufacturing improvement philosophy. The study sets out both to integrate well-researched findings and theories related to generic organisational culture with more recent research and experience related to lean culture, and to examine the role that culture plays in the effective implementation of lean manufacturing principles and techniques. The ultimate aim of this exercise is to develop a theoretical lean culture causal framework.

  16. Multidimensional profiling of components in complex mixtures of natural products for metabolic analysis, proof of concept: application to Quillaja saponins.

    Science.gov (United States)

    Bankefors, Johan; Nord, Lars I; Kenne, Lennart

    2010-02-01

    A method for separation and detection of major and minor components in complex mixtures has been developed, utilising two-dimensional high-performance liquid chromatography (2D-HPLC) combined with electrospray ionisation ion-trap multiple-stage mass spectrometry (ESI-ITMS(n)). Chromatographic conditions were matched with mass spectrometric detection to maximise the number of components that could be separated. The described procedure has proven useful to discern several hundreds of saponin components when applied to Quillaja saponaria Molina bark extracts. The discrimination of each saponin component relies on the fact that three coordinates (x, y, z) for each component can be derived from the retention time of the two chromatographic steps (x, y) and the m/z-values from the multiple-stage mass spectrometry (z(n), n=1, 2, ...). Thus an improved graphical representation was obtained by combining retention times from the two-stage separation with +MS(1) (z(1)) and the additional structural information from the second mass stage +MS(2) (z(2), z(3)) corresponding to the main fragment ions. By this approach three-dimensional plots can be made that reveal both the chromatographic and structural properties of a specific mixture which can be useful in fingerprinting of complex mixtures. 2009 Elsevier B.V. All rights reserved.

  17. ARTS: A System-Level Framework for Modeling MPSoC Components and Analysis of their Causality

    DEFF Research Database (Denmark)

    Mahadevan, Shankar; Storgaard, Michael; Madsen, Jan

    2005-01-01

    Designing complex heterogeneousmultiprocessor Systemon- Chip (MPSoC) requires support for modeling and analysis of the different layers i.e. application, operating system (OS) and platform architecture. This paper presents an abstract system-level modeling framework, called ARTS, to support...... the MPSoC designers in modeling the different layers and understanding their causalities. While others have developed tools for static analysis and modeled limited correlations (processor-memory or processor-communication), our model captures the impact of dynamic and unpredictable OS behaviour...... on processor, memory and communication performance. In particular, we focus on analyzing the impact of application mapping on the processor and memory utilization taking the on-chip communication latency into account. A case-study of real-time multimedia application consisting of 114 tasks on a 6-processor...

  18. Kinetic behavior of Fe(o,o-EDDHA)-humic substance mixtures in several soil components and in calcareous soils.

    Science.gov (United States)

    Cerdán, Mar; Alcañiz, Sara; Juárez, Margarita; Jordá, Juana D; Bermúdez, Dolores

    2007-10-31

    Ferric ethylenediamine- N, N'-bis-(o-hydroxyphenylacetic)acid chelate (Fe(o, o-EDDHA)) is one of the most effective Fe fertilizers in calcareous soils. However, humic substances are occasionally combined with iron chelates in drip irrigation systems in order to lower costs. The reactivity of iron chelate-humic substance mixtures in several soil components and in calcareous soils was investigated through interaction tests, and their behavior was compared to the application of iron chelates and humic substances separately. Two commercial humic substances and two Fe(o, o-EDDHA) chelates (one synthesized in the laboratory and one commercial) were used to prepare iron chelate-humic substance mixtures at 50% (w/w). Various soil components (calcium carbonate, gibbsite, amorphous iron oxide, hematite, tenorite, zincite, amorphous Mn oxide, and peat) and three calcareous soils were shaken for 15 days with the mixtures and with iron chelate and humic substance solutions. The kinetic behavior of Fe(o, o-EDDHA) and Fe non-(o,o-EDDHA) (Fe bonded to (o,p-EDDHA) and other polycondensated ligands) and of the different nutrients solubilized after the interaction assay was determined. The results showed that the mixtures did not significantly reduce the retention of Fe(o, o-EDDHA) and Fe non-(o,o-EDDHA) in the soil components and the calcareous soils compared to the iron chelate solutions, but they did produce changes in the retention rate. Moreover, the competition between humic substances and synthetic chelating agents for complexing metal cations limited the effectiveness of the mixtures to mobilize nutrients from the substrates. The presence of Fe(o, p-EDDHA) and other byproducts in the commercial iron chelate had an important effect on the evolution of Fe(o, o-EDDHA) and the nutrient solubilization process.

  19. Method to assess component contribution to toxicity of complex mixtures: Assessment of puberty acquisition in rats exposed to disinfection byproducts.

    Science.gov (United States)

    Parvez, Shahid; Rice, Glenn E; Teuschler, Linda K; Simmons, Jane Ellen; Speth, Thomas F; Richardson, Susan D; Miltner, Richard J; Hunter, E Sidney; Pressman, Jonathan G; Strader, Lillian F; Klinefelter, Gary R; Goldman, Jerome M; Narotsky, Michael G

    2017-08-01

    A method based on regression modeling was developed to discern the contribution of component chemicals to the toxicity of highly complex, environmentally realistic mixtures of disinfection byproducts (DBPs). Chemical disinfection of drinking water forms DBP mixtures. Because of concerns about possible reproductive and developmental toxicity, a whole mixture (WM) of DBPs produced by chlorination of a water concentrate was administered as drinking water to Sprague-Dawley (S-D) rats in a multigenerational study. Age of puberty acquisition, i.e., preputial separation (PPS) and vaginal opening (VO), was examined in male and female offspring, respectively. When compared to controls, a slight, but statistically significant delay in puberty acquisition was observed in females but not in males. WM-induced differences in the age at puberty acquisition were compared to those reported in S-D rats administered either a defined mixture (DM) of nine regulated DBPs or individual DBPs. Regression models were developed using individual animal data on age at PPS or VO from the DM study. Puberty acquisition data reported in the WM and individual DBP studies were then compared with the DM models. The delay in puberty acquisition observed in the WM-treated female rats could not be distinguished from delays predicted by the DM regression model, suggesting that the nine regulated DBPs in the DM might account for much of the delay observed in the WM. This method is applicable to mixtures of other types of chemicals and other endpoints. Copyright © 2017. Published by Elsevier B.V.

  20. The graphic representations for the one-dimensional solutions of problem from elastic mechanic deformations of two-component mixture

    Directory of Open Access Journals (Sweden)

    Ghenadie Bulgac

    2006-12-01

    Full Text Available In this paper we find the analytical solution of simple one-dimensional unsteady elastic problem of two-component mixture using Laplace integral transformation. The integral transformations simplify the initial motion systems for finding analytical solutions. The analytical solutions are represented as the graphic on time dependence in the fixed point of medium, and the graphic on the horizontal coordinate at the fixed time.

  1. A Comprehensive Mixture of Tobacco Smoke Components Retards Orthodontic Tooth Movement via the Inhibition of Osteoclastogenesis in a Rat Model

    Directory of Open Access Journals (Sweden)

    Maya Nagaie

    2014-10-01

    Full Text Available Tobacco smoke is a complex mixture of numerous components. Nevertheless, most experiments have examined the effects of individual chemicals in tobacco smoke. The comprehensive effects of components on tooth movement and bone resorption remain unexplored. Here, we have shown that a comprehensive mixture of tobacco smoke components (TSCs attenuated bone resorption through osteoclastogenesis inhibition, thereby retarding experimental tooth movement in a rat model. An elastic power chain (PC inserted between the first and second maxillary molars robustly yielded experimental tooth movement within 10 days. TSC administration effectively retarded tooth movement since day 4. Histological evaluation disclosed that tooth movement induced bone resorption at two sites: in the bone marrow and the peripheral bone near the root. TSC administration significantly reduced the number of tartrate-resistant acid phosphatase (TRAP-positive osteoclastic cells in the bone marrow cavity of the PC-treated dentition. An in vitro study indicated that the inhibitory effects of TSCs on osteoclastogenesis seemed directed more toward preosteoclasts than osteoblasts. These results indicate that the comprehensive mixture of TSCs might be a useful tool for detailed verification of the adverse effects of tobacco smoke, possibly contributing to the development of reliable treatments in various fields associated with bone resorption.

  2. A comprehensive mixture of tobacco smoke components retards orthodontic tooth movement via the inhibition of osteoclastogenesis in a rat model.

    Science.gov (United States)

    Nagaie, Maya; Nishiura, Aki; Honda, Yoshitomo; Fujiwara, Shin-Ichi; Matsumoto, Naoyuki

    2014-10-15

    Tobacco smoke is a complex mixture of numerous components. Nevertheless, most experiments have examined the effects of individual chemicals in tobacco smoke. The comprehensive effects of components on tooth movement and bone resorption remain unexplored. Here, we have shown that a comprehensive mixture of tobacco smoke components (TSCs) attenuated bone resorption through osteoclastogenesis inhibition, thereby retarding experimental tooth movement in a rat model. An elastic power chain (PC) inserted between the first and second maxillary molars robustly yielded experimental tooth movement within 10 days. TSC administration effectively retarded tooth movement since day 4. Histological evaluation disclosed that tooth movement induced bone resorption at two sites: in the bone marrow and the peripheral bone near the root. TSC administration significantly reduced the number of tartrate-resistant acid phosphatase (TRAP)-positive osteoclastic cells in the bone marrow cavity of the PC-treated dentition. An in vitro study indicated that the inhibitory effects of TSCs on osteoclastogenesis seemed directed more toward preosteoclasts than osteoblasts. These results indicate that the comprehensive mixture of TSCs might be a useful tool for detailed verification of the adverse effects of tobacco smoke, possibly contributing to the development of reliable treatments in various fields associated with bone resorption.

  3. Rapid identification of heterogeneous mixture components with hyperspectral coherent anti-Stokes Raman scattering imaging

    NARCIS (Netherlands)

    Garbacik, E.T.; Herek, Jennifer Lynn; Otto, Cornelis; Offerhaus, Herman L.

    2012-01-01

    For the rapid analysis of complicated heterogeneous mixtures, we have developed a method to acquire and intuitively display hyperspectral coherent anti-Stokes Raman scattering (CARS) images. The imaging is performed with a conventional optical setup based around an optical parametric oscillator.

  4. Mixture Experiments

    Energy Technology Data Exchange (ETDEWEB)

    Piepel, Gregory F.

    2007-12-01

    A mixture experiment involves combining two or more components in various proportions or amounts and then measuring one or more responses for the resulting end products. Other factors that affect the response(s), such as process variables and/or the total amount of the mixture, may also be studied in the experiment. A mixture experiment design specifies the combinations of mixture components and other experimental factors (if any) to be studied and the response variable(s) to be measured. Mixture experiment data analyses are then used to achieve the desired goals, which may include (i) understanding the effects of components and other factors on the response(s), (ii) identifying components and other factors with significant and nonsignificant effects on the response(s), (iii) developing models for predicting the response(s) as functions of the mixture components and any other factors, and (iv) developing end-products with desired values and uncertainties of the response(s). Given a mixture experiment problem, a practitioner must consider the possible approaches for designing the experiment and analyzing the data, and then select the approach best suited to the problem. Eight possible approaches include 1) component proportions, 2) mathematically independent variables, 3) slack variable, 4) mixture amount, 5) component amounts, 6) mixture process variable, 7) mixture of mixtures, and 8) multi-factor mixture. The article provides an overview of the mixture experiment designs, models, and data analyses for these approaches.

  5. Solid-Liquid Equilibria for Many-component Mixtures Using Cubic-Plus-Association (CPA) equation of state

    DEFF Research Database (Denmark)

    Fettouhi, André; Thomsen, Kaj

    2010-01-01

    In the creation of liquefied natural gas the formation of solids play a substantial role, hence detailed knowledge is needed about solid-liquid equilibria (SLE). In this abstract we shortly summarize the work we have carried out at CERE over the past year with SLE for many-component mixtures usin...... the Cubic-Plus-Association (CPA) equation of state. Components used in this work are highly relevant to the oil and gas industry and include light and heavy hydrocarbons, alcohols, water and carbon dioxide....

  6. Neural Correlates of Causal Power Judgments

    Directory of Open Access Journals (Sweden)

    Denise Dellarosa Cummins

    2014-12-01

    Full Text Available Causal inference is a fundamental component of cognition and perception. Probabilistic theories of causal judgment (most notably causal Bayes networks derive causal judgments using metrics that integrate contingency information. But human estimates typically diverge from these normative predictions. This is because human causal power judgments are typically strongly influenced by beliefs concerning underlying causal mechanisms, and because of the way knowledge is retrieved from human memory during the judgment process. Neuroimaging studies indicate that the brain distinguishes causal events from mere covariation, and between perceived and inferred causality. Areas involved in error prediction are also activated, implying automatic activation of possible exception cases during causal decision-making.

  7. Volatility of components of saturated vapours of UCl4-CsCl and UCl4-LiCl molten mixtures

    International Nuclear Information System (INIS)

    Smirnov, M.V.; Kudyakov, V.Ya.; Salyulev, A.B.; Komarov, V.E.; Posokhin, Yu.V.; Afonichkin, V.K.

    1979-01-01

    The flow method has been used for measuring the volatility of the components from UCl 4 -CsCl and UCl 4 -LiCl melted mixtures containing 2.0, 5.0, 12.0, 25.0, 33.0, 50.0, 67.0, and 83.0 mol.% of UCl 4 within the temperature ranges of 903-1188 K and 740-1200 K, respectively. The chemical composition of saturated vapours above the melted salts has been determined. The melted mixtures in question exhibit negative deviation from ideal behaviour. Made was the conclusion about the presence in a vapour phase, along with monomeric UCl 4 , LiCl, CsCl and Li 2 Cl 2 , Cs 2 Cl 2 dimers of double compounds of the MeUCl 5 most probable composition. Their absolute contribution into a total pressure above the UCl 4 -CsCl melted mixtures is considerably smaller than above the UCl 4 -LiCl mixtures

  8. Causal universe

    CERN Document Server

    Ellis, George FR; Pabjan, Tadeusz

    2013-01-01

    Written by philosophers, cosmologists, and physicists, this collection of essays deals with causality, which is a core issue for both science and philosophy. Readers will learn about different types of causality in complex systems and about new perspectives on this issue based on physical and cosmological considerations. In addition, the book includes essays pertaining to the problem of causality in ancient Greek philosophy, and to the problem of God's relation to the causal structures of nature viewed in the light of contemporary physics and cosmology.

  9. Isolation of EPR spectra and estimation of spin-states in two-component mixtures of paramagnets.

    Science.gov (United States)

    Chabbra, Sonia; Smith, David M; Bode, Bela E

    2018-04-26

    The presence of multiple paramagnetic species can lead to overlapping electron paramagnetic resonance (EPR) signals. This complication can be a critical obstacle for the use of EPR to unravel mechanisms and aid the understanding of earth abundant metal catalysis. Furthermore, redox or spin-crossover processes can result in the simultaneous presence of metal centres in different oxidation or spin states. In this contribution, pulse EPR experiments on model systems containing discrete mixtures of Cr(i) and Cr(iii) or Cu(ii) and Mn(ii) complexes demonstrate the feasibility of the separation of the EPR spectra of these species by inversion recovery filters and the identification of the relevant spin states by transient nutation experiments. We demonstrate the isolation of component spectra and identification of spin states in a mixture of catalyst precursors. The usefulness of the approach is emphasised by monitoring the fate of the chromium species upon activation of an industrially used precatalyst system.

  10. Approximation of Unknown Multivariate Probability Distributions by Using Mixtures of Product Components: A Tutorial

    Czech Academy of Sciences Publication Activity Database

    Grim, Jiří

    2017-01-01

    Roč. 31, č. 9 (2017), č. článku 1750028. ISSN 0218-0014 R&D Projects: GA ČR GA17-18407S Institutional support: RVO:67985556 Keywords : multivariate statistics * product mixtures * naive Bayes models * EM algorithm * pattern recognition * neural networks * expert systems * image analysis Subject RIV: IN - Informatics, Computer Science OBOR OECD: Computer sciences, information science, bioinformathics (hardware development to be 2.2, social aspect to be 5.8) Impact factor: 0.994, year: 2016 http://library.utia.cas.cz/separaty/2017/RO/grim-0475182.pdf

  11. Causal mapping

    DEFF Research Database (Denmark)

    Rasmussen, Lauge Baungaard

    2006-01-01

    The lecture note explains how to use the causal mapping method as well as the theoretical framework aoosciated to the method......The lecture note explains how to use the causal mapping method as well as the theoretical framework aoosciated to the method...

  12. Ground tire rubber (GTR) as a component material in concrete mixtures for paving concrete.

    Science.gov (United States)

    2015-02-01

    This research was done to investigate if the problems associated with flexibility and temperature sensitivity (expansion and : contraction) in roadway concrete pavements can be addressed by replacing some of the fine or coarse aggregate component : w...

  13. [Biodegradability of the components of natural hydrocarbon mixtures previously submitted to landfarming].

    Science.gov (United States)

    Pucci, G N; Pucci, O H

    2003-01-01

    The complex composition of the crude oil and the hydrocarbons that integrate the waste of the different stages of the oil industry turn this product a mixture that presents different difficulties for its elimination by biological methods. The objective of this paper was to study the biodegradation potential of autochthonous bacterial communities on hydrocarbons obtained from four polluted places and subjected to landfarming biorremediation system during a decade. The results showed a marked difference in biodegradability of the three main fractions of crude oil, aliphatic, aromatic, and polar fractions, obtained by column chromatography. All fractions were used as carbon source and energy. There were variations in the production of biomass among the different fractions as well as in the kinetics of biodegradation, according to the composition of each fraction.

  14. [Non-alcoholic fatty liver disease, as a component of the metabolic syndrome, and its causal correlations with other extrahepatic diseases].

    Science.gov (United States)

    Halmos, Tamás; Suba, Ilona

    2017-12-01

    Non-alcoholic fatty liver disease is the most common non-infectious chronic liver-disease in our age, and is a spectrum of all the diseases associated with increased fat accumulation in the hepatocytes. Its development is promoted by sedentary life-style, over-feeding, and certain genetic predisposition. Prevalence in the adult population, even in Hungary is ~30%. In a part of cases, this disease may pass into non-alcoholic steatohepatitis, later into fibrosis, rarely into primary hepatocellular cancer. Fatty liver is closely and bidirectionally related to the metabolic syndrome and type 2 diabetes, and nowadays there is a general consensus that fatty liver is the hepatic manifestation of the metabolic sycndrome. The importance of the fatty liver has been highly emphasized recently. In addition to the progression into steatohepatitis, its causal relationship with numerous extrahepatic disorders has been discovered. In our overview, we deal with the epidemiology, pathomechanism of the disease, discuss the possibilities of diagnosis, its relationship with the intestinal microbiota, its recently recognized correlations with bile acids and their receptors, and its supposed correlations with the circadian CLOCK system. Hereinafter, we overview those extrahepatic disorders, which have been shown to be causal link with the non-alcoholic fatty liver disease. Among these, we emphasize the metabolic syndrome/type 2 diabetes, cardiovascular disorders, chronic kidney disease, sleep apnea/hypoventilation syndrome, inflammatory bowel disease, Alzheimer's disease, osteoporosis, and psoriasis, as well. Based on the above, it can be stated, that high risk individuals with non-alcoholic fatty liver disease need systemic care, and require the detection of other components of this systemic pathological condition. While currently specific therapy for the disease is not yet known, life-style changes, adequate use of available medicines can prevent disease progression. Promising research

  15. Direct measurements of mass-specific optical cross sections of single-component aerosol mixtures.

    Science.gov (United States)

    Radney, James G; Ma, Xiaofei; Gillis, Keith A; Zachariah, Michael R; Hodges, Joseph T; Zangmeister, Christopher D

    2013-09-03

    The optical properties of atmospheric aerosols vary widely, being dependent upon particle composition, morphology, and mixing state. This diversity and complexity of aerosols motivates measurement techniques that can discriminate and quantify a variety of single- and multicomponent aerosols that are both internally and externally mixed. Here, we present a new combination of techniques to directly measure the mass-specific extinction and absorption cross sections of laboratory-generated aerosols that are relevant to atmospheric studies. Our approach employs a tandem differential mobility analyzer, an aerosol particle mass analyzer, cavity ring-down and photoacoustic spectrometers, and a condensation particle counter. This suite of instruments enables measurement of aerosol particle size, mass, extinction and absorption coefficients, and aerosol number density, respectively. Taken together, these observables yield the mass-specific extinction and absorption cross sections without the need to model particle morphology or account for sample collection artifacts. Here we demonstrate the technique in a set of case studies which involve complete separation of aerosol by charge, separation of an external mixture by mass, and discrimination between particle types by effective density and single-scattering albedo.

  16. Study of component distribution in pharmaceutical binary powder mixtures by near infrared chemical imaging

    Directory of Open Access Journals (Sweden)

    Manel Bautista

    2012-12-01

    Full Text Available Near infrared chemical imaging (NIR-CI has recently emerged as an effective technique for extracting spatial information from pharmaceutical products in an expeditious, non-destructive and non-invasive manner. These features have turned it into a useful tool for controlling various steps in drug production processes. Imaging techniques provide a vast amount of both spatial and spectral information that can be acquired in a very short time. Such a huge amount of data requires the use of efficient and fast methods to extract the relevant information. Chemometric methods have proved especially useful for this purpose. In this study, we assessed the usefulness of the correlation coefficient (CC between the spectra of samples, the pure spectra of the active pharmaceutical ingredient (API and we assessed the excipients to check for correct ingredient distribution in pharmaceutical binary preparations blended in the laboratory. Visual information about pharmaceutical component distribution can be obtained by using the CC. The performance of this model construction methodology for binary samples was compared with other various common multivariate methods including partial least squares, multivariate curve resolution and classical least squares. Based on the results, correlation coefficients are a powerful tool for the rapid assessment of correct component distribution and for quantitative analysis of pharmaceutical binary formulations. For samples of three or more components it has been shown that if the objective is only to determine uniformity of blending, then the CC image map is very good for this, and easy and fast to compute.

  17. Riemann solvers for multi-component gas mixtures with temperature dependent heat capacities

    International Nuclear Information System (INIS)

    Beccantini, A.

    2001-01-01

    This thesis represents a contribution to the development of upwind splitting schemes for the Euler equations for ideal gaseous mixtures and their investigation in computing multidimensional flows in irregular geometries. In the preliminary part we develop and investigate the parameterization of the shock and rarefaction curves in the phase space. Then, we apply them to perform some field-by-field decompositions of the Riemann problem: the entropy-respecting one, the one which supposes that genuinely-non-linear (GNL) waves are both shocks (shock-shock one) and the one which supposes that GNL waves are both rarefactions (rarefaction-rarefaction one). We emphasize that their analysis is fundamental in Riemann solvers developing: the simpler the field-by-field decomposition, the simpler the Riemann solver based on it. As the specific heat capacities of the gases depend on the temperature, the shock-shock field-by-field decomposition is the easiest to perform. Then, in the second part of the thesis, we develop an upwind splitting scheme based on such decomposition. Afterwards, we investigate its robustness, precision and CPU-time consumption, with respect to some of the most popular upwind splitting schemes for polytropic/non-polytropic ideal gases. 1-D test-cases show that this scheme is both precise (exact capturing of stationary shock and stationary contact) and robust in dealing with strong shock and rarefaction waves. Multidimensional test-cases show that it suffers from some of the typical deficiencies which affect the upwind splitting schemes capable of exact capturing stationary contact discontinuities i.e the developing of non-physical instabilities in computing strong shock waves. In the final part, we use the high-order multidimensional solver here developed to compute fully-developed detonation flows. (author)

  18. Gaussian Mixture Model with Variable Components for Full Waveform LiDAR Data Decomposition and RJMCMC Algorithm

    Directory of Open Access Journals (Sweden)

    ZHAO Quanhua

    2015-12-01

    Full Text Available Full waveform LiDAR data record the signal of the backscattered laser pulse. The elevation and the energy information of ground targets can be effectively obtained by decomposition of the full waveform LiDAR data. Therefore, waveform decomposition is the key to full waveform LiDAR data processing. However, in waveform decomposition, determining the number of the components is a focus and difficult problem. To this end, this paper presents a method which can automatically determine the number. First of all, a given full waveform LiDAR data is modeled on the assumption that energy recorded at elevation points satisfy Gaussian mixture distribution. The constraint function is defined to steer the model fitting the waveform. Correspondingly, a probability distribution based on the function is constructed by Gibbs. The Bayesian paradigm is followed to build waveform decomposition model. Then a RJMCMC (reversible jump Markov chain Monte Carlo scheme is used to simulate the decomposition model, which determines the number of the components and decomposes the waveform into a group of Gaussian distributions. In the RJMCMC algorithm, the move types are designed, including updating parameter vector, splitting or merging Gaussian components, birth or death Gaussian component. The results obtained from the ICESat-GLAS waveform data of different areas show that the proposed algorithm is efficient and promising.

  19. Epidemiological causality.

    Science.gov (United States)

    Morabia, Alfredo

    2005-01-01

    Epidemiological methods, which combine population thinking and group comparisons, can primarily identify causes of disease in populations. There is therefore a tension between our intuitive notion of a cause, which we want to be deterministic and invariant at the individual level, and the epidemiological notion of causes, which are invariant only at the population level. Epidemiologists have given heretofore a pragmatic solution to this tension. Causal inference in epidemiology consists in checking the logical coherence of a causality statement and determining whether what has been found grossly contradicts what we think we already know: how strong is the association? Is there a dose-response relationship? Does the cause precede the effect? Is the effect biologically plausible? Etc. This approach to causal inference can be traced back to the English philosophers David Hume and John Stuart Mill. On the other hand, the mode of establishing causality, devised by Jakob Henle and Robert Koch, which has been fruitful in bacteriology, requires that in every instance the effect invariably follows the cause (e.g., inoculation of Koch bacillus and tuberculosis). This is incompatible with epidemiological causality which has to deal with probabilistic effects (e.g., smoking and lung cancer), and is therefore invariant only for the population.

  20. Predicting the vapor–liquid equilibrium of hydrocarbon binary mixtures and polymer solutions using predetermined pure component parameters

    International Nuclear Information System (INIS)

    Ryu, Sang Kyu; Bae, Young Chan

    2012-01-01

    Highlights: ► We have developed a close-packed lattice model for chain-like molecules. ► The chain length dependence determined from Monte-Carlo simulation results were used. ► To consider the volume effect, hole theory and two mixing steps were used. ► A lattice fluid equation of state (LF-EoS) is presented for VLE of hydrocarbon mixtures. ► Correlation of pure polymer solutions data with use of the LF-EoS. - Abstract: In our previous work, a new close-packed lattice model was developed for multi-component system of chain fluids with taking the chain length dependence from Monte-Carlo (MC) simulation results into account. In this work, we further extend this model to describe pressure, volume and temperature (PVT) properties, such as vapor–liquid equilibrium (VLE). To consider the effect of pressure on the phase behavior, the volume change effect is taken into account by introducing holes into the incompressible lattice model with two mixing steps. The corresponding new lattice fluid equation of state (LF-EoS) is applied to predict the thermodynamic properties of pure and binary mixtures of hydrocarbons as well as pure polymer solutions. The results of the proposed model are compared to other predictive approaches based on VLE calculations using predetermined pure model parameters without further adjustment. Thermodynamic properties predicted using the method developed in this work are consistent with the experimental data.

  1. Trace analysis in complex mixtures using a high-component filtering strategy with liquid chromatography-mass spectrometry.

    Science.gov (United States)

    Zhang, Hui; Wang, Shao-Qing; Liu, Ying; Luo, Li-Ping; Liu, Peng; Qi, Lian-Wen; Li, Ping

    2012-11-01

    Trace constituents are widely present in complex mixtures, and trace analysis is challenging because of the unpredictable matrix. In this work, a high-component filtering strategy was developed for improved analysis of trace constituents in complex sample by liquid chromatography-mass spectrometry (LC-MS). Using a specifically designed chromatographic apparatus, the high-abundant fractions were filtered prior to LC-MS analysis. The samples complexity was reduced and the sample-loading amount for the rest low-level fractions can be considerably increased. The application of this approach was illustrated with an analytically challenging sample, a traditional Chinese herbal medicine named Compound Danshen Sample. We observed that the loss rate for 12 analytes during the filtering procedure ranged from 6.54 to 26.11%, but showed a stable repeatability with RSDanalysis, allowing six low compounds that cannot be quantified by the traditional methods to be tested by the filtering method. It can be predicted that the qualitative and quantitative trace analysis will be greatly improved when the loading samples is increased resulting from the filtration of high-level targets. The proposed strategy is promising to monitor trace constituents in diverse complex mixtures in the analytical field of pharmaceutics, metabonomics and environments. Copyright © 2012 Elsevier B.V. All rights reserved.

  2. Predicting the vapor-liquid equilibrium of hydrocarbon binary mixtures and polymer solutions using predetermined pure component parameters

    Science.gov (United States)

    Ryu, Sang Kyu; Bae, Young Chan

    2012-05-01

    In our previous work, a new close-packed lattice model was developed for multi-component system of chain fluids with taking the chain length dependence from Monte-Carlo (MC) simulation results into account. In this work, we further extend this model to describe pressure, volume and temperature (PVT) properties, such as vapor-liquid equilibrium (VLE). To consider the effect of pressure on the phase behavior, the volume change effect is taken into account by introducing holes into the incompressible lattice model with two mixing steps. The corresponding new lattice fluid equation of state (LF-EoS) is applied to predict the thermodynamic properties of pure and binary mixtures of hydrocarbons as well as pure polymer solutions. The results of the proposed model are compared to other predictive approaches based on VLE calculations using predetermined pure model parameters without further adjustment. Thermodynamic properties predicted using the method developed in this work are consistent with the experimental data.

  3. Retrieving simulated volcanic, desert dust and sea-salt particle properties from two/three-component particle mixtures using UV-VIS polarization lidar and T matrix

    Directory of Open Access Journals (Sweden)

    G. David

    2013-07-01

    Full Text Available During transport by advection, atmospheric nonspherical particles, such as volcanic ash, desert dust or sea-salt particles experience several chemical and physical processes, leading to a complex vertical atmospheric layering at remote sites where intrusion episodes occur. In this paper, a new methodology is proposed to analyse this complex vertical layering in the case of a two/three-component particle external mixtures. This methodology relies on an analysis of the spectral and polarization properties of the light backscattered by atmospheric particles. It is based on combining a sensitive and accurate UV-VIS polarization lidar experiment with T-matrix numerical simulations and air mass back trajectories. The Lyon UV-VIS polarization lidar is used to efficiently partition the particle mixture into its nonspherical components, while the T-matrix method is used for simulating the backscattering and depolarization properties of nonspherical volcanic ash, desert dust and sea-salt particles. It is shown that the particle mixtures' depolarization ratio δ p differs from the nonspherical particles' depolarization ratio δns due to the presence of spherical particles in the mixture. Hence, after identifying a tracer for nonspherical particles, particle backscattering coefficients specific to each nonspherical component can be retrieved in a two-component external mixture. For three-component mixtures, the spectral properties of light must in addition be exploited by using a dual-wavelength polarization lidar. Hence, for the first time, in a three-component external mixture, the nonsphericity of each particle is taken into account in a so-called 2β + 2δ formalism. Applications of this new methodology are then demonstrated in two case studies carried out in Lyon, France, related to the mixing of Eyjafjallajökull volcanic ash with sulfate particles (case of a two-component mixture and to the mixing of dust with sea-salt and water-soluble particles

  4. Efficient numerical methods for simulating surface tension of multi-component mixtures with the gradient theory of fluid interfaces

    KAUST Repository

    Kou, Jisheng

    2015-08-01

    Surface tension significantly impacts subsurface flow and transport, and it is the main cause of capillary effect, a major immiscible two-phase flow mechanism for systems with a strong wettability preference. In this paper, we consider the numerical simulation of the surface tension of multi-component mixtures with the gradient theory of fluid interfaces. Major numerical challenges include that the system of the Euler-Lagrange equations is solved on the infinite interval and the coefficient matrix is not positive definite. We construct a linear transformation to reduce the Euler-Lagrange equations, and naturally introduce a path function, which is proven to be a monotonic function of the spatial coordinate variable. By using the linear transformation and the path function, we overcome the above difficulties and develop the efficient methods for calculating the interface and its interior compositions. Moreover, the computation of the surface tension is also simplified. The proposed methods do not need to solve the differential equation system, and they are easy to be implemented in practical applications. Numerical examples are tested to verify the efficiency of the proposed methods. © 2014 Elsevier B.V.

  5. Paradoxical Behavior of Granger Causality

    Science.gov (United States)

    Witt, Annette; Battaglia, Demian; Gail, Alexander

    2013-03-01

    Granger causality is a standard tool for the description of directed interaction of network components and is popular in many scientific fields including econometrics, neuroscience and climate science. For time series that can be modeled as bivariate auto-regressive processes we analytically derive an expression for spectrally decomposed Granger Causality (SDGC) and show that this quantity depends only on two out of four groups of model parameters. Then we present examples of such processes whose SDGC expose paradoxical behavior in the sense that causality is high for frequency ranges with low spectral power. For avoiding misinterpretations of Granger causality analysis we propose to complement it by partial spectral analysis. Our findings are illustrated by an example from brain electrophysiology. Finally, we draw implications for the conventional definition of Granger causality. Bernstein Center for Computational Neuroscience Goettingen

  6. Quantification of Multiple Components of Complex Aluminum-Based Adjuvant Mixtures by Using Fourier Transform Infrared Spectroscopy and Partial Least Squares Modeling.

    Science.gov (United States)

    Dowling, Quinton M; Kramer, Ryan M

    2017-01-01

    Fourier transform infrared (FTIR) spectroscopy is widely used in the pharmaceutical industry for process monitoring, compositional quantification, and characterization of critical quality attributes in complex mixtures. Advantages over other spectroscopic measurements include ease of sample preparation, quantification of multiple components from a single measurement, and the ability to quantify optically opaque samples. This method describes the use of a multivariate model for quantifying a TLR4 agonist (GLA) adsorbed onto aluminum oxyhydroxide (Alhydrogel ® ) using FTIR spectroscopy that may be adapted to quantify other complex aluminum based adjuvant mixtures.

  7. Differential memory persistence of odour mixture and components in newborn rabbits: Competition between the whole and its parts.

    Directory of Open Access Journals (Sweden)

    Gérard eCoureaud

    2014-06-01

    Full Text Available Interacting with the mother during the daily nursing, newborn rabbits experience her body odour cues. In particular, the mammary pheromone (MP contained in rabbit milk triggers the typical behaviour which helps to localize and seize the nipples. It also promotes the very rapid appetitive learning of simple or complex stimuli (odorants or mixtures through associative conditioning. We previously showed that 24h after MP-induced conditioning to odorants A (ethyl isobutyrate or B (ethyl maltol, newborn rabbits perceive the AB mixture in a weak configural way, i.e. they perceive the odour of the AB configuration in addition to the odours of the elements. Moreover, after conditioning to the mixture, elimination of the memories of A and B does not affect the memory of AB, suggesting independent elemental and configural memories of the mixture. Here, we evaluated whether configural memory persistence differs from elemental one. First, whereas 1 or 3-day-old pups conditioned to A or B maintained their responsiveness to the conditioned odorant for 4 days, those conditioned to AB did not respond to the mixture after the same retention period. Second, the pups conditioned to AB still responded to A and B 4 days after conditioning, which indicates stronger retention of the elements than of the configuration when all information are learned together. Third, we determined whether the memory of the elements competes with the memory of the configuration: after conditioning to AB, when the memories of A and B were erased using pharmacological treatment, the memory of the mixture was extended to day 5. Thus, newborn rabbits have access to both elemental and configural information in certain odour mixtures, and competition between these distinct representations of the mixture influences the persistence of their memories. Such effects certainly occur in the natural context of mother-pup interactions and may contribute to early acquisition of knowledge about the

  8. Analysis of Causality Relationship of Components of Socio-ecological and Socio-economical System for Management of the Outermost Small Islands: A Case of Lingayan Island, Central Sulawesi

    Directory of Open Access Journals (Sweden)

    Mohammad Saleh Lubis

    2014-05-01

    Full Text Available Indonesia has more than 17,506 islands and 92 islands of them are outermost small islands.  Lingayan is one of them located in Northwest of Sulawesi Island and it has geostrategic role to determine the sea boundaries of Indonesian State (NKRI including the territorial seas, the exclusive economic zone and the continental shelf.  Recently, the coastal ecosystems of Lingayan has degraded and the island’s economy is weak so they cannot support the life’s survival of inhabiting people. This condition could weaken the geostrategic role in accordance with article 121 Chapter VIII of the United Nations Convention on the Law of the Sea (UNCLOS. Based on the above reasons, the study aim to examine and assess the causal relation of components in the socio-ecological and socio-economical systems as a basis for management of the Lingayan Island with target on conservation of coastal ecosystems and growth of inhabitant’ business economic.  Causalities relations within components were built using Statistic Equation Model (SEM with AMOS method and 40 constructed indicators as well as determinate the suitability program using Analytical Hierarchy Process (AHP.  The research showed that there is relationship between the components of socio-ecological systems as indicated by the fit model of causal relation path diagram that provides chi square value = 236.994, RMSEA = 0.083, GFI = 0.884.  Furthermore, there is relationship between the components of socio-economical that provides chi square value = 192.824, RMSEA = 0.081, GFI = 0.900. The most appropriate programs are seaweed cultivation (34.0% and restoration (23.4%.

  9. Obtaining the cumulative k-distribution of a gas mixture from those of its components. [radiative transfer in stratosphere

    Science.gov (United States)

    Gerstell, M. F.

    1993-01-01

    A review of the convolution theorem for obtaining the cumulative k-distribution of a gas mixture proven in Goody et al. (1989) and a discussion of its application to natural spectra are presented. Computational optimizations for use in analyzing high-altitude gas mixtures are introduced. Comparisons of the results of the optimizations, and criteria for deciding what altitudes are 'high' in this context are given. A few relevant features of the testing support software are examined. Some spectrally integrated results, and the circumstances the might permit substituting the method of principal absorbers are examined.

  10. Causally nonseparable processes admitting a causal model

    International Nuclear Information System (INIS)

    Feix, Adrien; Araújo, Mateus; Brukner, Caslav

    2016-01-01

    A recent framework of quantum theory with no global causal order predicts the existence of ‘causally nonseparable’ processes. Some of these processes produce correlations incompatible with any causal order (they violate so-called ‘causal inequalities’ analogous to Bell inequalities ) while others do not (they admit a ‘causal model’ analogous to a local model ). Here we show for the first time that bipartite causally nonseparable processes with a causal model exist, and give evidence that they have no clear physical interpretation. We also provide an algorithm to generate processes of this kind and show that they have nonzero measure in the set of all processes. We demonstrate the existence of processes which stop violating causal inequalities but are still causally nonseparable when mixed with a certain amount of ‘white noise’. This is reminiscent of the behavior of Werner states in the context of entanglement and nonlocality. Finally, we provide numerical evidence for the existence of causally nonseparable processes which have a causal model even when extended with an entangled state shared among the parties. (paper)

  11. Nature of the interfacial region between cementitious mixtures and rocks from the Palo Duro Basin and other seal components

    International Nuclear Information System (INIS)

    Wakeley, L.D.; Roy, D.M.

    1986-03-01

    Using the interface zone as an indicator of compatibility, preliminary tests were run using cement-based formulations designed to be used for shaft sealing in conjunction with evaporite and clastic rocks of the Palo Duro Basin, one of several potential sites for a high-level radioactive waste repository. Emphasis focused on two formulations, both designed to be slightly expansive. Mixture 83-05 was tested in combination with anhydrite and siltstone. A comparable mixture (83-03) containing salt was used with the halite. Cement, rocks, and their respective interfaces were examined using x-ray diffraction, optical microscopy, and scanning electron microscopy. Bond strengths between rock and cement as well as between selected steels and grout were determined as a function of curing conditions and pretest surface treatment. Permeabilities of cement/rock and cement/steel composites were also determined. Bond strength and permeability were found to vary with curing conditions as well as surface treatment

  12. Application of approximations for joint cumulative k-distributions for mixtures to FSK radiation heat transfer in multi-component high temperature non-LTE plasmas

    International Nuclear Information System (INIS)

    Maurente, André; França, Francis H.R.; Miki, Kenji; Howell, John R.

    2012-01-01

    Approximations for joint cumulative k-distribution for mixtures are efficient for full spectrum k-distribution (FSK) computations. These approximations provide reduction of the database that is necessary to perform FSK computation when compared to the direct approach, which uses cumulative k-distributions computed from the spectrum of the mixture, and also less computational expensive when compared to techniques in which RTE's are required to be solved for each component of the mixture. The aim of the present paper is to extend the approximations for joint cumulative k-distributions for non-LTE media. For doing that, a FSK to non-LTE media formulation well-suited to be applied along with approximations for joint cumulative k-distributions is presented. The application of the proposed methodology is demonstrated by solving the radiation heat transfer in non-LTE high temperature plasmas composed of N, O, N 2 , NO, N 2 + and mixtures of these species. The two more efficient approximations, that is, the superposition and multiplication are employed and analyzed.

  13. Separation of the components of the TBP-H2 MBP-HDBP-H3PO4 mixture

    International Nuclear Information System (INIS)

    Pires, M.A.F.; Abrao, A.

    1981-04-01

    Several schemes for the separation of dibutylphosphoric acid (HDBP), monobutylphosphoric acid (H 2 MBP) and orthophosphoric acid (H 3 PO 4 ) as hydrolytic and radiolytic degradation products from tri-n-butylphosphate (TBP) were studied. For the resolution of a HDBP, H 2 MPB and H 3 PO 4 mixture in TBP-diluent, or in TBP-diluent-heavy metal nitrate (U-VI, Th-IV or Zr-IV), techniques such as ion exchange chromatography, ion chromatography and separation onto a chromatographic alumina column were investigated. For the identification, determination and analytical resolution following up for the several systems studied, techniques such as refraction index measurement, electrical conductivity measurement, molecular spectrophotometry and gas chromatography were applied. Special emphasys was given to the separation using alumina column where the HDBP acid was retained and eluted selectively for its separation from TBP-varsol-uranyl nitrate mixtures. This analytical procedure was applied to the samples coming from the Uranium Purification Pilot Plant in operation at the Centro de Engenharia Quimica (IPEN). (Author) [pt

  14. Linear regression analysis and its application to multivariate chromatographic calibration for the quantitative analysis of two-component mixtures.

    Science.gov (United States)

    Dinç, Erdal; Ozdemir, Abdil

    2005-01-01

    Multivariate chromatographic calibration technique was developed for the quantitative analysis of binary mixtures enalapril maleate (EA) and hydrochlorothiazide (HCT) in tablets in the presence of losartan potassium (LST). The mathematical algorithm of multivariate chromatographic calibration technique is based on the use of the linear regression equations constructed using relationship between concentration and peak area at the five-wavelength set. The algorithm of this mathematical calibration model having a simple mathematical content was briefly described. This approach is a powerful mathematical tool for an optimum chromatographic multivariate calibration and elimination of fluctuations coming from instrumental and experimental conditions. This multivariate chromatographic calibration contains reduction of multivariate linear regression functions to univariate data set. The validation of model was carried out by analyzing various synthetic binary mixtures and using the standard addition technique. Developed calibration technique was applied to the analysis of the real pharmaceutical tablets containing EA and HCT. The obtained results were compared with those obtained by classical HPLC method. It was observed that the proposed multivariate chromatographic calibration gives better results than classical HPLC.

  15. Causality in Science

    Directory of Open Access Journals (Sweden)

    Cristina Puente Águeda

    2011-10-01

    Full Text Available Causality is a fundamental notion in every field of science. Since the times of Aristotle, causal relationships have been a matter of study as a way to generate knowledge and provide for explanations. In this paper I review the notion of causality through different scientific areas such as physics, biology, engineering, etc. In the scientific area, causality is usually seen as a precise relation: the same cause provokes always the same effect. But in the everyday world, the links between cause and effect are frequently imprecise or imperfect in nature. Fuzzy logic offers an adequate framework for dealing with imperfect causality, so a few notions of fuzzy causality are introduced.

  16. A Functional Monomer Is Not Enough: Principal Component Analysis of the Influence of Template Complexation in Pre-Polymerization Mixtures on Imprinted Polymer Recognition and Morphology

    Directory of Open Access Journals (Sweden)

    Kerstin Golker

    2014-11-01

    Full Text Available In this report, principal component analysis (PCA has been used to explore the influence of template complexation in the pre-polymerization phase on template molecularly imprinted polymer (MIP recognition and polymer morphology. A series of 16 bupivacaine MIPs were studied. The ethylene glycol dimethacrylate (EGDMA-crosslinked polymers had either methacrylic acid (MAA or methyl methacrylate (MMA as the functional monomer, and the stoichiometry between template, functional monomer and crosslinker was varied. The polymers were characterized using radioligand equilibrium binding experiments, gas sorption measurements, swelling studies and data extracted from molecular dynamics (MD simulations of all-component pre-polymerization mixtures. The molar fraction of the functional monomer in the MAA-polymers contributed to describing both the binding, surface area and pore volume. Interestingly, weak positive correlations between the swelling behavior and the rebinding characteristics of the MAA-MIPs were exposed. Polymers prepared with MMA as a functional monomer and a polymer prepared with only EGDMA were found to share the same characteristics, such as poor rebinding capacities, as well as similar surface area and pore volume, independent of the molar fraction MMA used in synthesis. The use of PCA for interpreting relationships between MD-derived descriptions of events in the pre-polymerization mixture, recognition properties and morphologies of the corresponding polymers illustrates the potential of PCA as a tool for better understanding these complex materials and for their rational design.

  17. A functional monomer is not enough: principal component analysis of the influence of template complexation in pre-polymerization mixtures on imprinted polymer recognition and morphology.

    Science.gov (United States)

    Golker, Kerstin; Karlsson, Björn C G; Rosengren, Annika M; Nicholls, Ian A

    2014-11-10

    In this report, principal component analysis (PCA) has been used to explore the influence of template complexation in the pre-polymerization phase on template molecularly imprinted polymer (MIP) recognition and polymer morphology. A series of 16 bupivacaine MIPs were studied. The ethylene glycol dimethacrylate (EGDMA)-crosslinked polymers had either methacrylic acid (MAA) or methyl methacrylate (MMA) as the functional monomer, and the stoichiometry between template, functional monomer and crosslinker was varied. The polymers were characterized using radioligand equilibrium binding experiments, gas sorption measurements, swelling studies and data extracted from molecular dynamics (MD) simulations of all-component pre-polymerization mixtures. The molar fraction of the functional monomer in the MAA-polymers contributed to describing both the binding, surface area and pore volume. Interestingly, weak positive correlations between the swelling behavior and the rebinding characteristics of the MAA-MIPs were exposed. Polymers prepared with MMA as a functional monomer and a polymer prepared with only EGDMA were found to share the same characteristics, such as poor rebinding capacities, as well as similar surface area and pore volume, independent of the molar fraction MMA used in synthesis. The use of PCA for interpreting relationships between MD-derived descriptions of events in the pre-polymerization mixture, recognition properties and morphologies of the corresponding polymers illustrates the potential of PCA as a tool for better understanding these complex materials and for their rational design.

  18. Stratification of mixtures in evaporating liquid films occurs only for a range of volume fractions of the smaller component

    Science.gov (United States)

    Sear, Richard P.

    2018-04-01

    I model the drying of a liquid film containing small and big colloid particles. Fortini et al. [Phys. Rev. Lett. 116, 118301 (2016)] studied these films with both computer simulation and experiment. They found that at the end of drying, the mixture had stratified with a layer of the smaller particles on top of the big particles. I develop a simple model for this process. The model has two ingredients: arrest of the diffusion of the particles at high density and diffusiophoretic motion of the big particles due to gradients in the volume fraction of the small particles. The model predicts that stratification only occurs over a range of initial volume fractions of the smaller colloidal species. Above and below this range, the downward diffusiophoretic motion of the big particles is too slow to remove the big particles from the top of the film, and so there is no stratification. In agreement with earlier work, the model also predicts that large Péclet numbers for drying are needed to see stratification.

  19. Causality in Europeanization Research

    DEFF Research Database (Denmark)

    Lynggaard, Kennet

    2012-01-01

    to develop discursive institutional analytical frameworks and something that comes close to the formulation of hypothesis on the effects of European Union (EU) policies and institutions on domestic change. Even if these efforts so far do not necessarily amount to substantive theories or claims of causality...... of discursive causalities towards more substantive claims of causality between EU policy and institutional initiatives and domestic change....

  20. In vitro activity of essential oils of Lippia sidoides and Lippia gracilis and their major chemical components against Thielaviopsis paradoxa, causal agent of stem bleeding in coconut palms

    Directory of Open Access Journals (Sweden)

    Rejane Rodrigues da Costa e Carvalho

    2013-01-01

    Full Text Available Essential oils of Lippia sidoides, Lippia gracilis and their main chemical components were investigated for in vitro control of Thielaviopsis paradoxa. Mycelial growth and a number of pathogen conidia were inhibited by the essential oil of L. sidoides at all concentrations tested (0.2; 0.5; 1.0; 3.0 µL mL-1. L. sidoides oil contained 42.33% thymol and 4.56% carvacrol, while L. gracilis oil contained 10% thymol and 41.7% carvacrol. Mycelial growth and conidial production of T. paradoxa were completely inhibited by thymol at a 0.3 µL m-1 concentration. The results suggest that thymol could potentially be used for controlling coconut stem bleeding.

  1. Study of Influence of Composite Materials Components on Properties of Concrete Mixtures and Concrete in Time Dynamics

    Science.gov (United States)

    Butakova, M. D.; Gorbunov, S. P.

    2017-11-01

    It is accepted to call concrete a special construction mix which consists of several main components - most often, these are cement, water and various fillers. As a result of grout hardening, the artificial stone, used in many areas where durability, stability and durability are required, is formed. To improve the main characteristics of concrete, various additives are added to the mix. These substances are also capable of accelerating the speed of construction and reducing the funds expenditure. It is especially important to apply additives at the installation of coverings to airfields, at the construction of moorings, roads, at the laying of pools or during other hydraulic engineering constructions, and also at the construction of monolithic industrial facilities and houses. The article deals with the composition and quantity of complex organomineral additives, the duration and conditions for the formation of composites’ structure.

  2. Component-wise exergy and energy analysis of vapor compression refrigeration system using mixture of R134a and LPG as refrigerant

    Science.gov (United States)

    Gill, Jatinder; Singh, Jagdev

    2017-11-01

    In this work, the experimental examination was carried out using a mixture of R134a and LPG refrigerant (consisting of R134a and LPG in a proportion of 28:72 by weight) as a replacement for R134a in a vapor compression refrigeration system. Exergy and energy tests were carried out at different evaporator and condenser temperatures with controlled environmental conditions. The results showed that the exergy destruction in the compressor, condenser, evaporator, and a capillary tube of the R134a / LPG refrigeration system was found lower by approximately 11.13-3.41%, 2.24-3.43%, 12.02-13.47% and 1.54-5.61% respectively. The compressor exhibits the highest level of destruction, accompanied by a condenser, an evaporator and a capillary tube in refrigeration systems. The refrigeration capacity, COP and power consumption of the compressor of the R134a /LPG refrigeration system were detected higher and lower compared to the R134a refrigeration system by about 7.04-11.41%, 15.1-17.82%, and 3.83-8.08% respectively. Also, the miscibility of R134a and LPG blend with mineral oil discovered good. The R134a and LPG refrigerant mixture proposed in this study perform superior to R134a from component-wise exergy and energy analyses under similar experimental conditions.

  3. Development of meniscus substitutes using a mixture of biocompatible polymers and extra cellular matrix components by electrospinning.

    Science.gov (United States)

    López-Calzada, G; Hernandez-Martínez, A R; Cruz-Soto, M; Ramírez-Cardona, M; Rangel, D; Molina, G A; Luna-Barcenas, G; Estevez, M

    2016-04-01

    Despite the significant advances in the meniscus tissue engineering field, it is difficult to recreate the complex structure and organization of the collagenous matrix of the meniscus. In this work, we developed a meniscus prototype to be used as substitute or scaffold for the regeneration of the meniscal matrix, recreating the differential morphology of the meniscus by electrospinning. Synthetic biocompatible polymers were combined with the extracellular matrix component, collagen and used to replicate the meniscus. We studied the correlation between mechanical and structural properties of the polymer blend as a function of collagen concentration. Fibers were collected on a surface of a rapidly rotating precast mold, to accurately replicate each sectional morphology of the meniscus; different electro-tissues were produced. Detailed XRD analyses exhibited structural changes developed by electrospinning. We achieved to integrate all these electro-tissues to form a complete synthetic meniscus. Vascularization tests were performed to assess the potential use of our novel polymeric blend for promising meniscus regeneration. Copyright © 2016 Elsevier B.V. All rights reserved.

  4. Behavioural responses of Anopheles gambiae sensu stricto to components of human breath, sweat and urine depend on mixture composition and concentration.

    Science.gov (United States)

    Qiu, Y T; Smallegange, R C; VAN Loon, J J A; Takken, W

    2011-09-01

    Host-seeking behaviour of the anthropophilic malaria vector Anopheles gambiae sensu stricto (Diptera: Culicidae) is mediated predominantly by olfactory cues. Several hundreds of odour components have been identified from human emanations, but only a few have been proven to act as attractants or synergists in the host-seeking behaviour of female An. gambiae. In previous work, aromatics, alcohols and ketones in human odours were found to elicit electrophysiological activity in antennal olfactory neurons of female An. gambiae. However, the behavioural effects of these compounds have not been investigated. In this study, behavioural responses of female An. gambiae to components of human breath, urine and sweat at a series of concentrations, or a single concentration in the case of acetone, were examined in combination with ammonia and L-lactic acid in a dual-choice olfactometer. The results showed that at specific concentrations 4-ethylphenol, indole, 3-methyl-1-butanol and two ketones inhibited the attractive effect of a mixture of ammonia and lactic acid. Acetone on its own was not attractive; however, when combined with lactic acid, the binary mixture was attractive. When combined with ammonia, acetone inhibited the attractiveness exerted by ammonia alone. Dodecanol and dimethyldisulphide did not affect the attraction exerted by ammonia and lactic acid at any of the concentrations tested. By contrast, a human-specific armpit odour, 7-octenoic acid, augmented the attraction exerted by the combination of ammonia and lactic acid at a specific dosage. © 2010 The Authors. Medical and Veterinary Entomology © 2010 The Royal Entomological Society.

  5. Influence of Physiological Gastrointestinal Surfactant Ratio on the Equilibrium Solubility of BCS Class II Drugs Investigated Using a Four Component Mixture Design

    Science.gov (United States)

    2017-01-01

    The absorption of poorly water-soluble drugs is influenced by the luminal gastrointestinal fluid content and composition, which control solubility. Simulated intestinal fluids have been introduced into dissolution testing including endogenous amphiphiles and digested lipids at physiological levels; however, in vivo individual variation exists in the concentrations of these components, which will alter drug absorption through an effect on solubility. The use of a factorial design of experiment and varying media by introducing different levels of bile, lecithin, and digested lipids has been previously reported, but here we investigate the solubility variation of poorly soluble drugs through more complex biorelevant amphiphile interactions. A four-component mixture design was conducted to understand the solubilization capacity and interactions of bile salt, lecithin, oleate, and monoglyceride with a constant total concentration (11.7 mM) but varying molar ratios. The equilibrium solubility of seven low solubility acidic (zafirlukast), basic (aprepitant, carvedilol), and neutral (fenofibrate, felodipine, griseofulvin, and spironolactone) drugs was investigated. Solubility results are comparable with literature values and also our own previously published design of experiment studies. Results indicate that solubilization is not a sum accumulation of individual amphiphile concentrations, but a drug specific effect through interactions of mixed amphiphile compositions with the drug. This is probably due to a combined interaction of drug characteristics; for example, lipophilicity, molecular shape, and ionization with amphiphile components, which can generate specific drug–micelle affinities. The proportion of each component can have a remarkable influence on solubility with, in some cases, the highest and lowest points close to each other. A single-point solubility measurement in a fixed composition simulated media or human intestinal fluid sample will therefore provide

  6. Influence of Physiological Gastrointestinal Surfactant Ratio on the Equilibrium Solubility of BCS Class II Drugs Investigated Using a Four Component Mixture Design.

    Science.gov (United States)

    Zhou, Zhou; Dunn, Claire; Khadra, Ibrahim; Wilson, Clive G; Halbert, Gavin W

    2017-12-04

    The absorption of poorly water-soluble drugs is influenced by the luminal gastrointestinal fluid content and composition, which control solubility. Simulated intestinal fluids have been introduced into dissolution testing including endogenous amphiphiles and digested lipids at physiological levels; however, in vivo individual variation exists in the concentrations of these components, which will alter drug absorption through an effect on solubility. The use of a factorial design of experiment and varying media by introducing different levels of bile, lecithin, and digested lipids has been previously reported, but here we investigate the solubility variation of poorly soluble drugs through more complex biorelevant amphiphile interactions. A four-component mixture design was conducted to understand the solubilization capacity and interactions of bile salt, lecithin, oleate, and monoglyceride with a constant total concentration (11.7 mM) but varying molar ratios. The equilibrium solubility of seven low solubility acidic (zafirlukast), basic (aprepitant, carvedilol), and neutral (fenofibrate, felodipine, griseofulvin, and spironolactone) drugs was investigated. Solubility results are comparable with literature values and also our own previously published design of experiment studies. Results indicate that solubilization is not a sum accumulation of individual amphiphile concentrations, but a drug specific effect through interactions of mixed amphiphile compositions with the drug. This is probably due to a combined interaction of drug characteristics; for example, lipophilicity, molecular shape, and ionization with amphiphile components, which can generate specific drug-micelle affinities. The proportion of each component can have a remarkable influence on solubility with, in some cases, the highest and lowest points close to each other. A single-point solubility measurement in a fixed composition simulated media or human intestinal fluid sample will therefore provide a

  7. Causality in Classical Physics

    Indian Academy of Sciences (India)

    IAS Admin

    Classical physics encompasses the study of phys- ical phenomena which range from local (a point) to nonlocal (a region) in space and/or time. We discuss the concept of spatial and temporal non- locality. However, one of the likely implications pertaining to nonlocality is non-causality. We study causality in the context of ...

  8. Causality in Classical Electrodynamics

    Science.gov (United States)

    Savage, Craig

    2012-01-01

    Causality in electrodynamics is a subject of some confusion, especially regarding the application of Faraday's law and the Ampere-Maxwell law. This has led to the suggestion that we should not teach students that electric and magnetic fields can cause each other, but rather focus on charges and currents as the causal agents. In this paper I argue…

  9. Causality in demand

    DEFF Research Database (Denmark)

    Nielsen, Max; Jensen, Frank; Setälä, Jari

    2011-01-01

    to fish demand. On the German market for farmed trout and substitutes, it is found that supply sources, i.e. aquaculture and fishery, are not the only determinant of causality. Storing, tightness of management and aggregation level of integrated markets might also be important. The methodological......This article focuses on causality in demand. A methodology where causality is imposed and tested within an empirical co-integrated demand model, not prespecified, is suggested. The methodology allows different causality of different products within the same demand system. The methodology is applied...... implication is that more explicit focus on causality in demand analyses provides improved information. The results suggest that frozen trout forms part of a large European whitefish market, where prices of fresh trout are formed on a relatively separate market. Redfish is a substitute on both markets...

  10. Non-Causal Computation

    Directory of Open Access Journals (Sweden)

    Ämin Baumeler

    2017-07-01

    Full Text Available Computation models such as circuits describe sequences of computation steps that are carried out one after the other. In other words, algorithm design is traditionally subject to the restriction imposed by a fixed causal order. We address a novel computing paradigm beyond quantum computing, replacing this assumption by mere logical consistency: We study non-causal circuits, where a fixed time structure within a gate is locally assumed whilst the global causal structure between the gates is dropped. We present examples of logically consistent non-causal circuits outperforming all causal ones; they imply that suppressing loops entirely is more restrictive than just avoiding the contradictions they can give rise to. That fact is already known for correlations as well as for communication, and we here extend it to computation.

  11. The suitability of concentration addition for predicting the effects of multi-component mixtures of up to 17 anti-androgens with varied structural features in an in vitro AR antagonist assay

    Energy Technology Data Exchange (ETDEWEB)

    Ermler, Sibylle; Scholze, Martin; Kortenkamp, Andreas, E-mail: andreas.kortenkamp@brunel.ac.uk

    2011-12-15

    The risks associated with human exposures to chemicals capable of antagonising the effects of endogenous androgens have attracted considerable recent interest. Exposure is typically to large numbers of chemicals with androgen receptor (AR) antagonist activity, yet there is limited evidence of the combined effects of multi-component mixtures of these chemicals. A few in vitro studies with mixtures of up to six AR antagonists suggest that the concept of concentration addition (CA) provides good approximations of experimentally observed mixture effects, but studies with larger numbers of anti-androgens, and with more varied structural features, are missing. Here we show that the mixture effects of up to 17 AR antagonists, comprising compounds as diverse as UV-filter substances, parabens, perfluorinated compounds, bisphenol-A, benzo({alpha})pyrene, synthetic musks, antioxidants and polybrominated biphenyls, can be predicted well on the basis of the anti-androgenicity of the single components using the concept of CA. We tested these mixtures in an in vitro AR-dependent luciferase reporter gene assay, based on MDA-kb2 cells. The effects of further mixtures, composed of four and six anti-androgens, could be predicted accurately by CA. However, there was a shortfall from expected additivity with a ten-component mixture at two different mixture ratios, but attempts to attribute these deviations to differential expression of hormone-metabolising CYP isoforms did not produce conclusive results. CA provides good approximations of in vitro mixture effects of anti-androgens with varying structural features. -- Highlights: Black-Right-Pointing-Pointer Humans are exposed to a large number of androgen receptor antagonists. Black-Right-Pointing-Pointer There is limited evidence of the combined effects of anti-androgenic chemicals. Black-Right-Pointing-Pointer We modelled the predictability of combined effects of up to 17 anti-androgens. Black-Right-Pointing-Pointer We tested the

  12. Causality and headache triggers

    Science.gov (United States)

    Turner, Dana P.; Smitherman, Todd A.; Martin, Vincent T.; Penzien, Donald B.; Houle, Timothy T.

    2013-01-01

    Objective The objective of this study was to explore the conditions necessary to assign causal status to headache triggers. Background The term “headache trigger” is commonly used to label any stimulus that is assumed to cause headaches. However, the assumptions required for determining if a given stimulus in fact has a causal-type relationship in eliciting headaches have not been explicated. Methods A synthesis and application of Rubin’s Causal Model is applied to the context of headache causes. From this application the conditions necessary to infer that one event (trigger) causes another (headache) are outlined using basic assumptions and examples from relevant literature. Results Although many conditions must be satisfied for a causal attribution, three basic assumptions are identified for determining causality in headache triggers: 1) constancy of the sufferer; 2) constancy of the trigger effect; and 3) constancy of the trigger presentation. A valid evaluation of a potential trigger’s effect can only be undertaken once these three basic assumptions are satisfied during formal or informal studies of headache triggers. Conclusions Evaluating these assumptions is extremely difficult or infeasible in clinical practice, and satisfying them during natural experimentation is unlikely. Researchers, practitioners, and headache sufferers are encouraged to avoid natural experimentation to determine the causal effects of headache triggers. Instead, formal experimental designs or retrospective diary studies using advanced statistical modeling techniques provide the best approaches to satisfy the required assumptions and inform causal statements about headache triggers. PMID:23534872

  13. Causal inference, probability theory, and graphical insights.

    Science.gov (United States)

    Baker, Stuart G

    2013-11-10

    Causal inference from observational studies is a fundamental topic in biostatistics. The causal graph literature typically views probability theory as insufficient to express causal concepts in observational studies. In contrast, the view here is that probability theory is a desirable and sufficient basis for many topics in causal inference for the following two reasons. First, probability theory is generally more flexible than causal graphs: Besides explaining such causal graph topics as M-bias (adjusting for a collider) and bias amplification and attenuation (when adjusting for instrumental variable), probability theory is also the foundation of the paired availability design for historical controls, which does not fit into a causal graph framework. Second, probability theory is the basis for insightful graphical displays including the BK-Plot for understanding Simpson's paradox with a binary confounder, the BK2-Plot for understanding bias amplification and attenuation in the presence of an unobserved binary confounder, and the PAD-Plot for understanding the principal stratification component of the paired availability design. Published 2013. This article is a US Government work and is in the public domain in the USA.

  14. Modified headspace solid-phase microextraction for the determination of quantitative relationships between components of mixtures consisting of alcohols, esters, and ethers - impact of the vapor pressure difference of the compounds.

    Science.gov (United States)

    Dawidowicz, Andrzej Lech; Szewczyk, Joanna; Dybowski, Michal P

    2017-07-01

    The quantitative relationship between analytes established by the headspace solid-phase microextraction procedure for multicomponent mixtures depends not only on the character and strength of interactions of individual components with solid-phase microextraction fiber but also on their vapor pressure in the applied headspace solid-phase microextraction system. This study proves that vapor pressure is of minor importance when the sample is dissolved/suspended in a low-volatility liquid of the same physicochemical character as that of the used solid phase microextraction fiber coating. It is demonstrated for mixtures of alcohols, esters, ethers and their selected representatives by applying a headspace solid-phase microextraction system composed of Carbowax fiber and sample solutions in polyethyleneglycol. The observed differences in quantitative relations between components of the examined mixtures established by their direct analysis and by modified headspace solid-phase microextraction are insignificant (F exp  difference between individual components of the examined mixture in the applied headspace solid phase microextraction system due to low components concentration in polyethyleneglycol suspensions (Raoult's law) and due to strong specific interactions of analyte molecules with polyethyleneglycol molecules. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  15. Dynamics and causality constraints

    International Nuclear Information System (INIS)

    Sousa, Manoelito M. de

    2001-04-01

    The physical meaning and the geometrical interpretation of causality implementation in classical field theories are discussed. Causality in field theory are kinematical constraints dynamically implemented via solutions of the field equation, but in a limit of zero-distance from the field sources part of these constraints carries a dynamical content that explains old problems of classical electrodynamics away with deep implications to the nature of physicals interactions. (author)

  16. High-power gas-discharge excimer ArF, KrCl, KrF and XeCl lasers utilising two-component gas mixtures without a buffer gas

    Science.gov (United States)

    Razhev, A. M.; Kargapol'tsev, E. S.; Churkin, D. S.

    2016-03-01

    Results of an experimental study of the influence of a gas mixture (laser active medium) composition on an output energy and total efficiency of gas-discharge excimer lasers on ArF* (193 nm), KrCl* (222 nm), KrF* (248 nm) and XeCl* (308 nm) molecules operating without a buffer gas are presented. The optimal ratios of gas components (from the viewpoint of a maximum output energy) of an active medium are found, which provide an efficient operation of laser sources. It is experimentally confirmed that for gas-discharge excimer lasers on halogenides of inert gases the presence of a buffer gas in an active medium is not a necessary condition for efficient operation. For the first time, in two-component gas mixtures of repetitively pulsed gas-discharge excimer lasers on electron transitions of excimer molecules ArF*, KrCl*, KrF* and XeCl*, the pulsed energy of laser radiation obtained under pumping by a transverse volume electric discharge in a low-pressure gas mixture without a buffer gas reached up to 170 mJ and a high pulsed output power (of up to 24 MW) was obtained at a FWHM duration of the KrF-laser pulse of 7 ns. The maximal total efficiency obtained in the experiment with two-component gas mixtures of KrF and XeCl lasers was 0.8%.

  17. Evaluation of the H-point standard additions method (HPSAM) and the generalized H-point standard additions method (GHPSAM) for the UV-analysis of two-component mixtures.

    Science.gov (United States)

    Hund, E; Massart, D L; Smeyers-Verbeke, J

    1999-10-01

    The H-point standard additions method (HPSAM) and two versions of the generalized H-point standard additions method (GHPSAM) are evaluated for the UV-analysis of two-component mixtures. Synthetic mixtures of anhydrous caffeine and phenazone as well as of atovaquone and proguanil hydrochloride were used. Furthermore, the method was applied to pharmaceutical formulations that contain these compounds as active drug substances. This paper shows both the difficulties that are related to the methods and the conditions by which acceptable results can be obtained.

  18. Dynamics Of Causal Sets

    CERN Document Server

    Rideout, D P

    2001-01-01

    The Causal Set approach to quantum gravity asserts that spacetime, at its smallest length scale, has a discrete structure. This discrete structure takes the form of a locally finite order relation, where the order, corresponding with the macroscopic notion of spacetime causality, is taken to be a fundamental aspect of nature. After an introduction to the Causal Set approach, this thesis considers a simple toy dynamics for causal sets. Numerical simulations of the model provide evidence for the existence of a continuum limit. While studying this toy dynamics, a picture arises of how the dynamics can be generalized in such a way that the theory could hope to produce more physically realistic causal sets. By thinking in terms of a stochastic growth process, and positing some fundamental principles, we are led almost uniquely to a family of dynamical laws (stochastic processes) parameterized by a countable sequence of coupling constants. This result is quite promising in that we now know how to speak of dynamics ...

  19. Thermophysical Properties of Hydrocarbon Mixtures

    Science.gov (United States)

    SRD 4 NIST Thermophysical Properties of Hydrocarbon Mixtures (PC database for purchase)   Interactive computer program for predicting thermodynamic and transport properties of pure fluids and fluid mixtures containing up to 20 components. The components are selected from a database of 196 components, mostly hydrocarbons.

  20. A quantum causal discovery algorithm

    Science.gov (United States)

    Giarmatzi, Christina; Costa, Fabio

    2018-03-01

    Finding a causal model for a set of classical variables is now a well-established task—but what about the quantum equivalent? Even the notion of a quantum causal model is controversial. Here, we present a causal discovery algorithm for quantum systems. The input to the algorithm is a process matrix describing correlations between quantum events. Its output consists of different levels of information about the underlying causal model. Our algorithm determines whether the process is causally ordered by grouping the events into causally ordered non-signaling sets. It detects if all relevant common causes are included in the process, which we label Markovian, or alternatively if some causal relations are mediated through some external memory. For a Markovian process, it outputs a causal model, namely the causal relations and the corresponding mechanisms, represented as quantum states and channels. Our algorithm opens the route to more general quantum causal discovery methods.

  1. Causal inference in econometrics

    CERN Document Server

    Kreinovich, Vladik; Sriboonchitta, Songsak

    2016-01-01

    This book is devoted to the analysis of causal inference which is one of the most difficult tasks in data analysis: when two phenomena are observed to be related, it is often difficult to decide whether one of them causally influences the other one, or whether these two phenomena have a common cause. This analysis is the main focus of this volume. To get a good understanding of the causal inference, it is important to have models of economic phenomena which are as accurate as possible. Because of this need, this volume also contains papers that use non-traditional economic models, such as fuzzy models and models obtained by using neural networks and data mining techniques. It also contains papers that apply different econometric models to analyze real-life economic dependencies.

  2. Perceptual causality in children.

    Science.gov (United States)

    Schlottmann, Anne; Allen, Deborah; Linderoth, Carina; Hesketh, Sarah

    2002-01-01

    Three experiments considered the development of perceptual causality in children from 3 to 9 years of age (N = 176 in total). Adults tend to see cause and effect even in schematic, two-dimensional motion events: Thus, if square A moves toward B, which moves upon contact, they report that A launches B--physical causality. If B moves before contact, adults report that B tries to escape from A--social or psychological causality. A brief pause between movements eliminates such impressions. Even infants in the first year of life are sensitive to causal structure in both contact and no-contact events, but previous research with talking-age children found poor verbal reports. The present experiments used a picture-based forced-choice task to reduce linguistic demands. Observers saw eight different animations involving squares A and B. Events varied in whether or not these agents made contact; whether or not there was a delay at the closest point; and whether they moved rigidly or with a rhythmic, nonrigid "caterpillar" motion. Participants of all ages assigned events with contact to the physical domain and events without contact to the psychological domain. In addition, participants of all ages chose causality more often for events without delay than with delay, but these events became more distinct over the preschool range. The manipulation of agent motion had only minor and inconsistent effects across studies, even though children of all ages considered only the nonrigid motion to be animal-like. These results agree with the view that perceptual causality is available early in development.

  3. Regression to Causality

    DEFF Research Database (Denmark)

    Bordacconi, Mats Joe; Larsen, Martin Vinæs

    2014-01-01

    Humans are fundamentally primed for making causal attributions based on correlations. This implies that researchers must be careful to present their results in a manner that inhibits unwarranted causal attribution. In this paper, we present the results of an experiment that suggests regression...... more likely. Our experiment drew on a sample of 235 university students from three different social science degree programs (political science, sociology and economics), all of whom had received substantial training in statistics. The subjects were asked to compare and evaluate the validity...

  4. Causality and Free Will

    Czech Academy of Sciences Publication Activity Database

    Hvorecký, Juraj

    2012-01-01

    Roč. 19, Supp.2 (2012), s. 64-69 ISSN 1335-0668 R&D Projects: GA ČR(CZ) GAP401/12/0833 Institutional support: RVO:67985955 Keywords : conciousness * free will * determinism * causality Subject RIV: AA - Philosophy ; Religion

  5. Explaining through causal mechanisms

    NARCIS (Netherlands)

    Biesbroek, Robbert; Dupuis, Johann; Wellstead, Adam

    2017-01-01

    This paper synthesizes and builds on recent critiques of the resilience literature; namely that the field has largely been unsuccessful in capturing the complexity of governance processes, in particular cause–effects relationships. We demonstrate that absence of a causal model is reflected in the

  6. Genotoxicity evaluation of multi-component mixtures of polyaromatic hydrocarbons (PAHs), arsenic, cadmium, and lead using flow cytometry based micronucleus test in HepG2 cells.

    Science.gov (United States)

    Muthusamy, Sasikumar; Peng, Cheng; Ng, Jack C

    2018-03-01

    Some polyaromatic hydrocarbons (PAHs) and metals are known human carcinogens and the combined toxicity data of these co-contaminants are important for assessing their health risk. In this study, we have evaluated the combined genotoxicity, AhR activity and cell cycle parameters of four PAHs (benzo[a]pyrene (Ba]P), naphthalene (Nap), phenanthrene (Phe) and pyrene (Pyr)) and three metals (arsenic (As), cadmium (Cd), and lead (Pb)) in HepG2 cells using a flow cytometry based micronucleus (MN) test CAFLUX assay and nuclear fluorescence assay, respectively. The mixtures of B[a]P and metals induced a maximum of four fold increase in the MN formation compared to B[a]P alone. The higher combination of PAHs and metals did not significantly increase the MN formation. The mixtures of metals or non-carcinogenic PAHs were found to increase or decrease the aryl hydrocarbon receptor (AhR) activation of B[a]P in HepG2 cell based CAFLUX assay. Overall, the results showed that combined genotoxicity of PAHs and metals in HepG2 cells vary depending on the concentrations and number of the chemicals that are present in the mixtures and the effects of higher order combinations appear to be largely unpredictable from binary combinations. In this study, we have demonstrated the use of flow cytometry based MN test to screen the genotoxicity of environmental chemicals and its mixtures. Copyright © 2018 Elsevier B.V. All rights reserved.

  7. Genotoxicity but not the AhR-mediated activity of PAHs is inhibited by other components of complex mixtures of ambient air pollutants

    Czech Academy of Sciences Publication Activity Database

    Líbalová, Helena; Krčková, S.; Uhlířová, Kateřina; Milcová, Alena; Schmuczerová, Jana; Cigánek, M.; Kléma, J.; Machala, M.; Šrám, Radim; Topinka, Jan

    2014-01-01

    Roč. 225, č. 3 (2014), s. 350-357 ISSN 0378-4274 R&D Projects: GA ČR GAP503/11/0142 Institutional support: RVO:68378041 Keywords : air pollutin * DNA adducts * complex mixtures Subject RIV: DN - Health Impact of the Environment Quality Impact factor: 3.262, year: 2014

  8. Experimental determination of critical data of multi-component mixtures containing potential gasoline additives 2-butanol by a flow-type apparatus

    International Nuclear Information System (INIS)

    He, Maogang; Xin, Nan; Wang, Chengjie; Liu, Yang; Zhang, Ying; Liu, Xiangyang

    2016-01-01

    Graphical abstract: Experimental critical pressures of 2-butanol + hexane + heptane system. - Highlights: • Critical properties of six binary systems and two ternary systems were measured. • Six binary systems containing 2-butanol show non-ideal behavior in their T c –x 1 curves. • Non-ideal behavior of mixtures with 2-butanol relies on azeotropy. • Experimental data for binary systems were fitted well with Redlich–Kister equation. • Critical surfaces of ternary systems were plotted using the Cibulka’s expressions. - Abstract: In this work, we used a flow method for measurement of critical properties of six binary mixtures (2-butanol + cyclohexane, 2-butanol + hexane, 2-butanol + heptane, 2-butanol + octane, 2-butanol + nonane and 2-butanol + decane) and two ternary mixtures (2-butanol + hexane + heptane and 2-butanol + octane + decane). The critical properties were determined by observing the disappearance and reappearance of the gas–liquid phase meniscus in a quartz glass tube. The standard uncertainties of temperatures and pressures for both binary and ternary mixtures were estimated to be less than 0.2 K and 5.2 kPa, respectively. These critical data provide the boundaries of the two-phase regions of the related mixture systems. Six binary systems show non-ideal behaviors in the loci of critical temperatures. We used the Redlich–Kister equations to correlate the critical temperatures and pressures of these systems and listed the binary interaction parameters. The maximum average absolute deviation (AAD) of each binary system between experimental data and calculated results from Redlich–Kister equations is 0.038% for critical temperatures, and 0.244% for critical pressures. Moreover, the two ternary systems were newly reported and correlated by Cibulka’s and Singh’s expressions. The maximum AAD of critical temperatures and critical pressures are 0.103% and 0.433%, respectively.

  9. Optimal causal inference: estimating stored information and approximating causal architecture.

    Science.gov (United States)

    Still, Susanne; Crutchfield, James P; Ellison, Christopher J

    2010-09-01

    We introduce an approach to inferring the causal architecture of stochastic dynamical systems that extends rate-distortion theory to use causal shielding--a natural principle of learning. We study two distinct cases of causal inference: optimal causal filtering and optimal causal estimation. Filtering corresponds to the ideal case in which the probability distribution of measurement sequences is known, giving a principled method to approximate a system's causal structure at a desired level of representation. We show that in the limit in which a model-complexity constraint is relaxed, filtering finds the exact causal architecture of a stochastic dynamical system, known as the causal-state partition. From this, one can estimate the amount of historical information the process stores. More generally, causal filtering finds a graded model-complexity hierarchy of approximations to the causal architecture. Abrupt changes in the hierarchy, as a function of approximation, capture distinct scales of structural organization. For nonideal cases with finite data, we show how the correct number of the underlying causal states can be found by optimal causal estimation. A previously derived model-complexity control term allows us to correct for the effect of statistical fluctuations in probability estimates and thereby avoid overfitting.

  10. Illness causal beliefs in Turkish immigrants.

    Science.gov (United States)

    Minas, Harry; Klimidis, Steven; Tuncer, Can

    2007-07-24

    People hold a wide variety of beliefs concerning the causes of illness. Such beliefs vary across cultures and, among immigrants, may be influenced by many factors, including level of acculturation, gender, level of education, and experience of illness and treatment. This study examines illness causal beliefs in Turkish-immigrants in Australia. Causal beliefs about somatic and mental illness were examined in a sample of 444 members of the Turkish population of Melbourne. The socio-demographic characteristics of the sample were broadly similar to those of the Melbourne Turkish community. Five issues were examined: the structure of causal beliefs; the relative frequency of natural, supernatural and metaphysical beliefs; ascription of somatic, mental, or both somatic and mental conditions to the various causes; the correlations of belief types with socio-demographic, modernizing and acculturation variables; and the relationship between causal beliefs and current illness. Principal components analysis revealed two broad factors, accounting for 58 percent of the variation in scores on illness belief scales, distinctly interpretable as natural and supernatural beliefs. Second, beliefs in natural causes were more frequent than beliefs in supernatural causes. Third, some causal beliefs were commonly linked to both somatic and mental conditions while others were regarded as more specific to either somatic or mental disorders. Last, there was a range of correlations between endorsement of belief types and factors defining heterogeneity within the community, including with demographic factors, indicators of modernizing and acculturative processes, and the current presence of illness. Results supported the classification of causal beliefs proposed by Murdock, Wilson & Frederick, with a division into natural and supernatural causes. While belief in natural causes is more common, belief in supernatural causes persists despite modernizing and acculturative influences. Different

  11. Space, time and causality

    International Nuclear Information System (INIS)

    Lucas, J.R.

    1984-01-01

    Originating from lectures given to first year undergraduates reading physics and philosophy or mathematics and philosophy, formal logic is applied to issues and the elucidation of problems in space, time and causality. No special knowledge of relativity theory or quantum mechanics is needed. The text is interspersed with exercises and each chapter is preceded by a suggested 'preliminary reading' and followed by 'further reading' references. (U.K.)

  12. Operator ordering and causality

    OpenAIRE

    Plimak, L. I.; Stenholm, S. T.

    2011-01-01

    It is shown that causality violations [M. de Haan, Physica 132A, 375, 397 (1985)], emerging when the conventional definition of the time-normal operator ordering [P.L.Kelley and W.H.Kleiner, Phys.Rev. 136, A316 (1964)] is taken outside the rotating wave approximation, disappear when the amended definition [L.P. and S.S., Annals of Physics, 323, 1989 (2008)] of this ordering is used.

  13. X-ray fluorescence analysis of Cr6+ component in mixtures of Cr2O3 and K2CrO4

    International Nuclear Information System (INIS)

    Tochio, Tatsunori; Sakakura, Shusuke; Oohashi, Hirofumi

    2010-01-01

    X-ray fluorescence analysis using Cr K α spectra was applied to the determination of the mixing ratio of Cr 6+ to (Cr 6+ + Cr 3+ ) in several mixtures of K 2 CrO 4 and Cr 2 O 3 . Because the powder of K 2 CrO 4 contained large particles that were more than 50 μm in diameter, it was ground between a pestle and a mortar for about 8 h. The coarse particles still remaining were removed by using a sieve with 325-mesh (44 μm) in order to reduce the difference in absorption effects between emissions from Cr 6+ and those from Cr 3+ . The mixing ratio, K 2 CrO 4 /(K 2 CrO 4 + Cr 2 O 3 ), of the five mixtures investigated is 0.50, 0.40, 0.20, 0.10, and 0.05 in weight, respectively. Each spectrum obtained was analyzed by decomposing it into two reference spectra, those of the two pure materials, K 2 CrO 4 and Cr 2 O 3 , with a constant background. The results for the mixtures containing K 2 CrO 4 of more than 20 wt% are that the relative deviation from the true value is less than ∼5%. On the other hand, when the content of K 2 CrO 4 decreases to less than 10 wt%, the relative deviation gets so large as 20 - 25%. The error coming from a peak separation of spectrum involved in our results were estimated by applying our method to five sets of data for each mixture computationally generated, taking into account the uncertainty in total counts of real measurements. (author)

  14. X-ray fluorescence analysis of Cr(6+) component in mixtures of Cr(2)O(3) and K(2)CrO(4).

    Science.gov (United States)

    Tochio, Tatsunori; Sakakura, Shusuke; Oohashi, Hirofumi; Mizota, Hirohisa; Zou, Yanhui; Ito, Yoshiaki; Fukushima, Sei; Tanuma, Shigeo; Shoji, Takashi; Fujimura, Hajime; Yamashita, Michiru

    2010-01-01

    X-ray fluorescence analysis using Cr K(alpha) spectra was applied to the determination of the mixing ratio of Cr(6+) to (Cr(6+) + Cr(3+)) in several mixtures of K(2)CrO(4) and Cr(2)O(3). Because the powder of K(2)CrO(4) contained large particles that were more than 50 microm in diameter, it was ground between a pestle and a mortar for about 8 h. The coarse particles still remaining were removed by using a sieve with 325-mesh (44 microm) in order to reduce the difference in absorption effects between emissions from Cr(6+) and those from Cr(3+). The mixing ratio, K(2)CrO(4)/(K(2)CrO(4) + Cr(2)O(3)), of the five mixtures investigated is 0.50, 0.40, 0.20, 0.10, and 0.05 in weight, respectively. Each spectrum obtained was analyzed by decomposing it into two reference spectra, those of the two pure materials, K(2)CrO(4) and Cr(2)O(3), with a constant background. The results for the mixtures containing K(2)CrO(4) of more than 20 wt% are that the relative deviation from the true value is less than approximately 5%. On the other hand, when the content of K(2)CrO(4) decreases to less than 10 wt%, the relative deviation gets so large as 20 - 25%. The error coming from a peak separation of spectrum involved in our results were estimated by applying our method to five sets of data for each mixture computationally generated, taking into account the uncertainty in total counts of real measurements.

  15. Effect of permethrin, anthracene and mixture exposure on shell components, enzymatic activities and proteins status in the Mediterranean clam Venerupis decussata

    Energy Technology Data Exchange (ETDEWEB)

    Sellami, Badreddine, E-mail: sellamibadreddine@gmail.com [Laboratory of Environment Biomonitoring, Coastal Ecology Unit, Faculty of Sciences of Bizerta, University of Carthage, 7021 Zarzouna (Tunisia); Khazri, Abdelhafidh [Laboratory of Environment Biomonitoring, Coastal Ecology Unit, Faculty of Sciences of Bizerta, University of Carthage, 7021 Zarzouna (Tunisia); Mezni, Amine [Unit of Research 99/UR12-30, Department of Chemistry, Faculty of Sciences of Bizerte, 7021 Jarzouna (Tunisia); Louati, Héla; Dellali, Mohamed; Aissa, Patricia; Mahmoudi, Ezzeddine; Beyrem, Hamouda [Laboratory of Environment Biomonitoring, Coastal Ecology Unit, Faculty of Sciences of Bizerta, University of Carthage, 7021 Zarzouna (Tunisia); Sheehan, David, E-mail: d.sheehan@ucc.ie [Environmental Research Institute and Department of Biochemistry, University College Cork, Western Gateway Building, Western Road, Cork (Ireland)

    2015-01-15

    Highlights: • We assessed toxicity of anthracene, permethrin and their mixture on clams. • Tissue and stressor-dependent changes were observed in biochemical responses. • Permethrin induces phase transition from aragonite to calcite in shell structure. • Interactive effects were observed on digestive gland and gill biomarkers. • Both approaches give new vision to risk assessment of organic pollution. - Abstract: Anthracene (ANT) and permethrin (PER) are two of the more toxic compounds reaching the marine environment. This study aimed to determine the impact of these molecules on Venerupis decussata, an economically important species cultured on the Tunisian coast. Shell structure and its possible transformation upon exposure to the two contaminants were studied by X-ray diffraction and gravimetric analyses. Results revealed a phase transition in shell composition from aragonite to calcite after PER exposure, to a mixture of PER and ANT (Mix) but not for ANT alone. Catalase (CAT), superoxide dismutase (SOD) and glutathione transferase (GST) activities were determined in digestive gland and gills after exposure to ANT, PER and Mix to assess the impact of the contamination on the oxidative status of V. decussata. Enzyme activities increased in the digestive gland after PER treatment and in the gills after ANT treatment. PER exposure significantly reduced the levels of free thiols and increased levels of carbonylated proteins in the digestive gland, as compared to controls. In contrast, ANT exposure significantly reduced free thiols and increased the number of carbonylated proteins in the gills. Mix induced additive effects as measured by both enzymatic and proteomic approaches. The present study suggests that PER has a strong effect on shell structure; that PER and ANT exposure generate compound-dependent oxidative stress in the tissues of V. decussata and that a mixture of the two compounds has synergistic effects on biochemical response.

  16. Effect of permethrin, anthracene and mixture exposure on shell components, enzymatic activities and proteins status in the Mediterranean clam Venerupis decussata

    International Nuclear Information System (INIS)

    Sellami, Badreddine; Khazri, Abdelhafidh; Mezni, Amine; Louati, Héla; Dellali, Mohamed; Aissa, Patricia; Mahmoudi, Ezzeddine; Beyrem, Hamouda; Sheehan, David

    2015-01-01

    Highlights: • We assessed toxicity of anthracene, permethrin and their mixture on clams. • Tissue and stressor-dependent changes were observed in biochemical responses. • Permethrin induces phase transition from aragonite to calcite in shell structure. • Interactive effects were observed on digestive gland and gill biomarkers. • Both approaches give new vision to risk assessment of organic pollution. - Abstract: Anthracene (ANT) and permethrin (PER) are two of the more toxic compounds reaching the marine environment. This study aimed to determine the impact of these molecules on Venerupis decussata, an economically important species cultured on the Tunisian coast. Shell structure and its possible transformation upon exposure to the two contaminants were studied by X-ray diffraction and gravimetric analyses. Results revealed a phase transition in shell composition from aragonite to calcite after PER exposure, to a mixture of PER and ANT (Mix) but not for ANT alone. Catalase (CAT), superoxide dismutase (SOD) and glutathione transferase (GST) activities were determined in digestive gland and gills after exposure to ANT, PER and Mix to assess the impact of the contamination on the oxidative status of V. decussata. Enzyme activities increased in the digestive gland after PER treatment and in the gills after ANT treatment. PER exposure significantly reduced the levels of free thiols and increased levels of carbonylated proteins in the digestive gland, as compared to controls. In contrast, ANT exposure significantly reduced free thiols and increased the number of carbonylated proteins in the gills. Mix induced additive effects as measured by both enzymatic and proteomic approaches. The present study suggests that PER has a strong effect on shell structure; that PER and ANT exposure generate compound-dependent oxidative stress in the tissues of V. decussata and that a mixture of the two compounds has synergistic effects on biochemical response

  17. Causal Entropic Forces

    Science.gov (United States)

    Wissner-Gross, A. D.; Freer, C. E.

    2013-04-01

    Recent advances in fields ranging from cosmology to computer science have hinted at a possible deep connection between intelligence and entropy maximization, but no formal physical relationship between them has yet been established. Here, we explicitly propose a first step toward such a relationship in the form of a causal generalization of entropic forces that we find can cause two defining behaviors of the human “cognitive niche”—tool use and social cooperation—to spontaneously emerge in simple physical systems. Our results suggest a potentially general thermodynamic model of adaptive behavior as a nonequilibrium process in open systems.

  18. Analogy in causal inference: rethinking Austin Bradford Hill's neglected consideration.

    Science.gov (United States)

    Weed, Douglas L

    2018-05-01

    The purpose of this article was to rethink and resurrect Austin Bradford Hill's "criterion" of analogy as an important consideration in causal inference. In epidemiology today, analogy is either completely ignored (e.g., in many textbooks), or equated with biologic plausibility or coherence, or aligned with the scientist's imagination. None of these examples, however, captures Hill's description of analogy. His words suggest that there may be something gained by contrasting two bodies of evidence, one from an established causal relationship, the other not. Coupled with developments in the methods of systematic assessments of evidence-including but not limited to meta-analysis-analogy can be restructured as a key component in causal inference. This new approach will require that a collection-a library-of known cases of causal inference (i.e., bodies of evidence involving established causal relationships) be developed. This library would likely include causal assessments by organizations such as the International Agency for Research on Cancer, the National Toxicology Program, and the United States Environmental Protection Agency. In addition, a process for describing key features of a causal relationship would need to be developed along with what will be considered paradigm cases of causation. Finally, it will be important to develop ways to objectively compare a "new" body of evidence with the relevant paradigm case of causation. Analogy, along with all other existing methods and causal considerations, may improve our ability to identify causal relationships. Copyright © 2018 Elsevier Inc. All rights reserved.

  19. Phase behaviour of symmetric binary mixtures with partially miscible components in slit-like pores. Application of the fundamental measure density functional approach

    CERN Document Server

    Martínez, A; Patrykiejew, A; Sokolowski, S

    2003-01-01

    We investigate adsorption in slit-like pores of model symmetric binary mixtures exhibiting demixing in bulk phase, by using a density functional approach. Our focus is on the evaluation of the first-order phase transitions in adsorbed fluids and the lines separating mixed and demixed phases. The scenario for phase transitions is sensitive to the pore width and to the energy of adsorption. Both these parameters can change the phase diagrams of the confined fluid. In particular, for relatively wide pores and for strong wall-fluid interactions, the demixing line can precede the first-order transition. Moreover, a competition between layering transitions and demixing within particular layers also leads to further enrichment of the phase diagram.

  20. Trokomponentne granulisane smeše na bazi heksogena, aluminijuma i polistirena kao flegmatizatora / Three-component granular mixtures on the basis of hexogen, aluminium and polystirene as a binder

    Directory of Open Access Journals (Sweden)

    Mirjana N. Lukić-Anđelković

    2009-10-01

    Full Text Available U radu je prikazan način dobijanja i karakteristike trokomponentnih smeša RDX/Al/PS. Primenjen flegmatizator heksogena je termostabilni polimer polistiren čija je karakteristika da dobro prekriva. Primenjivan je konstantan sadržaj od 5% PS za različite sadržaje heksogena i aluminijuma. Sadržaj aluminijuma u smešama je 10, 15, 20 u 25% m/m. Ispitan je sastav i određena brzina detonacije. / The characteristics of three-component RDX/PS/Al mixtures have been described s well as the method for their preparation. Polystirene as a binder is a thermostable polymer with satisfactory characteristics for bonding explosives. The constant content of 5% m/m PS was applied for different contents of hexogen and aluminium. The content of Al in the mixtures was 10, 15, and 20% m/m. The composition of the bonded explosives was examined as well as the detonation velocity of these mixtures.

  1. Causal inference based on counterfactuals

    Directory of Open Access Journals (Sweden)

    Höfler M

    2005-09-01

    Full Text Available Abstract Background The counterfactual or potential outcome model has become increasingly standard for causal inference in epidemiological and medical studies. Discussion This paper provides an overview on the counterfactual and related approaches. A variety of conceptual as well as practical issues when estimating causal effects are reviewed. These include causal interactions, imperfect experiments, adjustment for confounding, time-varying exposures, competing risks and the probability of causation. It is argued that the counterfactual model of causal effects captures the main aspects of causality in health sciences and relates to many statistical procedures. Summary Counterfactuals are the basis of causal inference in medicine and epidemiology. Nevertheless, the estimation of counterfactual differences pose several difficulties, primarily in observational studies. These problems, however, reflect fundamental barriers only when learning from observations, and this does not invalidate the counterfactual concept.

  2. Causal Reasoning with Mental Models

    Science.gov (United States)

    2014-08-08

    mreasoner/. 445 In broad terms, three strands of evidence corroborate the model theory of causal deductions. The 446 first strand of evidence bears ...models and causal reasoning Sangeet Khemlani et al. 13 She will not gain weight. 459 Will she not eat protein? 460 The results therefore bear out the... Adele Goldberg, Catrinel Haught, Max Lotstein, Marco Ragni, and Greg 821 Trafton for helpful criticisms. 822 Khemlani et al. Causal reasoning with

  3. Path integrals on causal sets

    International Nuclear Information System (INIS)

    Johnston, Steven

    2009-01-01

    We describe a quantum mechanical model for particle propagation on a causal set. The model involves calculating a particle propagator by summing amplitudes assigned to trajectories within the causal set. This 'discrete path integral' is calculated using a matrix geometric series. Amplitudes are given which, when the causal set is generated by sprinkling points into 1+1 or 3+1 Minkowski spacetime, ensure the particle propagator agrees in a suitable sense, with the retarded causal propagator for the Klein-Gordon equation.

  4. Causality Statistical Perspectives and Applications

    CERN Document Server

    Berzuini, Carlo; Bernardinell, Luisa

    2012-01-01

    A state of the art volume on statistical causality Causality: Statistical Perspectives and Applications presents a wide-ranging collection of seminal contributions by renowned experts in the field, providing a thorough treatment of all aspects of statistical causality. It covers the various formalisms in current use, methods for applying them to specific problems, and the special requirements of a range of examples from medicine, biology and economics to political science. This book:Provides a clear account and comparison of formal languages, concepts and models for statistical causality. Addr

  5. Structural Equations and Causal Explanations: Some Challenges for Causal SEM

    Science.gov (United States)

    Markus, Keith A.

    2010-01-01

    One common application of structural equation modeling (SEM) involves expressing and empirically investigating causal explanations. Nonetheless, several aspects of causal explanation that have an impact on behavioral science methodology remain poorly understood. It remains unclear whether applications of SEM should attempt to provide complete…

  6. Influence of Physiological Gastrointestinal Surfactant Ratio on the Equilibrium Solubility of BCS Class II Drugs Investigated Using a Four Component Mixture Design

    OpenAIRE

    Zhou, Zhou; Dunn, Claire; Khadra, Ibrahim; Wilson, Clive G.; Halbert, Gavin W.

    2017-01-01

    The absorption of poorly water-soluble drugs is influenced by the luminal gastrointestinal fluid content and composition, which control solubility. Simulated intestinal fluids have been introduced into dissolution testing including endogenous amphiphiles and digested lipids at physiological levels; however, in vivo individual variation exists in the concentrations of these components, which will alter drug absorption through an effect on solubility. The use of a factorial design of experiment...

  7. A Bayesian nonparametric approach to causal inference on quantiles.

    Science.gov (United States)

    Xu, Dandan; Daniels, Michael J; Winterstein, Almut G

    2018-02-25

    We propose a Bayesian nonparametric approach (BNP) for causal inference on quantiles in the presence of many confounders. In particular, we define relevant causal quantities and specify BNP models to avoid bias from restrictive parametric assumptions. We first use Bayesian additive regression trees (BART) to model the propensity score and then construct the distribution of potential outcomes given the propensity score using a Dirichlet process mixture (DPM) of normals model. We thoroughly evaluate the operating characteristics of our approach and compare it to Bayesian and frequentist competitors. We use our approach to answer an important clinical question involving acute kidney injury using electronic health records. © 2018, The International Biometric Society.

  8. Re-thinking local causality

    NARCIS (Netherlands)

    Friederich, Simon

    There is widespread belief in a tension between quantum theory and special relativity, motivated by the idea that quantum theory violates J. S. Bell's criterion of local causality, which is meant to implement the causal structure of relativistic space-time. This paper argues that if one takes the

  9. Expert Causal Reasoning and Explanation.

    Science.gov (United States)

    Kuipers, Benjamin

    The relationship between cognitive psychologists and researchers in artificial intelligence carries substantial benefits for both. An ongoing investigation in causal reasoning in medical problem solving systems illustrates this interaction. This paper traces a dialectic of sorts in which three different types of causal resaoning for medical…

  10. Introduction to causal dynamical triangulations

    DEFF Research Database (Denmark)

    Görlich, Andrzej

    2013-01-01

    The method of causal dynamical triangulations is a non-perturbative and background-independent approach to quantum theory of gravity. In this review we present recent results obtained within the four dimensional model of causal dynamical triangulations. We describe the phase structure of the mode...

  11. Covariation in Natural Causal Induction.

    Science.gov (United States)

    Cheng, Patricia W.; Novick, Laura R.

    1991-01-01

    Biases and models usually offered by cognitive and social psychology and by philosophy to explain causal induction are evaluated with respect to focal sets (contextually determined sets of events over which covariation is computed). A probabilistic contrast model is proposed as underlying covariation computation in natural causal induction. (SLD)

  12. Causal aspects of diffraction

    International Nuclear Information System (INIS)

    Crawford, G.N.

    1981-01-01

    The analysis is directed at a causal description of photon diffraction, which is explained in terms of a wave exerting real forces and providing actual guidance to each quantum of energy. An undulatory PSI wave is associated with each photon, and this wave is assumed to imply more than an informative probability function, so that it actually carries real energy, in much the same way as does an electro-magnetic wave. Whether or not it may be in some way related to the electromagnetic wave is left as a matter of on-going concern. A novel application of the concept of a minimum energy configuration is utilized; that is, a system of energy quanta seeks out relative positions and orientations of least mutual energy, much as an electron seeks its Bohr radius as a position of least mutual energy. Thus the concept implies more a guiding interaction of the PSI waves than an interfering cancellation of these waves. Similar concepts have been suggested by L. de Broglie and D. Bohm

  13. Clear message for causality

    Energy Technology Data Exchange (ETDEWEB)

    Steinberg, Aephraim M. [Institute for Experimental Physics, University of Vienna, Vienna (Austria)

    2003-12-01

    Experiment confirms that information cannot be transmitted faster than the speed of light. Ever since Einstein stated that nothing can travel faster than light, physicists have delighted in finding exceptions. One after another, observations of such 'superluminal' propagation have been made. However, while some image or pattern- such as the motion of a spotlight projected on a distant wall - might have appeared to travel faster than light, it seemed that there was no way to use the superluminal effect to transmit energy or information. In recent years, the superluminal propagation of light pulses through certain media has led to renewed controversy. In 1995, for example, Guenther Nimtz of the University of Cologne encoded Mozart's 40th Symphony on a microwave beam, which he claimed to have transmitted at a speed faster than light. Others maintain that such a violation of Einstein's speed limit would wreak havoc on our most fundamental ideas about causality, allowing an effect to precede its cause. Relativity teaches us that sending a signal faster than light would be equivalent to sending it backwards in time. (U.K.)

  14. Kernel Method for Nonlinear Granger Causality

    Science.gov (United States)

    Marinazzo, Daniele; Pellicoro, Mario; Stramaglia, Sebastiano

    2008-04-01

    Important information on the structure of complex systems can be obtained by measuring to what extent the individual components exchange information among each other. The linear Granger approach, to detect cause-effect relationships between time series, has emerged in recent years as a leading statistical technique to accomplish this task. Here we generalize Granger causality to the nonlinear case using the theory of reproducing kernel Hilbert spaces. Our method performs linear Granger causality in the feature space of suitable kernel functions, assuming arbitrary degree of nonlinearity. We develop a new strategy to cope with the problem of overfitting, based on the geometry of reproducing kernel Hilbert spaces. Applications to coupled chaotic maps and physiological data sets are presented.

  15. Mixtures Estimation and Applications

    CERN Document Server

    Mengersen, Kerrie; Titterington, Mike

    2011-01-01

    This book uses the EM (expectation maximization) algorithm to simultaneously estimate the missing data and unknown parameter(s) associated with a data set. The parameters describe the component distributions of the mixture; the distributions may be continuous or discrete. The editors provide a complete account of the applications, mathematical structure and statistical analysis of finite mixture distributions along with MCMC computational methods, together with a range of detailed discussions covering the applications of the methods and features chapters from the leading experts on the subject

  16. Principal stratification in causal inference.

    Science.gov (United States)

    Frangakis, Constantine E; Rubin, Donald B

    2002-03-01

    Many scientific problems require that treatment comparisons be adjusted for posttreatment variables, but the estimands underlying standard methods are not causal effects. To address this deficiency, we propose a general framework for comparing treatments adjusting for posttreatment variables that yields principal effects based on principal stratification. Principal stratification with respect to a posttreatment variable is a cross-classification of subjects defined by the joint potential values of that posttreatment variable tinder each of the treatments being compared. Principal effects are causal effects within a principal stratum. The key property of principal strata is that they are not affected by treatment assignment and therefore can be used just as any pretreatment covariate. such as age category. As a result, the central property of our principal effects is that they are always causal effects and do not suffer from the complications of standard posttreatment-adjusted estimands. We discuss briefly that such principal causal effects are the link between three recent applications with adjustment for posttreatment variables: (i) treatment noncompliance, (ii) missing outcomes (dropout) following treatment noncompliance. and (iii) censoring by death. We then attack the problem of surrogate or biomarker endpoints, where we show, using principal causal effects, that all current definitions of surrogacy, even when perfectly true, do not generally have the desired interpretation as causal effects of treatment on outcome. We go on to forrmulate estimands based on principal stratification and principal causal effects and show their superiority.

  17. Multi-response optimization using Taguchi design and principle component analysis for removing binary mixture of alizarin red and alizarin yellow from aqueous solution by nano γ-alumina.

    Science.gov (United States)

    Zolgharnein, Javad; Asanjrani, Neda; Bagtash, Maryam; Azimi, Gholamhasan

    2014-05-21

    The nanostructure of γ-alumina was used as an effective adsorbent for simultaneous removing of a mixture of alizarin red and alizarin yellow from aqueous solutions. The Taguchi design and principle component analysis were applied to explore effective parameters for achieving a higher adsorption capacity and removal percentage of the binary mixture containing alizarin red and alizarin yellow. Seven factors including temperature, contact time, initial pH value, the shaker rate, the sorbent dose, and initial concentrations of alizarin red and alizarin yellow in three levels were considered through the Taguchi technique. A L27 orthogonal array was used to determine the signal-to-noise ratio. Then, the removal percentage (R%) and adsorption capacity (q) of the above-mentioned dyes were transformed into an accurate S/N ratio. The Taguchi method indicates that the solution pH has the most contribution in controlling the removal percentage of alizarin red and alizarin yellow. Under optimal condition, the maximum removal percentages of 99% and 78.5%, and the capacity uptake of 54.4 and 39.0mg g(-1) were obtained for both alizarin red and alizarin yellow, respectively. Isotherm modeling and kinetic investigations showed that Langmuir, modified Langmuir, and pseudo-second-order models describe both the adsorption equilibrium and kinetic behavior well. The Fourier transform infrared analysis also firmly confirmed the involving active sites of nano γ-alumina in the adsorption process. Copyright © 2014 Elsevier B.V. All rights reserved.

  18. Methodology for the identification of tri-terpenes mixtures components by {sup 13} C NMR; Metodologia para identificao dos componentes de misturas de triterpenos por RMN de {sup 13} C

    Energy Technology Data Exchange (ETDEWEB)

    Olea, Roberto S.G.

    1990-12-31

    This work describes a methodology for the identification of tri terpenes complex mixtures by {sup 13} C NMR. The use of {sup 13} C NMR techniques, such as obtention of noise decoupled spectra, DEPT 135 and DEPT 90 sequences, allowed the identification of components of triterpene mixtures with identical functionality through comparison of observed {sup 13} C NMR chemical shifts with {sup 13} C NMR chemical shifts reported in the literature. The method proved to be specially helpful in the identification of triterpenes by analysis of chemical shifts assignable to doubly bonded carbons, since the particular position of such double bonds is characteristic of some triterpene skeletons. Application of this methodology indicated the presence of bauerenol, {alpha}-amyrin and {beta}-amyrin in Acmanthera latifolis Griseb. (Malpighiaceae); of germanicone, lupenone, {alpha}-amyrenone and {beta}-amyrenone in Alibertia macrophylla A. Rich. (Rubiaceae); of {alpha}-amyrin acetate, lupeol acetate and {beta}-amyrin acetate in Vernonia polyanthes Schreb. (Asteraceae); {alpha}-amyrenone, {beta}-amyrenone, boehmerone, friedelin, lupenone, {alpha}-amyrin, {beta}-amyrin and glutinol in Scoparia dulcis L. (Scrophulariaceae). (author). 37 refs., 93 figs.

  19. Causal boundary for stably causal space-times

    International Nuclear Information System (INIS)

    Racz, I.

    1987-12-01

    The usual boundary constructions for space-times often yield an unsatisfactory boundary set. This problem is reviewed and a new solution is proposed. An explicit identification rule is given on the set of the ideal points of the space-time. This construction leads to a satisfactory boundary point set structure for stably causal space-times. The topological properties of the resulting causal boundary construction are examined. For the stably causal space-times each causal curve has a unique endpoint on the boundary set according to the extended Alexandrov topology. The extension of the space-time through the boundary is discussed. To describe the singularities the defined boundary sets have to be separated into two disjoint sets. (D.Gy.) 8 refs

  20. Discrete causal theory emergent spacetime and the causal metric hypothesis

    CERN Document Server

    Dribus, Benjamin F

    2017-01-01

    This book evaluates and suggests potentially critical improvements to causal set theory, one of the best-motivated approaches to the outstanding problems of fundamental physics. Spacetime structure is of central importance to physics beyond general relativity and the standard model. The causal metric hypothesis treats causal relations as the basis of this structure. The book develops the consequences of this hypothesis under the assumption of a fundamental scale, with smooth spacetime geometry viewed as emergent. This approach resembles causal set theory, but differs in important ways; for example, the relative viewpoint, emphasizing relations between pairs of events, and relationships between pairs of histories, is central. The book culminates in a dynamical law for quantum spacetime, derived via generalized path summation.

  1. [Using 2-DCOS to identify the molecular spectrum peaks for the isomer in the multi-component mixture gases Fourier transform infrared analysis].

    Science.gov (United States)

    Zhao, An-Xin; Tang, Xiao-Jun; Zhang, Zhong-Hua; Liu, Jun-Hua

    2014-10-01

    The generalized two-dimensional correlation spectroscopy and Fourier transform infrared were used to identify hydrocarbon isomers in the mixed gases for absorption spectra resolution enhancement. The Fourier transform infrared spectrum of n-butane and iso-butane and the two-dimensional correlation infrared spectrum of concentration perturbation were used for analysis as an example. The all band and the main absorption peak wavelengths of Fourier transform infrared spectrum for single component gas showed that the spectra are similar, and if they were mixed together, absorption peaks overlap and peak is difficult to identify. The synchronous and asynchronous spectrum of two-dimensional correlation spectrum can clearly identify the iso-butane and normal butane and their respective characteristic absorption peak intensity. Iso-butane has strong absorption characteristics spectrum lines at 2,893, 2,954 and 2,893 cm(-1), and n-butane at 2,895 and 2,965 cm(-1). The analysis result in this paper preliminary verified that the two-dimensional infrared correlation spectroscopy can be used for resolution enhancement in Fourier transform infrared spectrum quantitative analysis.

  2. Classical planning and causal implicatures

    DEFF Research Database (Denmark)

    Blackburn, Patrick Rowan; Benotti, Luciana

    In this paper we motivate and describe a dialogue manager (called Frolog) which uses classical planning to infer causal implicatures. A causal implicature is a type of Gricean relation implicature, a highly context dependent form of inference. As we shall see, causal implicatures are important...... to generate clarification requests"; as a result we can model task-oriented dialogue as an interactive process locally structured by negotiation of the underlying task. We give several examples of Frolog-human dialog, discuss the limitations imposed by the classical planning paradigm, and indicate...

  3. Functional equations with causal operators

    CERN Document Server

    Corduneanu, C

    2003-01-01

    Functional equations encompass most of the equations used in applied science and engineering: ordinary differential equations, integral equations of the Volterra type, equations with delayed argument, and integro-differential equations of the Volterra type. The basic theory of functional equations includes functional differential equations with causal operators. Functional Equations with Causal Operators explains the connection between equations with causal operators and the classical types of functional equations encountered by mathematicians and engineers. It details the fundamentals of linear equations and stability theory and provides several applications and examples.

  4. A novel method dependent only on the mixture information (MIM) for evaluating the toxicity of mixture

    International Nuclear Information System (INIS)

    Zhang Jin; Liu Shushen; Liu Hailing; Zhu Xiangwei; Mi Xiaojuan

    2011-01-01

    Compound contamination and toxicity interaction necessitate the development of models that have an insight into the combined toxicity of chemicals. In this paper, a novel and simple model dependent only on the mixture information (MIM), was developed. Firstly, the concentration-response data of seven groups of binary and multi-component (pseudo-binary) mixtures with different mixture ratios to Vibrio qinghaiensis sp.-Q67 were determined using the microplate toxicity analysis. Then, a desirable non-linear function was selected to fit the data. It was found that there are good linear correlations between the location parameter (α) and mixture ratio (p) of a component and between the steepness (β) and p. Based on the correlations, a mixture toxicity model independent of pure component toxicity profiles was built. The model can be used to accurately estimate the toxicities of the seven groups of mixtures, which greatly simplified the predictive procedure of the combined toxicity. - Highlights: → We develop a mixture toxicity model only on the mixture toxicity profiles. → We regard all multi-component mixtures as pseudo-binary mixtures. → The model is built by using a set of uniform design mixtures. → The model is validated by using a set of fixed concentration ratio mixtures. → The model can also predict the toxicity of external mixtures. - A novel method depends only on mixture information but not on pure component toxicity profiles for evaluating the combined toxicity.

  5. Violation of causality in f( T) gravity

    Science.gov (United States)

    Otalora, G.; Rebouças, M. J.

    2017-11-01

    In the standard formulation, the f( T) field equations are not invariant under local Lorentz transformations, and thus the theory does not inherit the causal structure of special relativity. Actually, even locally violation of causality can occur in this formulation of f( T) gravity. A locally Lorentz covariant f( T) gravity theory has been devised recently, and this local causality problem seems to have been overcome. The non-locality question, however, is left open. If gravitation is to be described by this covariant f( T) gravity theory there are a number of issues that ought to be examined in its context, including the question as to whether its field equations allow homogeneous Gödel-type solutions, which necessarily leads to violation of causality on non-local scale. Here, to look into the potentialities and difficulties of the covariant f( T) theories, we examine whether they admit Gödel-type solutions. We take a combination of a perfect fluid with electromagnetic plus a scalar field as source, and determine a general Gödel-type solution, which contains special solutions in which the essential parameter of Gödel-type geometries, m^2, defines any class of homogeneous Gödel-type geometries. We show that solutions of the trigonometric and linear classes (m^2 electromagnetic field matter component. We extended to the context of covariant f( T) gravity a theorem which ensures that any perfect-fluid homogeneous Gödel-type solution defines the same set of Gödel tetrads h_A^{ μ } up to a Lorentz transformation. We also showed that the single massless scalar field generates Gödel-type solution with no closed time-like curves. Even though the covariant f( T) gravity restores Lorentz covariance of the field equations and the local validity of the causality principle, the bare existence of the Gödel-type solutions makes apparent that the covariant formulation of f( T) gravity does not preclude non-local violation of causality in the form of closed time

  6. Classical planning and causal implicatures

    DEFF Research Database (Denmark)

    Blackburn, Patrick Rowan; Benotti, Luciana

    for understanding the structure of task-oriented dialogues. Such dialogues locate conversational acts in contexts containing both pending tasks and the acts which bring them about. The ability to infer causal implicatures lets us interleave decisions about "how to sequence actions" with decisions about "when......In this paper we motivate and describe a dialogue manager (called Frolog) which uses classical planning to infer causal implicatures. A causal implicature is a type of Gricean relation implicature, a highly context dependent form of inference. As we shall see, causal implicatures are important...... to generate clarification requests"; as a result we can model task-oriented dialogue as an interactive process locally structured by negotiation of the underlying task. We give several examples of Frolog-human dialog, discuss the limitations imposed by the classical planning paradigm, and indicate...

  7. Consciousness and the "Causal Paradox"

    OpenAIRE

    Velmans, Max

    1996-01-01

    Viewed from a first-person perspective consciousness appears to be necessary for complex, novel human activity - but viewed from a third-person perspective consciousness appears to play no role in the activity of brains, producing a "causal paradox". To resolve this paradox one needs to distinguish consciousness of processing from consciousness accompanying processing or causing processing. Accounts of consciousness/brain causal interactions switch between first- and third-person perspectives...

  8. Identifying Mixtures of Mixtures Using Bayesian Estimation

    Science.gov (United States)

    Malsiner-Walli, Gertraud; Frühwirth-Schnatter, Sylvia; Grün, Bettina

    2017-01-01

    ABSTRACT The use of a finite mixture of normal distributions in model-based clustering allows us to capture non-Gaussian data clusters. However, identifying the clusters from the normal components is challenging and in general either achieved by imposing constraints on the model or by using post-processing procedures. Within the Bayesian framework, we propose a different approach based on sparse finite mixtures to achieve identifiability. We specify a hierarchical prior, where the hyperparameters are carefully selected such that they are reflective of the cluster structure aimed at. In addition, this prior allows us to estimate the model using standard MCMC sampling methods. In combination with a post-processing approach which resolves the label switching issue and results in an identified model, our approach allows us to simultaneously (1) determine the number of clusters, (2) flexibly approximate the cluster distributions in a semiparametric way using finite mixtures of normals and (3) identify cluster-specific parameters and classify observations. The proposed approach is illustrated in two simulation studies and on benchmark datasets. Supplementary materials for this article are available online. PMID:28626349

  9. Causality and analyticity in optics

    International Nuclear Information System (INIS)

    Nussenzveig, H.M.

    In order to provide an overall picture of the broad range of optical phenomena that are directly linked with the concepts of causality and analyticity, the following topics are briefly reviewed, emphasizing recent developments: 1) Derivation of dispersion relations for the optical constants of general linear media from causality. Application to the theory of natural optical activity. 2) Derivation of sum rules for the optical constants from causality and from the short-time response function (asymptotic high-frequency behavior). Average spectral behavior of optical media. Applications. 3) Role of spectral conditions. Analytic properties of coherence functions in quantum optics. Reconstruction theorem.4) Phase retrieval problems. 5) Inverse scattering problems. 6) Solution of nonlinear evolution equations in optics by inverse scattering methods. Application to self-induced transparency. Causality in nonlinear wave propagation. 7) Analytic continuation in frequency and angular momentum. Complex singularities. Resonances and natural-mode expansions. Regge poles. 8) Wigner's causal inequality. Time delay. Spatial displacements in total reflection. 9) Analyticity in diffraction theory. Complex angular momentum theory of Mie scattering. Diffraction as a barrier tunnelling effect. Complex trajectories in optics. (Author) [pt

  10. Hierarchical organisation of causal graphs

    International Nuclear Information System (INIS)

    Dziopa, P.

    1993-01-01

    This paper deals with the design of a supervision system using a hierarchy of models formed by graphs, in which the variables are the nodes and the causal relations between the variables of the arcs. To obtain a representation of the variables evolutions which contains only the relevant features of their real evolutions, the causal relations are completed with qualitative transfer functions (QTFs) which produce roughly the behaviour of the classical transfer functions. Major improvements have been made in the building of the hierarchical organization. First, the basic variables of the uppermost level and the causal relations between them are chosen. The next graph is built by adding intermediary variables to the upper graph. When the undermost graph has been built, the transfer functions parameters corresponding to its causal relations are identified. The second task consists in the upwelling of the information from the undermost graph to the uppermost one. A fusion procedure of the causal relations has been designed to compute the QFTs relevant for each level. This procedure aims to reduce the number of parameters needed to represent an evolution at a high level of abstraction. These techniques have been applied to the hierarchical modelling of nuclear process. (authors). 8 refs., 12 figs

  11. Dynamically redundant particle components in mixtures

    International Nuclear Information System (INIS)

    Lukacs, B.; Martinas, K.

    1984-10-01

    Examples are shown for cases in which the number of different kinds of particles in a system is not necessarily equal to the number of particle degrees of freedom in thermodynamical sense, and at the same time, the observed dynamics of the evolution of the system does not indicate a definite number of degrees of freedeom. The possibility for introducing dynamically redundant particles is discussed. (author)

  12. Effects of Crude Oil/Dispersant Mixture and Dispersant Components on PPARγ Activity in Vitro and in Vivo: Identification of Dioctyl Sodium Sulfosuccinate (DOSS; CAS #577-11-7) as a Probable Obesogen.

    Science.gov (United States)

    Temkin, Alexis M; Bowers, Robert R; Magaletta, Margaret E; Holshouser, Steven; Maggi, Adriana; Ciana, Paolo; Guillette, Louis J; Bowden, John A; Kucklick, John R; Baatz, John E; Spyropoulos, Demetri D

    2016-01-01

    DOSS is a putative obesogen worthy of further study, including epidemiological and clinical investigations into laxative prescriptions consisting of DOSS. Temkin AM, Bowers RR, Magaletta ME, Holshouser S, Maggi A, Ciana P, Guillette LJ, Bowden JA, Kucklick JR, Baatz JE, Spyropoulos DD. 2016. Effects of crude oil/dispersant mixture and dispersant components on PPARγ activity in vitro and in vivo: identification of dioctyl sodium sulfosuccinate (DOSS; CAS #577-11-7) as a probable obesogen. Environ Health Perspect 124:112-119; http://dx.doi.org/10.1289/ehp.1409672.

  13. Entropy for theories with indefinite causal structure

    International Nuclear Information System (INIS)

    Markes, Sonia; Hardy, Lucien

    2011-01-01

    Any theory with definite causal structure has a defined past and future, be it defined by light cones or an absolute time scale. Entropy is a concept that has traditionally been reliant on a definite notion of causality. However, without a definite notion of causality, the concept of entropy is not all lost. Indefinite causal structure results from combining probabilistic predictions and dynamical space-time. The causaloid framework lays the mathematical groundwork to be able to treat indefinite causal structure. In this paper, we build on the causaloid mathematics and define a causally-unbiased entropy for an indefinite causal structure. In defining a causally-unbiased entropy, there comes about an emergent idea of causality in the form of a measure of causal connectedness, termed the Q factor.

  14. A Causal Theory of Modality

    Directory of Open Access Journals (Sweden)

    José Tomás Alvarado

    2009-08-01

    Full Text Available This work presents a causal conception of metaphysical modality in which a state of affairs is metaphysically possible if and only if it can be caused (in the past, the present or the future by current entities. The conception is contrasted with what is called the “combinatorial” conception of modality, in which everything can co-exist with anything else. This work explains how the notion of ‘causality’ should be construed in the causal theory, what difference exists between modalities thus defined from nomological modality, how accessibility relations between possible worlds should be interpreted, and what is the relation between the causal conception and the necessity of origin.

  15. Introductive remarks on causal inference

    Directory of Open Access Journals (Sweden)

    Silvana A. Romio

    2013-05-01

    Full Text Available One of the more challenging issues in epidemiological research is being able to provide an unbiased estimate of the causal exposure-disease effect, to assess the possible etiological mechanisms and the implication for public health. A major source of bias is confounding, which can spuriously create or mask the causal relationship. In the last ten years, methodological research has been developed to better de_ne the concept of causation in epidemiology and some important achievements have resulted in new statistical models. In this review, we aim to show how a technique the well known by statisticians, i.e. standardization, can be seen as a method to estimate causal e_ects, equivalent under certain conditions to the inverse probability treatment weight procedure.

  16. Quantum theory and local causality

    CERN Document Server

    Hofer-Szabó, Gábor

    2018-01-01

    This book summarizes the results of research the authors have pursued in the past years on the problem of implementing Bell's notion of local causality in local physical theories and relating it to other important concepts and principles in the foundations of physics such as the Common Cause Principle, Bell's inequalities, the EPR (Einstein-Podolsky-Rosen) scenario, and various other locality and causality concepts. The book is intended for philosophers of science with an interest in the formal background of sciences, philosophers of physics and physicists working in foundation of physics.

  17. Causal feedbacks in climate change

    NARCIS (Netherlands)

    Nes, van E.H.; Scheffer, M.; Brovkin, V.; Lenton, T.M.; Ye, H.; Deyle, E.; Sugihara, G.

    2015-01-01

    The statistical association between temperature and greenhouse gases over glacial cycles is well documented1, but causality behind this correlation remains difficult to extract directly from the data. A time lag of CO2 behind Antarctic temperature—originally thought to hint at a driving role for

  18. Granger Causality and Unit Roots

    DEFF Research Database (Denmark)

    Rodríguez-Caballero, Carlos Vladimir; Ventosa-Santaulària, Daniel

    2014-01-01

    , eventually rejecting the null hypothesis, even when the series are independent of each other. Moreover, controlling for these deterministic elements (in the auxiliary regressions of the test) does not preclude the possibility of drawing erroneous inferences. Granger-causality tests should not be used under...... stochastic nonstationarity, a property typically found in many macroeconomic variables....

  19. Separation of gas mixtures

    International Nuclear Information System (INIS)

    1981-01-01

    Apparatus is described for the separation of a gaseous plasma mixture into components in some of which the original concentration of a specific ion has been greatly increased or decreased, comprising: a source for converting the gaseous mixture into a train of plasma packets; an open-ended vessel with a main section and at least one branch section, adapted to enclose along predetermined tracks the original plasma packets in the main section, and the separated plasma components in the branch sections; drive means for generating travelling magnetic waves along the predetermined tracks with the magnetic flux vector of the waves transverse to each of the tracks; and means for maintaining phase coherence between the plasma packets and the magnetic waves at a value needed for accelerating the components of the packets to different velocities and in such different directions that the plasma of each packet is divided into distinctly separate packets in some of which the original concentration of a specific ion has been greatly increased or decreased, and which plasma packets are collected from the branch sections of the vessels. (author)

  20. Liquid class predictor for liquid handling of complex mixtures

    Science.gov (United States)

    Seglke, Brent W [San Ramon, CA; Lekin, Timothy P [Livermore, CA

    2008-12-09

    A method of establishing liquid classes of complex mixtures for liquid handling equipment. The mixtures are composed of components and the equipment has equipment parameters. The first step comprises preparing a response curve for the components. The next step comprises using the response curve to prepare a response indicator for the mixtures. The next step comprises deriving a model that relates the components and the mixtures to establish the liquid classes.

  1. Simulation of mixture microstructures via particle packing models and their direct comparison with real mixtures

    Science.gov (United States)

    Gulliver, Eric A.

    particle-size distributions or mixture composition. Control charts based on tessellation measurements were used for direct, quantitative comparisons between real and simulated mixtures. Four sets of simulated and real mixtures were examined. Data from real mixture was matched with simulated data when the samples were well mixed and the particle size distributions and volume fractions of the components were identical. Analysis of mixture components that occupied less than approximately 10 vol% of the mixture was not practical unless the particle size of the component was extremely small and excellent quality high-resolution compositional micrographs of the real sample are available. These methods of analysis should allow future researchers to systematically evaluate and predict the impact and importance of variables such as component volume fraction and component particle size distribution as they pertain to the uniformity of powder mixture microstructures.

  2. Supercritical separation process for complex organic mixtures

    Science.gov (United States)

    Chum, H.L.; Filardo, G.

    1990-10-23

    A process is disclosed for separating low molecular weight components from complex aqueous organic mixtures. The process includes preparing a separation solution of supercritical carbon dioxide with an effective amount of an entrainer to modify the solvation power of the supercritical carbon dioxide and extract preselected low molecular weight components. The separation solution is maintained at a temperature of at least about 70 C and a pressure of at least about 1,500 psi. The separation solution is then contacted with the organic mixtures while maintaining the temperature and pressure as above until the mixtures and solution reach equilibrium to extract the preselected low molecular weight components from the organic mixtures. Finally, the entrainer/extracted components portion of the equilibrium mixture is isolated from the separation solution. 1 fig.

  3. Spatial correlation in 2D and 3D thin films of conserved binary mixtures in the presence of wetting of substrates by the preferred majority component: interpretation in real scenario

    Science.gov (United States)

    Singh, Satya Pal

    2012-09-01

    The spinodal decomposition has got fresh attraction in the past few decades with the advent of new computational problems under very thin geometries. The nanodrops evolve in the phase separation process. The phase separation process itself interplays with the wetting forces and give rise to structures of importance for a wide range of technological applications from spherical nanomagnetic domains to magnetic strips. MC simulation programs for 2D and 3D cases have been written for surface directed phase separation (using metropolis algorithm) to observe the spatial correlation varying with time, which shows a polynomial fitting behavior for 2D case and follow a peculiar trend with time specially, in early stages of evolution indicating a colossal behavior. The two point correlations (Pearson's linear xy correlation function) if evaluated in 3D case do not show any important oscillatory behavior but instead confirm for the two regimes as phase separation or mixing and the wetting ones. The correct generalization of xy correlation as xyz correlation in 3D case (i.e., product of the three moments) does not seem to be a reliable one because it moves to six to seven decimal places, thus comes out at the cost of loss in confidence limit. Thus, the 3D simulation confirm the two regime behavior indicating that the same colossal behavior of 2D case can exist in real 3D thin film of random binary mixture. Thus, the colossal behavior as obtained for the case of 2D problem is retained, and this may indicate for a quantized or discrete colossal behavior for certain set of composition and interface parameters for definite but small time periods. The corresponding density profiles are also plotted to confirm the distributions of the two components. Such computational studies may help in developing theoretical models for the observed phenomena and to search out for new events at the very bottom of the scale.

  4. Two roads to noncommutative causality

    International Nuclear Information System (INIS)

    Besnard, Fabien

    2015-01-01

    We review the physical motivations and the mathematical results obtained so far in the isocone-based approach to noncommutative causality. We also give a briefer account of the alternative framework of Franco and Eckstein which is based on Lorentzian spectral triples. We compare the two theories on the simple example of the product geometry of the Minkowski plane by the finite noncommutative space with algebra M 2 (C). (paper)

  5. Concept of statistical causality and local martingales

    Directory of Open Access Journals (Sweden)

    Valjarević Dragana

    2016-01-01

    Full Text Available In this paper we consider a statistical concept of causality in continuous time in filtered probability spaces which is based on Granger's definitions of causality. The given causality concept is closely connected to the preservation of the property being a local martingale if the filtration is getting larger. Namely, the local martingale remains unpredictable if the amount of information is increased. We proved that the preservation of this property is equivalent with the concept of causality.

  6. Obesity and infection: reciprocal causality.

    Science.gov (United States)

    Hainer, V; Zamrazilová, H; Kunešová, M; Bendlová, B; Aldhoon-Hainerová, I

    2015-01-01

    Associations between different infectious agents and obesity have been reported in humans for over thirty years. In many cases, as in nosocomial infections, this relationship reflects the greater susceptibility of obese individuals to infection due to impaired immunity. In such cases, the infection is not related to obesity as a causal factor but represents a complication of obesity. In contrast, several infections have been suggested as potential causal factors in human obesity. However, evidence of a causal linkage to human obesity has only been provided for adenovirus 36 (Adv36). This virus activates lipogenic and proinflammatory pathways in adipose tissue, improves insulin sensitivity, lipid profile and hepatic steatosis. The E4orf1 gene of Adv36 exerts insulin senzitizing effects, but is devoid of its pro-inflammatory modalities. The development of a vaccine to prevent Adv36-induced obesity or the use of E4orf1 as a ligand for novel antidiabetic drugs could open new horizons in the prophylaxis and treatment of obesity and diabetes. More experimental and clinical studies are needed to elucidate the mutual relations between infection and obesity, identify additional infectious agents causing human obesity, as well as define the conditions that predispose obese individuals to specific infections.

  7. Information flow and causality as rigorous notions ab initio

    Science.gov (United States)

    Liang, X. San

    2016-11-01

    Information flow or information transfer the widely applicable general physics notion can be rigorously derived from first principles, rather than axiomatically proposed as an ansatz. Its logical association with causality is firmly rooted in the dynamical system that lies beneath. The principle of nil causality that reads, an event is not causal to another if the evolution of the latter is independent of the former, which transfer entropy analysis and Granger causality test fail to verify in many situations, turns out to be a proven theorem here. Established in this study are the information flows among the components of time-discrete mappings and time-continuous dynamical systems, both deterministic and stochastic. They have been obtained explicitly in closed form, and put to applications with the benchmark systems such as the Kaplan-Yorke map, Rössler system, baker transformation, Hénon map, and stochastic potential flow. Besides unraveling the causal relations as expected from the respective systems, some of the applications show that the information flow structure underlying a complex trajectory pattern could be tractable. For linear systems, the resulting remarkably concise formula asserts analytically that causation implies correlation, while correlation does not imply causation, providing a mathematical basis for the long-standing philosophical debate over causation versus correlation.

  8. Behavioural Pattern of Causality Parameter of Autoregressive ...

    African Journals Online (AJOL)

    In this paper, a causal form of Autoregressive Moving Average process, ARMA (p, q) of various orders and behaviour of the causality parameter of ARMA model is investigated. It is deduced that the behaviour of causality parameter ψi depends on positive and negative values of autoregressive parameter φ and moving ...

  9. Causal knowledge and reasoning in decision making

    NARCIS (Netherlands)

    Hagmayer, Y.; Witteman, C.L.M.

    2017-01-01

    Normative causal decision theories argue that people should use their causal knowledge in decision making. Based on these ideas, we argue that causal knowledge and reasoning may support and thereby potentially improve decision making based on expected outcomes, narratives, and even cues. We will

  10. The argumentative impact of causal relations

    DEFF Research Database (Denmark)

    Nielsen, Anne Ellerup

    1996-01-01

    such as causality, explanation and justification. In certain types of discourse, causal relations also imply an intentional element. This paper describes the way in which the semantic and pragmatic functions of causal markers can be accounted for in terms of linguistic and rhetorical theories of argumentation....

  11. Deciding which chemical mixtures risk assessment methods work best for what mixtures

    International Nuclear Information System (INIS)

    Teuschler, Linda K.

    2007-01-01

    The most commonly used chemical mixtures risk assessment methods involve simple notions of additivity and toxicological similarity. Newer methods are emerging in response to the complexities of chemical mixture exposures and effects. Factors based on both science and policy drive decisions regarding whether to conduct a chemical mixtures risk assessment and, if so, which methods to employ. Scientific considerations are based on positive evidence of joint toxic action, elevated human exposure conditions or the potential for significant impacts on human health. Policy issues include legislative drivers that may mandate action even though adequate toxicity data on a specific mixture may not be available and risk assessment goals that impact the choice of risk assessment method to obtain the amount of health protection desired. This paper discusses three important concepts used to choose among available approaches for conducting a chemical mixtures risk assessment: (1) additive joint toxic action of mixture components; (2) toxicological interactions of mixture components; and (3) chemical composition of complex mixtures. It is proposed that scientific support for basic assumptions used in chemical mixtures risk assessment should be developed by expert panels, risk assessment methods experts, and laboratory toxicologists. This is imperative to further develop and refine quantitative methods and provide guidance on their appropriate applications. Risk assessors need scientific support for chemical mixtures risk assessment methods in the form of toxicological data on joint toxic action for high priority mixtures, statistical methods for analyzing dose-response for mixtures, and toxicological and statistical criteria for determining sufficient similarity of complex mixtures

  12. Norms and customs: causally important or causally impotent?

    Science.gov (United States)

    Jones, Todd

    2010-01-01

    In this article, I argue that norms and customs, despite frequently being described as being causes of behavior in the social sciences and ordinary conversation, cannot really cause behavior. Terms like "norms" and the like seem to refer to philosophically disreputable disjunctive properties. More problematically, even if they do not, or even if there can be disjunctive properties after all, I argue that norms and customs still cannot cause behavior. The social sciences would be better off without referring to properties like norms and customs as if they could be causal.

  13. A theory of causal learning in children: causal maps and Bayes nets.

    Science.gov (United States)

    Gopnik, Alison; Glymour, Clark; Sobel, David M; Schulz, Laura E; Kushnir, Tamar; Danks, David

    2004-01-01

    The authors outline a cognitive and computational account of causal learning in children. They propose that children use specialized cognitive systems that allow them to recover an accurate "causal map" of the world: an abstract, coherent, learned representation of the causal relations among events. This kind of knowledge can be perspicuously understood in terms of the formalism of directed graphical causal models, or Bayes nets. Children's causal learning and inference may involve computations similar to those for learning causal Bayes nets and for predicting with them. Experimental results suggest that 2- to 4-year-old children construct new causal maps and that their learning is consistent with the Bayes net formalism.

  14. Some properties of explosive mixtures containing peroxides

    International Nuclear Information System (INIS)

    Zeman, Svatopluk; Trzcinski, Waldemar A.; Matyas, Robert

    2008-01-01

    This study concerns mixtures of triacetone triperoxide (3,3,6,6,9,9-hexamethyl-1,2,4,5,7,8-hexoxonane, TATP) and ammonium nitrate (AN) with added water (W), as the case may be, and dry mixtures of TATP with urea nitrate (UN). Relative performances (RP) of the mixtures and their individual components, relative to TNT, were determined by means of ballistic mortar. The detonation energies, E 0 , and detonation velocities, D, were calculated for the mixtures studied by means of the thermodynamic code CHEETAH. Relationships have been found and are discussed between the RP and the E 0 values related to unit volume of gaseous products of detonation of these mixtures. These relationships together with those between RP and oxygen balance values of the mixtures studied indicate different types of participation of AN and UN in the explosive decomposition of the respective mixtures. Dry TATP/UN mixtures exhibit lower RP than analogous mixtures TATP/AN containing up to 25% of water. Depending on the water content, the TATP/AN mixtures possess higher detonability values than the ANFO explosives. A semi-logarithmic relationship between the D values and oxygen coefficients has been derived for all the mixtures studied at the charge density of 1000 kg m -3 . Among the mixtures studied, this relationship distinguishes several samples of the type of 'tertiary explosives' as well as samples that approach 'high explosives' in their performances and detonation velocities

  15. Model structure selection in convolutive mixtures

    DEFF Research Database (Denmark)

    Dyrholm, Mads; Makeig, Scott; Hansen, Lars Kai

    2006-01-01

    The CICAAR algorithm (convolutive independent component analysis with an auto-regressive inverse model) allows separation of white (i.i.d) source signals from convolutive mixtures. We introduce a source color model as a simple extension to the CICAAR which allows for a more parsimoneous...... representation in many practical mixtures. The new filter-CICAAR allows Bayesian model selection and can help answer questions like: 'Are we actually dealing with a convolutive mixture?'. We try to answer this question for EEG data....

  16. A theory of causal learning in children: Causal maps and Bayes nets

    OpenAIRE

    Gopnik, A; Glymour, C; Sobel, D M; Schulz, L E; Kushnir, T; Danks, D

    2004-01-01

    The authors outline a cognitive and computational account of causal learning in children. They propose that children use specialized cognitive systems that allow them to recover an accurate "causal map" of the world: an abstract, coherent, learned representation of the causal relations among events. This kind of knowledge can be perspicuously understood in terms of the formalism of directed graphical causal models, or Bayes nets. Children's causal learning and inference may involve computatio...

  17. Identifying Causality from Alarm Observations

    DEFF Research Database (Denmark)

    Kirchhübel, Denis; Zhang, Xinxin; Lind, Morten

    on an abstracted model of the mass and energy flows in the system. The application of MFM for root cause analysis based alarm grouping has been demonstrated and can be extended to reason about the direction of causality considering the entirety of the alarms present in the system for more comprehensive decision...... support. This contribution presents the foundation for combining the cause and consequence propagation of multiple observations from the system based on an MFM model. The proposed logical reasoning matches actually observed alarms to the propagation analysis in MFM to distinguish plausible causes...

  18. Random number generators and causality

    International Nuclear Information System (INIS)

    Larrondo, H.A.; Martin, M.T.; Gonzalez, C.M.; Plastino, A.; Rosso, O.A.

    2006-01-01

    We advance a prescription to randomize physical or algorithmic Random Number Generators (RNG's) that do not pass Marsaglia's DIEHARD test suite and discuss a special physical quantifier, based on an intensive statistical complexity measure, that is able to adequately assess the improvements produced thereby. Eight RNG's are evaluated and the associated results are compared to those obtained by recourse to Marsaglia's DIEHARD test suite. Our quantifier, which is evaluated using causality arguments, can forecast whether a given RNG will pass the above mentioned test

  19. Causality analysis in business performance measurement system using system dynamics methodology

    Science.gov (United States)

    Yusof, Zainuridah; Yusoff, Wan Fadzilah Wan; Maarof, Faridah

    2014-07-01

    One of the main components of the Balanced Scorecard (BSC) that differentiates it from any other performance measurement system (PMS) is the Strategy Map with its unidirectional causality feature. Despite its apparent popularity, criticisms on the causality have been rigorously discussed by earlier researchers. In seeking empirical evidence of causality, propositions based on the service profit chain theory were developed and tested using the econometrics analysis, Granger causality test on the 45 data points. However, the insufficiency of well-established causality models was found as only 40% of the causal linkages were supported by the data. Expert knowledge was suggested to be used in the situations of insufficiency of historical data. The Delphi method was selected and conducted in obtaining the consensus of the causality existence among the 15 selected expert persons by utilizing 3 rounds of questionnaires. Study revealed that only 20% of the propositions were not supported. The existences of bidirectional causality which demonstrate significant dynamic environmental complexity through interaction among measures were obtained from both methods. With that, a computer modeling and simulation using System Dynamics (SD) methodology was develop as an experimental platform to identify how policies impacting the business performance in such environments. The reproduction, sensitivity and extreme condition tests were conducted onto developed SD model to ensure their capability in mimic the reality, robustness and validity for causality analysis platform. This study applied a theoretical service management model within the BSC domain to a practical situation using SD methodology where very limited work has been done.

  20. Causal reasoning and models of cognitive tasks for naval nuclear power plant operators

    International Nuclear Information System (INIS)

    Salazar-Ferrer, P.

    1995-06-01

    In complex industrial process control, causal reasoning appears as a major component in operators' cognitive tasks. It is tightly linked to diagnosis, prediction of normal and failure states, and explanation. This work provides a detailed review of literature in causal reasoning. A synthesis is proposed as a model of causal reasoning in process control. This model integrates distinct approaches in Cognitive Science: especially qualitative physics, Bayesian networks, knowledge-based systems, and cognitive psychology. Our model defines a framework for the analysis of causal human errors in simulated naval nuclear power plant fault management. Through the methodological framework of critical incident analysis we define a classification of errors and difficulties linked to causal reasoning. This classification is based on shallow characteristics of causal reasoning. As an origin of these errors, more elementary component activities in causal reasoning are identified. The applications cover the field of functional specification for man-machine interfaces, operators support systems design as well as nuclear safety. In addition of this study, we integrate the model of causal reasoning in a model of cognitive task in process control. (authors). 106 refs., 49 figs., 8 tabs

  1. CADDIS Volume 1. Stressor Identification: About Causal Assessment

    Science.gov (United States)

    An introduction to the history of our approach to causal assessment, A chronology of causal history and philosophy, An introduction to causal history and philosophy, References for the Causal Assessment Background section of Stressor Identification

  2. Robust classification using mixtures of dependency networks

    DEFF Research Database (Denmark)

    Gámez, José A.; Mateo, Juan L.; Nielsen, Thomas Dyhre

    2008-01-01

    -ups are often obtained at the expense of accuracy. In this paper we try to address this issue through the use of mixtures of dependency networks. To reduce learning time and improve robustness when dealing with data sparse classes, we outline methods for reusing calculations across mixture components. Finally...

  3. An Algorithmic Information Calculus for Causal Discovery and Reprogramming Systems

    KAUST Repository

    Zenil, Hector

    2017-09-08

    We introduce a conceptual framework and an interventional calculus to steer and manipulate systems based on their intrinsic algorithmic probability using the universal principles of the theory of computability and algorithmic information. By applying sequences of controlled interventions to systems and networks, we estimate how changes in their algorithmic information content are reflected in positive/negative shifts towards and away from randomness. The strong connection between approximations to algorithmic complexity (the size of the shortest generating mechanism) and causality induces a sequence of perturbations ranking the network elements by the steering capabilities that each of them is capable of. This new dimension unmasks a separation between causal and non-causal components providing a suite of powerful parameter-free algorithms of wide applicability ranging from optimal dimension reduction, maximal randomness analysis and system control. We introduce methods for reprogramming systems that do not require the full knowledge or access to the system\\'s actual kinetic equations or any probability distributions. A causal interventional analysis of synthetic and regulatory biological networks reveals how the algorithmic reprogramming qualitatively reshapes the system\\'s dynamic landscape. For example, during cellular differentiation we find a decrease in the number of elements corresponding to a transition away from randomness and a combination of the system\\'s intrinsic properties and its intrinsic capabilities to be algorithmically reprogrammed can reconstruct an epigenetic landscape. The interventional calculus is broadly applicable to predictive causal inference of systems such as networks and of relevance to a variety of machine and causal learning techniques driving model-based approaches to better understanding and manipulate complex systems.

  4. Probabilistic causality and radiogenic cancers

    International Nuclear Information System (INIS)

    Groeer, P.G.

    1986-01-01

    A review and scrutiny of the literature on probability and probabilistic causality shows that it is possible under certain assumptions to estimate the probability that a certain type of cancer diagnosed in an individual exposed to radiation prior to diagnosis was caused by this exposure. Diagnosis of this causal relationship like diagnosis of any disease - malignant or not - requires always some subjective judgments by the diagnostician. It is, therefore, illusory to believe that tables based on actuarial data can provide objective estimates of the chance that a cancer diagnosed in an individual is radiogenic. It is argued that such tables can only provide a base from which the diagnostician(s) deviate in one direction or the other according to his (their) individual (consensual) judgment. Acceptance of a physician's diagnostic judgment by patients is commonplace. Similar widespread acceptance of expert judgment by claimants in radiation compensation cases does presently not exist. Judicious use of the present radioepidemiological tables prepared by the Working Group of the National Institutes of Health or of updated future versions of similar tables may improve the situation. 20 references

  5. Space and time in perceptual causality

    Directory of Open Access Journals (Sweden)

    Benjamin Straube

    2010-04-01

    Full Text Available Inferring causality is a fundamental feature of human cognition that allows us to theorize about and predict future states of the world. Michotte suggested that humans automatically perceive causality based on certain perceptual features of events. However, individual differences in judgments of perceptual causality cast doubt on Michotte’s view. To gain insights in the neural basis of individual difference in the perception of causality, our participants judged causal relationships in animations of a blue ball colliding with a red ball (a launching event while fMRI-data were acquired. Spatial continuity and temporal contiguity were varied parametrically in these stimuli. We did not find consistent brain activation differences between trials judged as caused and those judged as non-caused, making it unlikely that humans have universal instantiation of perceptual causality in the brain. However, participants were slower to respond to and showed greater neural activity for violations of causality, suggesting that humans are biased to expect causal relationships when moving objects appear to interact. Our participants demonstrated considerable individual differences in their sensitivity to spatial and temporal characteristics in perceiving causality. These qualitative differences in sensitivity to time or space in perceiving causality were instantiated in individual differences in activation of the left basal ganglia or right parietal lobe, respectively. Thus, the perception that the movement of one object causes the movement of another is triggered by elemental spatial and temporal sensitivities, which themselves are instantiated in specific distinct neural networks.

  6. The Functions of Danish Causal Conjunctions

    Directory of Open Access Journals (Sweden)

    Rita Therkelsen

    2004-01-01

    Full Text Available In the article I propose an analysis of the Danish causal conjunctions fordi, siden and for based on the framework of Danish Functional Grammar. As conjunctions they relate two clauses, and their semantics have in common that it indicates a causal relationship between the clauses. The causal conjunctions are different as far as their distribution is concerned; siden conjoins a subordinate clause and a main clause, for conjoins two main clauses, and fordi is able to do both. Methodologically I have based my analysis on these distributional properties comparing siden and fordi conjoining a subordinate and a main clause, and comparing for and fordi conjoining two main clauses, following the thesis that they would establish a causal relationship between different kinds of content. My main findings are that fordi establishes a causal relationship between the events referred to by the two clauses, and the whole utterance functions as a statement of this causal relationship. Siden presupposes such a general causal relationship between the two events and puts forward the causing event as a reason for assuming or wishing or ordering the caused event, siden thus establishes a causal relationship between an event and a speech act. For equally presupposes a general causal relationship between two events and it establishes a causal relationship between speech acts, and fordi conjoining two main clauses is able to do this too, but in this position it also maintains its event-relating ability, the interpretation depending on contextual factors.

  7. Shear viscosity of liquid mixtures Mass dependence

    CERN Document Server

    Kaushal, R

    2002-01-01

    Expressions for zeroth, second, and fourth sum rules of transverse stress autocorrelation function of two component fluid have been derived. These sum rules and Mori's memory function formalism have been used to study shear viscosity of Ar-Kr and isotopic mixtures. It has been found that theoretical result is in good agreement with the computer simulation result for the Ar-Kr mixture. The mass dependence of shear viscosity for different mole fraction shows that deviation from ideal linear model comes even from mass difference in two species of fluid mixture. At higher mass ratio shear viscosity of mixture is not explained by any of the emperical model.

  8. Linear causal modeling with structural equations

    CERN Document Server

    Mulaik, Stanley A

    2009-01-01

    Emphasizing causation as a functional relationship between variables that describe objects, Linear Causal Modeling with Structural Equations integrates a general philosophical theory of causation with structural equation modeling (SEM) that concerns the special case of linear causal relations. In addition to describing how the functional relation concept may be generalized to treat probabilistic causation, the book reviews historical treatments of causation and explores recent developments in experimental psychology on studies of the perception of causation. It looks at how to perceive causal

  9. Electromagnetic pulses, localized and causal

    Science.gov (United States)

    Lekner, John

    2018-01-01

    We show that pulse solutions of the wave equation can be expressed as time Fourier superpositions of scalar monochromatic beam wave functions (solutions of the Helmholtz equation). This formulation is shown to be equivalent to Bateman's integral expression for solutions of the wave equation, for axially symmetric solutions. A closed-form one-parameter solution of the wave equation, containing no backward-propagating parts, is constructed from a beam which is the tight-focus limit of two families of beams. Application is made to transverse electric and transverse magnetic pulses, with evaluation of the energy, momentum and angular momentum for a pulse based on the general localized and causal form. Such pulses can be represented as superpositions of photons. Explicit total energy and total momentum values are given for the one-parameter closed-form pulse.

  10. Space-time as a causal set

    International Nuclear Information System (INIS)

    Bombelli, L.; Lee, J.; Meyer, D.; Sorkin, R.D.

    1987-01-01

    We propose that space-time at the smallest scales is in reality a causal set: a locally finite set of elements endowed with a partial order corresponding to the macroscopic relation that defines past and future. We explore how a Lorentzian manifold can approximate a causal set, noting in particular that the thereby defined effective dimensionality of a given causal set can vary with length scale. Finally, we speculate briefly on the quantum dynamics of causal sets, indicating why an appropriate choice of action can reproduce general relativity in the classical limit

  11. Tools for Detecting Causality in Space Systems

    Science.gov (United States)

    Johnson, J.; Wing, S.

    2017-12-01

    Complex systems such as the solar and magnetospheric envivonment often exhibit patterns of behavior that suggest underlying organizing principles. Causality is a key organizing principle that is particularly difficult to establish in strongly coupled nonlinear systems, but essential for understanding and modeling the behavior of systems. While traditional methods of time-series analysis can identify linear correlations, they do not adequately quantify the distinction between causal and coincidental dependence. We discuss tools for detecting causality including: granger causality, transfer entropy, conditional redundancy, and convergent cross maps. The tools are illustrated by applications to magnetospheric and solar physics including radiation belt, Dst (a magnetospheric state variable), substorm, and solar cycle dynamics.

  12. Does causal action facilitate causal perception in infants younger than 6 months of age?

    Science.gov (United States)

    Rakison, David H; Krogh, Lauren

    2012-01-01

    Previous research has established that infants are unable to perceive causality until 6¼ months of age. The current experiments examined whether infants' ability to engage in causal action could facilitate causal perception prior to this age. In Experiment 1, 4½-month-olds were randomly assigned to engage in causal action experience via Velcro sticky mittens or not engage in causal action because they wore non-sticky mittens. Both groups were then tested in the visual habituation paradigm to assess their causal perception. Infants who engaged in causal action - but not those without this causal action experience - perceived the habituation events as causal. Experiment 2 used a similar design to establish that 4½-month-olds are unable to generalize their own causal action to causality observed in dissimilar objects. These data are the first to demonstrate that infants under 6 months of age can perceive causality, and have implications for the mechanisms underlying the development of causal perception. © 2011 Blackwell Publishing Ltd.

  13. Mixture Experiments and their Application in Agricultural Research

    OpenAIRE

    Raza, Irum; Masood, M. Asif; Mahmood, Rashid

    2013-01-01

    The present study was designed to show the applicability of Mixture designs in Agricultural Research System and to fit an appropriate mixture regression model making response variables as functions of the proportions of the mixture components. Data on four components namely neem oil, garlic oil, clove oil and tobacco extract (ml) were collected from field experiment conducted by Honeybee Research Institute, NARC. The main goal of the experiment was to check whether blending two components hav...

  14. Optimal mixture experiments

    CERN Document Server

    Sinha, B K; Pal, Manisha; Das, P

    2014-01-01

    The book dwells mainly on the optimality aspects of mixture designs. As mixture models are a special case of regression models, a general discussion on regression designs has been presented, which includes topics like continuous designs, de la Garza phenomenon, Loewner order domination, Equivalence theorems for different optimality criteria and standard optimality results for single variable polynomial regression and multivariate linear and quadratic regression models. This is followed by a review of the available literature on estimation of parameters in mixture models. Based on recent research findings, the volume also introduces optimal mixture designs for estimation of optimum mixing proportions in different mixture models, which include Scheffé’s quadratic model, Darroch-Waller model, log- contrast model, mixture-amount models, random coefficient models and multi-response model.  Robust mixture designs and mixture designs in blocks have been also reviewed. Moreover, some applications of mixture desig...

  15. Causal ubiquity in quantum physics. A superluminal and local-causal physical ontology

    International Nuclear Information System (INIS)

    Neelamkavil, Raphael

    2014-01-01

    A fixed highest criterial velocity (of light) in STR (special theory of relativity) is a convention for a layer of physical inquiry. QM (Quantum Mechanics) avoids action-at-a-distance using this concept, but accepts non-causality and action-at-a-distance in EPR (Einstein-Podolsky-Rosen-Paradox) entanglement experiments. Even in such allegedly [non-causal] processes, something exists processually in extension-motion, between the causal and the [non-causal]. If STR theoretically allows real-valued superluminal communication between EPR entangled particles, quantum processes become fully causal. That is, the QM world is sub-luminally, luminally and superluminally local-causal throughout, and the Law of Causality is ubiquitous in the micro-world. Thus, ''probabilistic causality'' is a merely epistemic term.

  16. Causal ubiquity in quantum physics. A superluminal and local-causal physical ontology

    Energy Technology Data Exchange (ETDEWEB)

    Neelamkavil, Raphael

    2014-07-01

    A fixed highest criterial velocity (of light) in STR (special theory of relativity) is a convention for a layer of physical inquiry. QM (Quantum Mechanics) avoids action-at-a-distance using this concept, but accepts non-causality and action-at-a-distance in EPR (Einstein-Podolsky-Rosen-Paradox) entanglement experiments. Even in such allegedly [non-causal] processes, something exists processually in extension-motion, between the causal and the [non-causal]. If STR theoretically allows real-valued superluminal communication between EPR entangled particles, quantum processes become fully causal. That is, the QM world is sub-luminally, luminally and superluminally local-causal throughout, and the Law of Causality is ubiquitous in the micro-world. Thus, ''probabilistic causality'' is a merely epistemic term.

  17. Quasi-Experimental Designs for Causal Inference

    Science.gov (United States)

    Kim, Yongnam; Steiner, Peter

    2016-01-01

    When randomized experiments are infeasible, quasi-experimental designs can be exploited to evaluate causal treatment effects. The strongest quasi-experimental designs for causal inference are regression discontinuity designs, instrumental variable designs, matching and propensity score designs, and comparative interrupted time series designs. This…

  18. Causal random geometry from stochastic quantization

    DEFF Research Database (Denmark)

    Ambjørn, Jan; Loll, R.; Westra, W.

    2010-01-01

     in this short note we review a recently found formulation of two-dimensional causal quantum gravity defined through Causal Dynamical Triangulations and stochastic quantization. This procedure enables one to extract the nonperturbative quantum Hamiltonian of the random surface model including the...... the sum over topologies. Interestingly, the generally fictitious stochastic time corresponds to proper time on the geometries...

  19. Special Relativity, Causality and Quantum Mechanics-2

    Indian Academy of Sciences (India)

    Home; Journals; Resonance – Journal of Science Education; Volume 11; Issue 9. Special Relativity, Causality and Quantum Mechanics - 2. Guruprasad Kar Samir Kunkri Sujit K Choudhary. General Article Volume 11 Issue 9 ... Keywords. Causality; quantum entanglement; cloning; local realism; completely positive maps.

  20. mediation: R Package for Causal Mediation Analysis

    Directory of Open Access Journals (Sweden)

    Dustin Tingley

    2014-09-01

    Full Text Available In this paper, we describe the R package mediation for conducting causal mediation analysis in applied empirical research. In many scientific disciplines, the goal of researchers is not only estimating causal effects of a treatment but also understanding the process in which the treatment causally affects the outcome. Causal mediation analysis is frequently used to assess potential causal mechanisms. The mediation package implements a comprehensive suite of statistical tools for conducting such an analysis. The package is organized into two distinct approaches. Using the model-based approach, researchers can estimate causal mediation effects and conduct sensitivity analysis under the standard research design. Furthermore, the design-based approach provides several analysis tools that are applicable under different experimental designs. This approach requires weaker assumptions than the model-based approach. We also implement a statistical method for dealing with multiple (causally dependent mediators, which are often encountered in practice. Finally, the package also offers a methodology for assessing causal mediation in the presence of treatment noncompliance, a common problem in randomized trials.

  1. Causal Mediation Analysis: Warning! Assumptions Ahead

    Science.gov (United States)

    Keele, Luke

    2015-01-01

    In policy evaluations, interest may focus on why a particular treatment works. One tool for understanding why treatments work is causal mediation analysis. In this essay, I focus on the assumptions needed to estimate mediation effects. I show that there is no "gold standard" method for the identification of causal mediation effects. In…

  2. Stability Studies of a Mixture of Paracetamol and Ascorbic Acid ...

    African Journals Online (AJOL)

    Purpose: To determine the effect of the temperature of water used for the preparation of paracetamol and ascorbic acid mixture on its stability, as well as to assess the influence of humidity on the stability of single components and their mixtures. Methods: The stability of the mixtures in aqueous medium was evaluated with ...

  3. Plant relations in mixtures of Trifolium subterraneum cv. Midmar: II ...

    African Journals Online (AJOL)

    Land Equivalent Ratios (LER) were calculated for mixtures of Trifolium subterraneum and Lolium multiflorum in terms of dry matter production and crude protein production. This ratio denotes the yield advantage, if any, of a specific mixture against the pure stands of the different components. The mixture receiving 240 kg N ...

  4. Bayesian D-Optimal Choice Designs for Mixtures

    NARCIS (Netherlands)

    A. Ruseckaite (Aiste); P.P. Goos (Peter); D. Fok (Dennis)

    2014-01-01

    markdownabstract__Abstract__ Consumer products and services can often be described as mixtures of ingredients. Examples are the mixture of ingredients in a cocktail and the mixture of different components of waiting time (e.g., in-vehicle and out-of-vehicle travel time) in a transportation

  5. Heterogeneous Causal Effects and Sample Selection Bias

    DEFF Research Database (Denmark)

    Breen, Richard; Choi, Seongsoo; Holm, Anders

    2015-01-01

    The role of education in the process of socioeconomic attainment is a topic of long standing interest to sociologists and economists. Recently there has been growing interest not only in estimating the average causal effect of education on outcomes such as earnings, but also in estimating how...... causal effects might vary over individuals or groups. In this paper we point out one of the under-appreciated hazards of seeking to estimate heterogeneous causal effects: conventional selection bias (that is, selection on baseline differences) can easily be mistaken for heterogeneity of causal effects....... This might lead us to find heterogeneous effects when the true effect is homogenous, or to wrongly estimate not only the magnitude but also the sign of heterogeneous effects. We apply a test for the robustness of heterogeneous causal effects in the face of varying degrees and patterns of selection bias...

  6. Repair of Partly Misspecified Causal Diagrams.

    Science.gov (United States)

    Oates, Chris J; Kasza, Jessica; Simpson, Julie A; Forbes, Andrew B

    2017-07-01

    Errors in causal diagrams elicited from experts can lead to the omission of important confounding variables from adjustment sets and render causal inferences invalid. In this report, a novel method is presented that repairs a misspecified causal diagram through the addition of edges. These edges are determined using a data-driven approach designed to provide improved statistical efficiency relative to de novo structure learning methods. Our main assumption is that the expert is "directionally informed," meaning that "false" edges provided by the expert would not create cycles if added to the "true" causal diagram. The overall procedure is cast as a preprocessing technique that is agnostic to subsequent causal inferences. Results based on simulated data and data derived from an observational cohort illustrate the potential for data-assisted elicitation in epidemiologic applications. See video abstract at, http://links.lww.com/EDE/B208.

  7. Separation of organic azeotropic mixtures by pervaporation

    Energy Technology Data Exchange (ETDEWEB)

    Baker, R.W.

    1991-12-01

    Distillation is a commonly used separation technique in the petroleum refining and chemical processing industries. However, there are a number of potential separations involving azetropic and close-boiling organic mixtures that cannot be separated efficiently by distillation. Pervaporation is a membrane-based process that uses selective permeation through membranes to separate liquid mixtures. Because the separation process is not affected by the relative volatility of the mixture components being separated, pervaporation can be used to separate azetropes and close-boiling mixtures. Our results showed that pervaporation membranes can be used to separate azeotropic mixtures efficiently, a result that is not achievable with simple distillation. The membranes were 5--10 times more permeable to one of the components of the mixture, concentrating it in the permeate stream. For example, the membrane was 10 times more permeable to ethanol than methyl ethyl ketone, producing 60% ethanol permeate from an azeotropic mixture of ethanol and methyl ethyl ketone containing 18% ethanol. For the ethyl acetate/water mixture, the membranes showed a very high selectivity to water (> 300) and the permeate was 50--100 times enriched in water relative to the feed. The membranes had permeate fluxes on the order of 0.1--1 kg/m{sup 2}{center dot}h in the operating range of 55--70{degrees}C. Higher fluxes were obtained by increasing the operating temperature.

  8. Causal ubiquity in quantum physics a superluminal and local-causal physical ontology

    CERN Document Server

    Neelamkavil, Raphael

    2014-01-01

    A fixed highest criterial velocity (of light) in STR (special theory of relativity) is a convention for a layer of physical inquiry. QM (Quantum Mechanics) avoids action-at-a-distance using this concept, but accepts non-causality and action-at-a-distance in EPR (Einstein-Podolsky-Rosen-Paradox) entanglement experiments. Even in such allegedly non-causal processes, something exists processually in extension-motion, between the causal and the non-causal. If STR theoretically allows real-valued superluminal communication between EPR entangled particles, quantum processes become fully causal. That

  9. Causal systems categories: differences in novice and expert categorization of causal phenomena.

    Science.gov (United States)

    Rottman, Benjamin M; Gentner, Dedre; Goldwater, Micah B

    2012-07-01

    We investigated the understanding of causal systems categories--categories defined by common causal structure rather than by common domain content--among college students. We asked students who were either novices or experts in the physical sciences to sort descriptions of real-world phenomena that varied in their causal structure (e.g., negative feedback vs. causal chain) and in their content domain (e.g., economics vs. biology). Our hypothesis was that there would be a shift from domain-based sorting to causal sorting with increasing expertise in the relevant domains. This prediction was borne out: the novice groups sorted primarily by domain and the expert group sorted by causal category. These results suggest that science training facilitates insight about causal structures. Copyright © 2012 Cognitive Science Society, Inc.

  10. Liquids and liquid mixtures

    CERN Document Server

    Rowlinson, J S; Baldwin, J E; Buckingham, A D; Danishefsky, S

    2013-01-01

    Liquids and Liquid Mixtures, Third Edition explores the equilibrium properties of liquids and liquid mixtures and relates them to the properties of the constituent molecules using the methods of statistical thermodynamics. Topics covered include the critical state, fluid mixtures at high pressures, and the statistical thermodynamics of fluids and mixtures. This book consists of eight chapters and begins with an overview of the liquid state and the thermodynamic properties of liquids and liquid mixtures, including vapor pressure and heat capacities. The discussion then turns to the thermodynami

  11. Perfect posterior simulation for mixture and hidden Marko models

    DEFF Research Database (Denmark)

    Berthelsen, Kasper Klitgaard; Breyer, Laird A.; Roberts, Gareth O.

    2010-01-01

    In this paper we present an application of the read-once coupling from the past algorithm to problems in Bayesian inference for latent statistical models. We describe a method for perfect simulation from the posterior distribution of the unknown mixture weights in a mixture model. Our method...... is extended to a more general mixture problem, where unknown parameters exist for the mixture components, and to a hidden Markov model....

  12. A Convex Hull Formulation for the Design of Optimal Mixtures

    OpenAIRE

    Jonuzaj, S; Adjiman, CS

    2016-01-01

    The design of mixtures plays an important role in improving process and product performance but is challenging because it requires finding the optimal number, identities and compositions of mixture components and using nonlinear property models. To address this, a general modeling framework for mixture design problems is presented. It integrates Generalized Disjunctive Programming (GDP) into Computer-Aided Mixture/blend Design via Hull Reformulation (HR). The design methodology is applied suc...

  13. A Simple Test for Causality in Volatility

    Directory of Open Access Journals (Sweden)

    Chia-Lin Chang

    2017-03-01

    Full Text Available An early development in testing for causality (technically, Granger non-causality in the conditional variance (or volatility associated with financial returns was the portmanteau statistic for non-causality in the variance of Cheng and Ng (1996. A subsequent development was the Lagrange Multiplier (LM test of non-causality in the conditional variance by Hafner and Herwartz (2006, who provided simulation results to show that their LM test was more powerful than the portmanteau statistic for sample sizes of 1000 and 4000 observations. While the LM test for causality proposed by Hafner and Herwartz (2006 is an interesting and useful development, it is nonetheless arbitrary. In particular, the specification on which the LM test is based does not rely on an underlying stochastic process, so the alternative hypothesis is also arbitrary, which can affect the power of the test. The purpose of the paper is to derive a simple test for causality in volatility that provides regularity conditions arising from the underlying stochastic process, namely a random coefficient autoregressive process, and a test for which the (quasi- maximum likelihood estimates have valid asymptotic properties under the null hypothesis of non-causality. The simple test is intuitively appealing as it is based on an underlying stochastic process, is sympathetic to Granger’s (1969, 1988 notion of time series predictability, is easy to implement, and has a regularity condition that is not available in the LM test.

  14. Some properties of explosive mixtures containing peroxides

    Energy Technology Data Exchange (ETDEWEB)

    Zeman, Svatopluk [Institute of Energetic Materials, Faculty of Chemical Technology, University of Pardubice, CZ-532 10 Pardubice (Czech Republic)], E-mail: svatopluk.zeman@upce.cz; Trzcinski, Waldemar A. [Institute of Chemistry, Military University of Technology, PL-00-908 Warsaw 49 (Poland); Matyas, Robert [Institute of Energetic Materials, Faculty of Chemical Technology, University of Pardubice, CZ-532 10 Pardubice (Czech Republic)

    2008-06-15

    This study concerns mixtures of triacetone triperoxide (3,3,6,6,9,9-hexamethyl-1,2,4,5,7,8-hexoxonane, TATP) and ammonium nitrate (AN) with added water (W), as the case may be, and dry mixtures of TATP with urea nitrate (UN). Relative performances (RP) of the mixtures and their individual components, relative to TNT, were determined by means of ballistic mortar. The detonation energies, E{sub 0}, and detonation velocities, D, were calculated for the mixtures studied by means of the thermodynamic code CHEETAH. Relationships have been found and are discussed between the RP and the E{sub 0} values related to unit volume of gaseous products of detonation of these mixtures. These relationships together with those between RP and oxygen balance values of the mixtures studied indicate different types of participation of AN and UN in the explosive decomposition of the respective mixtures. Dry TATP/UN mixtures exhibit lower RP than analogous mixtures TATP/AN containing up to 25% of water. Depending on the water content, the TATP/AN mixtures possess higher detonability values than the ANFO explosives. A semi-logarithmic relationship between the D values and oxygen coefficients has been derived for all the mixtures studied at the charge density of 1000 kg m{sup -3}. Among the mixtures studied, this relationship distinguishes several samples of the type of 'tertiary explosives' as well as samples that approach 'high explosives' in their performances and detonation velocities.

  15. On the origin of Hill's causal criteria.

    Science.gov (United States)

    Morabia, A

    1991-09-01

    The rules to assess causation formulated by the eighteenth century Scottish philosopher David Hume are compared to Sir Austin Bradford Hill's causal criteria. The strength of the analogy between Hume's rules and Hill's causal criteria suggests that, irrespective of whether Hume's work was known to Hill or Hill's predecessors, Hume's thinking expresses a point of view still widely shared by contemporary epidemiologists. The lack of systematic experimental proof to causal inferences in epidemiology may explain the analogy of Hume's and Hill's, as opposed to Popper's, logic.

  16. Causality and Time in Historical Institutionalism

    DEFF Research Database (Denmark)

    Mahoney, James; Mohamedali, Khairunnisa; Nguyen, Christoph

    2016-01-01

    This chapter explores the dual concern with causality and time in historical institutionalism using a graphical approach. The analysis focuses on three concepts that are central to this field: critical junctures, gradual change, and path dependence. The analysis makes explicit and formal the logic...... underlying studies that use these “causal-temporal” concepts. The chapter shows visually how causality and temporality are linked to one another in varying ways depending on the particular pattern of change. The chapter provides new tools for describing and understanding change in historical- institutional...

  17. Dual Causality and the Autonomy of Biology.

    Science.gov (United States)

    Bock, Walter J

    2017-03-01

    Ernst Mayr's concept of dual causality in biology with the two forms of causes (proximate and ultimate) continues to provide an essential foundation for the philosophy of biology. They are equivalent to functional (=proximate) and evolutionary (=ultimate) causes with both required for full biological explanations. The natural sciences can be classified into nomological, historical nomological and historical dual causality, the last including only biology. Because evolutionary causality is unique to biology and must be included for all complete biological explanations, biology is autonomous from the physical sciences.

  18. Tutorial for mixture-process experiments with an industrial application

    Directory of Open Access Journals (Sweden)

    Luiz Henrique Abreu Dal Bello

    2011-12-01

    Full Text Available This article presents a tutorial on mixture-process experiments and a case study of a chemical compound used in the delay mechanism for starting a rocket engine. The compound consists in a three-component mixture. Besides the mixture components, two process variables are considered. For the model selection, the use of an information criterion showed to be efficient in the case under study. A linear regression model was fitted. Through the developed model, the optimal proportions of the mixture components and the levels of the process variables were determined.

  19. Different mathematical processing of absorption, ratio and derivative spectra for quantification of mixtures containing minor component: An application to the analysis of the recently co-formulated antidiabetic drugs; canagliflozin and metformin

    Science.gov (United States)

    Lotfy, Hayam M.; Mohamed, Dalia; Elshahed, Mona S.

    2018-01-01

    In the presented work several spectrophotometric methods were performed for the quantification of canagliflozin (CGZ) and metformin hydrochloride (MTF) simultaneously in their binary mixture. Two of these methods; response correlation (RC) and advanced balance point-spectrum subtraction (ABP-SS) were developed and introduced for the first time in this work, where the latter method (ABP-SS) was performed on both the zero order and the first derivative spectra of the drugs. Besides, two recently established methods; advanced amplitude modulation (AAM) and advanced absorbance subtraction (AAS) were also accomplished. All the proposed methods were validated in accordance to the ICH guidelines, where all methods were proved to be accurate and precise. Additionally, the linearity range, limit of detection and limit of quantification were determined and the selectivity was examined through the analysis of laboratory prepared mixtures and the combined dosage form of the drugs. The proposed methods were capable of determining the two drugs in the ratio present in the pharmaceutical formulation CGZ:MTF (1:17) without the requirement of any preliminary separation, further dilution or standard spiking. The results obtained by the proposed methods were in compliance with the reported chromatographic method when compared statistically, proving the absence of any significant difference in accuracy and precision between the proposed and reported methods.

  20. Model structure selection in convolutive mixtures

    DEFF Research Database (Denmark)

    Dyrholm, Mads; Makeig, S.; Hansen, Lars Kai

    2006-01-01

    The CICAAR algorithm (convolutive independent component analysis with an auto-regressive inverse model) allows separation of white (i.i.d) source signals from convolutive mixtures. We introduce a source color model as a simple extension to the CICAAR which allows for a more parsimonious represent......The CICAAR algorithm (convolutive independent component analysis with an auto-regressive inverse model) allows separation of white (i.i.d) source signals from convolutive mixtures. We introduce a source color model as a simple extension to the CICAAR which allows for a more parsimonious...... representation in many practical mixtures. The new filter-CICAAR allows Bayesian model selection and can help answer questions like: ’Are we actually dealing with a convolutive mixture?’. We try to answer this question for EEG data....

  1. Neurotoxicity of Metal Mixtures.

    Science.gov (United States)

    Andrade, V M; Aschner, M; Marreilha Dos Santos, A P

    2017-01-01

    Metals are the oldest toxins known to humans. Metals differ from other toxic substances in that they are neither created nor destroyed by humans (Casarett and Doull's, Toxicology: the basic science of poisons, 8th edn. McGraw-Hill, London, 2013). Metals are of great importance in our daily life and their frequent use makes their omnipresence and a constant source of human exposure. Metals such as arsenic [As], lead [Pb], mercury [Hg], aluminum [Al] and cadmium [Cd] do not have any specific role in an organism and can be toxic even at low levels. The Substance Priority List of Agency for Toxic Substances and Disease Registry (ATSDR) ranked substances based on a combination of their frequency, toxicity, and potential for human exposure. In this list, As, Pb, Hg, and Cd occupy the first, second, third, and seventh positions, respectively (ATSDR, Priority list of hazardous substances. U.S. Department of Health and Human Services, Public Health Service, Atlanta, 2016). Besides existing individually, these metals are also (or mainly) found as mixtures in various parts of the ecosystem (Cobbina SJ, Chen Y, Zhou Z, Wub X, Feng W, Wang W, Mao G, Xu H, Zhang Z, Wua X, Yang L, Chemosphere 132:79-86, 2015). Interactions among components of a mixture may change toxicokinetics and toxicodynamics (Spurgeon DJ, Jones OAH, Dorne J-L, Svendsen C, Swain S, Stürzenbaum SR, Sci Total Environ 408:3725-3734, 2010) and may result in greater (synergistic) toxicity (Lister LJ, Svendsen C, Wright J, Hooper HL, Spurgeon DJ, Environ Int 37:663-670, 2011). This is particularly worrisome when the components of the mixture individually attack the same organs. On the other hand, metals such as manganese [Mn], iron [Fe], copper [Cu], and zinc [Zn] are essential metals, and their presence in the body below or above homeostatic levels can also lead to disease states (Annangi B, Bonassi S, Marcos R, Hernández A, Mutat Res 770(Pt A):140-161, 2016). Pb, As, Cd, and Hg can induce Fe, Cu, and Zn

  2. Violation of causality in f(T) gravity

    Energy Technology Data Exchange (ETDEWEB)

    Otalora, G. [Pontificia Universidad Catolica de Valparaiso, Instituto de Fisica, Valparaiso (Chile); Reboucas, M.J. [Centro Brasileiro de Pesquisas Fisicas, Rio de Janeiro, RJ (Brazil)

    2017-11-15

    In the standard formulation, the f(T) field equations are not invariant under local Lorentz transformations, and thus the theory does not inherit the causal structure of special relativity. Actually, even locally violation of causality can occur in this formulation of f(T) gravity. A locally Lorentz covariant f(T) gravity theory has been devised recently, and this local causality problem seems to have been overcome. The non-locality question, however, is left open. If gravitation is to be described by this covariant f(T) gravity theory there are a number of issues that ought to be examined in its context, including the question as to whether its field equations allow homogeneous Goedel-type solutions, which necessarily leads to violation of causality on non-local scale. Here, to look into the potentialities and difficulties of the covariant f(T) theories, we examine whether they admit Goedel-type solutions. We take a combination of a perfect fluid with electromagnetic plus a scalar field as source, and determine a general Goedel-type solution, which contains special solutions in which the essential parameter of Goedel-type geometries, m{sup 2}, defines any class of homogeneous Goedel-type geometries. We show that solutions of the trigonometric and linear classes (m{sup 2} < 0 and m = 0) are permitted only for the combined matter sources with an electromagnetic field matter component. We extended to the context of covariant f(T) gravity a theorem which ensures that any perfect-fluid homogeneous Goedel-type solution defines the same set of Goedel tetrads h{sub A}{sup μ} up to a Lorentz transformation. We also showed that the single massless scalar field generates Goedel-type solution with no closed time-like curves. Even though the covariant f(T) gravity restores Lorentz covariance of the field equations and the local validity of the causality principle, the bare existence of the Goedel-type solutions makes apparent that the covariant formulation of f(T) gravity

  3. Coking technology using packed coal mixtures

    Energy Technology Data Exchange (ETDEWEB)

    Kuznichenko, V.M.; Shteinberg, Eh.A.; Tolstoi, A.P. (Khar' kovskii Nauchno-Issledovatel' skii Uglekhimicheskii Institut, Kharkov (Ukrainian SSR))

    1991-08-01

    Discusses coking of packed coal charges in the FRG, USSR, France, India, Poland and Czechoslovakia. The following aspects are evaluated: types of weakly caking coals that are used as components of packed mixtures, energy consumption of packing, effects of coal mixture packing on coke oven design, number of coke ovens in a battery, heating temperature, coking time, coke properties, investment and operating cost. Statistical data that characterize the Saarberg packing process used in the FRG are analyzed. Packing coal mixtures for coking improves coke quality and reduces environmental pollution. 4 refs.

  4. Selecting appropriate cases when tracing causal mechanisms

    DEFF Research Database (Denmark)

    Beach, Derek; Pedersen, Rasmus Brun

    2016-01-01

    The last decade has witnessed resurgence in the interest in studying the causal mechanisms linking causes and outcomes in the social sciences. This article explores the overlooked implications for case selection when tracing mechanisms using in-depth case studies. Our argument is that existing case...... selection guidelines are appropriate for research aimed at making cross-case claims about causal relationships, where case selection is primarily used to control for other causes. However, existing guidelines are not in alignment with case-based research that aims to trace mechanisms, where the goal...... is to unpack the causal mechanism between X and Y, enabling causal inferences to be made because empirical evidence is provided for how the mechanism actually operated in a particular case. The in-depth, within-case tracing of how mechanisms operate in particular cases produces what can be termed mechanistic...

  5. Causality Between Urban Concentration and Environmental Quality

    Directory of Open Access Journals (Sweden)

    Amin Pujiati

    2015-08-01

    Full Text Available Population is concentrated in urban areas can cause the external diseconomies on environment if it exceeds the carrying capacity of the space and the urban economy. Otherwise the quality of the environment is getting better, led to the concentration of population in urban areas are increasingly high. This study aims to analyze the relationship of causality between the urban concentration and environmental quality in urban agglomeration areas. The data used in the study of secondary data obtained from the Central Bureau of statistics and the City Government from 2000 to 2013. The analytical method used is the Granger causality and descriptive. Granger causality study results showed no pattern of reciprocal causality, between urban concentration and the quality of the environment, but there unidirectional relationship between the urban concentration and environmental quality. This means that increasing urban concentration led to decreased environmental quality.

  6. Risk and causality in newspaper reporting.

    Science.gov (United States)

    Boholm, Max

    2009-11-01

    The study addresses the textual representation of risk and causality in news media reporting. The analytical framework combines two theoretical perspectives: media frame analysis and the philosophy of causality. Empirical data derive from selected newspaper articles on risks in the Göta älv river valley in southwest Sweden from 1994 to 2007. News media content was coded and analyzed with respect to causal explanations of risk issues. At the level of individual articles, this study finds that the media provide simple causal explanations of risks such as water pollution, landslides, and flooding. Furthermore, these explanations are constructed, or framed, in various ways, the same risk being attributed to different causes in different articles. However, the study demonstrates that a fairly complex picture of risks in the media emerges when extensive material is analyzed systematically.

  7. Rate-Agnostic (Causal) Structure Learning.

    Science.gov (United States)

    Plis, Sergey; Danks, David; Freeman, Cynthia; Calhoun, Vince

    2015-12-01

    Causal structure learning from time series data is a major scientific challenge. Extant algorithms assume that measurements occur sufficiently quickly; more precisely, they assume approximately equal system and measurement timescales. In many domains, however, measurements occur at a significantly slower rate than the underlying system changes, but the size of the timescale mismatch is often unknown. This paper develops three causal structure learning algorithms, each of which discovers all dynamic causal graphs that explain the observed measurement data, perhaps given undersampling. That is, these algorithms all learn causal structure in a "rate-agnostic" manner: they do not assume any particular relation between the measurement and system timescales. We apply these algorithms to data from simulations to gain insight into the challenge of undersampling.

  8. Causales de ausencia de responsabilidad penal

    Directory of Open Access Journals (Sweden)

    Jaime Sandoval Fernández

    2003-01-01

    Full Text Available Este trabajo se ocupa de las causales de ausencia de responsabilidad penal, especialmente de aquellas que tienen efecto en el injusto. Como subtemas se delimita el concepto de responsabilidad penal y su ausencia. Se estudian las principales teorias a cerca de la relación tipicidad-antijuridicidad y su incidencia en el derecho penal colombiano. Por último contiene una propuesta acerca de cómo deberian agruparse las causales del arto 32 C. PlOO.

  9. Exact Fit of Simple Finite Mixture Models

    Directory of Open Access Journals (Sweden)

    Dirk Tasche

    2014-11-01

    Full Text Available How to forecast next year’s portfolio-wide credit default rate based on last year’s default observations and the current score distribution? A classical approach to this problem consists of fitting a mixture of the conditional score distributions observed last year to the current score distribution. This is a special (simple case of a finite mixture model where the mixture components are fixed and only the weights of the components are estimated. The optimum weights provide a forecast of next year’s portfolio-wide default rate. We point out that the maximum-likelihood (ML approach to fitting the mixture distribution not only gives an optimum but even an exact fit if we allow the mixture components to vary but keep their density ratio fixed. From this observation we can conclude that the standard default rate forecast based on last year’s conditional default rates will always be located between last year’s portfolio-wide default rate and the ML forecast for next year. As an application example, cost quantification is then discussed. We also discuss how the mixture model based estimation methods can be used to forecast total loss. This involves the reinterpretation of an individual classification problem as a collective quantification problem.

  10. Kant on causal laws and powers.

    Science.gov (United States)

    Henschen, Tobias

    2014-12-01

    The aim of the paper is threefold. Its first aim is to defend Eric Watkins's claim that for Kant, a cause is not an event but a causal power: a power that is borne by a substance, and that, when active, brings about its effect, i.e. a change of the states of another substance, by generating a continuous flow of intermediate states of that substance. The second aim of the paper is to argue against Watkins that the Kantian concept of causal power is not the pre-critical concept of real ground but the category of causality, and that Kant holds with Hume that causal laws cannot be inferred non-inductively (that he accordingly has no intention to show in the Second analogy or elsewhere that events fall under causal laws). The third aim of the paper is to compare the Kantian position on causality with central tenets of contemporary powers ontology: it argues that unlike the variants endorsed by contemporary powers theorists, the Kantian variants of these tenets are resistant to objections that neo-Humeans raise to these tenets.

  11. Causal reasoning and models of cognitive tasks for naval nuclear power plant operators; Raisonnement causal et modelisation de l`activite cognitive d`operateurs de chaufferie nucleaire navale

    Energy Technology Data Exchange (ETDEWEB)

    Salazar-Ferrer, P.

    1995-06-01

    In complex industrial process control, causal reasoning appears as a major component in operators` cognitive tasks. It is tightly linked to diagnosis, prediction of normal and failure states, and explanation. This work provides a detailed review of literature in causal reasoning. A synthesis is proposed as a model of causal reasoning in process control. This model integrates distinct approaches in Cognitive Science: especially qualitative physics, Bayesian networks, knowledge-based systems, and cognitive psychology. Our model defines a framework for the analysis of causal human errors in simulated naval nuclear power plant fault management. Through the methodological framework of critical incident analysis we define a classification of errors and difficulties linked to causal reasoning. This classification is based on shallow characteristics of causal reasoning. As an origin of these errors, more elementary component activities in causal reasoning are identified. The applications cover the field of functional specification for man-machine interfaces, operators support systems design as well as nuclear safety. In addition of this study, we integrate the model of causal reasoning in a model of cognitive task in process control. (authors). 106 refs., 49 figs., 8 tabs.

  12. Entanglement entropy in causal set theory

    Science.gov (United States)

    Sorkin, Rafael D.; Yazdi, Yasaman K.

    2018-04-01

    Entanglement entropy is now widely accepted as having deep connections with quantum gravity. It is therefore desirable to understand it in the context of causal sets, especially since they provide in a natural manner the UV cutoff needed to render entanglement entropy finite. Formulating a notion of entanglement entropy in a causal set is not straightforward because the type of canonical hypersurface-data on which its definition typically relies is not available. Instead, we appeal to the more global expression given in Sorkin (2012 (arXiv:1205.2953)) which, for a Gaussian scalar field, expresses the entropy of a spacetime region in terms of the field’s correlation function within that region (its ‘Wightman function’ W(x, x') ). Carrying this formula over to the causal set, one obtains an entropy which is both finite and of a Lorentz invariant nature. We evaluate this global entropy-expression numerically for certain regions (primarily order-intervals or ‘causal diamonds’) within causal sets of 1  +  1 dimensions. For the causal-set counterpart of the entanglement entropy, we obtain, in the first instance, a result that follows a (spacetime) volume law instead of the expected (spatial) area law. We find, however, that one obtains an area law if one truncates the commutator function (‘Pauli–Jordan operator’) and the Wightman function by projecting out the eigenmodes of the Pauli–Jordan operator whose eigenvalues are too close to zero according to a geometrical criterion which we describe more fully below. In connection with these results and the questions they raise, we also study the ‘entropy of coarse-graining’ generated by thinning out the causal set, and we compare it with what one obtains by similarly thinning out a chain of harmonic oscillators, finding the same, ‘universal’ behaviour in both cases.

  13. Preschoolers prefer to learn causal information

    Directory of Open Access Journals (Sweden)

    Aubry eAlvarez

    2015-02-01

    Full Text Available Young children, in general, appear to have a strong drive to explore the environment in ways that reveal its underlying causal structure. But are they really attuned specifically to casual information in this quest for understanding, or do they show equal interest in other types of non-obvious information about the world? To answer this question, we introduced 20 three-year-old children to two puppets who were anxious to tell the child about a set of novel artifacts and animals. One puppet consistently described causal properties of the items while the other puppet consistently described carefully matched non-causal properties of the same items. After a familiarization period in which children learned which type of information to expect from each informant, children were given the opportunity to choose which they wanted to hear describe each of eight pictured test items. On average, children chose to hear from the informant that provided causal descriptions on 72% of the trials. This preference for causal information has important implications for explaining the role of conceptual information in supporting early learning and may suggest means for maximizing interest and motivation in young children.

  14. Psychiatric comorbidity and causal disease models.

    Science.gov (United States)

    van Loo, Hanna M; Romeijn, Jan-Willem; de Jonge, Peter; Schoevers, Robert A

    2013-12-01

    In psychiatry, comorbidity is the rule rather than the exception. Up to 45% of all patients are classified as having more than one psychiatric disorder. These high rates of comorbidity have led to a debate concerning the interpretation of this phenomenon. Some authors emphasize the problematic character of the high rates of comorbidity because they indicate absent zones of rarities. Others consider comorbid conditions to be a validator for a particular reclassification of diseases. In this paper we will show that those at first sight contrasting interpretations of comorbidity are based on similar assumptions about disease models. The underlying ideas are that firstly high rates of comorbidity are the result of the absence of causally defined diseases in psychiatry, and second that causal disease models are preferable to non-causal disease models. We will argue that there are good reasons to seek after causal understanding of psychiatric disorders, but that causal disease models will not rule out high rates of comorbidity--neither in psychiatry, nor in medicine in general. By bringing to the fore these underlying assumptions, we hope to clear the ground for a different understanding of comorbidity, and of models for psychiatric diseases. Copyright © 2012 Elsevier Inc. All rights reserved.

  15. The Relevance of Causal Social Construction

    Directory of Open Access Journals (Sweden)

    Marques Teresa

    2017-02-01

    Full Text Available Social constructionist claims are surprising and interesting when they entail that presumably natural kinds are in fact socially constructed. The claims are interesting because of their theoretical and political importance. Authors like Díaz-León argue that constitutive social construction is more relevant for achieving social justice than causal social construction. This paper challenges this claim. Assuming there are socially salient groups that are discriminated against, the paper presents a dilemma: if there were no constitutively constructed social kinds, the causes of the discrimination of existing social groups would have to be addressed, and understanding causal social construction would be relevant to achieve social justice. On the other hand, not all possible constitutively socially constructed kinds are actual social kinds. If an existing social group is constitutively constructed as a social kind K, the fact that it actually exists as a K has social causes. Again, causal social construction is relevant. The paper argues that (i for any actual social kind X, if X is constitutively socially constructed as K, then it is also causally socially constructed; and (ii causal social construction is at least as relevant as constitutive social construction for concerns of social justice. For illustration, I draw upon two phenomena that are presumed to contribute towards the discrimination of women: (i the poor performance effects of stereotype threat, and (ii the silencing effects of gendered language use.

  16. 7 CFR 201.12a - Lawn and turf seed mixtures.

    Science.gov (United States)

    2010-01-01

    ... 7 Agriculture 3 2010-01-01 2010-01-01 false Lawn and turf seed mixtures. 201.12a Section 201.12a... REGULATIONS Labeling Agricultural Seeds § 201.12a Lawn and turf seed mixtures. Seed mixtures intended for lawn and turf purposes shall be designated as a mixture on the label and each seed component shall be...

  17. Riemann solvers for multi-component gas mixtures with temperature dependent heat capacities; Solveurs de riemann pour des melanges de gaz parfaits avec capacites calorifiques dependant de la temperature

    Energy Technology Data Exchange (ETDEWEB)

    Beccantini, A

    2001-07-01

    This thesis represents a contribution to the development of upwind splitting schemes for the Euler equations for ideal gaseous mixtures and their investigation in computing multidimensional flows in irregular geometries. In the preliminary part we develop and investigate the parameterization of the shock and rarefaction curves in the phase space. Then, we apply them to perform some field-by-field decompositions of the Riemann problem: the entropy-respecting one, the one which supposes that genuinely-non-linear (GNL) waves are both shocks (shock-shock one) and the one which supposes that GNL waves are both rarefactions (rarefaction-rarefaction one). We emphasize that their analysis is fundamental in Riemann solvers developing: the simpler the field-by-field decomposition, the simpler the Riemann solver based on it. As the specific heat capacities of the gases depend on the temperature, the shock-shock field-by-field decomposition is the easiest to perform. Then, in the second part of the thesis, we develop an upwind splitting scheme based on such decomposition. Afterwards, we investigate its robustness, precision and CPU-time consumption, with respect to some of the most popular upwind splitting schemes for polytropic/non-polytropic ideal gases. 1-D test-cases show that this scheme is both precise (exact capturing of stationary shock and stationary contact) and robust in dealing with strong shock and rarefaction waves. Multidimensional test-cases show that it suffers from some of the typical deficiencies which affect the upwind splitting schemes capable of exact capturing stationary contact discontinuities i.e the developing of non-physical instabilities in computing strong shock waves. In the final part, we use the high-order multidimensional solver here developed to compute fully-developed detonation flows. (author)

  18. Perception of trigeminal mixtures.

    Science.gov (United States)

    Filiou, Renée-Pier; Lepore, Franco; Bryant, Bruce; Lundström, Johan N; Frasnelli, Johannes

    2015-01-01

    The trigeminal system is a chemical sense allowing for the perception of chemosensory information in our environment. However, contrary to smell and taste, we lack a thorough understanding of the trigeminal processing of mixtures. We, therefore, investigated trigeminal perception using mixtures of 3 relatively receptor-specific agonists together with one control odor in different proportions to determine basic perceptual dimensions of trigeminal perception. We found that 4 main dimensions were linked to trigeminal perception: sensations of intensity, warmth, coldness, and pain. We subsequently investigated perception of binary mixtures of trigeminal stimuli by means of these 4 perceptual dimensions using different concentrations of a cooling stimulus (eucalyptol) mixed with a stimulus that evokes warmth perception (cinnamaldehyde). To determine if sensory interactions are mainly of central or peripheral origin, we presented stimuli in a physical "mixture" or as a "combination" presented separately to individual nostrils. Results showed that mixtures generally yielded higher ratings than combinations on the trigeminal dimensions "intensity," "warm," and "painful," whereas combinations yielded higher ratings than mixtures on the trigeminal dimension "cold." These results suggest dimension-specific interactions in the perception of trigeminal mixtures, which may be explained by particular interactions that may take place on peripheral or central levels. © The Author 2014. Published by Oxford University Press. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  19. Causal binding of actions to their effects.

    Science.gov (United States)

    Buehner, Marc J; Humphreys, Gruffydd R

    2009-10-01

    According to widely held views in cognitive science harking back to David Hume, causality cannot be perceived directly, but instead is inferred from patterns of sensory experience, and the quality of these inferences is determined by perceivable quantities such as contingency and contiguity. We report results that suggest a reversal of Hume's conjecture: People's sense of time is warped by the experience of causality. In a stimulus-anticipation task, participants' response behavior reflected a shortened experience of time in the case of target stimuli participants themselves had generated, relative to equidistant, equally predictable stimuli they had not caused. These findings suggest that causality in the mind leads to temporal binding of cause and effect, and extend and generalize beyond earlier claims of intentional binding between action and outcome.

  20. Normalizing the causality between time series

    Science.gov (United States)

    Liang, X. San

    2015-08-01

    Recently, a rigorous yet concise formula was derived to evaluate information flow, and hence the causality in a quantitative sense, between time series. To assess the importance of a resulting causality, it needs to be normalized. The normalization is achieved through distinguishing a Lyapunov exponent-like, one-dimensional phase-space stretching rate and a noise-to-signal ratio from the rate of information flow in the balance of the marginal entropy evolution of the flow recipient. It is verified with autoregressive models and applied to a real financial analysis problem. An unusually strong one-way causality is identified from IBM (International Business Machines Corporation) to GE (General Electric Company) in their early era, revealing to us an old story, which has almost faded into oblivion, about "Seven Dwarfs" competing with a giant for the mainframe computer market.

  1. Causal inheritance in plane wave quotients

    International Nuclear Information System (INIS)

    Hubeny, Veronika E.; Rangamani, Mukund; Ross, Simon F.

    2003-01-01

    We investigate the appearance of closed timelike curves in quotients of plane waves along spacelike isometries. First we formulate a necessary and sufficient condition for a quotient of a general spacetime to preserve stable causality. We explicitly show that the plane waves are stably causal; in passing, we observe that some pp-waves are not even distinguishing. We then consider the classification of all quotients of the maximally supersymmetric ten-dimensional plane wave under a spacelike isometry, and show that the quotient will lead to closed timelike curves iff the isometry involves a translation along the u direction. The appearance of these closed timelike curves is thus connected to the special properties of the light cones in plane wave spacetimes. We show that all other quotients preserve stable causality

  2. Spatial hypersurfaces in causal set cosmology

    International Nuclear Information System (INIS)

    Major, Seth A; Rideout, David; Surya, Sumati

    2006-01-01

    Within the causal set approach to quantum gravity, a discrete analogue of a spacelike region is a set of unrelated elements, or an antichain. In the continuum approximation of the theory, a moment-of-time hypersurface is well represented by an inextendible antichain. We construct a richer structure corresponding to a thickening of this antichain containing non-trivial geometric and topological information. We find that covariant observables can be associated with such thickened antichains and transitions between them, in classical sequential growth models of causal sets. This construction highlights the difference between the covariant measure on causal set cosmology and the standard sum-over-histories approach: the measure is assigned to completed histories rather than to histories on a restricted spacetime region. The resulting re-phrasing of the sum-over-histories may be fruitful in other approaches to quantum gravity

  3. Composition of nonflowing impregnating cable mixture

    Energy Technology Data Exchange (ETDEWEB)

    Dimitrov, D.M.; Andreyev, V.G.; Koralski, G.I.; Petkova, N.; Velikova, D.G.

    1979-08-10

    A composition of a nonflowing electrical insulation cable mixture is patented which is based on petroleum oil, colophony, polyolefinic wax, polyethylene and ceresine containing 1-20 percent per mixture of butadiene-styrol latex. Example (parts by weight). Cable impregnating oil has the following composition: colophony 5, petroleum oil 7.7, butadiene-styrol latex 5.0, polyolefin wax 10, low pressure polyethylene 3. In order to obtain a mixture latex is added to petroleum oil heated to 80 degrees, the mixture is heated to 110 degrees, and then to 130 degrees and the other components are added as it is vigorously mixed. The oil obtained features the following properties: drop point 110 degrees, penetration at 25 degrees 142, specific volumetric resistance at 80 degrees 1.485 x 10 13, angle of dielectric losses 0.0049, dielectric strength at 20 degrees 240 kW/cm.

  4. How causal analysis can reveal autonomy in models of biological systems

    Science.gov (United States)

    Marshall, William; Kim, Hyunju; Walker, Sara I.; Tononi, Giulio; Albantakis, Larissa

    2017-11-01

    Standard techniques for studying biological systems largely focus on their dynamical or, more recently, their informational properties, usually taking either a reductionist or holistic perspective. Yet, studying only individual system elements or the dynamics of the system as a whole disregards the organizational structure of the system-whether there are subsets of elements with joint causes or effects, and whether the system is strongly integrated or composed of several loosely interacting components. Integrated information theory offers a theoretical framework to (1) investigate the compositional cause-effect structure of a system and to (2) identify causal borders of highly integrated elements comprising local maxima of intrinsic cause-effect power. Here we apply this comprehensive causal analysis to a Boolean network model of the fission yeast (Schizosaccharomyces pombe) cell cycle. We demonstrate that this biological model features a non-trivial causal architecture, whose discovery may provide insights about the real cell cycle that could not be gained from holistic or reductionist approaches. We also show how some specific properties of this underlying causal architecture relate to the biological notion of autonomy. Ultimately, we suggest that analysing the causal organization of a system, including key features like intrinsic control and stable causal borders, should prove relevant for distinguishing life from non-life, and thus could also illuminate the origin of life problem. This article is part of the themed issue 'Reconceptualizing the origins of life'.

  5. SEPARATION OF FLUID MIXTURES

    Science.gov (United States)

    Lipscomb, R.; Craig, A.; Labrow, S.; Dunn, J.F.

    1958-10-28

    An apparatus is presented for separating gaseous mixtures by selectively freezing a constituent of the mixture and subsequently separating the frozen gas. The gas mixture is passed through a cylinder fltted with a cooling jacket, causing one gas to freeze on the walls of the cylinder. A set of scraper blades are provided in the interior of the cyllnder, and as the blades oscillate, the frozen gas is scraped to the bottom of the cylinder. Means are provided for the frozen material to pass into a heating chamber where it is vaporized and the product gas collected.

  6. Morse theory on timelike and causal curves

    International Nuclear Information System (INIS)

    Everson, J.; Talbot, C.J.

    1976-01-01

    It is shown that the set of timelike curves in a globally hyperbolic space-time manifold can be given the structure of a Hilbert manifold under a suitable definition of 'timelike.' The causal curves are the topological closure of this manifold. The Lorentzian energy (corresponding to Milnor's energy, except that the Lorentzian inner product is used) is shown to be a Morse function for the space of causal curves. A fixed end point index theorem is obtained in which a lower bound for the index of the Hessian of the Lorentzian energy is given in terms of the sum of the orders of the conjugate points between the end points. (author)

  7. Inferring causality from noisy time series data

    DEFF Research Database (Denmark)

    Mønster, Dan; Fusaroli, Riccardo; Tylén, Kristian

    2016-01-01

    Convergent Cross-Mapping (CCM) has shown high potential to perform causal inference in the absence of models. We assess the strengths and weaknesses of the method by varying coupling strength and noise levels in coupled logistic maps. We find that CCM fails to infer accurate coupling strength...... and even causality direction in synchronized time-series and in the presence of intermediate coupling. We find that the presence of noise deterministically reduces the level of cross-mapping fidelity, while the convergence rate exhibits higher levels of robustness. Finally, we propose that controlled noise...

  8. Causal interpretation of stochastic differential equations

    DEFF Research Database (Denmark)

    Sokol, Alexander; Hansen, Niels Richard

    2014-01-01

    We give a causal interpretation of stochastic differential equations (SDEs) by defining the postintervention SDE resulting from an intervention in an SDE. We show that under Lipschitz conditions, the solution to the postintervention SDE is equal to a uniform limit in probability of postintervention...... structural equation models based on the Euler scheme of the original SDE, thus relating our definition to mainstream causal concepts. We prove that when the driving noise in the SDE is a Lévy process, the postintervention distribution is identifiable from the generator of the SDE....

  9. Toxicity of combined mixtures of nanoparticles to plants.

    Science.gov (United States)

    Jośko, Izabela; Oleszczuk, Patryk; Skwarek, Ewa

    2017-06-05

    An increasing production and using of nanoproducts results in releasing and dispersing nanoparticles (NPs) in the environment. Being released into various environment components, NPs may interact with numerous pollutants, including other NPs. This research aimed at assessing toxicity of combined binary mixtures of NPs. The study focused on assessing mixtures of NPs believed to be toxic (nano-ZnO+nano-CuO) and nano-ZnO/nano-CuO with the ones that are insignificantly toxic or non-toxic NPs (nano-TiO 2 /nano-Cr 2 O 3 /nano-Fe 2 O 3 ). Toxicity of combined mixtures proved comparable to toxicity of individual mixtures of NPs (the sum of effects triggered by individual types of NPs comprising respective mixtures). Toxicity evaluation was based on two parameters: seed germination and inhibition of root growth with respect to four plant species: Lepidium sativum, Linum utisassimmum, Cucumis sativus and Triticum aestivum. The findings showed combined mixtures of NPs to be significantly less toxic in comparison to individual mixtures, irrespective of their components. Within the scope of concentrations used, greatest differences between the toxicity of mixtures were reported at the 100mgL -1 concentration. Toxicity levels of combined and individual mixtures might have been determined by a lower total concentration of Zn and Cu metals and a greater aggregation of particles in combined mixtures than in individual mixtures. Copyright © 2017 Elsevier B.V. All rights reserved.

  10. Detecting causal drivers and empirical prediction of the Indian Summer Monsoon

    Science.gov (United States)

    Di Capua, G.; Vellore, R.; Raghavan, K.; Coumou, D.

    2017-12-01

    The Indian summer monsoon (ISM) is crucial for the economy, society and natural ecosystems on the Indian peninsula. Predict the total seasonal rainfall at several months lead time would help to plan effective water management strategies, improve flood or drought protection programs and prevent humanitarian crisis. However, the complexity and strong internal variability of the ISM circulation system make skillful seasonal forecasting challenging. Moreover, to adequately identify the low-frequency, and far-away processes which influence ISM behavior novel tools are needed. We applied a Response-Guided Causal Precursor Detection (RGCPD) scheme, which is a novel empirical prediction method which unites a response-guided community detection scheme with a causal discovery algorithm (CEN). These tool allow us to assess causal pathways between different components of the ISM circulation system and with far-away regions in the tropics, mid-latitudes or Arctic. The scheme has successfully been used to identify causal precursors of the Stratospheric polar vortex enabling skillful predictions at (sub) seasonal timescales (Kretschmer et al. 2016, J.Clim., Kretschmer et al. 2017, GRL). We analyze observed ISM monthly rainfall over the monsoon trough region. Applying causal discovery techniques, we identify several causal precursor communities in the fields of 2m-temperature, sea level pressure and snow depth over Eurasia. Specifically, our results suggest that surface temperature conditions in both tropical and Arctic regions contribute to ISM variability. A linear regression prediction model based on the identified set of communities has good hindcasting skills with 4-5 months lead times. Further we separate El Nino, La Nina and ENSO-neutral years from each other and find that the causal precursors are different dependent on ENSO state. The ENSO-state dependent causal precursors give even higher skill, especially for La Nina years when the ISM is relatively strong. These

  11. Pretense, Counterfactuals, and Bayesian Causal Models: Why What Is Not Real Really Matters

    Science.gov (United States)

    Weisberg, Deena S.; Gopnik, Alison

    2013-01-01

    Young children spend a large portion of their time pretending about non-real situations. Why? We answer this question by using the framework of Bayesian causal models to argue that pretending and counterfactual reasoning engage the same component cognitive abilities: disengaging with current reality, making inferences about an alternative…

  12. CausalTrail: Testing hypothesis using causal Bayesian networks [version 1; referees: 2 approved

    Directory of Open Access Journals (Sweden)

    Daniel Stöckel

    2015-12-01

    Full Text Available Summary Causal Bayesian Networks are a special class of Bayesian networks in which the hierarchy directly encodes the causal relationships between the variables. This allows to compute the effect of interventions, which are external changes to the system, caused by e.g. gene knockouts or an administered drug. Whereas numerous packages for constructing causal Bayesian networks are available, hardly any program targeted at downstream analysis exists. In this paper we present CausalTrail, a tool for performing reasoning on causal Bayesian networks using the do-calculus. CausalTrail's features include multiple data import methods, a flexible query language for formulating hypotheses, as well as an intuitive graphical user interface. The program is able to account for missing data and thus can be readily applied in multi-omics settings where it is common that not all measurements are performed for all samples. Availability and Implementation CausalTrail is implemented in C++ using the Boost and Qt5 libraries. It can be obtained from https://github.com/dstoeckel/causaltrail

  13. Somatoform disorders and causal attributions in patients with suspected allergies: Do somatic causal attributions matter?

    Science.gov (United States)

    Groben, Sylvie; Hausteiner, Constanze

    2011-03-01

    Somatic causal illness attributions are being considered as potential positive criteria for somatoform disorders (SFDs) in DSM-V. The aim of this study was to investigate whether patients diagnosed with SFDs tend towards a predominantly somatic attribution style. We compared the causal illness attributions of 48 SFD and 149 non-somatoform disorder patients, in a sample of patients presenting for an allergy diagnostic work-up, and those of 47 controls hospitalised for allergen-specific venom immunotherapy. The SFD diagnosis was established by means of the Structured Clinical Interview for DSM-IV. Both spontaneous and prompted causal illness attributions were recorded through interview and by means of the causal dimension of the Revised Illness Perception Questionnaire (IPQ-R), respectively. Patients' spontaneous and prompted responses were assigned to a psychosocial, somatic, or mixed attribution style. Both in the free-response task and in their responses to the IPQ-R, SFD patients were no more likely than their nonsomatoform counterparts to focus on somatic explanations for their symptoms. They were just as likely to make psychosocial or mixed causal attributions. However, patients with SFDs were significantly more likely to find fault with medical care in the past. Our data do not support the use of somatic causal illness attributions as positive criteria for SFDs. They confirm the dynamic and multidimensional nature of causal illness attributions. Clinical implications of these findings are discussed. Copyright © 2011 Elsevier Inc. All rights reserved.

  14. Spectrometric mixture analysis: An unexpected wrinkle

    Indian Academy of Sciences (India)

    Administrator

    The resulting calculation is simple, and typically yields a determinant ratio which, at least for a small number of mixture components, .... nes, both caffeine and theobromine have acid–base equilibria that make their ultraviolet spectra poten- tially pH-dependent. For a quantitative application of Beer's law we must therefore ...

  15. Mixture toxicity of PBT-like chemicals

    DEFF Research Database (Denmark)

    Syberg, Kristian; Dai, Lina; Ramskov, Tina

    beyond that of the individual components. Firstly, the effects of three chemicals with PBT-like properties (acetyl cedrene, pyrene and triclosan) was examined on the freshwater snail, Potamopyrgus antipodarum. Secondly, mixture bioaccumulation of the same three chemicals were assessed experimentally...

  16. Localization and causality in relativistic quantum mechanics

    International Nuclear Information System (INIS)

    Perez, J.F.; Wilde, I.F.

    It is shown that in relativistic quantum mechanics there is no criterion for the strict localization of a state in a bounded space-time region compatible with causality, translation covariance and the spectral condition (or positivity of energy together with Lorentz covariance) [pt

  17. Catastrophizing and Causal Beliefs in Whiplash

    NARCIS (Netherlands)

    Buitenhuis, J.; de Jong, P. J.; Jaspers, J. P. C.; Groothoff, J. W.

    2008-01-01

    Study Design. Prospective cohort study. Objective. This study investigates the role of pain catastrophizing and causal beliefs with regard to severity and persistence of neck complaints after motor vehicle accidents. Summary of Background Data. In previous research on low back pain, somatoform

  18. Special Relativity, Causality and Quantum Mechanics - 1

    Indian Academy of Sciences (India)

    information theory in general and quantum non-locality and entanglement in particular. Right. S Kunkri - current research interest is the role of entanglement in quantum information processing and the connection between quantum operations and causality. Centre. S K Choudhary - current research interest is the study of ...

  19. Marriage and Anomie: A Causal Argument

    Science.gov (United States)

    Lee, Gary R.

    1974-01-01

    A sample of 394 married couples is employed to test the possibility of an association between marital satisfaction and personal (attitudinal) anomie. The hypothesis is supported. Conclusions are offered relevant to anomie theory, and to utilization of marital and family phenomena as independent variables in causal explanations of nonfamily events.…

  20. Causal Measurement Models: Can Criticism Stimulate Clarification?

    Science.gov (United States)

    Markus, Keith A.

    2016-01-01

    In their 2016 work, Aguirre-Urreta et al. provided a contribution to the literature on causal measurement models that enhances clarity and stimulates further thinking. Aguirre-Urreta et al. presented a form of statistical identity involving mapping onto the portion of the parameter space involving the nomological net, relationships between the…

  1. Causal Meta-Analysis : Methodology and Applications

    NARCIS (Netherlands)

    Bax, L.J.

    2009-01-01

    Meta-analysis is a statistical method to summarize research data from multiple studies in a quantitative manner. This dissertation addresses a number of methodological topics in causal meta-analysis and reports the development and validation of meta-analysis software. In the first (methodological)

  2. A Causal Model of Faculty Turnover Intentions.

    Science.gov (United States)

    Smart, John C.

    1990-01-01

    A causal model assesses the relative influence of individual attributes, institutional characteristics, contextual-work environment variables, and multiple measures of job satisfaction on faculty intentions to leave their current institutions. Factors considered include tenure status, age, institutional status, governance style, organizational…

  3. Black Hole Complementarity and Violation of Causality

    OpenAIRE

    Rozenblit, Moshe

    2017-01-01

    Analysis of a massive shell collapsing on a solid sphere shows that black hole complementarity (BHC) violates causality in its effort to save information conservation. In particular, this note describes a hypothetical contraption based on BHC that would allow the transfer of information from the future to the present.

  4. THE CAUSAL TEXTURE OF TRADE UNION ENVIRONMENTS

    African Journals Online (AJOL)

    Admin

    This paper is an attempt to fill an important gap in the existing literature on trade unions by providing a more adequate theoretical formulation of trade union environments. The discussion suggests that unlike the environment of business and related organisations whose causal texture is understood in terms of uncertainty, ...

  5. Are bruxism and the bite causally related?

    NARCIS (Netherlands)

    Lobbezoo, F.; Ahlberg, J.; Manfredini, D.; Winocur, E.

    2012-01-01

    In the dental profession, the belief that bruxism and dental (mal-)occlusion (‘the bite’) are causally related is widespread. The aim of this review was to critically assess the available literature on this topic. A PubMed search of the English-language literature, using the query ‘Bruxism [Majr

  6. Sequential causal learning in humans and rats

    NARCIS (Netherlands)

    Lu, H.; Rojas, R.R.; Beckers, T.; Yuille, A.; Love, B.C.; McRae, K.; Sloutsky, V.M.

    2008-01-01

    Recent experiments (Beckers, De Houwer, Pineño, & Miller, 2005;Beckers, Miller, De Houwer, & Urushihara, 2006) have shown that pretraining with unrelated cues can dramatically influence the performance of humans in a causal learning paradigm and rats in a standard Pavlovian conditioning paradigm.

  7. Dimensional reduction in causal set gravity

    Science.gov (United States)

    Carlip, S.

    2015-12-01

    Results from a number of different approaches to quantum gravity suggest that the effective dimension of spacetime may drop to d = 2 at small scales. I show that two different dimensional estimators in causal set theory display the same behavior, and argue that a third, the spectral dimension, may exhibit a related phenomenon of ‘asymptotic silence.’

  8. The Causal Relationship between Financial Development and ...

    African Journals Online (AJOL)

    The study employs cointegration, vector error correction model and Granger causality test to ascertain causation between financial development and economic performance in Tanzania. Economic performance is measured by the real GDP, whereas proxies for financial development are: the ratio of money supply to nominal ...

  9. Causal and Teleological Explanations in Biology

    Science.gov (United States)

    Yip, Cheng-Wai

    2009-01-01

    A causal explanation in biology focuses on the mechanism by which a biological process is brought about, whereas a teleological explanation considers the end result, in the context of the survival of the organism, as a reason for certain biological processes or structures. There is a tendency among students to offer a teleological explanation…

  10. Special Relativity, Causality and Quantum Mechanics - 2

    Indian Academy of Sciences (India)

    Peaceful Coexistence of Special Relativity and. Quantum Mechanics. As discussed in Part 1, in the framework of the special theory of relativity, causality holds. This can be stated as follows: there is a finite speed for any signal, i.e. , for anything that carries information, and the highest speed for any signal is identical to the ...

  11. Causal Relationship between Teachers' Job Performance and ...

    African Journals Online (AJOL)

    The study investigated teachers' job performance and students' academic achievement in secondary schools for the existence of bi-causal relationship in Nigeria. The ex-post facto research design was adopted in the study. The population of the study covered all the Economic teachers and senior school students in class ...

  12. Introducing mechanics by tapping core causal knowledge

    NARCIS (Netherlands)

    Klaassen, C.W.J.M.; Westra, A.S.; Emmett, K.M.; Eijkelhof, H.M.C.; Lijnse, P.L.

    2008-01-01

    This article concerns an outline of an introductory mechanics course. It is based on the argument that various uses of the concept of force (e.g. from Kepler, Newton and everyday life) share an explanatory strategy based on core causal knowledge. The strategy consists of (a) the idea that a force

  13. Causality and analyticity in quantum fields theory

    International Nuclear Information System (INIS)

    Iagolnitzer, D.

    1992-01-01

    This is a presentation of results on the causal and analytical structure of Green functions and on the collision amplitudes in fields theories, for massive particles of one type, with a positive mass and a zero spin value. (A.B.)

  14. Causality relationship between energy demand and economic ...

    African Journals Online (AJOL)

    This paper attempts to examine the causal relationship between electricity demand and economic growth in Nigeria using data for 1970 – 2003. The study uses the Johansen cointegration VAR approach. The ADF and Phillips – Perron test statistics were used to test for stationarity of the data. It was found that the data were ...

  15. The Causal Priority of Form in Aristotle

    Directory of Open Access Journals (Sweden)

    Kathrin Koslicki

    2014-12-01

    Full Text Available In various texts (e.g., Met. Z.17, Aristotle assigns priority to form, in its role as a principle and cause, over matter and the matter-form compound. Given the central role played by this claim in Aristotle's search for primary substance in the Metaphysics, it is important to understand what motivates him in locating the primary causal responsibility for a thing's being what it is with the form, rather than the matter. According to Met. Theta.8, actuality [energeia/entelecheia] in general is prior to potentiality [dunamis] in three ways, viz., in definition, time and substance. I propose an explicitly causal reading of this general priority claim, as it pertains to the matter-form relationship. The priority of form over matter in definition, time and substance, in my view, is best explained by appeal to the role of form as the formal, efficient and final cause of the matter-form compound, respectively, while the posteriority of matter to form according to all three notions of priority is most plausibly accounted for by the fact that the causal contribution of matter is limited to its role as material cause. When approached from this angle, the work of Met. Theta.8 can be seen to lend direct support to the more specific and explicitly causal priority claim we encounter in Met. Z.17, viz., that form is prior to matter in its role as the principle and primary cause of a matter-form compound's being what it is.

  16. A quantum probability model of causal reasoning

    Directory of Open Access Journals (Sweden)

    Jennifer S Trueblood

    2012-05-01

    Full Text Available People can often outperform statistical methods and machine learning algorithms in situations that involve making inferences about the relationship between causes and effects. While people are remarkably good at causal reasoning in many situations, there are several instances where they deviate from expected responses. This paper examines three situations where judgments related to causal inference problems produce unexpected results and describes a quantum inference model based on the axiomatic principles of quantum probability theory that can explain these effects. Two of the three phenomena arise from the comparison of predictive judgments (i.e., the conditional probability of an effect given a cause with diagnostic judgments (i.e., the conditional probability of a cause given an effect. The third phenomenon is a new finding examining order effects in predictive causal judgments. The quantum inference model uses the notion of incompatibility among different causes to account for all three phenomena. Psychologically, the model assumes that individuals adopt different points of view when thinking about different causes. The model provides good fits to the data and offers a coherent account for all three causal reasoning effects thus proving to be a viable new candidate for modeling human judgment.

  17. Special Relativity, Causality and Quantum Mechanics - 1

    Indian Academy of Sciences (India)

    We discuss the significance of Einstein's second postulate of the special theory of relativity (STR) stipulating the constancy of the speed of light in vacuum. The causality that follows from the. STR may be a more general principle to orga- nize our knowledge of all phenomena. In partic- ular, quantum dynamics can be derived ...

  18. Special Relativity, Causality and Quantum Mechanics - 2

    Indian Academy of Sciences (India)

    tum world. An example of a game which can be won exploiting quantum entanglement, but which can never be won classically, is described. Peaceful Coexistence of Special Relativity and. Quantum Mechanics. As discussed in Part 1, in the framework of the special theory of relativity, causality holds. This can be stated.

  19. Probable autoimmune causal relationship between periodontitis and ...

    African Journals Online (AJOL)

    Periodontitis is a multifactorial disease with microbial dental plaque as the initiator of periodontal disease. However, the manifestation and progression of the disease is influenced by a wide variety of determinants and factors. The strongest type of causal relationship is the association of systemic and periodontal disease.

  20. Predicting skin permeability from complex chemical mixtures.

    Science.gov (United States)

    Riviere, Jim E; Brooks, James D

    2005-10-15

    Occupational and environmental exposure to topical chemicals is usually in the form of complex chemical mixtures, yet risk assessment is based on experimentally derived data from individual chemical exposures from a single, usually aqueous vehicle, or from computed physiochemical properties. We present an approach using hybrid quantitative structure permeation relationships (QSPeR) models where absorption through porcine skin flow-through diffusion cells is well predicted using a QSPeR model describing the individual penetrants, coupled with a mixture factor (MF) that accounts for physicochemical properties of the vehicle/mixture components. The baseline equation is log k(p) = c + mMF + a sigma alpha2(H) + b sigma beta2(H) + s pi2(H) + rR2 + vV(x) where sigma alpha2(H) is the hydrogen-bond donor acidity, sigma beta2(H) is the hydrogen-bond acceptor basicity, pi2(H) is the dipolarity/polarizability, R2 represents the excess molar refractivity, and V(x) is the McGowan volume of the penetrants of interest; c, m, a, b, s, r, and v are strength coefficients coupling these descriptors to skin permeability (k(p)) of 12 penetrants (atrazine, chlorpyrifos, ethylparathion, fenthion, methylparathion, nonylphenol, rho-nitrophenol, pentachlorophenol, phenol, propazine, simazine, and triazine) in 24 mixtures. Mixtures consisted of full factorial combinations of vehicles (water, ethanol, propylene glycol) and additives (sodium lauryl sulfate, methyl nicotinate). An additional set of 4 penetrants (DEET, SDS, permethrin, ricinoleic acid) in different mixtures were included to assess applicability of this approach. This resulted in a dataset of 16 compounds administered in 344 treatment combinations. Across all exposures with no MF, R2 for absorption was 0.62. With the MF, correlations increased up to 0.78. Parameters correlated to the MF include refractive index, polarizability and log (1/Henry's Law Constant) of the mixture components. These factors should not be considered final

  1. mixtools: An R Package for Analyzing Mixture Models

    Directory of Open Access Journals (Sweden)

    Tatiana Benaglia

    2009-10-01

    Full Text Available The mixtools package for R provides a set of functions for analyzing a variety of finite mixture models. These functions include both traditional methods, such as EM algorithms for univariate and multivariate normal mixtures, and newer methods that reflect some recent research in finite mixture models. In the latter category, mixtools provides algorithms for estimating parameters in a wide range of different mixture-of-regression contexts, in multinomial mixtures such as those arising from discretizing continuous multivariate data, in nonparametric situations where the multivariate component densities are completely unspecified, and in semiparametric situations such as a univariate location mixture of symmetric but otherwise unspecified densities. Many of the algorithms of the mixtools package are EM algorithms or are based on EM-like ideas, so this article includes an overview of EM algorithms for finite mixture models.

  2. Mixture of Regression Models with Single-Index

    OpenAIRE

    Xiang, Sijia; Yao, Weixin

    2016-01-01

    In this article, we propose a class of semiparametric mixture regression models with single-index. We argue that many recently proposed semiparametric/nonparametric mixture regression models can be considered special cases of the proposed model. However, unlike existing semiparametric mixture regression models, the new pro- posed model can easily incorporate multivariate predictors into the nonparametric components. Backfitting estimates and the corresponding algorithms have been proposed for...

  3. Bayesian D-Optimal Choice Designs for Mixtures

    OpenAIRE

    Aiste Ruseckaite; Peter Goos; Dennis Fok

    2014-01-01

    markdownabstract__Abstract__ Consumer products and services can often be described as mixtures of ingredients. Examples are the mixture of ingredients in a cocktail and the mixture of different components of waiting time (e.g., in-vehicle and out-of-vehicle travel time) in a transportation setting. Choice experiments may help to determine how the respondents' choice of a product or service is affected by the combination of ingredients. In such studies, individuals are confronted with sets of ...

  4. Papular urticaria: A review of causal agents in Colombia.

    Science.gov (United States)

    Lozano, Ana Milena; López, Juan Felipe; Zakzuk, Josefina; García, Elizabeth

    2016-12-01

    Papular urticaria is a chronic allergic reaction induced by insect bites, which is common in the tropics. The objective of this review was to deepen on epidemiological and immunological aspects of this disease, focused on data published in Latin American countries.We conducted a non-systematic review of the literature through electronic search on the epidemiology of papular urticaria, the entomological characteristics of the causative agents and associated immunological mechanisms.Several reports from medical centers suggest that papular urticaria is common in Latin America. Only one epidemiological survey designed to estimate prevalence of papular urticaria has been published, reporting that about a quarter of children under six years of age is affected by this condition in Bogotá. There is evidence on the causal relationship among exposure to indoor fleas, poverty and papular urticaria in Bogotá, a representative city of the Andean altitudes. Information about causal insects in tropical warmer areas is scarce, although from clinical reports Aedes aegypti and Culex quienquefasciatus appear to be the most common. Th2 cellular-mediated mechanisms are involved in its pathogenesis, which explains its delayed hypersensitivity. The role of immunoglobulin E is not clear in this disease. Insect-derived antigens directly involved in papular urticaria etiology are unknown. However, it is possible that common molecules among causal insects mediate cross-reactive reactions, such as Cte f 2 allergen, found in cat fleas, and its counterparts in mosquitoes.Papular urticaria is a frequent disease in Latin America that should be further investigated. Immunological characterization of the molecular components that cause this condition may solve questions about its pathogenesis.

  5. Some properties of explosive mixtures containing peroxides Part I. Relative performance and detonation of mixtures with triacetone triperoxide.

    Science.gov (United States)

    Zeman, Svatopluk; Trzciński, Waldemar A; Matyás, Robert

    2008-06-15

    This study concerns mixtures of triacetone triperoxide (3,3,6,6,9,9-hexamethyl-1,2,4,5,7,8-hexoxonane, TATP) and ammonium nitrate (AN) with added water (W), as the case may be, and dry mixtures of TATP with urea nitrate (UN). Relative performances (RP) of the mixtures and their individual components, relative to TNT, were determined by means of ballistic mortar. The detonation energies, E0, and detonation velocities, D, were calculated for the mixtures studied by means of the thermodynamic code CHEETAH. Relationships have been found and are discussed between the RP and the E0 values related to unit volume of gaseous products of detonation of these mixtures. These relationships together with those between RP and oxygen balance values of the mixtures studied indicate different types of participation of AN and UN in the explosive decomposition of the respective mixtures. Dry TATP/UN mixtures exhibit lower RP than analogous mixtures TATP/AN containing up to 25% of water. Depending on the water content, the TATP/AN mixtures possess higher detonability values than the ANFO explosives. A semi-logarithmic relationship between the D values and oxygen coefficients has been derived for all the mixtures studied at the charge density of 1000 kg m(-3). Among the mixtures studied, this relationship distinguishes several samples of the type of "tertiary explosives" as well as samples that approach "high explosives" in their performances and detonation velocities.

  6. [Construction of Three-Dimensional Isobologram for Ternary Pollutant Mixtures].

    Science.gov (United States)

    2015-12-01

    Tongji University, Shanghai 200092, China) Isobolographic analysis was widely used in the interaction assessment of binary mixtures. However, how to construct a three-dimensional (3D) isobologram for the assessment of toxicity interaction within ternary mixtures is still not reported up to date. The main purpose of this paper is to develop a 3D isobologram where the relative concentrations of three components are acted as three coordinate axes in 3D space to examine the toxicity interaction within ternary mixtures. Taking six commonly used pesticides in China, including three herbicides (2, 4-D, desmetryne and simetryn) and three insecticides ( dimethoate, imidacloprid and propoxur) as the mixture components, the uniform design ray procedure (UD-Ray) was used to rationally design the concentration composition of various components in the ternary mixtures so that effectively and comprehensively reflected the variety of actual environmental concentrations. The luminescent inhibition toxicities of single pesticides and their ternary mixtures to Vibrio fischeri at various concentration levels were determined by the microplate toxicity analysis. Selecting concentration addition (CA) as the addition reference, 3D isobolograms were constructed to study the toxicity interactions of various ternary mixtures. The results showed that the 3D isobologram could clearly and directly exhibit the toxicity interactions of ternary mixtures, and extend the use of isobolographic analysis into the ternary mixtures.

  7. Self-consistent calculation of atomic structure for mixture

    International Nuclear Information System (INIS)

    Meng Xujun; Bai Yun; Sun Yongsheng; Zhang Jinglin; Zong Xiaoping

    2000-01-01

    Based on relativistic Hartree-Fock-Slater self-consistent average atomic model, atomic structure for mixture is studied by summing up component volumes in mixture. Algorithmic procedure for solving both the group of Thomas-Fermi equations and the self-consistent atomic structure is presented in detail, and, some numerical results are discussed

  8. Quantitative measurement of mixtures by terahertz time-domain ...

    Indian Academy of Sciences (India)

    ... absorption coefficients of the components in each mixture were linearly proportional to their concentrations in the mixture. The results from analysis were in agreement with actual values with a relative error of less than 7%. The quantitative method will help in the detection of illegal drugs, poisons and dangerous materials ...

  9. The causal link between energy and output growth: Evidence from Markov switching Granger causality

    International Nuclear Information System (INIS)

    Kandemir Kocaaslan, Ozge

    2013-01-01

    In this paper we empirically investigate the causal link between energy consumption and economic growth employing a Markov switching Granger causality analysis. We carry out our investigation using annual U.S. real GDP, total final energy consumption and total primary energy consumption data which cover the period between 1968 and 2010. We find that there are significant changes in the causal relation between energy consumption and economic growth over the sample period under investigation. Our results show that total final energy consumption and total primary energy consumption have significant predictive content for real economic activity in the U.S. economy. Furthermore, the causality running from energy consumption to output growth seems to be strongly apparent particularly during the periods of economic downturn and energy crisis. We also document that output growth has predictive power in explaining total energy consumption. Furthermore, the power of output growth in predicting total energy consumption is found to diminish after the mid of 1980s. - Highlights: • Total energy consumption has predictive content for real economic activity. • The causality from energy to output growth is apparent in the periods of recession. • The causality from energy to output growth is strong in the periods of energy crisis. • Output growth has predictive power in explaining total energy consumption. • The power of output growth in explaining energy diminishes after the mid of 1980s

  10. Causal knowledge and the development of inductive reasoning.

    Science.gov (United States)

    Bright, Aimée K; Feeney, Aidan

    2014-06-01

    We explored the development of sensitivity to causal relations in children's inductive reasoning. Children (5-, 8-, and 12-year-olds) and adults were given trials in which they decided whether a property known to be possessed by members of one category was also possessed by members of (a) a taxonomically related category or (b) a causally related category. The direction of the causal link was either predictive (prey→predator) or diagnostic (predator→prey), and the property that participants reasoned about established either a taxonomic or causal context. There was a causal asymmetry effect across all age groups, with more causal choices when the causal link was predictive than when it was diagnostic. Furthermore, context-sensitive causal reasoning showed a curvilinear development, with causal choices being most frequent for 8-year-olds regardless of context. Causal inductions decreased thereafter because 12-year-olds and adults made more taxonomic choices when reasoning in the taxonomic context. These findings suggest that simple causal relations may often be the default knowledge structure in young children's inductive reasoning, that sensitivity to causal direction is present early on, and that children over-generalize their causal knowledge when reasoning. Copyright © 2013 Elsevier Inc. All rights reserved.

  11. Assessing statistical significance in causal graphs.

    Science.gov (United States)

    Chindelevitch, Leonid; Loh, Po-Ru; Enayetallah, Ahmed; Berger, Bonnie; Ziemek, Daniel

    2012-02-20

    Causal graphs are an increasingly popular tool for the analysis of biological datasets. In particular, signed causal graphs--directed graphs whose edges additionally have a sign denoting upregulation or downregulation--can be used to model regulatory networks within a cell. Such models allow prediction of downstream effects of regulation of biological entities; conversely, they also enable inference of causative agents behind observed expression changes. However, due to their complex nature, signed causal graph models present special challenges with respect to assessing statistical significance. In this paper we frame and solve two fundamental computational problems that arise in practice when computing appropriate null distributions for hypothesis testing. First, we show how to compute a p-value for agreement between observed and model-predicted classifications of gene transcripts as upregulated, downregulated, or neither. Specifically, how likely are the classifications to agree to the same extent under the null distribution of the observed classification being randomized? This problem, which we call "Ternary Dot Product Distribution" owing to its mathematical form, can be viewed as a generalization of Fisher's exact test to ternary variables. We present two computationally efficient algorithms for computing the Ternary Dot Product Distribution and investigate its combinatorial structure analytically and numerically to establish computational complexity bounds.Second, we develop an algorithm for efficiently performing random sampling of causal graphs. This enables p-value computation under a different, equally important null distribution obtained by randomizing the graph topology but keeping fixed its basic structure: connectedness and the positive and negative in- and out-degrees of each vertex. We provide an algorithm for sampling a graph from this distribution uniformly at random. We also highlight theoretical challenges unique to signed causal graphs

  12. Prenatal nutrition, epigenetics and schizophrenia risk: can we test causal effects?

    Science.gov (United States)

    Kirkbride, James B; Susser, Ezra; Kundakovic, Marija; Kresovich, Jacob K; Davey Smith, George; Relton, Caroline L

    2012-06-01

    We posit that maternal prenatal nutrition can influence offspring schizophrenia risk via epigenetic effects. In this article, we consider evidence that prenatal nutrition is linked to epigenetic outcomes in offspring and schizophrenia in offspring, and that schizophrenia is associated with epigenetic changes. We focus upon one-carbon metabolism as a mediator of the pathway between perturbed prenatal nutrition and the subsequent risk of schizophrenia. Although post-mortem human studies demonstrate DNA methylation changes in brains of people with schizophrenia, such studies cannot establish causality. We suggest a testable hypothesis that utilizes a novel two-step Mendelian randomization approach, to test the component parts of the proposed causal pathway leading from prenatal nutritional exposure to schizophrenia. Applied here to a specific example, such an approach is applicable for wider use to strengthen causal inference of the mediating role of epigenetic factors linking exposures to health outcomes in population-based studies.

  13. An Empirical Approach to Sufficient Similarity: Combining Exposure Data and Mixtures Toxicology Data

    Science.gov (United States)

    Marshall, Scott; Gennings, Chris; Teuschler, Linda K.; Stork, LeAnna G.; Tornero-Velez, Rogelio; Crofton, Kevin M.; Rice, Glenn E.

    2013-01-01

    When assessing risks posed by environmental chemical mixtures, whole mixture approaches are preferred to component approaches. When toxicological data on whole mixtures as they occur in the environment are not available, EPA guidance states that, toxicity data from a mixture considered “sufficiently similar” to the environmental mixture can serve as a surrogate. We propose a novel method to examine whether mixtures are sufficiently similar, when exposure data and mixture toxicity study data from at least one representative mixture are available. We define sufficient similarity using equivalence testing methodology comparing the distance between benchmark dose estimates for mixtures in both data rich and data poor cases. We construct a “similar mixtures risk indicator” (analogous to the hazard index) on sufficiently similar mixtures linking exposure data with mixtures toxicology data. The methods are illustrated using pyrethroid mixtures occurrence data collected in child care centers (CCC) and dose-response data examining acute neurobehavioral effects of pyrethroid mixtures in rats. Our method shows that the mixtures from 90% of the CCCs were sufficiently similar to the dose-response study mixture. Using exposure estimates for a hypothetical child, the 95th percentile of the (weighted) similar mixtures risk indicator (SMRI) for these sufficiently similar mixtures was 0.20 (i.e., where SMRI1, more concern). PMID:23398277

  14. Violence in psychosis: conceptualizing its causal relationship with risk factors

    NARCIS (Netherlands)

    Lamsma, J.; Harte, J.M.

    2015-01-01

    Background: While statistically robust, the association between psychosis and violence remains causally unexplained. Objective: To provide an overview of possible causal pathways between risk factors and violence in psychosis. Methods: A structured narrative review of relevant studies published

  15. Elements of Causal Inference: Foundations and Learning Algorithms

    DEFF Research Database (Denmark)

    Peters, Jonas Martin; Janzing, Dominik; Schölkopf, Bernhard

    A concise and self-contained introduction to causal inference, increasingly important in data science and machine learning......A concise and self-contained introduction to causal inference, increasingly important in data science and machine learning...

  16. The Hankel transform of causal distributions

    Directory of Open Access Journals (Sweden)

    Manuel A. Aguirre T.

    2012-03-01

    Full Text Available In this note we evaluate the unidimensional distributional Hankel transform of \\dfrac{x^{\\alpha-1}_{+}}{\\Gamma^{\\alpha}},\\dfrac{x^{\\alpha-1}_{-}}{\\Gamma^{\\alpha}},dfrac{|x|^{\\alpha-1}}{\\Gamma^{\\frac{\\alpha}{2}}},dfrac{|x|^{\\alpha-1}sgn(x}{\\Gamma^{\\frac{\\alpha +1}{2}}} and (x± i0^{\\alpha-1} and then we extend the formulae to certain kinds of n-dimensional distributions calles "causal" and "anti-causal" distributions. We evaluate the distributional Handel transform of \\dfrac{(m^2+P^{\\alpha -1}_{-}}{\\Gamma^{(\\alpha} }, \\dfrac{|m^2+P|^{\\alpha -1}_{-}}{\\Gamma^{(\\frac{\\alpha}{2}}}, \\dfrac{|m^2+P|^{\\alpha -1}sgn(m^2+P}{\\Gamma (\\frac{\\alpha +1}{2 }} and (m^2+P±i0^{\\alpha-1}

  17. Finite quantum electrodynamics the causal approach

    CERN Document Server

    Scharf, Günter

    2014-01-01

    In this classic text for advanced undergraduates and graduate students of physics, author Günter Scharf carefully analyzes the role of causality in quantum electrodynamics. His approach offers full proofs and detailed calculations of scattering processes in a mathematically rigorous manner. This third edition contains Scharf's revisions and corrections plus a brief new Epilogue on gauge invariance of quantum electrodynamics to all orders. The book begins with Dirac's theory, followed by the quantum theory of free fields and causal perturbation theory, a powerful method that avoids ultraviolet divergences and solves the infrared problem by means of the adiabatic limit. Successive chapters explore properties of the S-matrix — such as renormalizability, gauge invariance, and unitarity — the renormalization group, and interactive fields. Additional topics include electromagnetic couplings and the extension of the methods to non-abelian gauge theories. Each chapter is supplemented with problems, and four appe...

  18. Granger-Causality Maps of Diffusion Processes

    Czech Academy of Sciences Publication Activity Database

    Wahl, B.; Feudel, U.; Hlinka, Jaroslav; Wächter, M.; Peinke, J.; Freund, J.A.

    2016-01-01

    Roč. 93, č. 2 16 February (2016), č. článku 022213. ISSN 2470-0045 R&D Projects: GA ČR GA13-23940S; GA MZd(CZ) NV15-29835A Institutional support: RVO:67985807 Keywords : Granger causality * stochastic process * diffusion process * nonlinear dynamical systems Subject RIV: BB - Applied Statistics, Operational Research Impact factor: 2.366, year: 2016

  19. On the causality relations in thermoelectricity

    Science.gov (United States)

    Vázquez, Federico; López de Haro, Mariano; Figueroa, Aldo

    2018-01-01

    The relationship between the causality principle and the existence of couplings between different thermodynamic driving forces in a given phenomenon is discussed. The case of thermoelectricity is explicitly analyzed. A transport equation for the propagation of thermal disturbances in a sample after an electric potential difference is applied is derived. The consequences of the non-hyperbolic character of this equation and the need for investigating its possible connection with nonequilibrium thermodynamics formulations are pointed out.

  20. I-optimal mixture designs

    OpenAIRE

    GOOS, Peter; JONES, Bradley; SYAFITRI, Utami

    2013-01-01

    In mixture experiments, the factors under study are proportions of the ingredients of a mixture. The special nature of the factors in a mixture experiment necessitates specific types of regression models, and specific types of experimental designs. Although mixture experiments usually are intended to predict the response(s) for all possible formulations of the mixture and to identify optimal proportions for each of the ingredients, little research has been done concerning their I-optimal desi...

  1. Causal Relationship between Construction Production and GDP in Turkey

    OpenAIRE

    Hakkı Kutay Bolkol

    2015-01-01

    This study empirically investigates the causal relationship between construction production and GDP for Turkey during 2005Q1-2013Q4 period. Because it is found that, there is no cointegration which means there is no long run relationship between variables, VAR Granger Causality Method is used to test the causality in short run. The findings reveal that, the causality runs from GDP to Building Production and Building Production to Non-Building Production (i.e. bidirectional relationship). Find...

  2. Handling hybrid and missing data in constraint-based causal discovery to study the etiology of ADHD.

    Science.gov (United States)

    Sokolova, Elena; von Rhein, Daniel; Naaijen, Jilly; Groot, Perry; Claassen, Tom; Buitelaar, Jan; Heskes, Tom

    2017-01-01

    Causal discovery is an increasingly important method for data analysis in the field of medical research. In this paper, we consider two challenges in causal discovery that occur very often when working with medical data: a mixture of discrete and continuous variables and a substantial amount of missing values. To the best of our knowledge, there are no methods that can handle both challenges at the same time. In this paper, we develop a new method that can handle these challenges based on the assumption that data are missing at random and that continuous variables obey a non-paranormal distribution. We demonstrate the validity of our approach for causal discovery on simulated data as well as on two real-world data sets from a monetary incentive delay task and a reversal learning task. Our results help in the understanding of the etiology of attention-deficit/hyperactivity disorder (ADHD).

  3. A time domain frequency-selective multivariate Granger causality approach.

    Science.gov (United States)

    Leistritz, Lutz; Witte, Herbert

    2016-08-01

    The investigation of effective connectivity is one of the major topics in computational neuroscience to understand the interaction between spatially distributed neuronal units of the brain. Thus, a wide variety of methods has been developed during the last decades to investigate functional and effective connectivity in multivariate systems. Their spectrum ranges from model-based to model-free approaches with a clear separation into time and frequency range methods. We present in this simulation study a novel time domain approach based on Granger's principle of predictability, which allows frequency-selective considerations of directed interactions. It is based on a comparison of prediction errors of multivariate autoregressive models fitted to systematically modified time series. These modifications are based on signal decompositions, which enable a targeted cancellation of specific signal components with specific spectral properties. Depending on the embedded signal decomposition method, a frequency-selective or data-driven signal-adaptive Granger Causality Index may be derived.

  4. STAR-POLYMER -- COLLOID MIXTURES

    Directory of Open Access Journals (Sweden)

    J.Dzubiella

    2002-01-01

    Full Text Available Recent results in theory and simulation of star-polymer--colloid mixtures are reviewed. We present the effective interaction between hard, colloidal particles and star polymers in a good solvent derived by monomer-resolved Molecular Dynamics simulations and theoretical arguments. The relevant parameters are the size ratio q between the stars and the colloids, as well as the number of polymeric arms f (functionality attached to the common center of the star. By covering a wide range of q's ranging from zero (star against a flat wall up to about 0.5, we establish analytical forms for the star-colloid interaction which are in excellent agreement with simulation results. By employing this cross interaction and the effective interactions between stars and colloids themselves, a demixing transition in the fluid phase is observed and systematically investigated for different arm numbers and size ratios. The demixing binodals are compared with experimental observations and found to be consistent. Furthermore, we map the full two-component system on an effective one-component description for the colloids, by inverting the two-component Ornstein-Zernike equations. Some recent results for the depletion interaction and freezing transitions are shown.

  5. A new approach to causality in the frequency domain

    OpenAIRE

    Mehmet Dalkir

    2004-01-01

    This study refers to the earlier work of analysis in the frequency domain. A different definition of causality is made, and its implications to the general idea of causality are discussed. The causality relationship between two monetary aggregates, simple sum and Divisia indices, and their relation with the personal income is analyzed using wavelet time-scale decomposition.

  6. A Quantitative Causal Model Theory of Conditional Reasoning

    Science.gov (United States)

    Fernbach, Philip M.; Erb, Christopher D.

    2013-01-01

    The authors propose and test a causal model theory of reasoning about conditional arguments with causal content. According to the theory, the acceptability of modus ponens (MP) and affirming the consequent (AC) reflect the conditional likelihood of causes and effects based on a probabilistic causal model of the scenario being judged. Acceptability…

  7. How to Be Causal: Time, Spacetime and Spectra

    Science.gov (United States)

    Kinsler, Paul

    2011-01-01

    I explain a simple definition of causality in widespread use, and indicate how it links to the Kramers-Kronig relations. The specification of causality in terms of temporal differential equations then shows us the way to write down dynamical models so that their causal nature "in the sense used here" should be obvious to all. To extend existing…

  8. Pathway Analysis and the Search for Causal Mechanisms

    Science.gov (United States)

    Weller, Nicholas; Barnes, Jeb

    2016-01-01

    The study of causal mechanisms interests scholars across the social sciences. Case studies can be a valuable tool in developing knowledge and hypotheses about how causal mechanisms function. The usefulness of case studies in the search for causal mechanisms depends on effective case selection, and there are few existing guidelines for selecting…

  9. Causal structure in categorical quantum mechanics

    Science.gov (United States)

    Lal, Raymond Ashwin

    Categorical quantum mechanics is a way of formalising the structural features of quantum theory using category theory. It uses compound systems as the primitive notion, which is formalised by using symmetric monoidal categories. This leads to an elegant formalism for describing quantum protocols such as quantum teleportation. In particular, categorical quantum mechanics provides a graphical calculus that exposes the information flow of such protocols in an intuitive way. However, the graphical calculus also reveals surprising features of these protocols; for example, in the quantum teleportation protocol, information appears to flow `backwards-in-time'. This leads to question of how causal structure can be described within categorical quantum mechanics, and how this might lead to insight regarding the structural compatibility between quantum theory and relativity. This thesis is concerned with the project of formalising causal structure in categorical quantum mechanics. We begin by studying an abstract view of Bell-type experiments, as described by `no-signalling boxes', and we show that under time-reversal no-signalling boxes generically become signalling. This conflicts with the underlying symmetry of relativistic causal structure. This leads us to consider the framework of categorical quantum mechanics from the perspective of relativistic causal structure. We derive the properties that a symmetric monoidal category must satisfy in order to describe systems in such a background causal structure. We use these properties to define a new type of category, and this provides a formal framework for describing protocols in spacetime. We explore this new structure, showing how it leads to an understanding of the counter-intuitive information flow of protocols in categorical quantum mechanics. We then find that the formal properties of our new structure are naturally related to axioms for reconstructing quantum theory, and we show how a reconstruction scheme based on

  10. Two-Microphone Separation of Speech Mixtures

    DEFF Research Database (Denmark)

    Pedersen, Michael Syskind; Wang, DeLiang; Larsen, Jan

    2008-01-01

    Separation of speech mixtures, often referred to as the cocktail party problem, has been studied for decades. In many source separation tasks, the separation method is limited by the assumption of at least as many sensors as sources. Further, many methods require that the number of signals within...... combined, independent component analysis (ICA) and binary time–frequency (T–F) masking. By estimating binary masks from the outputs of an ICA algorithm, it is possible in an iterative way to extract basis speech signals from a convolutive mixture. The basis signals are afterwards improved by grouping...... similar signals. Using two microphones, we can separate, in principle, an arbitrary number of mixed speech signals. We show separation results for mixtures with as many as seven speech signals under instantaneous conditions. We also show that the proposed method is applicable to segregate speech signals...

  11. Causal beliefs about depression in different cultural groups – What do cognitive psychological theories of causal learning and reasoning predict?

    OpenAIRE

    York eHagmayer; Neele eEngelmann

    2014-01-01

    Cognitive psychological research focuses on causal learning and reasoning while cognitive anthropological and social science research tend to focus on systems of beliefs. Our aim was to explore how these two types of research can inform each other. Cognitive psychological theories (causal model theory and causal Bayes nets) were used to derive predictions for systems of causal beliefs. These predictions were then applied to lay theories of depression as a specific test case. A systematic lite...

  12. Segregation of granular binary mixtures by a ratchet mechanism.

    Science.gov (United States)

    Farkas, Zénó; Szalai, Ferenc; Wolf, Dietrich E; Vicsek, Tamás

    2002-02-01

    We report on a segregation scheme for granular binary mixtures, where the segregation is performed by a ratchet mechanism realized by a vertically shaken asymmetric sawtooth-shaped base in a quasi-two-dimensional box. We have studied this system by computer simulations and found that most binary mixtures can be segregated using an appropriately chosen ratchet, even when the particles in the two components have the same size and differ only in their normal restitution coefficient or friction coefficient. These results suggest that the components of otherwise nonsegregating granular mixtures may be separated using our method.

  13. A Bayesian Theory of Sequential Causal Learning and Abstract Transfer.

    Science.gov (United States)

    Lu, Hongjing; Rojas, Randall R; Beckers, Tom; Yuille, Alan L

    2016-03-01

    Two key research issues in the field of causal learning are how people acquire causal knowledge when observing data that are presented sequentially, and the level of abstraction at which learning takes place. Does sequential causal learning solely involve the acquisition of specific cause-effect links, or do learners also acquire knowledge about abstract causal constraints? Recent empirical studies have revealed that experience with one set of causal cues can dramatically alter subsequent learning and performance with entirely different cues, suggesting that learning involves abstract transfer, and such transfer effects involve sequential presentation of distinct sets of causal cues. It has been demonstrated that pre-training (or even post-training) can modulate classic causal learning phenomena such as forward and backward blocking. To account for these effects, we propose a Bayesian theory of sequential causal learning. The theory assumes that humans are able to consider and use several alternative causal generative models, each instantiating a different causal integration rule. Model selection is used to decide which integration rule to use in a given learning environment in order to infer causal knowledge from sequential data. Detailed computer simulations demonstrate that humans rely on the abstract characteristics of outcome variables (e.g., binary vs. continuous) to select a causal integration rule, which in turn alters causal learning in a variety of blocking and overshadowing paradigms. When the nature of the outcome variable is ambiguous, humans select the model that yields the best fit with the recent environment, and then apply it to subsequent learning tasks. Based on sequential patterns of cue-outcome co-occurrence, the theory can account for a range of phenomena in sequential causal learning, including various blocking effects, primacy effects in some experimental conditions, and apparently abstract transfer of causal knowledge. Copyright © 2015

  14. Developing Causal Understanding with Causal Maps: The Impact of Total Links, Temporal Flow, and Lateral Position of Outcome Nodes

    Science.gov (United States)

    Jeong, Allan; Lee, Woon Jee

    2012-01-01

    This study examined some of the methodological approaches used by students to construct causal maps in order to determine which approaches help students understand the underlying causes and causal mechanisms in a complex system. This study tested the relationship between causal understanding (ratio of root causes correctly/incorrectly identified,…

  15. Improved gas mixtures for gas-filled particle detectors

    Science.gov (United States)

    Christophorou, L.G.; McCorkle, D.L.; Maxey, D.V.; Carter, J.G.

    Improved binary and tertiary gas mixture for gas-filled particle detectors are provided. The components are chosen on the basis of the principle that the first component is one gas or mixture of two gases having a large electron scattering cross section at energies of about 0.5 eV and higher, and the second component is a gas (Ar) having a very small cross section at and below about 0.5 eV; whereby fast electrons in the gaseous mixture are slowed into the energy range of about 0.5 eV where the cross section for the mixture is small and hence the electron mean free path is large. The reduction in both the cross section and the electron energy results in an increase in the drift velocity of the electrons in the gas mixtures over that for the separate components for a range of E/P (pressure-reduced electron field) values. Several gas mixtures are provided that provide faster response in gas-filled detectors for convenient E/P ranges as compared with conventional gas mixtures.

  16. Gas mixtures for gas-filled radiation detectors

    International Nuclear Information System (INIS)

    Carter, J.G.; Christophorou, L.G.; Maxey, D.V.; Mccorkle, D.L.

    1982-01-01

    Improved binary and ternary gas mixtures for gas-filled radiation detectors are provided. The components are chosen on the basis of the principle that the first component is one molecular gas or mixture of two molecular gases having a large electron scattering cross section at energies of about 0.5 ev and higher, and the second component is a noble gas having a very small cross section at and below about 1.0 ev, whereby fast electrons in the gaseous mixture are slowed into the energy range of about 0.5 ev where the cross section for the mixture is small and hence the electron mean free path is large. The reduction in both the cross section and the electron energy results in an increase in the drift velocity of the electrons in the gas mixtures over that for the separate components for a range of e/p (Pressure-reduced electric field) values. Several gas mixtures are provided that provide faster response in gas-filled detectors for convenient e/p ranges as compared with conventional gas mixtures

  17. Information causality from an entropic and a probabilistic perspective

    International Nuclear Information System (INIS)

    Al-Safi, Sabri W.; Short, Anthony J.

    2011-01-01

    The information causality principle is a generalization of the no-signaling principle which implies some of the known restrictions on quantum correlations. But despite its clear physical motivation, information causality is formulated in terms of a rather specialized game and figure of merit. We explore different perspectives on information causality, discussing the probability of success as the figure of merit, a relation between information causality and the nonlocal ''inner-product game,'' and the derivation of a quadratic bound for these games. We then examine an entropic formulation of information causality with which one can obtain the same results, arguably in a simpler fashion.

  18. The adsorption of Benzene-Ethylene Dichloride Mixtures on Activated Carbon

    Science.gov (United States)

    Miao, T.; Tang, H. M.; Cheng, Z. X.

    2018-02-01

    The single component adsorption of benzene and ethylene dichloride and also the adsorption of binary mixtures of benzene and ethylene dichloride have been studied in a small fixed isothermal bed containing activated carbon (AC). Results indicate that an empirical Langmuir isotherm fits the experimental data for single components. An extended form of the empirical Langmuir isotherm, in which the parameters are obtained from single component data, satisfactorily describes the adsorption of binary mixtures. Breakthrough curves of both components could be predicted with good precision studied. This paper analyses the adsorption behaviour of a mixture of VOCs (benzene–ethylene dichloride) on AC, due to the lack of information regarding the adsorption of mixtures.

  19. Other components

    International Nuclear Information System (INIS)

    Anon.

    1993-01-01

    This chapter includes descriptions of electronic and mechanical components which do not merit a chapter to themselves. Other hardware requires mention because of particularly high tolerance or intolerance of exposure to radiation. A more systematic analysis of radiation responses of structures which are definable by material was given in section 3.8. The components discussed here are field effect transistors, transducers, temperature sensors, magnetic components, superconductors, mechanical sensors, and miscellaneous electronic components

  20. Emergent Geometry from Entropy and Causality

    Science.gov (United States)

    Engelhardt, Netta

    In this thesis, we investigate the connections between the geometry of spacetime and aspects of quantum field theory such as entanglement entropy and causality. This work is motivated by the idea that spacetime geometry is an emergent phenomenon in quantum gravity, and that the physics responsible for this emergence is fundamental to quantum field theory. Part I of this thesis is focused on the interplay between spacetime and entropy, with a special emphasis on entropy due to entanglement. In general spacetimes, there exist locally-defined surfaces sensitive to the geometry that may act as local black hole boundaries or cosmological horizons; these surfaces, known as holographic screens, are argued to have a connection with the second law of thermodynamics. Holographic screens obey an area law, suggestive of an association with entropy; they are also distinguished surfaces from the perspective of the covariant entropy bound, a bound on the total entropy of a slice of the spacetime. This construction is shown to be quite general, and is formulated in both classical and perturbatively quantum theories of gravity. The remainder of Part I uses the Anti-de Sitter/ Conformal Field Theory (AdS/CFT) correspondence to both expand and constrain the connection between entanglement entropy and geometry. The AdS/CFT correspondence posits an equivalence between string theory in the "bulk" with AdS boundary conditions and certain quantum field theories. In the limit where the string theory is simply classical General Relativity, the Ryu-Takayanagi and more generally, the Hubeny-Rangamani-Takayanagi (HRT) formulae provide a way of relating the geometry of surfaces to entanglement entropy. A first-order bulk quantum correction to HRT was derived by Faulkner, Lewkowycz and Maldacena. This formula is generalized to include perturbative quantum corrections in the bulk at any (finite) order. Hurdles to spacetime emergence from entanglement entropy as described by HRT and its quantum

  1. Mixture based outlier filtration

    Czech Academy of Sciences Publication Activity Database

    Pecherková, Pavla; Nagy, Ivan

    2006-01-01

    Roč. 46, č. 2 (2006), s. 30-35 ISSN 1210-2709 R&D Projects: GA MŠk 1M0572; GA MDS 1F43A/003/120 Institutional research plan: CEZ:AV0Z10750506 Keywords : data filtration * system modelling * mixture models Subject RIV: BD - Theory of Information http://library.utia.cas.cz/prace/20060165.pdf

  2. Computer simulation-molecular-thermodynamic framework to predict the micellization behavior of mixtures of surfactants: application to binary surfactant mixtures.

    Science.gov (United States)

    Iyer, Jaisree; Mendenhall, Jonathan D; Blankschtein, Daniel

    2013-05-30

    We present a computer simulation-molecular-thermodynamic (CSMT) framework to model the micellization behavior of mixtures of surfactants in which hydration information from all-atomistic simulations of surfactant mixed micelles and monomers in aqueous solution is incorporated into a well-established molecular-thermodynamic framework for mixed surfactant micellization. In addition, we address the challenges associated with the practical implementation of the CSMT framework by formulating a simpler mixture CSMT model based on a composition-weighted average approach involving single-component micelle simulations of the mixture constituents. We show that the simpler mixture CSMT model works well for all of the binary surfactant mixtures considered, except for those containing alkyl ethoxylate surfactants, and rationalize this finding molecularly. The mixture CSMT model is then utilized to predict mixture CMCs, and we find that the predicted CMCs compare very well with the experimental CMCs for various binary mixtures of linear surfactants. This paper lays the foundation for the mixture CSMT framework, which can be used to predict the micellization properties of mixtures of surfactants that possess a complex chemical architecture, and are therefore not amenable to traditional molecular-thermodynamic modeling.

  3. A Causal Theory of Mnemonic Confabulation.

    Science.gov (United States)

    Bernecker, Sven

    2017-01-01

    This paper attempts to answer the question of what defines mnemonic confabulation vis-à-vis genuine memory. The two extant accounts of mnemonic confabulation as "false memory" and as ill-grounded memory are shown to be problematic, for they cannot account for the possibility of veridical confabulation, ill-grounded memory, and well-grounded confabulation. This paper argues that the defining characteristic of mnemonic confabulation is that it lacks the appropriate causal history. In the confabulation case, there is no proper counterfactual dependence of the state of seeming to remember on the corresponding past representation.

  4. A Causal Theory of Mnemonic Confabulation

    Directory of Open Access Journals (Sweden)

    Sven Bernecker

    2017-07-01

    Full Text Available This paper attempts to answer the question of what defines mnemonic confabulation vis-à-vis genuine memory. The two extant accounts of mnemonic confabulation as “false memory” and as ill-grounded memory are shown to be problematic, for they cannot account for the possibility of veridical confabulation, ill-grounded memory, and well-grounded confabulation. This paper argues that the defining characteristic of mnemonic confabulation is that it lacks the appropriate causal history. In the confabulation case, there is no proper counterfactual dependence of the state of seeming to remember on the corresponding past representation.

  5. De Broglie's causal interpretations of quantum mechanics

    International Nuclear Information System (INIS)

    YOAV Ben-Dov

    1989-01-01

    In this article we trace the history of de Broglie's two causal interpretations of quantum mechanics, namely the double solution and the pilot wave theories, at the two periods in which he developed them: 1924-27 and 1952 onwards. Examining the reasons for which he always preferred the first theory to the second, reasons that are mainly concerned with the question of the physical nature of the quantum wave function, we try to show the continuity and the coherence of his underlying vision

  6. [Therapy of polyneuropathies. Causal and symptomatic].

    Science.gov (United States)

    Müller-Felber, W

    2001-05-28

    In the first instance, polyneuropathies are treated causally. The most common underlying cause is diabetes mellitus or alcohol abuse. In a large number of patients with polyneuropathy, however, the underlying cause cannot be definitively identified. For these--but equally for patients with etiologically clear polyneuropathy--a stock-taking of clinical symptoms should be carried out and, where indicated, symptomatic treatment initiated. In addition to medication aimed at combating pain, muscular spasm, autonomic functional disorders, and for the prevention of thrombosis, physical measures (physiotherapy, foot care, orthopedic shoes) are of primary importance.

  7. Conditional Granger Causality of Diffusion Processes

    Czech Academy of Sciences Publication Activity Database

    Wahl, B.; Feudel, U.; Hlinka, Jaroslav; Wächter, M.; Peinke, J.; Freund, J.A.

    2017-01-01

    Roč. 90, č. 10 (2017), č. článku 197. ISSN 1434-6028 R&D Projects: GA ČR GA13-23940S; GA MZd(CZ) NV15-29835A Institutional support: RVO:67985807 Keywords : Granger causality * stochastic process * diffusion process * nonlinear dynamical systems Subject RIV: BB - Applied Statistics, Operational Research OBOR OECD: Computer sciences, information science, bioinformathics (hardware development to be 2.2, social aspect to be 5.8) Impact factor: 1.461, year: 2016

  8. Extracting Spurious Latent Classes in Growth Mixture Modeling with Nonnormal Errors

    Science.gov (United States)

    Guerra-Peña, Kiero; Steinley, Douglas

    2016-01-01

    Growth mixture modeling is generally used for two purposes: (1) to identify mixtures of normal subgroups and (2) to approximate oddly shaped distributions by a mixture of normal components. Often in applied research this methodology is applied to both of these situations indistinctly: using the same fit statistics and likelihood ratio tests. This…

  9. Foundational perspectives on causality in large-scale brain networks

    Science.gov (United States)

    Mannino, Michael; Bressler, Steven L.

    2015-12-01

    A profusion of recent work in cognitive neuroscience has been concerned with the endeavor to uncover causal influences in large-scale brain networks. However, despite the fact that many papers give a nod to the important theoretical challenges posed by the concept of causality, this explosion of research has generally not been accompanied by a rigorous conceptual analysis of the nature of causality in the brain. This review provides both a descriptive and prescriptive account of the nature of causality as found within and between large-scale brain networks. In short, it seeks to clarify the concept of causality in large-scale brain networks both philosophically and scientifically. This is accomplished by briefly reviewing the rich philosophical history of work on causality, especially focusing on contributions by David Hume, Immanuel Kant, Bertrand Russell, and Christopher Hitchcock. We go on to discuss the impact that various interpretations of modern physics have had on our understanding of causality. Throughout all this, a central focus is the distinction between theories of deterministic causality (DC), whereby causes uniquely determine their effects, and probabilistic causality (PC), whereby causes change the probability of occurrence of their effects. We argue that, given the topological complexity of its large-scale connectivity, the brain should be considered as a complex system and its causal influences treated as probabilistic in nature. We conclude that PC is well suited for explaining causality in the brain for three reasons: (1) brain causality is often mutual; (2) connectional convergence dictates that only rarely is the activity of one neuronal population uniquely determined by another one; and (3) the causal influences exerted between neuronal populations may not have observable effects. A number of different techniques are currently available to characterize causal influence in the brain. Typically, these techniques quantify the statistical

  10. Are bruxism and the bite causally related?

    Science.gov (United States)

    Lobbezoo, F; Ahlberg, J; Manfredini, D; Winocur, E

    2012-07-01

    In the dental profession, the belief that bruxism and dental (mal-)occlusion ('the bite') are causally related is widespread. The aim of this review was to critically assess the available literature on this topic. A PubMed search of the English-language literature, using the query 'Bruxism [Majr] AND (Dental Occlusion [Majr] OR Malocclusion [Majr])', yielded 93 articles, of which 46 papers were finally included in the present review*. Part of the included publications dealt with the possible associations between bruxism and aspects of occlusion, from which it was concluded that neither for occlusal interferences nor for factors related to the anatomy of the oro-facial skeleton, there is any evidence available that they are involved in the aetiology of bruxism. Instead, there is a growing awareness of other factors (viz. psychosocial and behavioural ones) being important in the aetiology of bruxism. Another part of the included papers assessed the possible mediating role of occlusion between bruxism and its purported consequences (e.g. tooth wear, loss of periodontal tissues, and temporomandibular pain and dysfunction). Even though most dentists agree that bruxism may have several adverse effects on the masticatory system, for none of these purported adverse effects, evidence for a mediating role of occlusion and articulation has been found to date. Hence, based on this review, it should be concluded that to date, there is no evidence whatsoever for a causal relationship between bruxism and the bite. © 2012 Blackwell Publishing Ltd.

  11. [Antibibiotic resistance by nosocomial infections' causal agents].

    Science.gov (United States)

    Salazar-Holguín, Héctor Daniel; Cisneros-Robledo, María Elena

    2016-01-01

    The antibibiotic resistance by nosocomial infections (NI) causal agents constitutes a seriously global problematic that involves the Mexican Institute of Social Security's Regional General Hospital 1 in Chihuahua, Mexico; although with special features that required to be specified and evaluated, in order to concrete an effective therapy. Observational, descriptive and prospective study; by means of active vigilance all along 2014 in order to detect the nosocomial infections, for epidemiologic study, culture and antibiogram to identify its causal agents and antibiotics resistance and sensitivity. Among 13527 hospital discharges, 1079 displayed NI (8 %), standed out: the related on vascular lines, of surgical site, pneumonia and urinal track; they added up two thirds of the total. We carried out culture and antibiogram about 300 of them (27.8 %); identifying 31 bacterian species, mainly seven of those (77.9 %): Escherichia coli, Staphylococcus aureus and epidermidis, Pseudomonas aeruginosa, Acinetobacter baumannii, Klebsiella pneumoniae and Enterobacter cloacae; showing multiresistance to 34 tested antibiotics, except in seven with low or without resistance at all: vancomycin, teicoplanin, linezolid, quinupristin-dalfopristin, piperacilin-tazobactam, amikacin and carbapenems. When we contrasted those results with the recommendations in the clinical practice guides, it aroused several contradictions; so they must be taken with reserves and has to be tested in each hospital, by means of cultures and antibiograms in practically every case of nosocomial infection.

  12. Evidence for online processing during causal learning.

    Science.gov (United States)

    Liu, Pei-Pei; Luhmann, Christian C

    2015-03-01

    Many models of learning describe both the end product of learning (e.g., causal judgments) and the cognitive mechanisms that unfold on a trial-by-trial basis. However, the methods employed in the literature typically provide only indirect evidence about the unfolding cognitive processes. Here, we utilized a simultaneous secondary task to measure cognitive processing during a straightforward causal-learning task. The results from three experiments demonstrated that covariation information is not subject to uniform cognitive processing. Instead, we observed systematic variation in the processing dedicated to individual pieces of covariation information. In particular, observations that are inconsistent with previously presented covariation information appear to elicit greater cognitive processing than do observations that are consistent with previously presented covariation information. In addition, the degree of cognitive processing appears to be driven by learning per se, rather than by nonlearning processes such as memory and attention. Overall, these findings suggest that monitoring learning processes at a finer level may provide useful psychological insights into the nature of learning.

  13. Diagnostic reasoning using qualitative causal models

    International Nuclear Information System (INIS)

    Sudduth, A.L.

    1992-01-01

    The application of expert systems to reasoning problems involving real-time data from plant measurements has been a topic of much research, but few practical systems have been deployed. One obstacle to wider use of expert systems in applications involving real-time data is the lack of adequate knowledge representation methodologies for dynamic processes. Knowledge bases composed mainly of rules have disadvantages when applied to dynamic processes and real-time data. This paper describes a methodology for the development of qualitative causal models that can be used as knowledge bases for reasoning about process dynamic behavior. These models provide a systematic method for knowledge base construction, considerably reducing the engineering effort required. They also offer much better opportunities for verification and validation of the knowledge base, thus increasing the possibility of the application of expert systems to reasoning about mission critical systems. Starting with the Signed Directed Graph (SDG) method that has been successfully applied to describe the behavior of diverse dynamic processes, the paper shows how certain non-physical behaviors that result from abstraction may be eliminated by applying causal constraint to the models. The resulting Extended Signed Directed Graph (ESDG) may then be compiled to produce a model for use in process fault diagnosis. This model based reasoning methodology is used in the MOBIAS system being developed by Duke Power Company under EPRI sponsorship. 15 refs., 4 figs

  14. Introducing mechanics by tapping core causal knowledge

    International Nuclear Information System (INIS)

    Klaassen, Kees; Westra, Axel; Emmett, Katrina; Eijkelhof, Harrie; Lijnse, Piet

    2008-01-01

    This article concerns an outline of an introductory mechanics course. It is based on the argument that various uses of the concept of force (e.g. from Kepler, Newton and everyday life) share an explanatory strategy based on core causal knowledge. The strategy consists of (a) the idea that a force causes a deviation from how an object would move of its own accord (i.e. its force-free motion), and (b) an incentive to search, where the motion deviates from the assumed force-free motion, for recurring configurations with which such deviations can be correlated (interaction theory). Various assumptions can be made concerning both the force-free motion and the interaction theory, thus giving rise to a variety of specific explanations. Kepler's semi-implicit intuition about the force-free motion is rest, Newton's explicit assumption is uniform rectilinear motion, while in everyday explanations a diversity of pragmatic suggestions can be recognized. The idea is that the explanatory strategy, once made explicit by drawing on students' intuitive causal knowledge, can be made to function for students as an advance organizer, in the sense of a general scheme that they recognize but do not yet know how to detail for scientific purposes

  15. Causal Inference in the Perception of Verticality.

    Science.gov (United States)

    de Winkel, Ksander N; Katliar, Mikhail; Diers, Daniel; Bülthoff, Heinrich H

    2018-04-03

    The perceptual upright is thought to be constructed by the central nervous system (CNS) as a vector sum; by combining estimates on the upright provided by the visual system and the body's inertial sensors with prior knowledge that upright is usually above the head. Recent findings furthermore show that the weighting of the respective sensory signals is proportional to their reliability, consistent with a Bayesian interpretation of a vector sum (Forced Fusion, FF). However, violations of FF have also been reported, suggesting that the CNS may rely on a single sensory system (Cue Capture, CC), or choose to process sensory signals based on inferred signal causality (Causal Inference, CI). We developed a novel alternative-reality system to manipulate visual and physical tilt independently. We tasked participants (n = 36) to indicate the perceived upright for various (in-)congruent combinations of visual-inertial stimuli, and compared models based on their agreement with the data. The results favor the CI model over FF, although this effect became unambiguous only for large discrepancies (±60°). We conclude that the notion of a vector sum does not provide a comprehensive explanation of the perception of the upright, and that CI offers a better alternative.

  16. Separating Underdetermined Convolutive Speech Mixtures

    DEFF Research Database (Denmark)

    Pedersen, Michael Syskind; Wang, DeLiang; Larsen, Jan

    2006-01-01

    a method for underdetermined blind source separation of convolutive mixtures. The proposed framework is applicable for separation of instantaneous as well as convolutive speech mixtures. It is possible to iteratively extract each speech signal from the mixture by combining blind source separation...

  17. Mixtures of truncated basis functions

    DEFF Research Database (Denmark)

    Langseth, Helge; Nielsen, Thomas Dyhre; Rumí, Rafael

    2012-01-01

    In this paper we propose a framework, called mixtures of truncated basis functions (MoTBFs), for representing general hybrid Bayesian networks. The proposed framework generalizes both the mixture of truncated exponentials (MTEs) framework and the mixture of polynomials (MoPs) framework. Similar...

  18. Separation of organic azeotropic mixtures by pervaporation. Final technical report

    Energy Technology Data Exchange (ETDEWEB)

    Baker, R.W.

    1991-12-01

    Distillation is a commonly used separation technique in the petroleum refining and chemical processing industries. However, there are a number of potential separations involving azetropic and close-boiling organic mixtures that cannot be separated efficiently by distillation. Pervaporation is a membrane-based process that uses selective permeation through membranes to separate liquid mixtures. Because the separation process is not affected by the relative volatility of the mixture components being separated, pervaporation can be used to separate azetropes and close-boiling mixtures. Our results showed that pervaporation membranes can be used to separate azeotropic mixtures efficiently, a result that is not achievable with simple distillation. The membranes were 5--10 times more permeable to one of the components of the mixture, concentrating it in the permeate stream. For example, the membrane was 10 times more permeable to ethanol than methyl ethyl ketone, producing 60% ethanol permeate from an azeotropic mixture of ethanol and methyl ethyl ketone containing 18% ethanol. For the ethyl acetate/water mixture, the membranes showed a very high selectivity to water (> 300) and the permeate was 50--100 times enriched in water relative to the feed. The membranes had permeate fluxes on the order of 0.1--1 kg/m{sup 2}{center_dot}h in the operating range of 55--70{degrees}C. Higher fluxes were obtained by increasing the operating temperature.

  19. Causal attribution and psychobiological response to competition in young men.

    Science.gov (United States)

    Salvador, Alicia; Costa, Raquel; Hidalgo, Vanesa; González-Bono, Esperanza

    2017-06-01

    A contribution to a special issue on Hormones and Human Competition. Psychoneuroendocrine effects of competition have been widely accepted as a clear example of the relationship between androgens and aggressive/dominant behavior in humans. However, results about the effects of competitive outcomes are quite heterogeneous, suggesting that personal and contextual factors play a moderating role in this relationship. To further explore these dimensions, we aimed to examine (i) the effect of competition and its outcome on the psychobiological response to a laboratory competition in young men, and (ii) the moderating role of some cognitive dimensions such as causal attributions. To do so, we compared the responses of 56 healthy young men faced with two competitive tasks with different instructions. Twenty-eight men carried out a task whose instructions led subjects to think the outcome was due to their personal performance ("merit" task), whereas 28 other men faced a task whose outcome was attributable to luck ("chance" task). In both cases, outcome was manipulated by the experimenter. Salivary steroid hormones (testosterone and cortisol), cardiovascular variables (heart rate and blood pressure), and emotional state (mood and anxiety) were measured at different moments before, during and after both tasks. Our results did not support the "winner-loser effect" because no significant differences were found in the responses of winners and losers. However, significantly higher values on the testosterone and cardiovascular variables, along with slight decreases in positive mood, were associated with the merit-based competition, but not the chance-based condition. In addition, an exploratory factorial analysis grouped the response components into two patterns traditionally related to more active or more passive behaviors. Thus, our results suggest that the perception of contributing to the outcome is relevant in the psychobiological response to competition in men. Overall, our

  20. ¿CONFIEREN PODERES CAUSALES LOS UNIVERSALES TRASCENDENTES?

    Directory of Open Access Journals (Sweden)

    José Tomás Alvarado Marambio

    2013-11-01

    Full Text Available This work discusses the so-called ‘Eleatic’ argument against the existence of transcendent universals, i. e. universals which does not require instantiation for its existence. The Eleatic Principle states that everything produces a difference in the causal powers of something. As transcendent universals seem not to produce such a difference, transcendent universals seem not to exist. The argument depends crucially on the justification and the interpretation of the Eleatic Principle. It is argued, first, that it is not very clear that the principle is justified, and, second, that there are several alternatives for its interpretation, in relation with the different theories one can endorse about modality or causality. Anti-realist theories of modality or causality are not very appropriate for the understanding of what should be a ‘causal power’. Neither does a realist theory of causality conjoined with a combinatorial theory of possible worlds. A ‘causal power’ seems to be better understood in connection with a realist –non-reductionist– theory of causality and a causal theory of modality. Taken in this way the Eleatic Principle, nonetheless, it is argued that transcendent universals do ‘produce’ a difference in causal powers, for every causal connection requires such universals for its existence.

  1. Trivariate causality between economic growth, urbanisation and electricity consumption in Angola: Cointegration and causality analysis

    International Nuclear Information System (INIS)

    Solarin, Sakiru Adebola; Shahbaz, Muhammad

    2013-01-01

    This paper investigates the causal relationship between economic growth, urbanisation and electricity consumption in the case of Angola, while utilizing the data over the period of 1971–2009. We have applied Lee and Strazicich (2003. The Review of Economics and Statistics 63, 1082–1089; 2004. Working Paper. Department of Economics, Appalachian State University) unit root tests to examine the stationarity properties of the series. Using the Gregory–Hansen structural break cointegration procedure as a complement, we employ the ARDL bounds test to investigate long run relationships. The VECM Granger causality test is subsequently used to examine the direction of causality between economic growth, urbanisation, and electricity consumption. Our results indicate the existence of long run relationships. We further observe evidence in favour of bidirectional causality between electricity consumption and economic growth. The feedback hypothesis is also found between urbanisation and economic growth. Urbanisation and electricity consumption Granger cause each other. We conclude that Angola is energy-dependent country. Consequently, the relevant authorities should boost electricity production as one of the means of achieving sustainable economic development in the long run. - Highlights: • We consider the link between electricity consumption and economic growth in Angola. • Urbanisation is added to turn the research into a trivariate investigation. • Various time series procedures are used. • Results show that increasing electricity will improve economic growth in Angola. • Results show urbanisations reduced economic growth during civil war

  2. God Does Not Play Dice: Causal Determinism and Preschoolers' Causal Inferences

    Science.gov (United States)

    Schulz, Laura E.; Sommerville, Jessica

    2006-01-01

    Three studies investigated children's belief in causal determinism. If children are determinists, they should infer unobserved causes whenever observed causes appear to act stochastically. In Experiment 1, 4-year-olds saw a stochastic generative cause and inferred the existence of an unobserved inhibitory cause. Children traded off inferences…

  3. Quantum causality conceptual issues in the causal theory of quantum mechanics

    CERN Document Server

    Riggs, Peter J; French, Steven RD

    2009-01-01

    This is a treatise devoted to the foundations of quantum physics and the role that causality plays in the microscopic world governed by the laws of quantum mechanics. The book is controversial and will engender some lively debate on the various issues raised.

  4. Causality as a Rigorous Notion and Quantitative Causality Analysis with Time Series

    Science.gov (United States)

    Liang, X. S.

    2017-12-01

    Given two time series, can one faithfully tell, in a rigorous and quantitative way, the cause and effect between them? Here we show that this important and challenging question (one of the major challenges in the science of big data), which is of interest in a wide variety of disciplines, has a positive answer. Particularly, for linear systems, the maximal likelihood estimator of the causality from a series X2 to another series X1, written T2→1, turns out to be concise in form: T2→1 = [C11 C12 C2,d1 — C112 C1,d1] / [C112 C22 — C11C122] where Cij (i,j=1,2) is the sample covariance between Xi and Xj, and Ci,dj the covariance between Xi and ΔXj/Δt, the difference approximation of dXj/dt using the Euler forward scheme. An immediate corollary is that causation implies correlation, but not vice versa, resolving the long-standing debate over causation versus correlation. The above formula has been validated with touchstone series purportedly generated with one-way causality that evades the classical approaches such as Granger causality test and transfer entropy analysis. It has also been applied successfully to the investigation of many real problems. Through a simple analysis with the stock series of IBM and GE, an unusually strong one-way causality is identified from the former to the latter in their early era, revealing to us an old story, which has almost faded into oblivion, about "Seven Dwarfs" competing with a "Giant" for the computer market. Another example presented here regards the cause-effect relation between the two climate modes, El Niño and Indian Ocean Dipole (IOD). In general, these modes are mutually causal, but the causality is asymmetric. To El Niño, the information flowing from IOD manifests itself as a propagation of uncertainty from the Indian Ocean. In the third example, an unambiguous one-way causality is found between CO2 and the global mean temperature anomaly. While it is confirmed that CO2 indeed drives the recent global warming

  5. Causality, randomness, intelligibility, and the epistemology of the cell.

    Science.gov (United States)

    Dougherty, Edward R; Bittner, Michael L

    2010-06-01

    Because the basic unit of biology is the cell, biological knowledge is rooted in the epistemology of the cell, and because life is the salient characteristic of the cell, its epistemology must be centered on its livingness, not its constituent components. The organization and regulation of these components in the pursuit of life constitute the fundamental nature of the cell. Thus, regulation sits at the heart of biological knowledge of the cell and the extraordinary complexity of this regulation conditions the kind of knowledge that can be obtained, in particular, the representation and intelligibility of that knowledge. This paper is essentially split into two parts. The first part discusses the inadequacy of everyday intelligibility and intuition in science and the consequent need for scientific theories to be expressed mathematically without appeal to commonsense categories of understanding, such as causality. Having set the backdrop, the second part addresses biological knowledge. It briefly reviews modern scientific epistemology from a general perspective and then turns to the epistemology of the cell. In analogy with a multi-faceted factory, the cell utilizes a highly parallel distributed control system to maintain its organization and regulate its dynamical operation in the face of both internal and external changes. Hence, scientific knowledge is constituted by the mathematics of stochastic dynamical systems, which model the overall relational structure of the cell and how these structures evolve over time, stochasticity being a consequence of the need to ignore a large number of factors while modeling relatively few in an extremely complex environment.

  6. Electronic components

    CERN Document Server

    Colwell, Morris A

    1976-01-01

    Electronic Components provides a basic grounding in the practical aspects of using and selecting electronics components. The book describes the basic requirements needed to start practical work on electronic equipment, resistors and potentiometers, capacitance, and inductors and transformers. The text discusses semiconductor devices such as diodes, thyristors and triacs, transistors and heat sinks, logic and linear integrated circuits (I.C.s) and electromechanical devices. Common abbreviations applied to components are provided. Constructors and electronics engineers will find the book useful

  7. Component testing

    International Nuclear Information System (INIS)

    Hutchings, M.T.; Schofield, Peter; Seymour, W.A.J.

    1986-01-01

    A method for non-destructive testing of an industrial component to ascertain if it is a single crystal, and to find the crystal orientations of those parts of the component which are single crystals, involves irradiating the component with a monochromatic collimated neutron beam. Diffracted neutron beams are observed live by means of LiF/ZnS composite screen, an image intensifier and a television camera and screen. (author)

  8. Initiation of explosive mixtures having multi-sized structures

    Science.gov (United States)

    Vasil'ev, A. A.; Vasiliev, V. A.; Trotsyuk, A. V.

    2016-10-01

    Theory of strong blast was used as the basis for the experimental method of determining of the energy of source which provides the initiation of combustible mixture. For mono-fuel mixtures the following parameters were experimentally determined at testing: the critical initiation energy of a cylindrical detonation wave in mixtures 2H2+O2 and C2H2+2.5O2 (exploding wire); the critical initiation energy of a spherical detonation in a mixture of C2H2+2.5O2 (electrical discharge). Similarly, for the double-fuel mixtures of acetylene - nitrous oxide - oxygen (having bifurcation cellular structures) the critical initiation energy of spherical wave was determined also. It was found that for the stoichiometric mixture on both fuel components the critical energy of mixture with the bifurcation structure was undervalued by several times in comparison with the value of the critical energy for the mono-fuel mixture, in which the cell size at a given pressure is determined by the large scale of bifurcation cells. This result shows the decrease of the critical energy with an increase of the number of "hot spots", which are the numerous areas of collision of the transverse waves of large and small scales in a mixture with bifurcation properties.

  9. utilization of ensiled metabolizable mixture of cassava peel and caged

    African Journals Online (AJOL)

    Toluwande

    2011-09-05

    Sep 5, 2011 ... component of farm animal feed has been documented [8, 9]. Therefore, this investigation was carried out to study the response of broiler chicken to fermented mixture of cassava peel and caged layers' manure (obtained as farm residue or waste) and fed as a component of energy source in the feed.

  10. Entanglement, non-Markovianity, and causal non-separability

    Science.gov (United States)

    Milz, Simon; Pollock, Felix A.; Le, Thao P.; Chiribella, Giulio; Modi, Kavan

    2018-03-01

    Quantum mechanics, in principle, allows for processes with indefinite causal order. However, most of these causal anomalies have not yet been detected experimentally. We show that every such process can be simulated experimentally by means of non-Markovian dynamics with a measurement on additional degrees of freedom. In detail, we provide an explicit construction to implement arbitrary a causal processes. Furthermore, we give necessary and sufficient conditions for open system dynamics with measurement to yield processes that respect causality locally, and find that tripartite entanglement and nonlocal unitary transformations are crucial requirements for the simulation of causally indefinite processes. These results show a direct connection between three counter-intuitive concepts: entanglement, non-Markovianity, and causal non-separability.

  11. A MATLAB toolbox for Granger causal connectivity analysis.

    Science.gov (United States)

    Seth, Anil K

    2010-02-15

    Assessing directed functional connectivity from time series data is a key challenge in neuroscience. One approach to this problem leverages a combination of Granger causality analysis and network theory. This article describes a freely available MATLAB toolbox--'Granger causal connectivity analysis' (GCCA)--which provides a core set of methods for performing this analysis on a variety of neuroscience data types including neuroelectric, neuromagnetic, functional MRI, and other neural signals. The toolbox includes core functions for Granger causality analysis of multivariate steady-state and event-related data, functions to preprocess data, assess statistical significance and validate results, and to compute and display network-level indices of causal connectivity including 'causal density' and 'causal flow'. The toolbox is deliberately small, enabling its easy assimilation into the repertoire of researchers. It is however readily extensible given proficiency with the MATLAB language. Copyright 2009 Elsevier B.V. All rights reserved.

  12. Energy consumption and economic growth: Parametric and non-parametric causality testing for the case of Greece

    International Nuclear Information System (INIS)

    Dergiades, Theologos; Martinopoulos, Georgios; Tsoulfidis, Lefteris

    2013-01-01

    The objective of this paper is to contribute towards the understanding of the linear and non-linear causal linkages between energy consumption and economic activity, making use of annual time series data of Greece for the period 1960–2008. Two are the salient features of our study: first, the total energy consumption has been adjusted for qualitative differences among its constituent components through the thermodynamics of energy conversion. In doing so, we rule out the possibility of a misleading inference with respect to causality due to aggregation bias. Second, the investigation of the causal linkage between economic growth and the adjusted for quality total energy consumption is conducted within a non-linear context. Our empirical results reveal significant unidirectional both linear and non-linear causal linkages running from total useful energy to economic growth. These findings may provide valuable information for the contemplation of more effective energy policies with respect to both the consumption of energy and environmental protection. - Highlights: ► The energy consumption and economic growth nexus is investigated for Greece. ► A quality-adjusted energy series is used in our analysis. ► The causality testing procedure is conducted within a non-linear context. ► A causality running from energy consumption to economic growth is verified

  13. The Interplay of Implicit Causality, Structural Heuristics, and Anaphor Type in Ambiguous Pronoun Resolution.

    Science.gov (United States)

    Järvikivi, Juhani; van Gompel, Roger P G; Hyönä, Jukka

    2017-06-01

    Two visual-world eye-tracking experiments investigating pronoun resolution in Finnish examined the time course of implicit causality information relative to both grammatical role and order-of-mention information. Experiment 1 showed an effect of implicit causality that appeared at the same time as the first-mention preference. Furthermore, when we counterbalanced the semantic roles of the verbs, we found no effect of grammatical role, suggesting the standard observed subject preference has a large semantic component. Experiment 2 showed that both the personal pronoun hän and the demonstrative tämä preferred the antecedent consistent with the implicit causality bias; tämä was not interpreted as referring to the semantically non-prominent entity. In contrast, structural prominence affected hän and tämä differently: we found a first-mention preference for hän, but a second-mention preference for tämä. The results suggest that semantic implicit causality information has an immediate effect on pronoun resolution and its use is not delayed relative to order-of-mention information. Furthermore, they show that order-of-mention differentially affects different types of anaphoric expressions, but semantic information has the same effect.

  14. Causal attribution among women with breast cancer

    Directory of Open Access Journals (Sweden)

    Ana Carolina W. B. Peuker

    2016-01-01

    Full Text Available Abstract Causal attribution among women with breast cancer was studied. The study included 157 women outpatients with breast cancer. A form for sociodemographic and clinical data and the Revised Illness Perception Questionnaire (IPQ-R were used. The results showed that women attributed breast cancer primarily to psychological causes, which does not correspond to known multifactorial causes validated by the scientific community. Providing high quality, patient-centered care requires sensitivity to breast cancer women’s beliefs about the causes of their cancer and awareness of how it can influence patient’s health behaviors after diagnosis. If women with breast cancer attribute the illness to modifiable factors then they can keep a healthy lifestyle, improving their recovery and decrease the probability of cancer recurrence after diagnosis.

  15. Delinquency among pathological gamblers: A causal approach.

    Science.gov (United States)

    Meyer, G; Fabian, T

    1992-03-01

    In a comprehensive research project on gamblers in self-help groups in West Germany one object of investigation was the question of whether or not pathological gambling has a criminogenic effect. 54.5% of the 437 members of Gamblers Anonymous interviewed stated that they had committed illegal actions in order to obtain money for gambling. Comparisons of this sub-group with those interviewees who did not admit having committed criminal offences show distinct differences: Those who admitted illegal action were more excessive in their gambling behavior and experienced a higher degree of subjective satisfaction through gambling. They also showed a more pronounced problem behavior and more psychosocial problems because of gambling. A multiple regression within the framework of path analysis was computed in order to explore causal links between pathological gambling and delinquency. The results support the hypothesis that pathological gambling can lead to delinquent behavior. Forensic implications are discussed.

  16. Causality Constraints in Conformal Field Theory

    CERN Multimedia

    CERN. Geneva

    2015-01-01

    Causality places nontrivial constraints on QFT in Lorentzian signature, for example fixing the signs of certain terms in the low energy Lagrangian. In d-dimensional conformal field theory, we show how such constraints are encoded in crossing symmetry of Euclidean correlators, and derive analogous constraints directly from the conformal bootstrap (analytically). The bootstrap setup is a Lorentzian four-point function corresponding to propagation through a shockwave. Crossing symmetry fixes the signs of certain log terms that appear in the conformal block expansion, which constrains the interactions of low-lying operators. As an application, we use the bootstrap to rederive the well known sign constraint on the (∂φ)4 coupling in effective field theory, from a dual CFT. We also find constraints on theories with higher spin conserved currents. Our analysis is restricted to scalar correlators, but we argue that similar methods should also impose nontrivial constraints on the interactions of spinni...

  17. A study in cosmology and causal thermodynamics

    International Nuclear Information System (INIS)

    Oliveira, H.P. de.

    1986-01-01

    The especial relativity of thermodynamic theories for reversible and irreversible processes in continuous medium is studied. The formalism referring to equilibrium and non-equilibrium configurations, and theories which includes the presence of gravitational fields are discussed. The nebular model in contraction with dissipative processes identified by heat flux and volumetric viscosity is thermodymically analysed. This model is presented by a plane conformal metric. The temperature, pressure, entropy and entropy production within thermodynamic formalism which adopts the hypothesis of local equilibrium, is calculated. The same analysis is carried out considering a causal thermodynamics, which establishes a local entropy of non-equilibrium. Possible homogeneous and isotropic cosmological models, considering the new phenomenological equation for volumetric viscosity deriving from cause thermodynamics are investigated. The found out models have plane spatial section (K=0) and some ones do not have singularities. The energy conditions are verified and the entropy production for physically reasobable models are calculated. (M.C.K.) [pt

  18. On the causal set–continuum correspondence

    International Nuclear Information System (INIS)

    Saravani, Mehdi; Aslanbeigi, Siavash

    2014-01-01

    We present two results that concern certain aspects of the question: when is a causal set well approximated by a Lorentzian manifold? The first result is a theorem that shows that the number–volume correspondence, if required to hold even for arbitrarily small regions, is best realized via Poisson sprinkling. The second result concerns a family of lattices in 1+1 dimensional Minkowski space, known as Lorentzian lattices, which we show provide a much better number–volume correspondence than Poisson sprinkling for large volumes. We argue, however, that this feature should not persist in higher dimensions. We conclude by conjecturing a form of the aforementioned theorem that holds under weaker assumptions, namely that Poisson sprinkling provides the best number–volume correspondence in 3+1 dimensions for spacetime regions with macroscopically large volumes. (paper)

  19. Exploring Torus Universes in Causal Dynamical Triangulations

    DEFF Research Database (Denmark)

    Budd, Timothy George; Loll, R.

    2013-01-01

    Motivated by the search for new observables in nonperturbative quantum gravity, we consider Causal Dynamical Triangulations (CDT) in 2+1 dimensions with the spatial topology of a torus. This system is of particular interest, because one can study not only the global scale factor, but also global ....... Apart from setting the stage for the analysis of shape dynamics on the torus, the new set-up highlights the role of nontrivial boundaries and topology....... shape variables in the presence of arbitrary quantum fluctuations of the geometry. Our initial investigation focusses on the dynamics of the scale factor and uncovers a qualitatively new behaviour, which leads us to investigate a novel type of boundary conditions for the path integral. Comparing large...

  20. Equity Theory Ratios as Causal Schemas.

    Science.gov (United States)

    Arvanitis, Alexios; Hantzi, Alexandra

    2016-01-01

    Equity theory approaches justice evaluations based on ratios of exchange inputs to exchange outcomes. Situations are evaluated as just if ratios are equal and unjust if unequal. We suggest that equity ratios serve a more fundamental cognitive function than the evaluation of justice. More particularly, we propose that they serve as causal schemas for exchange outcomes, that is, they assist in determining whether certain outcomes are caused by inputs of other people in the context of an exchange process. Equality or inequality of ratios in this sense points to an exchange process. Indeed, Study 1 shows that different exchange situations, such as disproportional or balanced proportional situations, create perceptions of give-and-take on the basis of equity ratios. Study 2 shows that perceptions of justice are based more on communicatively accepted rules of interaction than equity-based evaluations, thereby offering a distinction between an attribution and an evaluation cognitive process for exchange outcomes.

  1. Equity Theory Ratios as Causal Schemas

    Directory of Open Access Journals (Sweden)

    Alexios Arvanitis

    2016-08-01

    Full Text Available Equity theory approaches justice evaluations based on ratios of exchange inputs to exchange outcomes. Situations are evaluated as just if ratios are equal and unjust if unequal. We suggest that equity ratios serve a more fundamental cognitive function than the evaluation of justice. More particularly, we propose that they serve as causal schemas for exchange outcomes, that is, they assist in determining whether certain outcomes are caused by inputs of other people in the context of an exchange process. Equality or inequality of ratios in this sense points to an exchange process. Indeed, Study 1 shows that different exchange situations, such as disproportional or balanced proportional situations, create perceptions of give-and-take on the basis of equity ratios. Study 2 shows that perceptions of justice are based more on communicatively accepted rules of interaction than equity-based evaluations, thereby offering a distinction between an attribution and an evaluation cognitive process for exchange outcomes.

  2. Modeling and analysis of personal exposures to VOC mixtures using copulas

    Science.gov (United States)

    Su, Feng-Chiao; Mukherjee, Bhramar; Batterman, Stuart

    2014-01-01

    Environmental exposures typically involve mixtures of pollutants, which must be understood to evaluate cumulative risks, that is, the likelihood of adverse health effects arising from two or more chemicals. This study uses several powerful techniques to characterize dependency structures of mixture components in personal exposure measurements of volatile organic compounds (VOCs) with aims of advancing the understanding of environmental mixtures, improving the ability to model mixture components in a statistically valid manner, and demonstrating broadly applicable techniques. We first describe characteristics of mixtures and introduce several terms, including the mixture fraction which represents a mixture component's share of the total concentration of the mixture. Next, using VOC exposure data collected in the Relationship of Indoor Outdoor and Personal Air (RIOPA) study, mixtures are identified using positive matrix factorization (PMF) and by toxicological mode of action. Dependency structures of mixture components are examined using mixture fractions and modeled using copulas, which address dependencies of multiple variables across the entire distribution. Five candidate copulas (Gaussian, t, Gumbel, Clayton, and Frank) are evaluated, and the performance of fitted models was evaluated using simulation and mixture fractions. Cumulative cancer risks are calculated for mixtures, and results from copulas and multivariate lognormal models are compared to risks calculated using the observed data. Results obtained using the RIOPA dataset showed four VOC mixtures, representing gasoline vapor, vehicle exhaust, chlorinated solvents and disinfection by-products, and cleaning products and odorants. Often, a single compound dominated the mixture, however, mixture fractions were generally heterogeneous in that the VOC composition of the mixture changed with concentration. Three mixtures were identified by mode of action, representing VOCs associated with hematopoietic, liver

  3. Spinodal decomposition of chemically reactive binary mixtures

    Science.gov (United States)

    Lamorgese, A.; Mauri, R.

    2016-08-01

    We simulate the influence of a reversible isomerization reaction on the phase segregation process occurring after spinodal decomposition of a deeply quenched regular binary mixture, restricting attention to systems wherein material transport occurs solely by diffusion. Our theoretical approach follows a diffuse-interface model of partially miscible binary mixtures wherein the coupling between reaction and diffusion is addressed within the frame of nonequilibrium thermodynamics, leading to a linear dependence of the reaction rate on the chemical affinity. Ultimately, the rate for an elementary reaction depends on the local part of the chemical potential difference since reaction is an inherently local phenomenon. Based on two-dimensional simulation results, we express the competition between segregation and reaction as a function of the Damköhler number. For a phase-separating mixture with components having different physical properties, a skewed phase diagram leads, at large times, to a system converging to a single-phase equilibrium state, corresponding to the absolute minimum of the Gibbs free energy. This conclusion continues to hold for the critical phase separation of an ideally perfectly symmetric binary mixture, where the choice of final equilibrium state at large times depends on the initial mean concentration being slightly larger or less than the critical concentration.

  4. Predicting diffusivities in dense fluid mixtures

    Directory of Open Access Journals (Sweden)

    C. DARIVA

    1999-09-01

    Full Text Available In this work the Enskog solution of the Boltzmann equation, as corrected by Speedy, together with the Weeks-Chandler-Andersen (WCA perturbation theory of liquids is employed in correlating and predicting self-diffusivities of dense fluids. Afterwards this theory is used to estimate mutual diffusion coefficients of solutes at infinite dilution in sub and supercritical solvents. We have also investigated the behavior of Fick diffusion coefficients in the proximity of a binary vapor-liquid critical point since this subject is of great interest for extraction purposes. The approach presented here, which makes use of a density and temperature dependent hard-sphere diameter, is shown to be excellent for predicting diffusivities in dense pure fluids and fluid mixtures. The calculations involved highly nonideal mixtures as well as systems with high molecular asymmetry. The predicted diffusivities are in good agreement with the experimental data for the pure and binary systems. The methodology proposed here makes only use of pure component information and density of mixtures. The simple algebraic relations are proposed without any binary adjustable parameters and can be readily used for estimating diffusivities in multicomponent mixtures.

  5. Pool Boiling of Hydrocarbon Mixtures on Water

    Energy Technology Data Exchange (ETDEWEB)

    Boee, R.

    1996-09-01

    In maritime transport of liquefied natural gas (LNG) there is a risk of spilling cryogenic liquid onto water. The present doctoral thesis discusses transient boiling experiments in which liquid hydrocarbons were poured onto water and left to boil off. Composition changes during boiling are believed to be connected with the initiation of rapid phase transition in LNG spilled on water. 64 experimental runs were carried out, 14 using pure liquid methane, 36 using methane-ethane, and 14 using methane-propane binary mixtures of different composition. The water surface was open to the atmosphere and covered an area of 200 cm{sup 2} at 25 - 40{sup o}C. The heat flux was obtained by monitoring the change of mass vs time. The void fraction in the boiling layer was measured with a gamma densitometer, and a method for adapting this measurement concept to the case of a boiling cryogenic liquid mixture is suggested. Significant differences in the boil-off characteristics between pure methane and binary mixtures revealed by previous studies are confirmed. Pure methane is in film boiling, whereas the mixtures appear to enter the transitional boiling regime with only small amounts of the second component added. The results indicate that the common assumption that LNG will be in film boiling on water because of the high temperature difference, may be questioned. Comparison with previous work shows that at this small scale the results are influenced by the experimental apparatus and procedures. 66 refs., 76 figs., 28 tabs.

  6. Granger Causality and Transfer Entropy Are Equivalent for Gaussian Variables

    Science.gov (United States)

    Barnett, Lionel; Barrett, Adam B.; Seth, Anil K.

    2009-12-01

    Granger causality is a statistical notion of causal influence based on prediction via vector autoregression. Developed originally in the field of econometrics, it has since found application in a broader arena, particularly in neuroscience. More recently transfer entropy, an information-theoretic measure of time-directed information transfer between jointly dependent processes, has gained traction in a similarly wide field. While it has been recognized that the two concepts must be related, the exact relationship has until now not been formally described. Here we show that for Gaussian variables, Granger causality and transfer entropy are entirely equivalent, thus bridging autoregressive and information-theoretic approaches to data-driven causal inference.

  7. Causal topology in future and past distinguishing spacetimes

    Science.gov (United States)

    Parrikar, Onkar; Surya, Sumati

    2011-08-01

    The causal structure of a strongly causal spacetime is particularly well endowed. Not only does it determine the conformal spacetime geometry when the spacetime dimension n > 2, as shown by Malament and Hawking-King-McCarthy (MHKM), but also the manifold dimension. The MHKM result, however, applies more generally to spacetimes satisfying the weaker causality condition of future and past distinguishability (FPD), and it is an important question whether the causal structure of such spacetimes can determine the manifold dimension. In this work, we show that the answer to this question is in the affirmative. We investigate the properties of future or past distinguishing spacetimes and show that their causal structures determine the manifold dimension. This gives a non-trivial generalization of the MHKM theorem and suggests that there is a causal topology for FPD spacetimes which encodes manifold dimension and which is strictly finer than the Alexandrov topology. We show that such a causal topology does exist. We construct it using a convergence criterion based on sequences of 'chain intervals' which are the causal analogues of null geodesic segments. We show that when the region of strong causality violation satisfies a local achronality condition, this topology is equivalent to the manifold topology in an FPD spacetime.

  8. Causal topology in future and past distinguishing spacetimes

    Energy Technology Data Exchange (ETDEWEB)

    Parrikar, Onkar [Birla Institute of Technology and Science-Pilani, Goa campus, Goa 403 726 (India); Surya, Sumati, E-mail: ssurya@rri.res.in [Raman Research Institute, CV Raman Ave, Sadashivanagar, Bangalore 560 080 (India)

    2011-08-07

    The causal structure of a strongly causal spacetime is particularly well endowed. Not only does it determine the conformal spacetime geometry when the spacetime dimension n > 2, as shown by Malament and Hawking-King-McCarthy (MHKM), but also the manifold dimension. The MHKM result, however, applies more generally to spacetimes satisfying the weaker causality condition of future and past distinguishability (FPD), and it is an important question whether the causal structure of such spacetimes can determine the manifold dimension. In this work, we show that the answer to this question is in the affirmative. We investigate the properties of future or past distinguishing spacetimes and show that their causal structures determine the manifold dimension. This gives a non-trivial generalization of the MHKM theorem and suggests that there is a causal topology for FPD spacetimes which encodes manifold dimension and which is strictly finer than the Alexandrov topology. We show that such a causal topology does exist. We construct it using a convergence criterion based on sequences of 'chain intervals' which are the causal analogues of null geodesic segments. We show that when the region of strong causality violation satisfies a local achronality condition, this topology is equivalent to the manifold topology in an FPD spacetime.

  9. A general, multivariate definition of causal effects in epidemiology.

    Science.gov (United States)

    Flanders, W Dana; Klein, Mitchel

    2015-07-01

    Population causal effects are often defined as contrasts of average individual-level counterfactual outcomes, comparing different exposure levels. Common examples include causal risk difference and risk ratios. These and most other examples emphasize effects on disease onset, a reflection of the usual epidemiological interest in disease occurrence. Exposure effects on other health characteristics, such as prevalence or conditional risk of a particular disability, can be important as well, but contrasts involving these other measures may often be dismissed as non-causal. For example, an observed prevalence ratio might often viewed as an estimator of a causal incidence ratio and hence subject to bias. In this manuscript, we provide and evaluate a definition of causal effects that generalizes those previously available. A key part of the generalization is that contrasts used in the definition can involve multivariate, counterfactual outcomes, rather than only univariate outcomes. An important consequence of our generalization is that, using it, one can properly define causal effects based on a wide variety of additional measures. Examples include causal prevalence ratios and differences and causal conditional risk ratios and differences. We illustrate how these additional measures can be useful, natural, easily estimated, and of public health importance. Furthermore, we discuss conditions for valid estimation of each type of causal effect, and how improper interpretation or inferences for the wrong target population can be sources of bias.

  10. Realistic environmental mixtures of micropollutants in surface, drinking, and recycled water: herbicides dominate the mixture toxicity toward algae.

    Science.gov (United States)

    Tang, Janet Y M; Escher, Beate I

    2014-06-01

    Mixture toxicity studies with herbicides have focused on a few priority components that are most likely to cause environmental impacts, and experimental mixtures were often designed as equipotent mixtures; however, real-world mixtures are made up of chemicals with different modes of toxic action at arbitrary concentration ratios. The toxicological significance of environmentally realistic mixtures has only been scarcely studied. Few studies have simultaneously compared the mixture effect of water samples with designed reference mixtures comprised of the ratios of analytically detected concentrations in toxicity tests. In the present study, the authors address the effect of herbicides and other chemicals on inhibition of photosynthesis and algal growth rate. The authors tested water samples including secondary treated wastewater effluent, recycled water, drinking water, and storm water in the combined algae assay. The detected chemicals were mixed in the concentration ratios detected, and the biological effects of the water samples were compared with the designed mixtures of individual detected chemicals to quantify the fraction of effect caused by unknown chemicals. The results showed that herbicides dominated the algal toxicity in these environmentally realistic mixtures, and the contribution by the non-herbicides was negligible. A 2-stage model, which used concentration addition within the groups of herbicides and non-herbicides followed by the model of independent action to predict the mixture effect of the two groups, could predict the experimental mixture toxicity effectively, but the concentration addition model for herbicides was robust and sufficient for complex mixtures. Therefore, the authors used the bioanalytical equivalency concept to derive effect-based trigger values for algal toxicity for monitoring water quality in recycled and surface water. All water samples tested would be compliant with the proposed trigger values associated with the

  11. Research of Deformation of Clay Soil Mixtures Mixtures

    OpenAIRE

    Romas Girkontas; Tadas Tamošiūnas; Andrius Savickas

    2014-01-01

    The aim of this article is to determine clay soils and clay soils mixtures deformations during drying. Experiments consisted from: a) clay and clay mixtures bridges (height ~ 0,30 m, span ~ 1,00 m); b) tiles of clay and clay, sand and straw (height, length, wide); c) cylinders of clay; clay and straw; clay, straw and sand (diameter; height). According to the findings recommendations for clay and clay mixtures drying technology application were presented. During the experiment clay bridge bear...

  12. Gaussian Process-Mixture Conditional Heteroscedasticity.

    Science.gov (United States)

    Platanios, Emmanouil A; Chatzis, Sotirios P

    2014-05-01

    Generalized autoregressive conditional heteroscedasticity (GARCH) models have long been considered as one of the most successful families of approaches for volatility modeling in financial return series. In this paper, we propose an alternative approach based on methodologies widely used in the field of statistical machine learning. Specifically, we propose a novel nonparametric Bayesian mixture of Gaussian process regression models, each component of which models the noise variance process that contaminates the observed data as a separate latent Gaussian process driven by the observed data. This way, we essentially obtain a Gaussian process-mixture conditional heteroscedasticity (GPMCH) model for volatility modeling in financial return series. We impose a nonparametric prior with power-law nature over the distribution of the model mixture components, namely the Pitman-Yor process prior, to allow for better capturing modeled data distributions with heavy tails and skewness. Finally, we provide a copula-based approach for obtaining a predictive posterior for the covariances over the asset returns modeled by means of a postulated GPMCH model. We evaluate the efficacy of our approach in a number of benchmark scenarios, and compare its performance to state-of-the-art methodologies.

  13. Concomitant variables in finite mixture models

    NARCIS (Netherlands)

    Wedel, M

    The standard mixture model, the concomitant variable mixture model, the mixture regression model and the concomitant variable mixture regression model all enable simultaneous identification and description of groups of observations. This study reviews the different ways in which dependencies among

  14. SANS studies of critical phenomena in ternary mixtures

    CERN Document Server

    Bulavyn, L A; Hohryakov, A; Garamus, V; Avdeev, M; Almasy, L

    2002-01-01

    Critical behaviour of a quasi-binary liquid mixture is investigated by small-angle neutron scattering. Analysis of the changes of the critical parameters, caused by addition of a small amount of electrolyte into the binary mixture 3-methylpyridine-heavy water, shows that the third component does not change the 3D Ising-type behaviour of the system; a crossover towards the mean-field behaviour is not observed. (orig.)

  15. Pseudoideal detonation of mechanoactivated mixtures of ammonium perchlorate with nanoaluminum

    Science.gov (United States)

    Shevchenko, A. A.; Dolgoborodov, A. Yu; Brazhnikov, M. A.; Kirilenko, V. G.

    2018-01-01

    Detonation properties of mechanochemical activated ammonium perchlorate with aluminum (AP–Al) mixtures with increased detonation velocity was studied. For compositions with nanoscale aluminum was obtained nonmonotonic dependence of the detonation velocity vs reciprocal diameter. The results generally showed that the combined usage of mechanical activation and nanoscale components of explosive mixtures can significantly increase the detonation ability and reduce the critical diameter to d cr = 7 mm.

  16. Equation of state of strongly coupled plasma mixtures

    International Nuclear Information System (INIS)

    DeWitt, H.E.

    1984-01-01

    Thermodynamic properties of strongly coupled (high density) plasmas of mixtures of light elements have been obtained by Monte Carlo simulations. For an assumed uniform charge background the equation of state of ionic mixtures is a simple extension of the one-component plasma EOS. More realistic electron screening effects are treated in linear response theory and with an appropriate electron dielectric function. Results have been obtained for the ionic pair distribution functions, and for the electric microfield distribution

  17. The Dynamic Causal Relationship between Electricity Consumption and Economic Growth in Ghana: A Trivariate Causality Model

    Directory of Open Access Journals (Sweden)

    Bernard N. Iyke

    2014-06-01

    Full Text Available This paper examines the dynamic causal relationship between electricity consumption and economic growth in Ghana within a trivariate ARDL framework, for the period 1971–2012.The paper obviates the variable omission bias, and the use of cross-sectional techniques that characterise most existing studies. The results show that there is a distinct causal flow from economic growth to electricity consumption: both in the short run and in the long run. This finding supports the growth-led electricity consumption hypothesis, as documented in the literature. The paper urges policymakers in Ghana to resort to alternative sources of electric power generation, in order to reduce any future pressures on the current sources of electricity production. Appropriate monetary policies must also be put in place, in order to accommodate potential inflation hikes stemming from excessive demands for electricity in the near future.

  18. Mutagenicity of complex mixtures

    International Nuclear Information System (INIS)

    Pelroy, R.A.

    1985-01-01

    The effect of coal-derived complex chemical mixtures on the mutagenicity of 6-aminochrysene (6-AC) was determined with Salmonella typhimurium TA98. Previous results suggested that the mutagenic potency of 6-AC for TA98 in the standard microsomal activation (Ames) assay increased if it was presented to the cells mixed with high-boiling coal liquids (CL) from the solvent refined coal (SRC) process. In this year's work, the apparent mutational synergism of CL and 6-AC was independently verified in a fluctuation bioassay which allowed quantitation of mutational frequencies and cell viability. The results of this assay system were similar to those in the Ames assay. Moreover, the fluctation assay revealed that mutagenesis and cellular toxicity induced by 6-AC were both strongly enhanced if 6-AC was presented to the cells mixed in a high-boiling CL. 4 figures

  19. Automatic Control of the Concrete Mixture Homogeneity in Cycling Mixers

    Science.gov (United States)

    Anatoly Fedorovich, Tikhonov; Drozdov, Anatoly

    2018-03-01

    The article describes the factors affecting the concrete mixture quality related to the moisture content of aggregates, since the effectiveness of the concrete mixture production is largely determined by the availability of quality management tools at all stages of the technological process. It is established that the unaccounted moisture of aggregates adversely affects the concrete mixture homogeneity and, accordingly, the strength of building structures. A new control method and the automatic control system of the concrete mixture homogeneity in the technological process of mixing components have been proposed, since the tasks of providing a concrete mixture are performed by the automatic control system of processing kneading-and-mixing machinery with operational automatic control of homogeneity. Theoretical underpinnings of the control of the mixture homogeneity are presented, which are related to a change in the frequency of vibrodynamic vibrations of the mixer body. The structure of the technical means of the automatic control system for regulating the supply of water is determined depending on the change in the concrete mixture homogeneity during the continuous mixing of components. The following technical means for establishing automatic control have been chosen: vibro-acoustic sensors, remote terminal units, electropneumatic control actuators, etc. To identify the quality indicator of automatic control, the system offers a structure flowchart with transfer functions that determine the ACS operation in transient dynamic mode.

  20. Identification and separation of DNA mixtures using peak area information

    DEFF Research Database (Denmark)

    Cowell, R.G.; Lauritzen, Steffen Lilholt; Mortera, J.

    We show how probabilistic expert systems can be used to analyse forensic identification problems involving DNA mixture traces using quantitative peak area information. Peak area is modelled with conditional Gaussian distributions. The expert system can be used for scertaining whether individuals......, whose profiles have been measured, have contributed to the mixture, but also to predict DNA profiles of unknown contributors by separating the mixture into its individual components. The potential of our methodology is illustrated on case data examples and compared with alternative approaces...

  1. Lattice Boltzmann model for thermal binary-mixture gas flows.

    Science.gov (United States)

    Kang, Jinfen; Prasianakis, Nikolaos I; Mantzaras, John

    2013-05-01

    A lattice Boltzmann model for thermal gas mixtures is derived. The kinetic model is designed in a way that combines properties of two previous literature models, namely, (a) a single-component thermal model and (b) a multicomponent isothermal model. A comprehensive platform for the study of various practical systems involving multicomponent mixture flows with large temperature differences is constructed. The governing thermohydrodynamic equations include the mass, momentum, energy conservation equations, and the multicomponent diffusion equation. The present model is able to simulate mixtures with adjustable Prandtl and Schmidt numbers. Validation in several flow configurations with temperature and species concentration ratios up to nine is presented.

  2. World oil and agricultural commodity prices: Evidence from nonlinear causality

    International Nuclear Information System (INIS)

    Nazlioglu, Saban

    2011-01-01

    The increasing co-movements between the world oil and agricultural commodity prices have renewed interest in determining price transmission from oil prices to those of agricultural commodities. This study extends the literature on the oil-agricultural commodity prices nexus, which particularly concentrates on nonlinear causal relationships between the world oil and three key agricultural commodity prices (corn, soybeans, and wheat). To this end, the linear causality approach of Toda-Yamamoto and the nonparametric causality method of Diks-Panchenko are applied to the weekly data spanning from 1994 to 2010. The linear causality analysis indicates that the oil prices and the agricultural commodity prices do not influence each other, which supports evidence on the neutrality hypothesis. In contrast, the nonlinear causality analysis shows that: (i) there are nonlinear feedbacks between the oil and the agricultural prices, and (ii) there is a persistent unidirectional nonlinear causality running from the oil prices to the corn and to the soybeans prices. The findings from the nonlinear causality analysis therefore provide clues for better understanding the recent dynamics of the agricultural commodity prices and some policy implications for policy makers, farmers, and global investors. This study also suggests the directions for future studies. - Research highlights: → This study determines the price transmission mechanisms between the world oil and three key agricultural commodity prices (corn, soybeans, and wheat). → The linear and nonlinear cointegration and causality methods are carried out. → The linear causality analysis supports evidence on the neutrality hypothesis. → The nonlinear causality analysis shows that there is a persistent unidirectional causality from the oil prices to the corn and to the soybeans prices.

  3. Inferring Causalities in Landscape Genetics: An Extension of Wright's Causal Modeling to Distance Matrices.

    Science.gov (United States)

    Fourtune, Lisa; Prunier, Jérôme G; Paz-Vinas, Ivan; Loot, Géraldine; Veyssière, Charlotte; Blanchet, Simon

    2018-04-01

    Identifying landscape features that affect functional connectivity among populations is a major challenge in fundamental and applied sciences. Landscape genetics combines landscape and genetic data to address this issue, with the main objective of disentangling direct and indirect relationships among an intricate set of variables. Causal modeling has strong potential to address the complex nature of landscape genetic data sets. However, this statistical approach was not initially developed to address the pairwise distance matrices commonly used in landscape genetics. Here, we aimed to extend the applicability of two causal modeling methods-that is, maximum-likelihood path analysis and the directional separation test-by developing statistical approaches aimed at handling distance matrices and improving functional connectivity inference. Using simulations, we showed that these approaches greatly improved the robustness of the absolute (using a frequentist approach) and relative (using an information-theoretic approach) fits of the tested models. We used an empirical data set combining genetic information on a freshwater fish species (Gobio occitaniae) and detailed landscape descriptors to demonstrate the usefulness of causal modeling to identify functional connectivity in wild populations. Specifically, we demonstrated how direct and indirect relationships involving altitude, temperature, and oxygen concentration influenced within- and between-population genetic diversity of G. occitaniae.

  4. NMRI Measurements of Flow of Granular Mixtures

    Science.gov (United States)

    Nakagawa, Masami; Waggoner, R. Allen; Fukushima, Eiichi

    1996-01-01

    We investigate complex 3D behavior of granular mixtures in shaking and shearing devices. NMRI can non-invasively measure concentration, velocity, and velocity fluctuations of flows of suitable particles. We investigate origins of wall-shear induced convection flow of single component particles by measuring the flow and fluctuating motion of particles near rough boundaries. We also investigate if a mixture of different size particles segregate into their own species under the influence of external shaking and shearing disturbances. These non-invasive measurements will reveal true nature of convecting flow properties and wall disturbance. For experiments in a reduced gravity environment, we will design a light weight NMR imager. The proof of principle development will prepare for the construction of a complete spaceborne system to perform experiments in space.

  5. Mixture optimization for mixed gas Joule-Thomson cycle

    Science.gov (United States)

    Detlor, J.; Pfotenhauer, J.; Nellis, G.

    2017-12-01

    An appropriate gas mixture can provide lower temperatures and higher cooling power when used in a Joule-Thomson (JT) cycle than is possible with a pure fluid. However, selecting gas mixtures to meet specific cooling loads and cycle parameters is a challenging design problem. This study focuses on the development of a computational tool to optimize gas mixture compositions for specific operating parameters. This study expands on prior research by exploring higher heat rejection temperatures and lower pressure ratios. A mixture optimization model has been developed which determines an optimal three-component mixture based on the analysis of the maximum value of the minimum value of isothermal enthalpy change, ΔhT , that occurs over the temperature range. This allows optimal mixture compositions to be determined for a mixed gas JT system with load temperatures down to 110 K and supply temperatures above room temperature for pressure ratios as small as 3:1. The mixture optimization model has been paired with a separate evaluation of the percent of the heat exchanger that exists in a two-phase range in order to begin the process of selecting a mixture for experimental investigation.

  6. Thermodynamic study of (anthracene + phenanthrene) solid state mixtures.

    Science.gov (United States)

    Rice, James W; Fu, Jinxia; Sandström, Emma; Ditto, Jenna C; Suuberg, Eric M

    2015-11-01

    Polycyclic aromatic hydrocarbons (PAH) are common components of many materials, such as petroleum and various types of tars. They are generally present in mixtures, occurring both naturally and as byproducts of fuel processing operations. It is important to understand the thermodynamic properties of such mixtures in order to understand better and predict their behavior ( i.e. , fate and transport) in the environment and in industrial operations. To characterize better the thermodynamic behavior of PAH mixtures, the phase behavior of a binary (anthracene + phenanthrene) system was studied by differential scanning calorimetry, X-ray diffraction, and the Knudsen effusion technique. Mixtures of (anthracene + phenanthrene) exhibit non-ideal mixture behavior. They form a lower-melting, phenanthrene-rich phase with an initial melting temperature of 372 K (identical to the melting temperature of pure phenanthrene) and a vapor pressure of roughly ln P /Pa = -2.38. The phenanthrene-rich phase coexists with an anthracene-rich phase when the mole fraction of phenanthrene ( x P ) in the mixture is less than or equal to 0.80. Mixtures initially at x P = 0.90 consist entirely of the phenanthrene-rich phase and sublime at nearly constant vapor pressure and composition, consistent with azeotrope-like behavior. Quasi-azeotropy was also observed for very high-content anthracene mixtures (2.5 < x P < 5) indicating that anthracene may accommodate very low levels of phenanthrene in its crystal structure.

  7. Structural and Physical Properties of Ionic Liquid Mixtures

    Science.gov (United States)

    Cha, Seoncheol; Kim, Doseok

    Ionic liquids are the materials consisting of only cations and anions and existing at liquid phase below 100 °C. They are called designer solven as the physical properties of the materials can be tuned by changing their constituent ions. Mixing ionic liquids is a new way of maximizing this advantage because the material properties can be changed continuously in the mixture. The excess molar volumes, a difference between the molar volumes of the mixtures and a linear interpolation between the volumes of pure components, have been found to differ significantly for some ionic liquid mixtures, but the origin of this difference is not well understood. The different microstructures of the mixtures, which can range from a simple mixture of two different consisting ionic liquids to a different structure from those of pure materials, have been suggested as the origin of this difference. We investigated ionic liquid mixture systems by IR spectroscopy by utilizing a particular peak in the IR spectrum for the moiety participating in the hydrogen bonding (νC(2)-H) that changes sensitively with the change of the anion in the ionic liquid. The absorbance of νC(2)-H changed proportionally to the composition for the mixtures consisting of halide anion. By contrast, the absorbance changed nonlinearly for the mixtures of which one of the anion had multiple interaction sites

  8. Causality in cancer research: a journey through models in molecular epidemiology and their philosophical interpretation

    Directory of Open Access Journals (Sweden)

    Paolo Vineis

    2017-06-01

    Full Text Available Abstract In the last decades, Systems Biology (including cancer research has been driven by technology, statistical modelling and bioinformatics. In this paper we try to bring biological and philosophical thinking back. We thus aim at making different traditions of thought compatible: (a causality in epidemiology and in philosophical theorizing—notably, the “sufficient-component-cause framework” and the “mark transmission” approach; (b new acquisitions about disease pathogenesis, e.g. the “branched model” in cancer, and the role of biomarkers in this process; (c the burgeoning of omics research, with a large number of “signals” and of associations that need to be interpreted. In the paper we summarize first the current views on carcinogenesis, and then explore the relevance of current philosophical interpretations of “cancer causes”. We try to offer a unifying framework to incorporate biomarkers and omic data into causal models, referring to a position called “evidential pluralism”. According to this view, causal reasoning is based on both “evidence of difference-making” (e.g. associations and on “evidence of underlying biological mechanisms”. We conceptualize the way scientists detect and trace signals in terms of information transmission, which is a generalization of the mark transmission theory developed by philosopher Wesley Salmon. Our approach is capable of helping us conceptualize how heterogeneous factors such as micro and macro-biological and psycho-social—are causally linked. This is important not only to understand cancer etiology, but also to design public health policies that target the right causal factors at the macro-level.

  9. Causal Relations Drive Young Children's Induction, Naming, and Categorization

    Science.gov (United States)

    Opfer, John E.; Bulloch, Megan J.

    2007-01-01

    A number of recent models and experiments have suggested that evidence of early category-based induction is an artifact of perceptual cues provided by experimenters. We tested these accounts against the prediction that different relations (causal versus non-causal) determine the types of perceptual similarity by which children generalize. Young…

  10. Theories of conduct disorder: a causal modelling analysis

    NARCIS (Netherlands)

    Krol, N.P.C.M.; Morton, J.; Bruyn, E.E.J. De

    2004-01-01

    Background: If a clinician has to make decisions on diagnosis and treatment, he or she is confronted with a variety of causal theories. In order to compare these theories a neutral terminology and notational system is needed. The Causal Modelling framework involving three levels of description –

  11. Thinking Fast and Slow about Causality: Response to Palinkas

    Science.gov (United States)

    Marsh, Jeanne C.

    2014-01-01

    Larry Palinkas advances the developing science of social work by providing an explanation of how social science research methods, both qualitative and quantitative, can improve our capacity to draw casual inferences. Understanding causal relations and making causal inferences--with the promise of being able to predict and control outcomes--is…

  12. Cause and Event: Supporting Causal Claims through Logistic Models

    Science.gov (United States)

    O'Connell, Ann A.; Gray, DeLeon L.

    2011-01-01

    Efforts to identify and support credible causal claims have received intense interest in the research community, particularly over the past few decades. In this paper, we focus on the use of statistical procedures designed to support causal claims for a treatment or intervention when the response variable of interest is dichotomous. We identify…

  13. Financial networks based on Granger causality: A case study

    NARCIS (Netherlands)

    Papana, A.; Kyrtsou, C.; Kugiumtzis, D.; Diks, C.

    Connectivity analysis is performed on a long financial record of 21 international stock indices employing a linear and a nonlinear causality measure, the conditional Granger causality index (CGCI) and the partial mutual information on mixed embedding (PMIME), respectively. Both measures aim to

  14. An Information Processing Approach to Children's Causal Reasoning.

    Science.gov (United States)

    Siegler, Robert S.

    This paper questions evidence for the thesis that causal reasoning of older children is more logical than that of younger ones, and describes two experiments which attempted to determine (1) whether there are true developmental differences in causal reasoning, and (2) what explanations for developmental differences can be supported. In the first…

  15. Causality in 1+1-dimensional Yukawa model-II

    Indian Academy of Sciences (India)

    2013-10-01

    Oct 1, 2013 ... shown that the effective model can be interpreted as a field theory of a bound state. We study causality in such a ... the motivation pertaining to causality violation in the bound states. In §3 condition of .... Consider a diagram with n external scalars, L fermion loops, V vertices, IF internal fermion lines and IB ...

  16. Forces and Motion: How Young Children Understand Causal Events

    Science.gov (United States)

    Goksun, Tilbe; George, Nathan R.; Hirsh-Pasek, Kathy; Golinkoff, Roberta M.

    2013-01-01

    How do children evaluate complex causal events? This study investigates preschoolers' representation of "force dynamics" in causal scenes, asking whether (a) children understand how single and dual forces impact an object's movement and (b) this understanding varies across cause types (Cause, Enable, Prevent). Three-and-a half- to…

  17. Causal pathways between substance use disorders and personality pathology

    NARCIS (Netherlands)

    Verheul, R.; van den Brink, W.

    2005-01-01

    A high co-occurrence between personality and substance use disorders suggests causal relationships between these conditions. Most empirical evidence strongly supports causal pathways in which (pathological) personality traits contribute to the development of a substance use disorder (i.e., primary

  18. Child Care Subsidy Use and Child Development: Potential Causal Mechanisms

    Science.gov (United States)

    Hawkinson, Laura E.

    2011-01-01

    Research using an experimental design is needed to provide firm causal evidence on the impacts of child care subsidy use on child development, and on underlying causal mechanisms since subsidies can affect child development only indirectly via changes they cause in children's early experiences. However, before costly experimental research is…

  19. Sartre's Contingency of Being and Asouzu's Principle of Causality ...

    African Journals Online (AJOL)

    The position of this work is that all contingent beings have a causal agent. This position is taken as a result of trying to delve into the issue of contingency and causality of being which has been discussed by many philosophers of diverse epochs of philosophy. This work tries to participate in the debate of whether contingent ...

  20. A note on mental content in the Causal Theory

    African Journals Online (AJOL)

    A note on mental content in the Causal Theory. JP Smit. Department of Philosophy, Stellenbosch University, Private Bag X1, 7600 Matieland, South Africa. E-mail: jps@sun.ac.za. Kripke's causal theory requires that downstream users of a name must have the intention to use the name in the same way that upstream users ...

  1. Identification of Mixed Causal-Noncausal Models in Finite Samples

    NARCIS (Netherlands)

    Hecq, Alain; Lieb, Lenard; Telg, Sean

    2016-01-01

    Gouriéroux and Zakoïan (2013) propose to use noncausal models to parsimoniously capture nonlinear features often observed in financial time series and in particular bubble phenomena. In order to distinguish causal autoregressive processes from purely noncausal or mixed causal-noncausal ones, one has

  2. Causal Discourse Analyzer: Improving Automated Feedback on Academic ESL Writing

    Science.gov (United States)

    Chukharev-Hudilainen, Evgeny; Saricaoglu, Aysel

    2016-01-01

    Expressing causal relations plays a central role in academic writing. While it is important that writing instructors assess and provide feedback on learners' causal discourse, it could be a very time-consuming task. In this respect, automated writing evaluation (AWE) tools may be helpful. However, to date, there have been no AWE tools capable of…

  3. Thinking in a Foreign language reduces the causality bias.

    Science.gov (United States)

    Díaz-Lago, Marcos; Matute, Helena

    2018-02-01

    The purpose of this research is to investigate the impact of a foreign language on the causality bias (i.e., the illusion that two events are causally related when they are not). We predict that using a foreign language could reduce the illusions of causality. A total of 36 native English speakers participated in Experiment 1, 80 native Spanish speakers in Experiment 2. They performed a standard contingency learning task, which can be used to detect causal illusions. Participants who performed the task in their native tongue replicated the illusion of causality effect, whereas those performing the task in their foreign language were more accurate in detecting that the two events were causally unrelated. Our results suggest that presenting the information in a foreign language could be used as a strategy to debias individuals against causal illusions, thereby facilitating more accurate judgements and decisions in non-contingent situations. They also contribute to the debate on the nature and underlying mechanisms of the foreign language effect, given that the illusion of causality is rooted in basic associative processes.

  4. Is there a causal relationship between alcohol and HIV? Implications ...

    African Journals Online (AJOL)

    There is now conclusive evidence of a causal linkage between heavy drinking patterns and/or alcohol use disorders and the worsening of the disease course for HIV. However, while alcohol usage is consistently associated with the prevalence and incidence of HIV, further research is needed to substantiate causality in ...

  5. Temporal and Statistical Information in Causal Structure Learning

    Science.gov (United States)

    McCormack, Teresa; Frosch, Caren; Patrick, Fiona; Lagnado, David

    2015-01-01

    Three experiments examined children's and adults' abilities to use statistical and temporal information to distinguish between common cause and causal chain structures. In Experiment 1, participants were provided with conditional probability information and/or temporal information and asked to infer the causal structure of a 3-variable mechanical…

  6. Weighting-Based Sensitivity Analysis in Causal Mediation Studies

    Science.gov (United States)

    Hong, Guanglei; Qin, Xu; Yang, Fan

    2018-01-01

    Through a sensitivity analysis, the analyst attempts to determine whether a conclusion of causal inference could be easily reversed by a plausible violation of an identification assumption. Analytic conclusions that are harder to alter by such a violation are expected to add a higher value to scientific knowledge about causality. This article…

  7. Non-Bayesian Inference: Causal Structure Trumps Correlation

    Science.gov (United States)

    Bes, Benedicte; Sloman, Steven; Lucas, Christopher G.; Raufaste, Eric

    2012-01-01

    The study tests the hypothesis that conditional probability judgments can be influenced by causal links between the target event and the evidence even when the statistical relations among variables are held constant. Three experiments varied the causal structure relating three variables and found that (a) the target event was perceived as more…

  8. Proceeding From Observed Correlation to Causal Inference: The Use of Natural Experiments.

    Science.gov (United States)

    Rutter, Michael

    2007-12-01

    This article notes five reasons why a correlation between a risk (or protective) factor and some specified outcome might not reflect environmental causation. In keeping with numerous other writers, it is noted that a causal effect is usually composed of a constellation of components acting in concert. The study of causation, therefore, will necessarily be informative on only one or more subsets of such components. There is no such thing as a single basic necessary and sufficient cause. Attention is drawn to the need (albeit unobservable) to consider the counterfactual (i.e., what would have happened if the individual had not had the supposed risk experience). Fifteen possible types of natural experiments that may be used to test causal inferences with respect to naturally occurring prior causes (rather than planned interventions) are described. These comprise five types of genetically sensitive designs intended to control for possible genetic mediation (as well as dealing with other issues), six uses of twin or adoptee strategies to deal with other issues such as selection bias or the contrasts between different environmental risks, two designs to deal with selection bias, regression discontinuity designs to take into account unmeasured confounders, and the study of contextual effects. It is concluded that, taken in conjunction, natural experiments can be very helpful in both strengthening and weakening causal inferences. © 2007 Association for Psychological Science.

  9. Statistical causal inferences and their applications in public health research

    CERN Document Server

    Wu, Pan; Chen, Ding-Geng

    2016-01-01

    This book compiles and presents new developments in statistical causal inference. The accompanying data and computer programs are publicly available so readers may replicate the model development and data analysis presented in each chapter. In this way, methodology is taught so that readers may implement it directly. The book brings together experts engaged in causal inference research to present and discuss recent issues in causal inference methodological development. This is also a timely look at causal inference applied to scenarios that range from clinical trials to mediation and public health research more broadly. In an academic setting, this book will serve as a reference and guide to a course in causal inference at the graduate level (Master's or Doctorate). It is particularly relevant for students pursuing degrees in Statistics, Biostatistics and Computational Biology. Researchers and data analysts in public health and biomedical research will also find this book to be an important reference.

  10. Causal Relationship between Construction Production and GDP in Turkey

    Directory of Open Access Journals (Sweden)

    Hakkı Kutay Bolkol

    2015-12-01

    Full Text Available This study empirically investigates the causal relationship between construction production and GDP for Turkey during 2005Q1-2013Q4 period. Because it is found that, there is no cointegration which means there is no long run relationship between variables, VAR Granger Causality Method is used to test the causality in short run. The findings reveal that, the causality runs from GDP to Building Production and Building Production to Non-Building Production (i.e. bidirectional relationship. Findings of this paper suggest that, because there is no long run relationship between Construction Production (Building and Non-Building and GDP and also in short run the causality runs from GDP to Construction Production, the growth strategy based on mainly Construction Sector growth is not a good idea for Turkey.

  11. Causal Relationship between Construction Production and GDP in Turkey

    Directory of Open Access Journals (Sweden)

    Hakkı Kutay Bolkol

    2015-09-01

    Full Text Available This study empirically investigates the causal relationship between construction production and GDP for Turkey during 2005Q1-2013Q4 period. Because it is found that, there is no cointegration which means there is no long run relationship between variables, VAR Granger Causality Method is used to test the causality in short run. The findings reveal that, the causality runs from GDP to Building Production and Building Production to Non-Building Production (i.e. bidirectional relationship. Findings of this paper suggest that, because there is no long run relationship between Construction Production (Building and Non-Building and GDP and also in short run the causality runs from GDP to Construction Production, the growth strategy based on mainly Construction Sector growth is not a good idea for Turkey.

  12. Causal Relationship Between Relative Price Variability and Inflation in Turkey:

    Directory of Open Access Journals (Sweden)

    Nebiye Yamak

    2016-09-01

    Full Text Available This study investigates the causal relationship between inflation and relative price variability in Turkey for the period of January 2003-January 2014, by using panel data. In the study, a Granger (1969 non-causality test in heterogeneous panel data models developed by Dumitrescu and Hurlin (2012 is utilized to determine the causal relations between inflation rate relative price variability. The panel data consists of 4123 observations: 133 time observations and 31 cross-section observations. The results of panel causality test indicate that there is a bidirectional causality between inflation rate and relative price variability by not supporting the imperfection information model of Lucas and the menu cost model of Ball and Mankiw.

  13. G-computation demonstration in causal mediation analysis.

    Science.gov (United States)

    Wang, Aolin; Arah, Onyebuchi A

    2015-10-01

    Recent work has considerably advanced the definition, identification and estimation of controlled direct, and natural direct and indirect effects in causal mediation analysis. Despite the various estimation methods and statistical routines being developed, a unified approach for effect estimation under different effect decomposition scenarios is still needed for epidemiologic research. G-computation offers such unification and has been used for total effect and joint controlled direct effect estimation settings, involving different types of exposure and outcome variables. In this study, we demonstrate the utility of parametric g-computation in estimating various components of the total effect, including (1) natural direct and indirect effects, (2) standard and stochastic controlled direct effects, and (3) reference and mediated interaction effects, using Monte Carlo simulations in standard statistical software. For each study subject, we estimated their nested potential outcomes corresponding to the (mediated) effects of an intervention on the exposure wherein the mediator was allowed to attain the value it would have under a possible counterfactual exposure intervention, under a pre-specified distribution of the mediator independent of any causes, or under a fixed controlled value. A final regression of the potential outcome on the exposure intervention variable was used to compute point estimates and bootstrap was used to obtain confidence intervals. Through contrasting different potential outcomes, this analytical framework provides an intuitive way of estimating effects under the recently introduced 3- and 4-way effect decomposition. This framework can be extended to complex multivariable and longitudinal mediation settings.

  14. Symmetries, causality problems and neutrino fields in antipode universes

    International Nuclear Information System (INIS)

    Sasse, F.D.

    1986-01-01

    The S 3 xR and H 3 xR Lie groups are characterized, and using continuous deformations of S 3 and H 3 subgroups algebra, Lorentz space-time metrics with g E left invariance and g D right invariance are introduced. The topology of sections of these space-time is investigated and its relation with global causality problems is shown. In the search of g E and g D isometries, it is shown that in the class of metrics associated to H 3 xR topology, there is a particular case which accepts a G 7 group of isometries. It is shown that the g E and g D metrics characterize universes which vortices of cosmological fluid (when it is present), in relation to the inertial compass, are opposite. In the particular coordinate system, that is used, these metrics differs only by a coordinate inversion transformation. Neutrinos interacting with the geometry of these spaces times is considered. It is shown that the physical transformation which consists in to reverse the universe rotation and to reverse a determined component of neutrino momentum, leads the universe with g E (g D ) metric containing neutrinos, with determined helicity, to another one with g D (g E )metric containing neutrinos, with opposite helicity from the original. Thus, neutrinos can be used to distinguish physically these two universes, supposing that in a given universe neutrinos there is always a type of helicity. (M.C.K.) [pt

  15. G-computation demonstration in causal mediation analysis

    International Nuclear Information System (INIS)

    Wang, Aolin; Arah, Onyebuchi A.

    2015-01-01

    Recent work has considerably advanced the definition, identification and estimation of controlled direct, and natural direct and indirect effects in causal mediation analysis. Despite the various estimation methods and statistical routines being developed, a unified approach for effect estimation under different effect decomposition scenarios is still needed for epidemiologic research. G-computation offers such unification and has been used for total effect and joint controlled direct effect estimation settings, involving different types of exposure and outcome variables. In this study, we demonstrate the utility of parametric g-computation in estimating various components of the total effect, including (1) natural direct and indirect effects, (2) standard and stochastic controlled direct effects, and (3) reference and mediated interaction effects, using Monte Carlo simulations in standard statistical software. For each study subject, we estimated their nested potential outcomes corresponding to the (mediated) effects of an intervention on the exposure wherein the mediator was allowed to attain the value it would have under a possible counterfactual exposure intervention, under a pre-specified distribution of the mediator independent of any causes, or under a fixed controlled value. A final regression of the potential outcome on the exposure intervention variable was used to compute point estimates and bootstrap was used to obtain confidence intervals. Through contrasting different potential outcomes, this analytical framework provides an intuitive way of estimating effects under the recently introduced 3- and 4-way effect decomposition. This framework can be extended to complex multivariable and longitudinal mediation settings

  16. Beyond Markov: Accounting for independence violations in causal reasoning.

    Science.gov (United States)

    Rehder, Bob

    2018-03-06

    Although many theories of causal cognition are based on causal graphical models, a key property of such models-the independence relations stipulated by the Markov condition-is routinely violated by human reasoners. This article presents three new accounts of those independence violations, accounts that share the assumption that people's understanding of the correlational structure of data generated from a causal graph differs from that stipulated by causal graphical model framework. To distinguish these models, experiments assessed how people reason with causal graphs that are larger than those tested in previous studies. A traditional common cause network (Y 1 ←X→Y 2 ) was extended so that the effects themselves had effects (Z 1 ←Y 1 ←X→Y 2 →Z 2 ). A traditional common effect network (Y 1 →X←Y 2 ) was extended so that the causes themselves had causes (Z 1 →Y 1 →X←Y 2 ←Z 2 ). Subjects' inferences were most consistent with the beta-Q model in which consistent states of the world-those in which variables are either mostly all present or mostly all absent-are viewed as more probable than stipulated by the causal graphical model framework. Substantial variability in subjects' inferences was also observed, with the result that substantial minorities of subjects were best fit by one of the other models (the dual prototype or a leaky gate models). The discrepancy between normative and human causal cognition stipulated by these models is foundational in the sense that they locate the error not in people's causal reasoning but rather in their causal representations. As a result, they are applicable to any cognitive theory grounded in causal graphical models, including theories of analogy, learning, explanation, categorization, decision-making, and counterfactual reasoning. Preliminary evidence that independence violations indeed generalize to other judgment types is presented. Copyright © 2018 Elsevier Inc. All rights reserved.

  17. Physics Without Causality — Theory and Evidence

    Science.gov (United States)

    Shoup, Richard

    2006-10-01

    The principle of cause and effect is deeply rooted in human experience, so much so that it is routinely and tacitly assumed throughout science, even by scientists working in areas where time symmetry is theoretically ingrained, as it is in both classical and quantum physics. Experiments are said to cause their results, not the other way around. In this informal paper, we argue that this assumption should be replaced with a more general notion of mutual influence — bi-directional relations or constraints on joint values of two or more variables. From an analysis based on quantum entropy, it is proposed that quantum measurement is a unitary three-interaction, with no collapse, no fundamental randomness, and no barrier to backward influence. Experimental results suggesting retrocausality are seen frequently in well-controlled laboratory experiments in parapsychology and elsewhere, especially where a random element is included. Certain common characteristics of these experiments give the appearance of contradicting well-established physical laws, thus providing an opportunity for deeper understanding and important clues that must be addressed by any explanatory theory. We discuss how retrocausal effects and other anomalous phenomena can be explained without major injury to existing physical theory. A modified quantum formalism can give new insights into the nature of quantum measurement, randomness, entanglement, causality, and time.

  18. A new test of multivariate nonlinear causality.

    Science.gov (United States)

    Bai, Zhidong; Hui, Yongchang; Jiang, Dandan; Lv, Zhihui; Wong, Wing-Keung; Zheng, Shurong

    2018-01-01

    The multivariate nonlinear Granger causality developed by Bai et al. (2010) (Mathematics and Computers in simulation. 2010; 81: 5-17) plays an important role in detecting the dynamic interrelationships between two groups of variables. Following the idea of Hiemstra-Jones (HJ) test proposed by Hiemstra and Jones (1994) (Journal of Finance. 1994; 49(5): 1639-1664), they attempt to establish a central limit theorem (CLT) of their test statistic by applying the asymptotical property of multivariate U-statistic. However, Bai et al. (2016) (2016; arXiv: 1701.03992) revisit the HJ test and find that the test statistic given by HJ is NOT a function of U-statistics which implies that the CLT neither proposed by Hiemstra and Jones (1994) nor the one extended by Bai et al. (2010) is valid for statistical inference. In this paper, we re-estimate the probabilities and reestablish the CLT of the new test statistic. Numerical simulation shows that our new estimates are consistent and our new test performs decent size and power.

  19. Reconstructing Causal Biological Networks through Active Learning.

    Directory of Open Access Journals (Sweden)

    Hyunghoon Cho

    Full Text Available Reverse-engineering of biological networks is a central problem in systems biology. The use of intervention data, such as gene knockouts or knockdowns, is typically used for teasing apart causal relationships among genes. Under time or resource constraints, one needs to carefully choose which intervention experiments to carry out. Previous approaches for selecting most informative interventions have largely been focused on discrete Bayesian networks. However, continuous Bayesian networks are of great practical interest, especially in the study of complex biological systems and their quantitative properties. In this work, we present an efficient, information-theoretic active learning algorithm for Gaussian Bayesian networks (GBNs, which serve as important models for gene regulatory networks. In addition to providing linear-algebraic insights unique to GBNs, leading to significant runtime improvements, we demonstrate the effectiveness of our method on data simulated with GBNs and the DREAM4 network inference challenge data sets. Our method generally leads to faster recovery of underlying network structure and faster convergence to final distribution of confidence scores over candidate graph structures using the full data, in comparison to random selection of intervention experiments.

  20. From causal dynamical triangulations to astronomical observations

    Science.gov (United States)

    Mielczarek, Jakub

    2017-09-01

    This letter discusses phenomenological aspects of dimensional reduction predicted by the Causal Dynamical Triangulations (CDT) approach to quantum gravity. The deformed form of the dispersion relation for the fields defined on the CDT space-time is reconstructed. Using the Fermi satellite observations of the GRB 090510 source we find that the energy scale of the dimensional reduction is E* > 0.7 \\sqrt{4-d\\text{UV}} \\cdot 1010 \\text{GeV} at (95% CL), where d\\text{UV} is the value of the spectral dimension in the UV limit. By applying the deformed dispersion relation to the cosmological perturbations it is shown that, for a scenario when the primordial perturbations are formed in the UV region, the scalar power spectrum PS \\propto kn_S-1 , where n_S-1≈ \\frac{3 r (d\\text{UV}-2)}{(d\\text{UV}-1)r-48} . Here, r is the tensor-to-scalar ratio. We find that within the considered model, the predicted from CDT deviation from the scale invariance (n_S=1) is in contradiction with the up to date Planck and BICEP2.

  1. Causality in noncommutative space-time

    Energy Technology Data Exchange (ETDEWEB)

    Neves, M.J.; Abreu, E.M.C. [Universidade Federal Rural do Rio de Janeiro (UFRRJ), Seropedica, RJ (Brazil)

    2011-07-01

    Full text: Space-time noncommutativity has been investigated in the last years as a real possibility to describe physics at fundamental scale. This subject is associated with many tough issues in physics, i.e., strings, gravity, noncommutative field theories and others. The first formulation for a noncommutative spacetime was proposed by Snyder in 1947, where the object of noncommutativity is considered as a constant matrix that breaks the Lorentz symmetry. His objective was to get rid of the infinities that intoxicate quantum field theory. Unfortunately it was demonstrated not a success. Here we consider an alternative recent formulation known as Doplicher-Fredenhagen-Roberts-Amorim (DFRA) algebra in which the object of noncommutativity is treated as an ordinary coordinate by constructing an extended space-time with 4 + 6 dimensions (x + {phi}) - spacetime. In this way, the Lorentz symmetry is preserved in DFRA algebra. A quantum field theory is constructed in accordance with DFRA Poincare algebra, as well as a Lagrangian density formulation. By means of the Klein-Gordon equation in this (x + {phi}) - spacetime. We analyze the aspects of causality by studying the advanced and retarded Green functions. (author)

  2. Mixture reduction algorithms for target tracking in clutter

    Science.gov (United States)

    Salmond, David J.

    1990-10-01

    The Bayesian solution of the problem of tracking a target in random clutter gives rise to Gaussian mixture distributions, which are composed of an ever increasing number of components. To implement such a tracking filter, the growth of components must be controlled by approximating the mixture distribution. A popular and economical scheme is the Probabilistic Data Association Filter (PDAF), which reduces the mixture to a single Gaussian component at each time step. However this approximation may destroy valuable information, especially if several significant, well spaced components are present. In this paper, two new algorithms for reducing Gaussian mixture distributions are presented. These techniques preserve the mean and covariance of the mixture, and the fmal approximation is itself a Gaussian mixture. The reduction is achieved by successively merging pairs of components or groups of components until their number is reduced to some specified limit. Further reduction will then proceed while the approximation to the main features of the original distribution is still good. The performance of the most economical of these algorithms has been compared with that of the PDAF for the problem of tracking a single target which moves in a plane according to a second order model. A linear sensor which measures target position is corrupted by uniformly distributed clutter. Given a detection probability of unity and perfect knowledge of initial target position and velocity, this problem depends on only tw‡ non-dimensional parameters. Monte Carlo simulation has been employed to identify the region of this parameter space where significant performance improvement is obtained over the PDAF.

  3. Complex mixtures biostudies

    International Nuclear Information System (INIS)

    Springer, D.L.

    1987-01-01

    The objective of the project is to identify potential adverse biological activities associated with human exposures to complex organic mixtures (COM) from energy-related industries. Studies to identify the influence of chemical class fractions from a COM on the initiating activity of a known carcinogen, benzo(a)pyrene (BaP), demonstrated that the polycyclic aromatic hydrocarbons (PAH) and nitrogen-containing polycyclic aromatic compound (NPAC) fractions were the most effective inhibitors of initiation. In an effort to determine the contribution of BaP to the initiating activity of the COM, binding of radiolabeled BaP to mouse skin DNA was measured. Results indicated that binding of BaP to DNA decreased in the presence of the COM so that at initiating COM doses, BaP binding was near the limit detection. Addition of unlabeled BaP to the COM at an amount similar to that originally present in the COM did not significantly increase the binding. Studies to determine the rates of disappearance of carcinogenic PAH from the site of application on the skin indicated that half-lives for PAH differed by a factor of about 2. Analytical methods developed to identify PAH from COM which covalently bind to DNA demonstrated that the lower level of detection is approximately 200 picograms. Developmental studies demonstrated that both pregnant rats and mice treated dermally with a high-boiling COM developed fetuses with major malformations including cleft palate, small lungs, edema, and sagittal suture hemorrhages. 3 figures, 5 tables

  4. Causal beliefs about depression in different cultural groups-what do cognitive psychological theories of causal learning and reasoning predict?

    Science.gov (United States)

    Hagmayer, York; Engelmann, Neele

    2014-01-01

    Cognitive psychological research focuses on causal learning and reasoning while cognitive anthropological and social science research tend to focus on systems of beliefs. Our aim was to explore how these two types of research can inform each other. Cognitive psychological theories (causal model theory and causal Bayes nets) were used to derive predictions for systems of causal beliefs. These predictions were then applied to lay theories of depression as a specific test case. A systematic literature review on causal beliefs about depression was conducted, including original, quantitative research. Thirty-six studies investigating 13 non-Western and 32 Western cultural groups were analyzed by classifying assumed causes and preferred forms of treatment into common categories. Relations between beliefs and treatment preferences were assessed. Substantial agreement between cultural groups was found with respect to the impact of observable causes. Stress was generally rated as most important. Less agreement resulted for hidden, especially supernatural causes. Causal beliefs were clearly related to treatment preferences in Western groups, while evidence was mostly lacking for non-Western groups. Overall predictions were supported, but there were considerable methodological limitations. Pointers to future research, which may combine studies on causal beliefs with experimental paradigms on causal reasoning, are given.

  5. Causal beliefs about depression in different cultural groups—what do cognitive psychological theories of causal learning and reasoning predict?

    Science.gov (United States)

    Hagmayer, York; Engelmann, Neele

    2014-01-01

    Cognitive psychological research focuses on causal learning and reasoning while cognitive anthropological and social science research tend to focus on systems of beliefs. Our aim was to explore how these two types of research can inform each other. Cognitive psychological theories (causal model theory and causal Bayes nets) were used to derive predictions for systems of causal beliefs. These predictions were then applied to lay theories of depression as a specific test case. A systematic literature review on causal beliefs about depression was conducted, including original, quantitative research. Thirty-six studies investigating 13 non-Western and 32 Western cultural groups were analyzed by classifying assumed causes and preferred forms of treatment into common categories. Relations between beliefs and treatment preferences were assessed. Substantial agreement between cultural groups was found with respect to the impact of observable causes. Stress was generally rated as most important. Less agreement resulted for hidden, especially supernatural causes. Causal beliefs were clearly related to treatment preferences in Western groups, while evidence was mostly lacking for non-Western groups. Overall predictions were supported, but there were considerable methodological limitations. Pointers to future research, which may combine studies on causal beliefs with experimental paradigms on causal reasoning, are given. PMID:25505432

  6. Predicting Causal Relationships from Biological Data: Applying Automated Causal Discovery on Mass Cytometry Data of Human Immune Cells.

    Science.gov (United States)

    Triantafillou, Sofia; Lagani, Vincenzo; Heinze-Deml, Christina; Schmidt, Angelika; Tegner, Jesper; Tsamardinos, Ioannis

    2017-10-05

    Learning the causal relationships that define a molecular system allows us to predict how the system will respond to different interventions. Distinguishing causality from mere association typically requires randomized experiments. Methods for automated  causal discovery from limited experiments exist, but have so far rarely been tested in systems biology applications. In this work, we apply state-of-the art causal discovery methods on a large collection of public mass cytometry data sets, measuring intra-cellular signaling proteins of the human immune system and their response to several perturbations. We show how different experimental conditions can be used to facilitate causal discovery, and apply two fundamental methods that produce context-specific causal predictions. Causal predictions were reproducible across independent data sets from two different studies, but often disagree with the KEGG pathway databases. Within this context, we discuss the caveats we need to overcome for automated causal discovery to become a part of the routine data analysis in systems biology.

  7. Causal beliefs about depression in different cultural groups – What do cognitive psychological theories of causal learning and reasoning predict?

    Directory of Open Access Journals (Sweden)

    York eHagmayer

    2014-11-01

    Full Text Available Cognitive psychological research focusses on causal learning and reasoning while cognitive anthropological and social science research tend to focus on systems of beliefs. Our aim was to explore how these two types of research can inform each other. Cognitive psychological theories (causal model theory and causal Bayes nets were used to derive predictions for systems of causal beliefs. These predictions were then applied to lay theories of depression as a specific test case. A systematic literature review on causal beliefs about depression was conducted, including original, quantitative research. Thirty-six studies investigating 13 non-Western and 32 Western cultural groups were analysed by classifying assumed causes and preferred forms of treatment into common categories. Relations between beliefs and treatment preferences were assessed. Substantial agreement between cultural groups was found with respect to the impact of observable causes. Stress was generally rated as most important. Less agreement resulted for hidden, especially supernatural causes. Causal beliefs were clearly related to treatment preferences in Western groups, while evidence was mostly lacking for non-Western groups. Overall predictions were supported, but there were considerable methodological limitations. Pointers to future research, which may combine studies on causal beliefs with experimental paradigms on causal reasoning, are given.

  8. Predicting Causal Relationships from Biological Data: Applying Automated Causal Discovery on Mass Cytometry Data of Human Immune Cells

    KAUST Repository

    Triantafillou, Sofia

    2017-09-29

    Learning the causal relationships that define a molecular system allows us to predict how the system will respond to different interventions. Distinguishing causality from mere association typically requires randomized experiments. Methods for automated  causal discovery from limited experiments exist, but have so far rarely been tested in systems biology applications. In this work, we apply state-of-the art causal discovery methods on a large collection of public mass cytometry data sets, measuring intra-cellular signaling proteins of the human immune system and their response to several perturbations. We show how different experimental conditions can be used to facilitate causal discovery, and apply two fundamental methods that produce context-specific causal predictions. Causal predictions were reproducible across independent data sets from two different studies, but often disagree with the KEGG pathway databases. Within this context, we discuss the caveats we need to overcome for automated causal discovery to become a part of the routine data analysis in systems biology.

  9. Causality and subjectivity in discourse : The meaning and use of causal connectives in spontaneous conversation, chat interactions and written text

    NARCIS (Netherlands)

    Sanders, T.J.M.|info:eu-repo/dai/nl/075243911; Spooren, W.P.M.S.

    Many languages of the world have connectives to express causal relations at the discourse level. Often, language users systematically prefer one lexical item (because) over another (even highly similar) one (since) to express a causal relationship. Such choices provide a window on speakers'

  10. Causation or only correlation? Application of causal inference graphs for evaluating causality in nano-QSAR models

    Science.gov (United States)

    Sizochenko, Natalia; Gajewicz, Agnieszka; Leszczynski, Jerzy; Puzyn, Tomasz

    2016-03-01

    In this paper, we suggest that causal inference methods could be efficiently used in Quantitative Structure-Activity Relationships (QSAR) modeling as additional validation criteria within quality evaluation of the model. Verification of the relationships between descriptors and toxicity or other activity in the QSAR model has a vital role in understanding the mechanisms of action. The well-known phrase ``correlation does not imply causation'' reflects insight statistically correlated with the endpoint descriptor may not cause the emergence of this endpoint. Hence, paradigmatic shifts must be undertaken when moving from traditional statistical correlation analysis to causal analysis of multivariate data. Methods of causal discovery have been applied for broader physical insight into mechanisms of action and interpretation of the developed nano-QSAR models. Previously developed nano-QSAR models for toxicity of 17 nano-sized metal oxides towards E. coli bacteria have been validated by means of the causality criteria. Using the descriptors confirmed by the causal technique, we have developed new models consistent with the straightforward causal-reasoning account. It was proven that causal inference methods are able to provide a more robust mechanistic interpretation of the developed nano-QSAR models.In this paper, we suggest that causal inference methods could be efficiently used in Quantitative Structure-Activity Relationships (QSAR) modeling as additional validation criteria within quality evaluation of the model. Verification of the relationships between descriptors and toxicity or other activity in the QSAR model has a vital role in understanding the mechanisms of action. The well-known phrase ``correlation does not imply causation'' reflects insight statistically correlated with the endpoint descriptor may not cause the emergence of this endpoint. Hence, paradigmatic shifts must be undertaken when moving from traditional statistical correlation analysis to causal

  11. Easy and flexible mixture distributions

    DEFF Research Database (Denmark)

    Fosgerau, Mogens; Mabit, Stefan L.

    2013-01-01

    We propose a method to generate flexible mixture distributions that are useful for estimating models such as the mixed logit model using simulation. The method is easy to implement, yet it can approximate essentially any mixture distribution. We test it with good results in a simulation study...

  12. Systematic study of RPC performances in polluted or varying gas mixtures compositions: an online monitor system for the RPC gas mixture at LHC

    CERN Document Server

    Capeans, M; Mandelli, B

    2012-01-01

    The importance of the correct gas mixture for the Resistive Plate Chamber (RPC) detector systems is fundamental for their correct and safe operation. A small change in the percentages of the gas mixture components can alter the RPC performance and this will rebound on the data quality in the ALICE, ATLAS and CMS experiments at CERN. A constant monitoring of the gas mixture injected in the RPCs would avoid such kind of problems. A systematic study has been performed to understand RPC performances with several gas mixture compositions and in the presence of common gas impurities. The systematic analysis of several RPC performance parameters in different gas mixtures allows the rapid identification of any variation in the RPC gas mixture. A set-up for the online monitoring of the RPC gas mixture in the LHC gas systems is also proposed.

  13. Pycnonuclear reaction rates for binary ionic mixtures

    Science.gov (United States)

    Ichimaru, S.; Ogata, S.; Van Horn, H. M.

    1992-01-01

    Through a combination of compositional scaling arguments and examinations of Monte Carlo simulation results for the interparticle separations in binary-ionic mixture (BIM) solids, we have derived parameterized expressions for the BIM pycnonuclear rates as generalizations of those in one-component solids obtained previously by Salpeter and Van Horn and by Ogata et al. We have thereby discovered a catalyzing effect of the heavier elements, which enhances the rates of reactions among the lighter elements when the charge ratio exceeds a critical value of approximately 2.3.

  14. Processing of Positive-Causal and Negative-Causal Coherence Relations in Primary School Children and Adults: A Test of the Cumulative Cognitive Complexity Approach in German

    Science.gov (United States)

    Knoepke, Julia; Richter, Tobias; Isberner, Maj-Britt; Naumann, Johannes; Neeb, Yvonne; Weinert, Sabine

    2017-01-01

    Establishing local coherence relations is central to text comprehension. Positive-causal coherence relations link a cause and its consequence, whereas negative-causal coherence relations add a contrastive meaning (negation) to the causal link. According to the cumulative cognitive complexity approach, negative-causal coherence relations are…

  15. Nonlinearity of bituminous mixtures

    Science.gov (United States)

    Mangiafico, S.; Babadopulos, L. F. A. L.; Sauzéat, C.; Di Benedetto, H.

    2018-02-01

    This paper presents an experimental characterization of the strain dependency of the complex modulus of bituminous mixtures for strain amplitude levels lower than about 110 μm/m. A series of strain amplitude sweep tests are performed at different temperatures (8, 10, 12 and 14°C) and frequencies (0.3, 1, 3 and 10 Hz), during which complex modulus is monitored. For each combination of temperature and frequency, four maximum strain amplitudes are targeted (50, 75, 100 and 110 μm/m). For each of them, two series of 50 loading cycles are applied, respectively at decreasing and increasing strain amplitudes. Before each decreasing strain sweep and after each increasing strain sweep, 5 cycles are performed at constant maximum targeted strain amplitude. Experimental results show that the behavior of the studied material is strain dependent. The norm of the complex modulus decreases and phase angle increases with strain amplitude. Results are presented in Black and Cole-Cole plots, where characteristic directions of nonlinearity can be identified. Both the effects of nonlinearity in terms of the complex modulus variation and of the direction of nonlinearity in Black space seem to validate the time-temperature superposition principle with the same shift factors as for linear viscoelasticity. The comparison between results obtained during increasing and decreasing strain sweeps suggests the existence of another phenomenon occurring during cyclic loading, which appears to systematically induce a decrease of the norm of the complex modulus and an increase of the phase angle, regardless of the type of the strain sweep (increasing or decreasing).

  16. Causal inference in neuronal time-series using adaptive decomposition.

    Science.gov (United States)

    Rodrigues, João; Andrade, Alexandre

    2015-04-30

    The assessment of directed functional connectivity from neuronal data is increasingly common in neuroscience by applying measures based in the Granger causality (GC) framework. Although initially these consisted in simple analyses based on directionality strengths, current methods aim to discriminate causal effects both in time and frequency domain. We study the effect of adaptive data analysis on the GC framework by combining empirical mode decomposition (EMD) and causal analysis of neuronal signals. EMD decomposes data into simple amplitude and phase modulated oscillatory modes, the intrinsic mode functions (IMFs), from which it is possible to compute their instantaneous frequencies (IFs). Hence, we propose a method where causality is estimated between IMFs with comparable IFs, in a static or time-varying procedure, and then attributed to the frequencies corresponding to the IF of the driving IMF for improved frequency localization. We apply a thorough simulation framework involving all possible combinations of EMD algorithms with causality metrics and realistically simulated datasets. Results show that synchrosqueezing wavelet transform and noise-assisted multivariate EMD, paired with generalized partial directed coherence or with Geweke's GC, provide the highest sensitivity and specificity results. Compared to standard causal analysis, the output of selected representative instances of this methodology result in the fulfillment of performance criteria in a well-known benchmark with real animal epicranial recordings and improved frequency resolution for simulated neural data. This study presents empirical evidence that adaptive data analysis is a fruitful addition to the existing causal framework. Copyright © 2015 Elsevier B.V. All rights reserved.

  17. Consumer involvement in the product with general causality orientations

    Directory of Open Access Journals (Sweden)

    Matanović Jelena

    2012-01-01

    Full Text Available The main objective of the research relates to establish predictable values of causality orientation for types of consumer involvement in product. Beside that, the possible differences in expression of causality orientations and dimension of involvement within different groups of respondents were tested. The assumption is that it is possible to predict some of five types of consumer involvement: pleasure, importance, sign, risk importance, risk probability according to dominant type of causality orientation (autonomy, controlled, impersonal. The research were conducted on a sample of 178 consumers on the territory of Republic of Serbia on different types of sex, ages, married and working status, level of education and purchasing power. The General Causality Orientation Scale (Deci and Ryan,1985 and Profil involvement (Laurent and Kapferer, 1985 were used. Pleasure is the most expressed dimension of involvement. When we are talking about sociodemographic variable and the expression of single dimension involvement, there are differences in sex and education. The degree differences of general causality orientation exist only in sex. As important predictors of involvement type the next causality orientation were separated: autonomy for pleasure, controlled for importance and risk importance and impersonal for risk probability. It is not possible to predict the type of involvement on base of general causality orientation of respondents.

  18. Causality between Prices and Wages: VECM Analysis for EU-27

    Directory of Open Access Journals (Sweden)

    Adriatik Hoxha

    2010-09-01

    Full Text Available The literature on causality as well as the empirical evidence clearly shows that there are two opposing groups of economists, who support different hypotheses with respect to the flow of causality in the price-wage causal relationship. The first group argues that causality runs from wages to prices, whereas the second argues that effect flows from prices to wages. Nonetheless, the literature review suggeststhat there is at least some consensus in that researcher’s conclusions may be contingent on the type of data employed, applied econometric model, or even that relationship may alter with economic cycles. This paper empirically examines theprice-wage causal relationship in EU-27, by using the OLS and VECM analysis, and it also provides robust evidence in support of a bilateral causal relationship between prices and wages, both in long-run as well as in the shortrun.Prior to designing and estimating the econometric model we have performed stationarity tests for the employed price, wage and productivity variables. Additionally, we have also specified the model taking into account the lag order as well as the rank of co-integration for the co-integrated variables. Furthermore, we have also applied respective restrictions on the parameters of estimatedVECM. The evidence resulting from model robustness checks indicates that results are statistically robust. Although far from closing the issue of causality between prices and wages, this paper at least provides some fresh evidence in the case of EU-27.

  19. New Insights into Signed Path Coefficient Granger Causality Analysis.

    Science.gov (United States)

    Zhang, Jian; Li, Chong; Jiang, Tianzi

    2016-01-01

    Granger causality analysis, as a time series analysis technique derived from econometrics, has been applied in an ever-increasing number of publications in the field of neuroscience, including fMRI, EEG/MEG, and fNIRS. The present study mainly focuses on the validity of "signed path coefficient Granger causality," a Granger-causality-derived analysis method that has been adopted by many fMRI researches in the last few years. This method generally estimates the causality effect among the time series by an order-1 autoregression, and defines a positive or negative coefficient as an "excitatory" or "inhibitory" influence. In the current work we conducted a series of computations from resting-state fMRI data and simulation experiments to illustrate the signed path coefficient method was flawed and untenable, due to the fact that the autoregressive coefficients were not always consistent with the real causal relationships and this would inevitablely lead to erroneous conclusions. Overall our findings suggested that the applicability of this kind of causality analysis was rather limited, hence researchers should be more cautious in applying the signed path coefficient Granger causality to fMRI data to avoid misinterpretation.

  20. Causal Learning in Gambling Disorder: Beyond the Illusion of Control.

    Science.gov (United States)

    Perales, José C; Navas, Juan F; Ruiz de Lara, Cristian M; Maldonado, Antonio; Catena, Andrés

    2017-06-01

    Causal learning is the ability to progressively incorporate raw information about dependencies between events, or between one's behavior and its outcomes, into beliefs of the causal structure of the world. In spite of the fact that some cognitive biases in gambling disorder can be described as alterations of causal learning involving gambling-relevant cues, behaviors, and outcomes, general causal learning mechanisms in gamblers have not been systematically investigated. In the present study, we compared gambling disorder patients against controls in an instrumental causal learning task. Evidence of illusion of control, namely, overestimation of the relationship between one's behavior and an uncorrelated outcome, showed up only in gamblers with strong current symptoms. Interestingly, this effect was part of a more complex pattern, in which gambling disorder patients manifested a poorer ability to discriminate between null and positive contingencies. Additionally, anomalies were related to gambling severity and current gambling disorder symptoms. Gambling-related biases, as measured by a standard psychometric tool, correlated with performance in the causal learning task, but not in the expected direction. Indeed, performance of gamblers with stronger biases tended to resemble the one of controls, which could imply that anomalies of causal learning processes play a role in gambling disorder, but do not seem to underlie gambling-specific biases, at least in a simple, direct way.

  1. Causalidade e epidemiologia Causality and epidemiology

    Directory of Open Access Journals (Sweden)

    Rita Barradas Barata

    1997-06-01

    Full Text Available Este texto trata da questão da causalidade em epidemiologia. Começa com um breve retrospecto histórico para recuperar os diversos sentidos dados ao conceito pelos principais filósofos ocidentais. Em seguida, considera as raízes históricas da epidemiologia enquanto disciplina científica e as transformações que o conceito de causa sofreu em seu âmbito. Estabelecidas essas premissas, analisa-se o desenvolvimento da epidemiologia no século XX e a crise de paradigma que enfrenta na atualidade. Como saídas para a crise, no que se refere à questão da causalidade, examina três alternativas: a epidemiologia social, a crítica popperiana a os aportes da biologia molecular. Finalmente, comenta a necessiclade de uma nova teoria epidemiológica construída a partir da teoria da complexidade.In examining the issue of causality whithin epidemiology, the text begins with a brief historical overview that reclaims the different meanings which the West's main philosophers have lent to this concept. It next delves into the historical roots of epidemiology as a scientific discipline and the transformations the concept of cause has undergone to within this realm. With these presuppositions in place, the text goes on to analyse the 20th century development of epidemiology and the crisis ít currently faces in terms of paradigm. Three alternatives are explored as ways out of this crisis: social epidemiology, Popperian criticism and the contributions of molecular biology. Lastly, the tent discusses the need for a new epidemiological theory grounded on the theory of complexity.

  2. A new correlation for nucleate pool boiling of aqueous mixtures

    International Nuclear Information System (INIS)

    Thome, J.R.; Shakir, S.

    1987-01-01

    A new mixture boiling correlation was developed for nucleate pool boiling of aqueous mixtures on plain, smooth tubes. The semi-empirical correlation models the rise in the local bubble point temperature in a mixture caused by the preferential evaporation of the more volatile component during bubble growth. This rise varies from zero at low heat fluxes (where only single-phase natural convection is present) up to nearly the entire boiling range at the peak heat flux (where latent heat transport is dominant). The boiling range, which is the temperature difference between the dew point and bubble point of a mixture, is used to characterize phase equilibrium effects. An exponential term models the rise in the local bubble point temperature as a function of heat flux. The correlation was compared against binary mixture boiling data for ethanol-water, methanol-water, n-propanol-water, and acetone-water. The majority of the data was predicted to within 20%. Further experimental research is currently underway to obtain multicomponent boiling data for aqueous mixtures with up to five components and for wider boiling ranges

  3. A Complex Systems Approach to Causal Discovery in Psychiatry.

    Directory of Open Access Journals (Sweden)

    Glenn N Saxe

    Full Text Available Conventional research methodologies and data analytic approaches in psychiatric research are unable to reliably infer causal relations without experimental designs, or to make inferences about the functional properties of the complex systems in which psychiatric disorders are embedded. This article describes a series of studies to validate a novel hybrid computational approach--the Complex Systems-Causal Network (CS-CN method-designed to integrate causal discovery within a complex systems framework for psychiatric research. The CS-CN method was first applied to an existing dataset on psychopathology in 163 children hospitalized with injuries (validation study. Next, it was applied to a much larger dataset of traumatized children (replication study. Finally, the CS-CN method was applied in a controlled experiment using a 'gold standard' dataset for causal discovery and compared with other methods for accurately detecting causal variables (resimulation controlled experiment. The CS-CN method successfully detected a causal network of 111 variables and 167 bivariate relations in the initial validation study. This causal network had well-defined adaptive properties and a set of variables was found that disproportionally contributed to these properties. Modeling the removal of these variables resulted in significant loss of adaptive properties. The CS-CN method was successfully applied in the replication study and performed better than traditional statistical methods, and similarly to state-of-the-art causal discovery algorithms in the causal detection experiment. The CS-CN method was validated, replicated, and yielded both novel and previously validated findings related to risk factors and potential treatments of psychiatric disorders. The novel approach yields both fine-grain (micro and high-level (macro insights and thus represents a promising approach for complex systems-oriented research in psychiatry.

  4. Dye mixtures for ultrafast wavelength shifters

    International Nuclear Information System (INIS)

    Gangopadhyay, S.; Liu, L.; Palsule, C.; Borst, W.; Wigmans, R.

    1994-01-01

    Particle detectors based on scintillation processes have been used since the discovery of radium about 100 years ago. The fast signals that can be obtained with these detectors, although often considered a nice asset, were rarely essential for the success of experiments. However, the new generation of high energy particle accelerators require particle detectors with fast response time. The authors have produced fast wavelength shifters using mixtures of various Coumarin dyes with DCM in epoxy-polymers (DGEBA+HHPA) and measured the properties of these wavelength shifters. The particular mixtures were chosen because there is a substantial overlap between the emission spectrum of Coumarin and the absorption spectrum of DCM. The continuous wave and time-resolved fluorescence spectra have been studied as a function of component concentration to optimize the decay times, emission peaks and quantum yields. The mean decay times of these mixtures are in the range of 2.5--4.5 ns. The mean decay time increases with an increase in Coumarin concentration at a fixed DCM concentration or with a decrease in DCM concentration at a fixed Coumarin concentration. This indicates that the energy transfer is radiative at lower relative DCM concentrations and becomes non-radiative at higher DCM concentrations

  5. Non-Genomic Effects of Xenoestrogen Mixtures

    Science.gov (United States)

    Viñas, René; Jeng, Yow-Jiun; Watson, Cheryl S.

    2012-01-01

    Xenoestrogens (XEs) are chemicals derived from a variety of natural and anthropogenic sources that can interfere with endogenous estrogens by either mimicking or blocking their responses via non-genomic and/or genomic signaling mechanisms. Disruption of estrogens’ actions through the less-studied non-genomic pathway can alter such functional end points as cell proliferation, peptide hormone release, catecholamine transport, and apoptosis, among others. Studies of potentially adverse effects due to mixtures and to low doses of endocrine-disrupting chemicals have recently become more feasible, though few so far have included actions via the non-genomic pathway. Physiologic estrogens and XEs evoke non-monotonic dose responses, with different compounds having different patterns of actions dependent on concentration and time, making mixture assessments all the more challenging. In order to understand the spectrum of toxicities and their mechanisms, future work should focus on carefully studying individual and mixture components across a range of concentrations and cellular pathways in a variety of tissue types. PMID:23066391

  6. Non-Genomic Effects of Xenoestrogen Mixtures

    Directory of Open Access Journals (Sweden)

    Yow-Jiun Jeng

    2012-07-01

    Full Text Available Xenoestrogens (XEs are chemicals derived from a variety of natural and anthropogenic sources that can interfere with endogenous estrogens by either mimicking or blocking their responses via non-genomic and/or genomic signaling mechanisms. Disruption of estrogens’ actions through the less-studied non-genomic pathway can alter such functional end points as cell proliferation, peptide hormone release, catecholamine transport, and apoptosis, among others. Studies of potentially adverse effects due to mixtures and to low doses of endocrine-disrupting chemicals have recently become more feasible, though few so far have included actions via the non-genomic pathway. Physiologic estrogens and XEs evoke non-monotonic dose responses, with different compounds having different patterns of actions dependent on concentration and time, making mixture assessments all the more challenging. In order to understand the spectrum of toxicities and their mechanisms, future work should focus on carefully studying individual and mixture components across a range of concentrations and cellular pathways in a variety of tissue types.

  7. Statistical mechanical theory of fluid mixtures

    Science.gov (United States)

    Zhao, Yueqiang; Wu, Zhengming; Liu, Weiwei

    2014-01-01

    A general statistical mechanical theory of fluid mixtures (liquid mixtures and gas mixtures) is developed based on the statistical mechanical expression of chemical potential of components in the grand canonical ensemble, which gives some new relationships between thermodynamic quantities (equilibrium ratio Ki, separation factor α and activity coefficient γi) and ensemble average potential energy u for one molecule. The statistical mechanical expressions of separation factor α and activity coefficient γi derived in this work make the fluid phase equilibrium calculations can be performed by molecular simulation simply and efficiently, or by the statistical thermodynamic approach (based on the saturated-vapor pressure of pure substance) that does not need microscopic intermolecular pair potential functions. The physical meaning of activity coefficient γi in the liquid phase is discussed in detail from a viewpoint of molecular thermodynamics. The calculated Vapor-Liquid Equilibrium (VLE) properties of argon-methane, methanol-water and n-hexane-benzene systems by this model fit well with experimental data in references, which indicates that this model is accurate and reliable in the prediction of VLE properties for small, large and strongly associating molecules; furthermore the statistical mechanical expressions of separation factor α and activity coefficient γi have good compatibility with classical thermodynamic equations and quantum mechanical COSMO-SAC approach.

  8. Component Rhinoplasty

    OpenAIRE

    Mohmand, Muhammad Humayun; Ahmad, Muhammad

    2014-01-01

    BACKGROUND According to statistics of American Society of Plastic Surgeons, cosmetic rhinoplasty was the second most frequently performed cosmetic surgery. This study shares the experiences with component rhinoplasty. METHODS From 2004 to 2010, all patients underwent aesthetic nasal surgery were enrolled. The patients requiring only correction of septal deviation and those presenting with cleft lip nasal deformity were excluded. All procedures were performed under general anaesthesia with ope...

  9. Hyperfrequency components

    Science.gov (United States)

    1994-09-01

    The document has a collection of 19 papers (11 on technologies, 8 on applications) by 26 authors and coauthors. Technological topics include: evolution from conventional HEMT's double heterojunction and planar types of pseudomorphic HEMT's; MMIC R&D and production aspects for very-low-noise, low-power, and very-low-noise, high-power applications; hyperfrequency CAD tools; parametric measurements of hyperfrequency components on plug-in cards for design and in-process testing uses; design of Class B power amplifiers and millimetric-wave, bigrid-transistor mixers, exemplifying combined use of three major types of physical simulation in electrical modeling of microwave components; FET's for power amplification at up to 110 GHz; production, characterization, and nonlinear applications of resonant tunnel diodes. Applications topics include: development of active modules for major European programs; tubes versus solid-state components in hyperfrequency applications; status and potentialities of national and international cooperative R&D on MMIC's and CAD of hyperfrequency circuitry; attainable performance levels in multifunction MMIC applications; state of the art relative of MESFET power amplifiers (Bands S, C, X, Ku); creating a hyperfrequency functions library, of parametrizable reference cells or macrocells; and design of a single-stage, low-noise, band-W amplifier toward development of a three-stage amplifier.

  10. Causal influence in linear Langevin networks without feedback

    Science.gov (United States)

    Auconi, Andrea; Giansanti, Andrea; Klipp, Edda

    2017-04-01

    The intuition of causation is so fundamental that almost every research study in life sciences refers to this concept. However, a widely accepted formal definition of causal influence between observables is still missing. In the framework of linear Langevin networks without feedback (linear response models) we propose a measure of causal influence based on a new decomposition of information flows over time. We discuss its main properties and we compare it with other information measures like the transfer entropy. We are currently unable to extend the definition of causal influence to systems with a general feedback structure and nonlinearities.

  11. Causality Constraints on Hadron Production In High Energy Collisions

    CERN Document Server

    Castorina, P

    2014-01-01

    For hadron production in high energy collisions, causality requirements lead to the counterpart of the cosmological horizon problem: the production occurs in a number of causally disconnected regions of finite space-time size. As a result, globally conserved quantum numbers (charge, strangeness, baryon number) must be conserved locally in spatially restricted correlation clusters. This provides a theoretical basis for the observed suppression of strangeness production in elementary interactions (pp, e^+e^-). In contrast, the space-time superposition of many collisions in heavy ion interactions largely removes these causality constraints, resulting in an ideal hadronic resonance gas in full equilibrium.

  12. Causality and prediction: differences and points of contact

    Directory of Open Access Journals (Sweden)

    Luis Carlos Silva Ayçaguer, PhD

    2014-09-01

    Full Text Available This contribution presents the differences between those variables that might play a causal role in a certain process and those only valuable for predicting the outcome. Some considerations are made about the core intervention of the association and the temporal precedence and biases in both cases, the study of causality and predictive modeling. In that context, several relevant aspects related to the design of the corresponding studies are briefly reviewed and some of the mistakes that are often committed in handling both, causality and prediction, are illustrated.

  13. Spatial Causality. An application to the Deforestation Process in Bolivia

    Directory of Open Access Journals (Sweden)

    Javier Aliaga

    2011-01-01

    Full Text Available This paper analyses the causes of deforestation for a representative set of Bolivian municipalities. The literature on environmental economics insists on the importance of physical and social factors. We focus on the last group of variables. Our objective is to identify causal mechanisms between these factors of risk and the problem of deforestation. To this end, we present a testing strategy for spatial causality, based on a sequence of Lagrange Multipliers. The results that we obtain for the Bolivian case confirm only partially the traditional view of the problem of deforestation. Indeed, we only find unequivocal signs of causality in relation to the structure of property rights.

  14. Evaporation dynamics and Marangoni number estimation for sessile picoliter liquid drop of binary mixture solution

    OpenAIRE

    Lebedev-Stepanov Peter; Kobelev Alexander; Efimov Sergey

    2016-01-01

    We propose the evaporation model of picoliter sessile drop of binary solvent mixture (with infinitely soluble in each other components) based on Hu and Larson solution for single solvent sessile drop and Raoult law for saturated vapor density of components of binary mixture in wide range of undimensional molar binary concentration of the components. Concentration Marangoni number estimation for such a system is also considered for prediction of liquid flows structure for further applications ...

  15. Poisson Mixture Regression Models for Heart Disease Prediction

    Science.gov (United States)

    Erol, Hamza

    2016-01-01

    Early heart disease control can be achieved by high disease prediction and diagnosis efficiency. This paper focuses on the use of model based clustering techniques to predict and diagnose heart disease via Poisson mixture regression models. Analysis and application of Poisson mixture regression models is here addressed under two different classes: standard and concomitant variable mixture regression models. Results show that a two-component concomitant variable Poisson mixture regression model predicts heart disease better than both the standard Poisson mixture regression model and the ordinary general linear Poisson regression model due to its low Bayesian Information Criteria value. Furthermore, a Zero Inflated Poisson Mixture Regression model turned out to be the best model for heart prediction over all models as it both clusters individuals into high or low risk category and predicts rate to heart disease componentwise given clusters available. It is deduced that heart disease prediction can be effectively done by identifying the major risks componentwise using Poisson mixture regression model. PMID:27999611

  16. Explosibility boundaries for fly ash/pulverized fuel mixtures.

    Science.gov (United States)

    Dastidar, A G; Amyotte, P R

    2002-05-27

    Incomplete combustion and subsequent fuel contamination of a waste stream can pose a serious explosion hazard. An example of this type of incident is the contamination of fly ash with unburned pulverized coal. The coal, if present in sufficient quantities in the mixture, can act as a fuel source for a potential explosion. Experiments were conducted in a 20l Siwek explosibility test chamber to determine the minimum fuel contamination of fly ash required to form an explosible mixture. A sample of fly ash from Ontario Power Generation (OPG) (Ont., Canada) was artificially contaminated with Pittsburgh pulverized coal dust (the surrogate used to represent unburned fuel dust). Additionally, the influence of fly ash particle size on the amount of fuel contaminant required to form an explosible mixture was examined. Fine and coarse size fractions of fly ash were obtained by screening the original sample of OPG fly ash. The results show that at least 21% Pittsburgh pulverized coal (or 10% volatile matter) was required to form an explosible mixture of the original fly ash sample and coal dust. The results also illustrate that fly ash particle size is important when examining the explosibility of the mixture. The fine size fraction of fly ash required a minimum of 25% coal dust (12% volatile matter) in the mixture for explosibility, whereas the coarse fly ash required only 10% coal dust (7% volatile matter). Thus, the larger the particle size of the inert fly ash component in the mixture, the greater the hazard.

  17. Dynamics and causalities of atmospheric and oceanic data identified by complex networks and Granger causality analysis

    Science.gov (United States)

    Charakopoulos, A. K.; Katsouli, G. A.; Karakasidis, T. E.

    2018-04-01

    Understanding the underlying processes and extracting detailed characteristics of spatiotemporal dynamics of ocean and atmosphere as well as their interaction is of significant interest and has not been well thoroughly established. The purpose of this study was to examine the performance of two main additional methodologies for the identification of spatiotemporal underlying dynamic characteristics and patterns among atmospheric and oceanic variables from Seawatch buoys from Aegean and Ionian Sea, provided by the Hellenic Center for Marine Research (HCMR). The first approach involves the estimation of cross correlation analysis in an attempt to investigate time-lagged relationships, and further in order to identify the direction of interactions between the variables we performed the Granger causality method. According to the second approach the time series are converted into complex networks and then the main topological network properties such as degree distribution, average path length, diameter, modularity and clustering coefficient are evaluated. Our results show that the proposed analysis of complex network analysis of time series can lead to the extraction of hidden spatiotemporal characteristics. Also our findings indicate high level of positive and negative correlations and causalities among variables, both from the same buoy and also between buoys from different stations, which cannot be determined from the use of simple statistical measures.

  18. Permeation of aromatic solvent mixtures through nitrile protective gloves.

    Science.gov (United States)

    Chao, Keh-Ping; Hsu, Ya-Ping; Chen, Su-Yi

    2008-05-30

    The permeation of binary and ternary mixtures of benzene, toluene, ethyl benzene and p-xylene through nitrile gloves were investigated using the ASTM F739 test cell. The more slowly permeating component of a mixture was accelerated to have a shorter breakthrough time than its pure form. The larger differences in solubility parameter between a solvent mixture and glove resulted in a lower permeation rate. Solubility parameter theory provides a potential approach to interpret the changes of permeation properties for BTEX mixtures through nitrile gloves. Using a one-dimensional diffusion model based on Fick's law, the permeation concentrations of ASTM F739 experiments were appropriately simulated by the estimated diffusion coefficient and solubility. This study will be a fundamental work for the risk assessment of the potential dermal exposure of workers wearing protective gloves.

  19. Forced convection heat transfer to air/water vapor mixtures

    International Nuclear Information System (INIS)

    Richards, D.R.; Florschuetz, L.W.

    1986-01-01

    Heat transfer coefficients were measured using both dry air and air/water vapor mixtures in the same forced convection cooling test rig (jet array impingement configurations) with mass ratios of water vapor to air up to 0.23. The primary objective was to verify by direct experiment that selected existing methods for evaluation of viscosity and thermal conductivity of air/water vapor mixtures could be used with confidence to predict heat transfer coefficients for such mixtures using as a basis heat transfer data for dry air only. The property evaluation methods deemed most appropriate require as a basis a measured property value at one mixture composition in addition to the property values for the pure components. 20 references

  20. Relativistic Causality and Quasi-Orthomodular Algebras

    Science.gov (United States)

    Nobili, Renato

    2006-05-01

    The concept of fractionability or decomposability in parts of a physical system has its mathematical counterpart in the lattice--theoretic concept of orthomodularity. Systems with a finite number of degrees of freedom can be decomposed in different ways, corresponding to different groupings of the degrees of freedom. The orthomodular structure of these simple systems is trivially manifest. The problem then arises as to whether the same property is shared by physical systems with an infinite number of degrees of freedom, in particular by the quantum relativistic ones. The latter case was approached several years ago by Haag and Schroer (1962; Haag, 1992) who started from noting that the causally complete sets of Minkowski spacetime form an orthomodular lattice and posed the question of whether the subalgebras of local observables, with topological supports on such subsets, form themselves a corresponding orthomodular lattice. Were it so, the way would be paved to interpreting spacetime as an intrinsic property of a local quantum field algebra. Surprisingly enough, however, the hoped property does not hold for local algebras of free fields with superselection rules. The possibility seems to be instead open if the local currents that govern the superselection rules are driven by gauge fields. Thus, in the framework of local quantum physics, the request for algebraic orthomodularity seems to imply physical interactions! Despite its charm, however, such a request appears plagued by ambiguities and criticities that make of it an ill--posed problem. The proposers themselves, indeed, concluded that the orthomodular correspondence hypothesis is too strong for having a chance of being practicable. Thus, neither the idea was taken seriously by the proposers nor further investigated by others up to a reasonable degree of clarification. This paper is an attempt to re--formulate and well--pose the problem. It will be shown that the idea is viable provided that the algebra of

  1. A framework for Bayesian nonparametric inference for causal effects of mediation.

    Science.gov (United States)

    Kim, Chanmin; Daniels, Michael J; Marcus, Bess H; Roy, Jason A

    2017-06-01

    We propose a Bayesian non-parametric (BNP) framework for estimating causal effects of mediation, the natural direct, and indirect, effects. The strategy is to do this in two parts. Part 1 is a flexible model (using BNP) for the observed data distribution. Part 2 is a set of uncheckable assumptions with sensitivity parameters that in conjunction with Part 1 allows identification and estimation of the causal parameters and allows for uncertainty about these assumptions via priors on the sensitivity parameters. For Part 1, we specify a Dirichlet process mixture of multivariate normals as a prior on the joint distribution of the outcome, mediator, and covariates. This approach allows us to obtain a (simple) closed form of each marginal distribution. For Part 2, we consider two sets of assumptions: (a) the standard sequential ignorability (Imai et al., 2010) and (b) weakened set of the conditional independence type assumptions introduced in Daniels et al. (2012) and propose sensitivity analyses for both. We use this approach to assess mediation in a physical activity promotion trial. © 2016, The International Biometric Society.

  2. Variance Components

    CERN Document Server

    Searle, Shayle R; McCulloch, Charles E

    1992-01-01

    WILEY-INTERSCIENCE PAPERBACK SERIES. The Wiley-Interscience Paperback Series consists of selected books that have been made more accessible to consumers in an effort to increase global appeal and general circulation. With these new unabridged softcover volumes, Wiley hopes to extend the lives of these works by making them available to future generations of statisticians, mathematicians, and scientists. ". . .Variance Components is an excellent book. It is organized and well written, and provides many references to a variety of topics. I recommend it to anyone with interest in linear models.".

  3. Eventos Quânticos e Reducionismo Causal

    Directory of Open Access Journals (Sweden)

    Osvaldo Pessoa Jr.

    2013-09-01

    Full Text Available http://dx.doi.org/10.5007/1808-1711.2013v17n3p365   This paper is the first step in an investigation of whether microscopic events can be reduced to a mereological composition of elementary events, especially in biological systems. The hypothesis is made that, between events in which quanta are exchanged, there is causal flow, but strictly speaking no events take place. A causal event is characterized by the possibility of an intervention or manipulation. Thus, three types of quantum mechanical events may be found: (1 detection of a quantum of energy; (2 confinement by an apparatus in a Glauber coherent state; (3 null result measurement (without exchange of quanta. The paper explores these three types of elementary causal events, e sets forth as the next step the investigation of the causal events involved in the action of a molecular motor.

  4. Is Host-Based Anomaly Detection + Temporal Correlation = Worm Causality

    National Research Council Canada - National Science Library

    Sekar, Vyas; Xie, Yinglian; Reiter, Michael K; Zhang, Hui

    2007-01-01

    Epidemic-spreading attacks (e.g., worm and botnet propagation) have a natural notion of attack causality - a single network flow causes a victim host to get infected and subsequently spread the attack...

  5. Quantifying 'causality' in complex systems: understanding transfer entropy.

    Directory of Open Access Journals (Sweden)

    Fatimah Abdul Razak

    Full Text Available 'Causal' direction is of great importance when dealing with complex systems. Often big volumes of data in the form of time series are available and it is important to develop methods that can inform about possible causal connections between the different observables. Here we investigate the ability of the Transfer Entropy measure to identify causal relations embedded in emergent coherent correlations. We do this by firstly applying Transfer Entropy to an amended Ising model. In addition we use a simple Random Transition model to test the reliability of Transfer Entropy as a measure of 'causal' direction in the presence of stochastic fluctuations. In particular we systematically study the effect of the finite size of data sets.

  6. Mixed Causal-Noncausal Autoregressions with Strictly Exogenous Regressors

    NARCIS (Netherlands)

    Hecq, Alain; Issler, J.V.; Telg, Sean

    2017-01-01

    The mixed autoregressive causal-noncausal model (MAR) has been proposed to estimate economic relationships involving explosive roots in their autoregressive part, as they have stationary forward solutions. In previous work, possible exogenous variables in economic relationships are substituted into

  7. Defining the Locus of Developmental Differences in Children's Causal Reasoning

    Science.gov (United States)

    Siegler, Robert S.

    1975-01-01

    Five experiments were performed in the area of children's causal reasoning to validate a previously reported developmental difference, to examine the role of a possible mediating mechanism, and to test a number of competing theoretical interpretations. (GO)

  8. Management’s causal reasoning on performance and earnings management

    NARCIS (Netherlands)

    Aerts, W.A.A.; Zhang, S.

    2014-01-01

    We investigate the association between the intensity of causal reasoning on performance in a firm’s annual management commentary and its earnings management propensity. Anticipated earnings management concerns are argued to constitute a significant accountability predicament, bringing management to

  9. On Storks and Babies: Correlation, Causality and Field Experiments

    Directory of Open Access Journals (Sweden)

    Lambrecht Anja

    2016-11-01

    Full Text Available The explosion of available data has created much excitement among marketing practitioners about their ability to better understand the impact of marketing investments. Big data allows for detecting patterns and often it seems plausible to interpret them as causal. While it is quite obvious that storks do not bring babies, marketing relationships are usually less clear. Apparent “causalities” often fail to hold up under examination. If marketers want to be sure not to walk into a causality trap, they need to conduct field experiments to detect true causal relationships. In the present digital environment, experiments are easier than ever to execute. However, they need to be prepared and interpreted with great care in order to deliver meaningful and genuinely causal results that help improve marketing decisions.

  10. Energy consumption and economic growth: A causality analysis for Greece

    International Nuclear Information System (INIS)

    Tsani, Stela Z.

    2010-01-01

    This paper investigates the causal relationship between aggregated and disaggregated levels of energy consumption and economic growth for Greece for the period 1960-2006 through the application of a later development in the methodology of time series proposed by Toda and Yamamoto (1995). At aggregated levels of energy consumption empirical findings suggest the presence of a uni-directional causal relationship running from total energy consumption to real GDP. At disaggregated levels empirical evidence suggests that there is a bi-directional causal relationship between industrial and residential energy consumption to real GDP but this is not the case for the transport energy consumption with causal relationship being identified in neither direction. The importance of these findings lies on their policy implications and their adoption on structural policies affecting energy consumption in Greece suggesting that in order to address energy import dependence and environmental concerns without hindering economic growth emphasis should be put on the demand side and energy efficiency improvements.

  11. First report of Chryseobacterium indologenes as causal agent for ...

    African Journals Online (AJOL)

    First report of Chryseobacterium indologenes as causal agent for crown rot of papaya (Carica papaya L.) in peninsular Malaysia. B.N.M. Din, J Kadir, M.S. Hailmi, K Sijam, N.A. Badaluddin, Z Suhaili ...

  12. QED representation for the net of causal loops

    Science.gov (United States)

    Ciolli, Fabio; Ruzzi, Giuseppe; Vasselli, Ezio

    2015-06-01

    The present work tackles the existence of local gauge symmetries in the setting of Algebraic Quantum Field Theory (AQFT). The net of causal loops, previously introduced by the authors, is a model independent construction of a covariant net of local C*-algebras on any 4-dimensional globally hyperbolic space-time, aimed to capture structural properties of any reasonable quantum gauge theory. Representations of this net can be described by causal and covariant connection systems, and local gauge transformations arise as maps between equivalent connection systems. The present paper completes these abstract results, realizing QED as a representation of the net of causal loops in Minkowski space-time. More precisely, we map the quantum electromagnetic field Fμν, not free in general, into a representation of the net of causal loops and show that the corresponding connection system and the local gauge transformations find a counterpart in terms of Fμν.

  13. Mendelian Randomization versus Path Models: Making Causal Inferences in Genetic Epidemiology.

    Science.gov (United States)

    Ziegler, Andreas; Mwambi, Henry; König, Inke R

    2015-01-01

    The term Mendelian randomization is popular in the current literature. The first aim of this work is to describe the idea of Mendelian randomization studies and the assumptions required for drawing valid conclusions. The second aim is to contrast Mendelian randomization and path modeling when different 'omics' levels are considered jointly. We define Mendelian randomization as introduced by Katan in 1986, and review its crucial assumptions. We introduce path models as the relevant additional component to the current use of Mendelian randomization studies in 'omics'. Real data examples for the association between lipid levels and coronary artery disease illustrate the use of path models. Numerous assumptions underlie Mendelian randomization, and they are difficult to be fulfilled in applications. Path models are suitable for investigating causality, and they should not be mixed up with the term Mendelian randomization. In many applications, path modeling would be the appropriate analysis in addition to a simple Mendelian randomization analysis. Mendelian randomization and path models use different concepts for causal inference. Path modeling but not simple Mendelian randomization analysis is well suited to study causality with different levels of 'omics' data. 2015 S. Karger AG, Basel.

  14. Daily Grind: A Comparison of Causality Orientations, Emotions, and Fantasy Sport Participation.

    Science.gov (United States)

    Dwyer, Brendan; Weiner, James

    2018-03-01

    In 2015, daily fantasy football entered the fantasy sports market as an offshoot of the traditional, season-long form of the game. With quicker payouts and less commitment, the new activity has drawn comparisons to other forms of illegal gambling, and the determination of whether it is a primarily a game of skill or chance has become the center of the comparison. For the most part, legal commentators and society, in general, views traditional, season-long fantasy football as an innocuous, social activity governed equally by both skill and chance. Little evidence exists, however, about participant perception of skill and chance components in daily fantasy football. The current study surveyed 535 daily and traditional-only fantasy football participants in order to understand differences and similarities in the causality orientations of participation (skill or chance). In addition, enjoyment and anxiety were tested for mediating effects on causality orientations and consumption behavior. The results suggest the differences between the activities are not extreme. However, differences were found in which causality orientations influenced enjoyment and which emotion mediated the relationship between perceived skill and consumption.

  15. mixsmsn: Fitting Finite Mixture of Scale Mixture of Skew-Normal Distributions

    Directory of Open Access Journals (Sweden)

    Marcos Oliveira Prates

    2013-09-01

    Full Text Available We present the R package mixsmsn, which implements routines for maximum likeli- hood estimation (via an expectation maximization EM-type algorithm in finite mixture models with components belonging to the class of scale mixtures of the skew-normal distribution, which we call the FMSMSN models. Both univariate and multivariate re- sponses are considered. It is possible to fix the number of components of the mixture to be fitted, but there exists an option that transfers this responsibility to an automated procedure, through the analysis of several models choice criteria. Plotting routines to generate histograms, plug-in densities and contour plots using the fitted models output are also available. The precision of the EM estimates can be evaluated through their esti- mated standard deviations, which can be obtained by the provision of an approximation of the associated information matrix for each particular model in the FMSMSN family. A function to generate artificial samples from several elements of the family is also supplied. Finally, two real data sets are analyzed in order to show the usefulness of the package.

  16. Deconstructing events: the neural bases for space, time, and causality.

    Science.gov (United States)

    Kranjec, Alexander; Cardillo, Eileen R; Schmidt, Gwenda L; Lehet, Matthew; Chatterjee, Anjan

    2012-01-01

    Space, time, and causality provide a natural structure for organizing our experience. These abstract categories allow us to think relationally in the most basic sense; understanding simple events requires one to represent the spatial relations among objects, the relative durations of actions or movements, and the links between causes and effects. The present fMRI study investigates the extent to which the brain distinguishes between these fundamental conceptual domains. Participants performed a 1-back task with three conditions of interest (space, time, and causality). Each condition required comparing relations between events in a simple verbal narrative. Depending on the condition, participants were instructed to either attend to the spatial, temporal, or causal characteristics of events, but between participants each particular event relation appeared in all three conditions. Contrasts compared neural activity during each condition against the remaining two and revealed how thinking about events is deconstructed neurally. Space trials recruited neural areas traditionally associated with visuospatial processing, primarily bilateral frontal and occipitoparietal networks. Causality trials activated areas previously found to underlie causal thinking and thematic role assignment, such as left medial frontal and left middle temporal gyri, respectively. Causality trials also produced activations in SMA, caudate, and cerebellum; cortical and subcortical regions associated with the perception of time at different timescales. The time contrast, however, produced no significant effects. This pattern, indicating negative results for time trials but positive effects for causality trials in areas important for time perception, motivated additional overlap analyses to further probe relations between domains. The results of these analyses suggest a closer correspondence between time and causality than between time and space.

  17. Informational and Causal Architecture of Continuous-time Renewal Processes

    Science.gov (United States)

    Marzen, Sarah; Crutchfield, James P.

    2017-07-01

    We introduce the minimal maximally predictive models (ɛ {-machines }) of processes generated by certain hidden semi-Markov models. Their causal states are either discrete, mixed, or continuous random variables and causal-state transitions are described by partial differential equations. As an application, we present a complete analysis of the ɛ {-machines } of continuous-time renewal processes. This leads to closed-form expressions for their entropy rate, statistical complexity, excess entropy, and differential information anatomy rates.

  18. Newton's Law of Universal Gravitation and Hume's Conception of Causality

    OpenAIRE

    Slavov, Matias

    2013-01-01

    This article investigates the relationship between Hume’s causal philosophy and Newton’s philosophy of nature. I claim that Newton’s experimentalist methodology in gravity research is an important background for understanding Hume’s conception of causality: Hume sees the relation of cause and effect as not being founded on a priori reasoning, similar to the way that Newton criticized non-empirical hypotheses about the properties of gravity. However, according to Hume’s criteria of...

  19. Self Occlusion and Disocclusion in Causal Video Object Segmentation

    Science.gov (United States)

    2015-12-18

    Self-Occlusion and Disocclusion in Causal Video Object Segmentation Yanchao Yang1, Ganesh Sundaramoorthi2, and Stefano Soatto1 1University of...video segmentation (e.g., [14, 19, 36, 16]), tracking (e.g., [35, 3, 12, 20]), optical flow (e.g., [15, 6, 7, 39, 26]), and motion segmentation (e.g...without over- segmenting them. Other motion segmentation approaches perform clustering of optical flow, often non- causally [23, 14]. Although our goal is

  20. Ultra-Wideband Electromagnetic Pulse Propagation through Causal Media

    Science.gov (United States)

    2016-03-04

    AFRL-AFOSR-VA-TR-2016-0112 Ultra-Wideband Electromagnetic Pulse Propagation through Causal Media Natalie Cartwright RESEARCH FOUNDATION OF STATE... Electromagnetic Pulse Propagation through Causal Media 5a.  CONTRACT NUMBER 5b.  GRANT NUMBER FA9550-13-1-0013 5c.  PROGRAM ELEMENT NUMBER 61102F 6...SUPPLEMENTARY NOTES 14. ABSTRACT When an electromagnetic pulse travels through a dispersive material each frequency of the transmitted pulse changes in both