Causal inference, probability theory, and graphical insights.
Baker, Stuart G
2013-11-10
Causal inference from observational studies is a fundamental topic in biostatistics. The causal graph literature typically views probability theory as insufficient to express causal concepts in observational studies. In contrast, the view here is that probability theory is a desirable and sufficient basis for many topics in causal inference for the following two reasons. First, probability theory is generally more flexible than causal graphs: Besides explaining such causal graph topics as M-bias (adjusting for a collider) and bias amplification and attenuation (when adjusting for instrumental variable), probability theory is also the foundation of the paired availability design for historical controls, which does not fit into a causal graph framework. Second, probability theory is the basis for insightful graphical displays including the BK-Plot for understanding Simpson's paradox with a binary confounder, the BK2-Plot for understanding bias amplification and attenuation in the presence of an unobserved binary confounder, and the PAD-Plot for understanding the principal stratification component of the paired availability design. Published 2013. This article is a US Government work and is in the public domain in the USA.
DEFF Research Database (Denmark)
Kreiner, Svend; Christensen, Karl Bang
Rasch models; Partial Credit models; Rating Scale models; Item bias; Differential item functioning; Local independence; Graphical models......Rasch models; Partial Credit models; Rating Scale models; Item bias; Differential item functioning; Local independence; Graphical models...
DEFF Research Database (Denmark)
Jensen, Finn Verner; Nielsen, Thomas Dyhre
2016-01-01
Mathematically, a Bayesian graphical model is a compact representation of the joint probability distribution for a set of variables. The most frequently used type of Bayesian graphical models are Bayesian networks. The structural part of a Bayesian graphical model is a graph consisting of nodes...
Causally nonseparable processes admitting a causal model
International Nuclear Information System (INIS)
Feix, Adrien; Araújo, Mateus; Brukner, Caslav
2016-01-01
A recent framework of quantum theory with no global causal order predicts the existence of ‘causally nonseparable’ processes. Some of these processes produce correlations incompatible with any causal order (they violate so-called ‘causal inequalities’ analogous to Bell inequalities ) while others do not (they admit a ‘causal model’ analogous to a local model ). Here we show for the first time that bipartite causally nonseparable processes with a causal model exist, and give evidence that they have no clear physical interpretation. We also provide an algorithm to generate processes of this kind and show that they have nonzero measure in the set of all processes. We demonstrate the existence of processes which stop violating causal inequalities but are still causally nonseparable when mixed with a certain amount of ‘white noise’. This is reminiscent of the behavior of Werner states in the context of entanglement and nonlocality. Finally, we provide numerical evidence for the existence of causally nonseparable processes which have a causal model even when extended with an entangled state shared among the parties. (paper)
DEFF Research Database (Denmark)
Højsgaard, Søren; Edwards, David; Lauritzen, Steffen
Graphical models in their modern form have been around since the late 1970s and appear today in many areas of the sciences. Along with the ongoing developments of graphical models, a number of different graphical modeling software programs have been written over the years. In recent years many...... of these software developments have taken place within the R community, either in the form of new packages or by providing an R ingerface to existing software. This book attempts to give the reader a gentle introduction to graphical modeling using R and the main features of some of these packages. In addition......, the book provides examples of how more advanced aspects of graphical modeling can be represented and handled within R. Topics covered in the seven chapters include graphical models for contingency tables, Gaussian and mixed graphical models, Bayesian networks and modeling high dimensional data...
Højsgaard, Søren; Lauritzen, Steffen
2012-01-01
Graphical models in their modern form have been around since the late 1970s and appear today in many areas of the sciences. Along with the ongoing developments of graphical models, a number of different graphical modeling software programs have been written over the years. In recent years many of these software developments have taken place within the R community, either in the form of new packages or by providing an R interface to existing software. This book attempts to give the reader a gentle introduction to graphical modeling using R and the main features of some of these packages. In add
CAUSAL INFERENCE WITH A GRAPHICAL HIERARCHY OF INTERVENTIONS.
Shpitser, Ilya; Tchetgen, Eric Tchetgen
2016-12-01
Identifying causal parameters from observational data is fraught with subtleties due to the issues of selection bias and confounding. In addition, more complex questions of interest, such as effects of treatment on the treated and mediated effects may not always be identified even in data where treatment assignment is known and under investigator control, or may be identified under one causal model but not another. Increasingly complex effects of interest, coupled with a diversity of causal models in use resulted in a fragmented view of identification. This fragmentation makes it unnecessarily difficult to determine if a given parameter is identified (and in what model), and what assumptions must hold for this to be the case. This, in turn, complicates the development of estimation theory and sensitivity analysis procedures. In this paper, we give a unifying view of a large class of causal effects of interest, including novel effects not previously considered, in terms of a hierarchy of interventions, and show that identification theory for this large class reduces to an identification theory of random variables under interventions from this hierarchy. Moreover, we show that one type of intervention in the hierarchy is naturally associated with queries identified under the Finest Fully Randomized Causally Interpretable Structure Tree Graph (FFRCISTG) model of Robins (via the extended g-formula), and another is naturally associated with queries identified under the Non-Parametric Structural Equation Model with Independent Errors (NPSEM-IE) of Pearl, via a more general functional we call the edge g-formula. Our results motivate the study of estimation theory for the edge g-formula, since we show it arises both in mediation analysis, and in settings where treatment assignment has unobserved causes, such as models associated with Pearl's front-door criterion.
Directory of Open Access Journals (Sweden)
A. Jackson Stenner
2013-08-01
Full Text Available Rasch’s unidimensional models for measurement show how to connect object measures (e.g., reader abilities, measurement mechanisms (e.g., machine-generated cloze reading items, and observational outcomes (e.g., counts correct on reading instruments. Substantive theory shows what interventions or manipulations to the measurement mechanism can be traded off against a change to the object measure to hold the observed outcome constant. A Rasch model integrated with a substantive theory dictates the form and substance of permissible interventions. Rasch analysis, absent construct theory and an associated specification equation, is a black box in which understanding may be more illusory than not. Finally, the quantitative hypothesis can be tested by comparing theory-based trade-off relations with observed trade-off relations. Only quantitative variables (as measured support such trade-offs. Note that to test the quantitative hypothesis requires more than manipulation of the algebraic equivalencies in the Rasch model or descriptively fitting data to the model. A causal Rasch model involves experimental intervention/manipulation on either reader ability or text complexity or a conjoint intervention on both simultaneously to yield a successful prediction of the resultant observed outcome (count correct. We conjecture that when this type of manipulation is introduced for individual reader text encounters and model predictions are consistent with observations, the quantitative hypothesis is sustained.
Stenner, A Jackson; Fisher, William P; Stone, Mark H; Burdick, Donald S
2013-01-01
Rasch's unidimensional models for measurement show how to connect object measures (e.g., reader abilities), measurement mechanisms (e.g., machine-generated cloze reading items), and observational outcomes (e.g., counts correct on reading instruments). Substantive theory shows what interventions or manipulations to the measurement mechanism can be traded off against a change to the object measure to hold the observed outcome constant. A Rasch model integrated with a substantive theory dictates the form and substance of permissible interventions. Rasch analysis, absent construct theory and an associated specification equation, is a black box in which understanding may be more illusory than not. Finally, the quantitative hypothesis can be tested by comparing theory-based trade-off relations with observed trade-off relations. Only quantitative variables (as measured) support such trade-offs. Note that to test the quantitative hypothesis requires more than manipulation of the algebraic equivalencies in the Rasch model or descriptively fitting data to the model. A causal Rasch model involves experimental intervention/manipulation on either reader ability or text complexity or a conjoint intervention on both simultaneously to yield a successful prediction of the resultant observed outcome (count correct). We conjecture that when this type of manipulation is introduced for individual reader text encounters and model predictions are consistent with observations, the quantitative hypothesis is sustained.
Stenner, A. Jackson; Fisher, William P.; Stone, Mark H.; Burdick, Donald S.
2013-01-01
Rasch's unidimensional models for measurement show how to connect object measures (e.g., reader abilities), measurement mechanisms (e.g., machine-generated cloze reading items), and observational outcomes (e.g., counts correct on reading instruments). Substantive theory shows what interventions or manipulations to the measurement mechanism can be traded off against a change to the object measure to hold the observed outcome constant. A Rasch model integrated with a substantive theory dictates the form and substance of permissible interventions. Rasch analysis, absent construct theory and an associated specification equation, is a black box in which understanding may be more illusory than not. Finally, the quantitative hypothesis can be tested by comparing theory-based trade-off relations with observed trade-off relations. Only quantitative variables (as measured) support such trade-offs. Note that to test the quantitative hypothesis requires more than manipulation of the algebraic equivalencies in the Rasch model or descriptively fitting data to the model. A causal Rasch model involves experimental intervention/manipulation on either reader ability or text complexity or a conjoint intervention on both simultaneously to yield a successful prediction of the resultant observed outcome (count correct). We conjecture that when this type of manipulation is introduced for individual reader text encounters and model predictions are consistent with observations, the quantitative hypothesis is sustained. PMID:23986726
Graphical models for genetic analyses
DEFF Research Database (Denmark)
Lauritzen, Steffen Lilholt; Sheehan, Nuala A.
2003-01-01
This paper introduces graphical models as a natural environment in which to formulate and solve problems in genetics and related areas. Particular emphasis is given to the relationships among various local computation algorithms which have been developed within the hitherto mostly separate areas...... of graphical models and genetics. The potential of graphical models is explored and illustrated through a number of example applications where the genetic element is substantial or dominating....
Causal reasoning with mental models
Khemlani, Sangeet S.; Barbey, Aron K.; Johnson-Laird, Philip N.
2014-01-01
This paper outlines the model-based theory of causal reasoning. It postulates that the core meanings of causal assertions are deterministic and refer to temporally-ordered sets of possibilities: A causes B to occur means that given A, B occurs, whereas A enables B to occur means that given A, it is possible for B to occur. The paper shows how mental models represent such assertions, and how these models underlie deductive, inductive, and abductive reasoning yielding explanations. It reviews evidence both to corroborate the theory and to account for phenomena sometimes taken to be incompatible with it. Finally, it reviews neuroscience evidence indicating that mental models for causal inference are implemented within lateral prefrontal cortex. PMID:25389398
Causal reasoning with mental models.
Khemlani, Sangeet S; Barbey, Aron K; Johnson-Laird, Philip N
2014-01-01
This paper outlines the model-based theory of causal reasoning. It postulates that the core meanings of causal assertions are deterministic and refer to temporally-ordered sets of possibilities: A causes B to occur means that given A, B occurs, whereas A enables B to occur means that given A, it is possible for B to occur. The paper shows how mental models represent such assertions, and how these models underlie deductive, inductive, and abductive reasoning yielding explanations. It reviews evidence both to corroborate the theory and to account for phenomena sometimes taken to be incompatible with it. Finally, it reviews neuroscience evidence indicating that mental models for causal inference are implemented within lateral prefrontal cortex.
Causal reasoning with mental models
Directory of Open Access Journals (Sweden)
Sangeet eKhemlani
2014-10-01
Full Text Available This paper outlines the model-based theory of causal reasoning. It postulates that the core meanings of causal assertions are deterministic and refer to temporally-ordered sets of possibilities: A causes B to occur means that given A, B occurs, whereas A enables B to occur means that given A, it is possible for B to occur. The paper shows how mental models represent such assertions, and how these models underlie deductive, inductive, and abductive reasoning yielding explanations. It reviews evidence both to corroborate the theory and to account for phenomena sometimes taken to be incompatible with it. Finally, it reviews neuroscience evidence indicating that mental models for causal inference are implemented within lateral prefrontal cortex.
Transforming Graphical System Models to Graphical Attack Models
DEFF Research Database (Denmark)
Ivanova, Marieta Georgieva; Probst, Christian W.; Hansen, Rene Rydhof
2016-01-01
Manually identifying possible attacks on an organisation is a complex undertaking; many different factors must be considered, and the resulting attack scenarios can be complex and hard to maintain as the organisation changes. System models provide a systematic representation of organisations...... approach to transforming graphical system models to graphical attack models in the form of attack trees. Based on an asset in the model, our transformations result in an attack tree that represents attacks by all possible actors in the model, after which the actor in question has obtained the asset....
Causal Reasoning with Mental Models
2014-08-08
The initial rubric is equivalent to an exclusive disjunction between the two causal assertions. It 488 yields the following two mental models: 489...are 575 important, whereas the functions of artifacts are important (Ahn, 1998). A genetic code is 576 accordingly more critical to being a goat than
Modeling of causality with metamaterials
International Nuclear Information System (INIS)
Smolyaninov, Igor I
2013-01-01
Hyperbolic metamaterials may be used to model a 2 + 1-dimensional Minkowski space–time in which the role of time is played by one of the spatial coordinates. When a metamaterial is built and illuminated with a coherent extraordinary laser beam, the stationary pattern of light propagation inside the metamaterial may be treated as a collection of particle world lines, which represents a complete ‘history’ of this 2 + 1-dimensional space–time. While this model may be used to build interesting space–time analogs, such as metamaterial ‘black holes’ and a metamaterial ‘big bang’, it lacks causality: since light inside the metamaterial may propagate back and forth along the ‘timelike’ spatial coordinate, events in the ‘future’ may affect events in the ‘past’. Here we demonstrate that a more sophisticated metamaterial model may fix this deficiency via breaking the mirror and temporal (PT) symmetries of the original model and producing one-way propagation along the ‘timelike’ spatial coordinate. The resulting 2 + 1-dimensional Minkowski space–time appears to be causal. This scenario may be considered as a metamaterial model of the Wheeler–Feynman absorber theory of causality. (paper)
Modeling chemical kinetics graphically
Heck, A.
2012-01-01
In literature on chemistry education it has often been suggested that students, at high school level and beyond, can benefit in their studies of chemical kinetics from computer supported activities. Use of system dynamics modeling software is one of the suggested quantitative approaches that could
Learning Graphical Models With Hubs.
Tan, Kean Ming; London, Palma; Mohan, Karthik; Lee, Su-In; Fazel, Maryam; Witten, Daniela
2014-10-01
We consider the problem of learning a high-dimensional graphical model in which there are a few hub nodes that are densely-connected to many other nodes. Many authors have studied the use of an ℓ 1 penalty in order to learn a sparse graph in the high-dimensional setting. However, the ℓ 1 penalty implicitly assumes that each edge is equally likely and independent of all other edges. We propose a general framework to accommodate more realistic networks with hub nodes, using a convex formulation that involves a row-column overlap norm penalty. We apply this general framework to three widely-used probabilistic graphical models: the Gaussian graphical model, the covariance graph model, and the binary Ising model. An alternating direction method of multipliers algorithm is used to solve the corresponding convex optimization problems. On synthetic data, we demonstrate that our proposed framework outperforms competitors that do not explicitly model hub nodes. We illustrate our proposal on a webpage data set and a gene expression data set.
Linear causal modeling with structural equations
Mulaik, Stanley A
2009-01-01
Emphasizing causation as a functional relationship between variables that describe objects, Linear Causal Modeling with Structural Equations integrates a general philosophical theory of causation with structural equation modeling (SEM) that concerns the special case of linear causal relations. In addition to describing how the functional relation concept may be generalized to treat probabilistic causation, the book reviews historical treatments of causation and explores recent developments in experimental psychology on studies of the perception of causation. It looks at how to perceive causal
A Causal Model of Faculty Research Productivity.
Bean, John P.
A causal model of faculty research productivity was developed through a survey of the literature. Models of organizational behavior, organizational effectiveness, and motivation were synthesized into a causal model of productivity. Two general types of variables were assumed to affect individual research productivity: institutional variables and…
Pearl, Judea
2000-03-01
Written by one of the pre-eminent researchers in the field, this book provides a comprehensive exposition of modern analysis of causation. It shows how causality has grown from a nebulous concept into a mathematical theory with significant applications in the fields of statistics, artificial intelligence, philosophy, cognitive science, and the health and social sciences. Pearl presents a unified account of the probabilistic, manipulative, counterfactual and structural approaches to causation, and devises simple mathematical tools for analyzing the relationships between causal connections, statistical associations, actions and observations. The book will open the way for including causal analysis in the standard curriculum of statistics, artifical intelligence, business, epidemiology, social science and economics. Students in these areas will find natural models, simple identification procedures, and precise mathematical definitions of causal concepts that traditional texts have tended to evade or make unduly complicated. This book will be of interest to professionals and students in a wide variety of fields. Anyone who wishes to elucidate meaningful relationships from data, predict effects of actions and policies, assess explanations of reported events, or form theories of causal understanding and causal speech will find this book stimulating and invaluable.
Causal Modelling in Evaluation Research.
Winteler, Adolf
1983-01-01
A study applied path analysis methods, using new techniques of causal analysis, to the problem of predicting the achievement, dropout rate, and satisfaction of university students. Besides providing explanations, the technique indicates possible remedial measures. (MSE)
Graphical interpretation of numerical model results
International Nuclear Information System (INIS)
Drewes, D.R.
1979-01-01
Computer software has been developed to produce high quality graphical displays of data from a numerical grid model. The code uses an existing graphical display package (DISSPLA) and overcomes some of the problems of both line-printer output and traditional graphics. The software has been designed to be flexible enough to handle arbitrarily placed computation grids and a variety of display requirements
Mastering probabilistic graphical models using Python
Ankan, Ankur
2015-01-01
If you are a researcher or a machine learning enthusiast, or are working in the data science field and have a basic idea of Bayesian learning or probabilistic graphical models, this book will help you to understand the details of graphical models and use them in your data science problems.
Rehder, Bob
2017-01-01
This article assesses how people reason with categories whose features are related in causal cycles. Whereas models based on causal graphical models (CGMs) have enjoyed success modeling category-based judgments as well as a number of other cognitive phenomena, CGMs are only able to represent causal structures that are acyclic. A number of new…
Graphical Model Debugger Framework for Embedded Systems
DEFF Research Database (Denmark)
Zeng, Kebin
2010-01-01
Model Driven Software Development has offered a faster way to design and implement embedded real-time software by moving the design to a model level, and by transforming models to code. However, the testing of embedded systems has remained at the code level. This paper presents a Graphical Model...... Debugger Framework, providing an auxiliary avenue of analysis of system models at runtime by executing generated code and updating models synchronously, which allows embedded developers to focus on the model level. With the model debugger, embedded developers can graphically test their design model...
Graphical modelling software in R - status
DEFF Research Database (Denmark)
Detlefsen, Claus; Højsgaard, Søren; Lauritzen, Steffen L
2007-01-01
Graphical models in their modern form have been around for nearly a quarter of a century. Various computer programs for inference in graphical models have been developed over that period. Some examples of free software programs are BUGS (Thomas 1994), CoCo (Badsberg2001), Digram (Klein, Keiding......, and Kreiner 1995), MIM (Edwards 2000), and Tetrad (Glymour, Scheines, Spirtes, and Kelley 1987). The gR initiative (Lauritzen 2002) aims at making graphical models available in R (R Development Core Team 2006). A small grant from the Danish Science Foundation supported this initiative. We will summarize...... the results of the initiative so far. Specifically we will illustrate some of the R packages for graphical modelling currently on CRAN and discuss their strengths and weaknesses....
Exploring Causal Models of Educational Achievement.
Parkerson, Jo Ann; And Others
1984-01-01
This article evaluates five causal model of educational productivity applied to learning science in a sample of 882 fifth through eighth graders. Each model explores the relationship between achievement and a combination of eight constructs: home environment, peer group, media, ability, social environment, time on task, motivation, and…
Intelligent diagnosis of jaundice with dynamic uncertain causality graph model*
Hao, Shao-rui; Geng, Shi-chao; Fan, Lin-xiao; Chen, Jia-jia; Zhang, Qin; Li, Lan-juan
2017-01-01
Jaundice is a common and complex clinical symptom potentially occurring in hepatology, general surgery, pediatrics, infectious diseases, gynecology, and obstetrics, and it is fairly difficult to distinguish the cause of jaundice in clinical practice, especially for general practitioners in less developed regions. With collaboration between physicians and artificial intelligence engineers, a comprehensive knowledge base relevant to jaundice was created based on demographic information, symptoms, physical signs, laboratory tests, imaging diagnosis, medical histories, and risk factors. Then a diagnostic modeling and reasoning system using the dynamic uncertain causality graph was proposed. A modularized modeling scheme was presented to reduce the complexity of model construction, providing multiple perspectives and arbitrary granularity for disease causality representations. A “chaining” inference algorithm and weighted logic operation mechanism were employed to guarantee the exactness and efficiency of diagnostic reasoning under situations of incomplete and uncertain information. Moreover, the causal interactions among diseases and symptoms intuitively demonstrated the reasoning process in a graphical manner. Verification was performed using 203 randomly pooled clinical cases, and the accuracy was 99.01% and 84.73%, respectively, with or without laboratory tests in the model. The solutions were more explicable and convincing than common methods such as Bayesian Networks, further increasing the objectivity of clinical decision-making. The promising results indicated that our model could be potentially used in intelligent diagnosis and help decrease public health expenditure. PMID:28471111
Intelligent diagnosis of jaundice with dynamic uncertain causality graph model.
Hao, Shao-Rui; Geng, Shi-Chao; Fan, Lin-Xiao; Chen, Jia-Jia; Zhang, Qin; Li, Lan-Juan
2017-05-01
Jaundice is a common and complex clinical symptom potentially occurring in hepatology, general surgery, pediatrics, infectious diseases, gynecology, and obstetrics, and it is fairly difficult to distinguish the cause of jaundice in clinical practice, especially for general practitioners in less developed regions. With collaboration between physicians and artificial intelligence engineers, a comprehensive knowledge base relevant to jaundice was created based on demographic information, symptoms, physical signs, laboratory tests, imaging diagnosis, medical histories, and risk factors. Then a diagnostic modeling and reasoning system using the dynamic uncertain causality graph was proposed. A modularized modeling scheme was presented to reduce the complexity of model construction, providing multiple perspectives and arbitrary granularity for disease causality representations. A "chaining" inference algorithm and weighted logic operation mechanism were employed to guarantee the exactness and efficiency of diagnostic reasoning under situations of incomplete and uncertain information. Moreover, the causal interactions among diseases and symptoms intuitively demonstrated the reasoning process in a graphical manner. Verification was performed using 203 randomly pooled clinical cases, and the accuracy was 99.01% and 84.73%, respectively, with or without laboratory tests in the model. The solutions were more explicable and convincing than common methods such as Bayesian Networks, further increasing the objectivity of clinical decision-making. The promising results indicated that our model could be potentially used in intelligent diagnosis and help decrease public health expenditure.
Light reflection models for computer graphics.
Greenberg, D P
1989-04-14
During the past 20 years, computer graphic techniques for simulating the reflection of light have progressed so that today images of photorealistic quality can be produced. Early algorithms considered direct lighting only, but global illumination phenomena with indirect lighting, surface interreflections, and shadows can now be modeled with ray tracing, radiosity, and Monte Carlo simulations. This article describes the historical development of computer graphic algorithms for light reflection and pictorially illustrates what will be commonly available in the near future.
A quantum probability model of causal reasoning
Directory of Open Access Journals (Sweden)
Jennifer S Trueblood
2012-05-01
Full Text Available People can often outperform statistical methods and machine learning algorithms in situations that involve making inferences about the relationship between causes and effects. While people are remarkably good at causal reasoning in many situations, there are several instances where they deviate from expected responses. This paper examines three situations where judgments related to causal inference problems produce unexpected results and describes a quantum inference model based on the axiomatic principles of quantum probability theory that can explain these effects. Two of the three phenomena arise from the comparison of predictive judgments (i.e., the conditional probability of an effect given a cause with diagnostic judgments (i.e., the conditional probability of a cause given an effect. The third phenomenon is a new finding examining order effects in predictive causal judgments. The quantum inference model uses the notion of incompatibility among different causes to account for all three phenomena. Psychologically, the model assumes that individuals adopt different points of view when thinking about different causes. The model provides good fits to the data and offers a coherent account for all three causal reasoning effects thus proving to be a viable new candidate for modeling human judgment.
Causal Measurement Models: Can Criticism Stimulate Clarification?
Markus, Keith A.
2016-01-01
In their 2016 work, Aguirre-Urreta et al. provided a contribution to the literature on causal measurement models that enhances clarity and stimulates further thinking. Aguirre-Urreta et al. presented a form of statistical identity involving mapping onto the portion of the parameter space involving the nomological net, relationships between the…
A Causal Model of Faculty Turnover Intentions.
Smart, John C.
1990-01-01
A causal model assesses the relative influence of individual attributes, institutional characteristics, contextual-work environment variables, and multiple measures of job satisfaction on faculty intentions to leave their current institutions. Factors considered include tenure status, age, institutional status, governance style, organizational…
Graphical Model Theory for Wireless Sensor Networks
International Nuclear Information System (INIS)
Davis, William B.
2002-01-01
Information processing in sensor networks, with many small processors, demands a theory of computation that allows the minimization of processing effort, and the distribution of this effort throughout the network. Graphical model theory provides a probabilistic theory of computation that explicitly addresses complexity and decentralization for optimizing network computation. The junction tree algorithm, for decentralized inference on graphical probability models, can be instantiated in a variety of applications useful for wireless sensor networks, including: sensor validation and fusion; data compression and channel coding; expert systems, with decentralized data structures, and efficient local queries; pattern classification, and machine learning. Graphical models for these applications are sketched, and a model of dynamic sensor validation and fusion is presented in more depth, to illustrate the junction tree algorithm
Transforming Graphical System Models To Graphical Attack Models
Ivanova, Marieta Georgieva; Probst, Christian W.; Hansen, René Rydhof; Kammüller, Florian; Mauw, S.; Kordy, B.
2015-01-01
Manually identifying possible attacks on an organisation is a complex undertaking; many different factors must be considered, and the resulting attack scenarios can be complex and hard to maintain as the organisation changes. System models provide a systematic representation of organisations that
Structure and Strength in Causal Induction
Griffiths, Thomas L.; Tenenbaum, Joshua B.
2005-01-01
We present a framework for the rational analysis of elemental causal induction--learning about the existence of a relationship between a single cause and effect--based upon causal graphical models. This framework makes precise the distinction between causal structure and causal strength: the difference between asking whether a causal relationship…
Efficiently adapting graphical models for selectivity estimation
DEFF Research Database (Denmark)
Tzoumas, Kostas; Deshpande, Amol; Jensen, Christian S.
2013-01-01
cardinality estimation without making the independence assumption. By carefully using concepts from the field of graphical models, we are able to factor the joint probability distribution over all the attributes in the database into small, usually two-dimensional distributions, without a significant loss...... in estimation accuracy. We show how to efficiently construct such a graphical model from the database using only two-way join queries, and we show how to perform selectivity estimation in a highly efficient manner. We integrate our algorithms into the PostgreSQL DBMS. Experimental results indicate...
Building probabilistic graphical models with Python
Karkera, Kiran R
2014-01-01
This is a short, practical guide that allows data scientists to understand the concepts of Graphical models and enables them to try them out using small Python code snippets, without being too mathematically complicated. If you are a data scientist who knows about machine learning and want to enhance your knowledge of graphical models, such as Bayes network, in order to use them to solve real-world problems using Python libraries, this book is for you. This book is intended for those who have some Python and machine learning experience, or are exploring the machine learning field.
Formal Analysis of Graphical Security Models
DEFF Research Database (Denmark)
Aslanyan, Zaruhi
, software components and human actors interacting with each other to form so-called socio-technical systems. The importance of socio-technical systems to modern societies requires verifying their security properties formally, while their inherent complexity makes manual analyses impracticable. Graphical...... models for security offer an unrivalled opportunity to describe socio-technical systems, for they allow to represent different aspects like human behaviour, computation and physical phenomena in an abstract yet uniform manner. Moreover, these models can be assigned a formal semantics, thereby allowing...... formal verification of their properties. Finally, their appealing graphical notations enable to communicate security concerns in an understandable way also to non-experts, often in charge of the decision making. This dissertation argues that automated techniques can be developed on graphical security...
Planar graphical models which are easy
Energy Technology Data Exchange (ETDEWEB)
Chertkov, Michael [Los Alamos National Laboratory; Chernyak, Vladimir [WAYNE STATE UNIV
2009-01-01
We describe a rich family of binary variables statistical mechanics models on planar graphs which are equivalent to Gaussian Grassmann Graphical models (free fermions). Calculation of partition function (weighted counting) in the models is easy (of polynomial complexity) as reduced to evaluation of determinants of matrixes linear in the number of variables. In particular, this family of models covers Holographic Algorithms of Valiant and extends on the Gauge Transformations discussed in our previous works.
Probabilistic reasoning with graphical security models
Kordy, Barbara; Pouly, Marc; Schweitzer, Patrick
This work provides a computational framework for meaningful probabilistic evaluation of attack–defense scenarios involving dependent actions. We combine the graphical security modeling technique of attack–defense trees with probabilistic information expressed in terms of Bayesian networks. In order
Diagnostic reasoning using qualitative causal models
International Nuclear Information System (INIS)
Sudduth, A.L.
1992-01-01
The application of expert systems to reasoning problems involving real-time data from plant measurements has been a topic of much research, but few practical systems have been deployed. One obstacle to wider use of expert systems in applications involving real-time data is the lack of adequate knowledge representation methodologies for dynamic processes. Knowledge bases composed mainly of rules have disadvantages when applied to dynamic processes and real-time data. This paper describes a methodology for the development of qualitative causal models that can be used as knowledge bases for reasoning about process dynamic behavior. These models provide a systematic method for knowledge base construction, considerably reducing the engineering effort required. They also offer much better opportunities for verification and validation of the knowledge base, thus increasing the possibility of the application of expert systems to reasoning about mission critical systems. Starting with the Signed Directed Graph (SDG) method that has been successfully applied to describe the behavior of diverse dynamic processes, the paper shows how certain non-physical behaviors that result from abstraction may be eliminated by applying causal constraint to the models. The resulting Extended Signed Directed Graph (ESDG) may then be compiled to produce a model for use in process fault diagnosis. This model based reasoning methodology is used in the MOBIAS system being developed by Duke Power Company under EPRI sponsorship. 15 refs., 4 figs
Graphical models for inferring single molecule dynamics
Directory of Open Access Journals (Sweden)
Gonzalez Ruben L
2010-10-01
Full Text Available Abstract Background The recent explosion of experimental techniques in single molecule biophysics has generated a variety of novel time series data requiring equally novel computational tools for analysis and inference. This article describes in general terms how graphical modeling may be used to learn from biophysical time series data using the variational Bayesian expectation maximization algorithm (VBEM. The discussion is illustrated by the example of single-molecule fluorescence resonance energy transfer (smFRET versus time data, where the smFRET time series is modeled as a hidden Markov model (HMM with Gaussian observables. A detailed description of smFRET is provided as well. Results The VBEM algorithm returns the model’s evidence and an approximating posterior parameter distribution given the data. The former provides a metric for model selection via maximum evidence (ME, and the latter a description of the model’s parameters learned from the data. ME/VBEM provide several advantages over the more commonly used approach of maximum likelihood (ML optimized by the expectation maximization (EM algorithm, the most important being a natural form of model selection and a well-posed (non-divergent optimization problem. Conclusions The results demonstrate the utility of graphical modeling for inference of dynamic processes in single molecule biophysics.
Stochastic Spectral Descent for Discrete Graphical Models
International Nuclear Information System (INIS)
Carlson, David; Hsieh, Ya-Ping; Collins, Edo; Carin, Lawrence; Cevher, Volkan
2015-01-01
Interest in deep probabilistic graphical models has in-creased in recent years, due to their state-of-the-art performance on many machine learning applications. Such models are typically trained with the stochastic gradient method, which can take a significant number of iterations to converge. Since the computational cost of gradient estimation is prohibitive even for modestly sized models, training becomes slow and practically usable models are kept small. In this paper we propose a new, largely tuning-free algorithm to address this problem. Our approach derives novel majorization bounds based on the Schatten- norm. Intriguingly, the minimizers of these bounds can be interpreted as gradient methods in a non-Euclidean space. We thus propose using a stochastic gradient method in non-Euclidean space. We both provide simple conditions under which our algorithm is guaranteed to converge, and demonstrate empirically that our algorithm leads to dramatically faster training and improved predictive ability compared to stochastic gradient descent for both directed and undirected graphical models.
Dynamic Causal Models and Autopoietic Systems
Directory of Open Access Journals (Sweden)
OLIVIER DAVID
2007-01-01
Full Text Available Dynamic Causal Modelling (DCM and the theory of autopoietic systems are two important conceptual frameworks. In this review, we suggest that they can be combined to answer important questions about self-organising systems like the brain. DCM has been developed recently by the neuroimaging community to explain, using biophysical models, the non-invasive brain imaging data are caused by neural processes. It allows one to ask mechanistic questions about the implementation of cerebral processes. In DCM the parameters of biophysical models are estimated from measured data and the evidence for each model is evaluated. This enables one to test different functional hypotheses (i.e., models for a given data set. Autopoiesis and related formal theories of biological systems as autonomous machines represent a body of concepts with many successful applications. However, autopoiesis has remained largely theoretical and has not penetrated the empiricism of cognitive neuroscience. In this review, we try to show the connections that exist between DCM and autopoiesis. In particular, we propose a simple modification to standard formulations of DCM that includes autonomous processes. The idea is to exploit the machinery of the system identification of DCMs in neuroimaging to test the face validity of the autopoietic theory applied to neural subsystems. We illustrate the theoretical concepts and their implications for interpreting electroencephalographic signals acquired during amygdala stimulation in an epileptic patient. The results suggest that DCM represents a relevant biophysical approach to brain functional organisation, with a potential that is yet to be fully evaluated
Bayesian graphical models for genomewide association studies.
Verzilli, Claudio J; Stallard, Nigel; Whittaker, John C
2006-07-01
As the extent of human genetic variation becomes more fully characterized, the research community is faced with the challenging task of using this information to dissect the heritable components of complex traits. Genomewide association studies offer great promise in this respect, but their analysis poses formidable difficulties. In this article, we describe a computationally efficient approach to mining genotype-phenotype associations that scales to the size of the data sets currently being collected in such studies. We use discrete graphical models as a data-mining tool, searching for single- or multilocus patterns of association around a causative site. The approach is fully Bayesian, allowing us to incorporate prior knowledge on the spatial dependencies around each marker due to linkage disequilibrium, which reduces considerably the number of possible graphical structures. A Markov chain-Monte Carlo scheme is developed that yields samples from the posterior distribution of graphs conditional on the data from which probabilistic statements about the strength of any genotype-phenotype association can be made. Using data simulated under scenarios that vary in marker density, genotype relative risk of a causative allele, and mode of inheritance, we show that the proposed approach has better localization properties and leads to lower false-positive rates than do single-locus analyses. Finally, we present an application of our method to a quasi-synthetic data set in which data from the CYP2D6 region are embedded within simulated data on 100K single-nucleotide polymorphisms. Analysis is quick (<5 min), and we are able to localize the causative site to a very short interval.
Theories of conduct disorder: a causal modelling analysis
Krol, N.P.C.M.; Morton, J.; Bruyn, E.E.J. De
2004-01-01
Background: If a clinician has to make decisions on diagnosis and treatment, he or she is confronted with a variety of causal theories. In order to compare these theories a neutral terminology and notational system is needed. The Causal Modelling framework involving three levels of description –
The Causal Foundations of Structural Equation Modeling
2012-02-16
and Baumrind (1993).” This, together with the steady influx of statisticians into the field, has left SEM re- searchers in a quandary about the...considerations. Journal of Personality and Social Psychology 51 1173–1182. Baumrind , D. (1993). Specious causal attributions in social sciences: The
ModelMate - A graphical user interface for model analysis
Banta, Edward R.
2011-01-01
ModelMate is a graphical user interface designed to facilitate use of model-analysis programs with models. This initial version of ModelMate supports one model-analysis program, UCODE_2005, and one model software program, MODFLOW-2005. ModelMate can be used to prepare input files for UCODE_2005, run UCODE_2005, and display analysis results. A link to the GW_Chart graphing program facilitates visual interpretation of results. ModelMate includes capabilities for organizing directories used with the parallel-processing capabilities of UCODE_2005 and for maintaining files in those directories to be identical to a set of files in a master directory. ModelMate can be used on its own or in conjunction with ModelMuse, a graphical user interface for MODFLOW-2005 and PHAST.
Markov chain Monte Carlo methods in directed graphical models
DEFF Research Database (Denmark)
Højbjerre, Malene
Directed graphical models present data possessing a complex dependence structure, and MCMC methods are computer-intensive simulation techniques to approximate high-dimensional intractable integrals, which emerge in such models with incomplete data. MCMC computations in directed graphical models h...
Quantum Graphical Models and Belief Propagation
International Nuclear Information System (INIS)
Leifer, M.S.; Poulin, D.
2008-01-01
Belief Propagation algorithms acting on Graphical Models of classical probability distributions, such as Markov Networks, Factor Graphs and Bayesian Networks, are amongst the most powerful known methods for deriving probabilistic inferences amongst large numbers of random variables. This paper presents a generalization of these concepts and methods to the quantum case, based on the idea that quantum theory can be thought of as a noncommutative, operator-valued, generalization of classical probability theory. Some novel characterizations of quantum conditional independence are derived, and definitions of Quantum n-Bifactor Networks, Markov Networks, Factor Graphs and Bayesian Networks are proposed. The structure of Quantum Markov Networks is investigated and some partial characterization results are obtained, along the lines of the Hammersley-Clifford theorem. A Quantum Belief Propagation algorithm is presented and is shown to converge on 1-Bifactor Networks and Markov Networks when the underlying graph is a tree. The use of Quantum Belief Propagation as a heuristic algorithm in cases where it is not known to converge is discussed. Applications to decoding quantum error correcting codes and to the simulation of many-body quantum systems are described
Graphical modeling and query language for hospitals.
Barzdins, Janis; Barzdins, Juris; Rencis, Edgars; Sostaks, Agris
2013-01-01
So far there has been little evidence that implementation of the health information technologies (HIT) is leading to health care cost savings. One of the reasons for this lack of impact by the HIT likely lies in the complexity of the business process ownership in the hospitals. The goal of our research is to develop a business model-based method for hospital use which would allow doctors to retrieve directly the ad-hoc information from various hospital databases. We have developed a special domain-specific process modelling language called the MedMod. Formally, we define the MedMod language as a profile on UML Class diagrams, but we also demonstrate it on examples, where we explain the semantics of all its elements informally. Moreover, we have developed the Process Query Language (PQL) that is based on MedMod process definition language. The purpose of PQL is to allow a doctor querying (filtering) runtime data of hospital's processes described using MedMod. The MedMod language tries to overcome deficiencies in existing process modeling languages, allowing to specify the loosely-defined sequence of the steps to be performed in the clinical process. The main advantages of PQL are in two main areas - usability and efficiency. They are: 1) the view on data through "glasses" of familiar process, 2) the simple and easy-to-perceive means of setting filtering conditions require no more expertise than using spreadsheet applications, 3) the dynamic response to each step in construction of the complete query that shortens the learning curve greatly and reduces the error rate, and 4) the selected means of filtering and data retrieving allows to execute queries in O(n) time regarding the size of the dataset. We are about to continue developing this project with three further steps. First, we are planning to develop user-friendly graphical editors for the MedMod process modeling and query languages. The second step is to do evaluation of usability the proposed language and tool
GRAPHICAL MODELS OF THE AIRCRAFT MAINTENANCE PROCESS
Directory of Open Access Journals (Sweden)
Stanislav Vladimirovich Daletskiy
2017-01-01
Full Text Available The aircraft maintenance is realized by a rapid sequence of maintenance organizational and technical states, its re- search and analysis are carried out by statistical methods. The maintenance process concludes aircraft technical states con- nected with the objective patterns of technical qualities changes of the aircraft as a maintenance object and organizational states which determine the subjective organization and planning process of aircraft using. The objective maintenance pro- cess is realized in Maintenance and Repair System which does not include maintenance organization and planning and is a set of related elements: aircraft, Maintenance and Repair measures, executors and documentation that sets rules of their interaction for maintaining of the aircraft reliability and readiness for flight. The aircraft organizational and technical states are considered, their characteristics and heuristic estimates of connection in knots and arcs of graphs and of aircraft organi- zational states during regular maintenance and at technical state failure are given. It is shown that in real conditions of air- craft maintenance, planned aircraft technical state control and maintenance control through it, is only defined by Mainte- nance and Repair conditions at a given Maintenance and Repair type and form structures, and correspondingly by setting principles of Maintenance and Repair work types to the execution, due to maintenance, by aircraft and all its units mainte- nance and reconstruction strategies. The realization of planned Maintenance and Repair process determines the one of the constant maintenance component. The proposed graphical models allow to reveal quantitative correlations between graph knots to improve maintenance processes by statistical research methods, what reduces manning, timetable and expenses for providing safe civil aviation aircraft maintenance.
Causal Bayes Model of Mathematical Competence in Kindergarten
Directory of Open Access Journals (Sweden)
Božidar Tepeš
2016-06-01
Full Text Available In this paper authors define mathematical competences in the kindergarten. The basic objective was to measure the mathematical competences or mathematical knowledge, skills and abilities in mathematical education. Mathematical competences were grouped in the following areas: Arithmetic and Geometry. Statistical set consisted of 59 children, 65 to 85 months of age, from the Kindergarten Milan Sachs from Zagreb. The authors describe 13 variables for measuring mathematical competences. Five measuring variables were described for the geometry, and eight measuring variables for the arithmetic. Measuring variables are tasks which children solved with the evaluated results. By measuring mathematical competences the authors make causal Bayes model using free software Tetrad 5.2.1-3. Software makes many causal Bayes models and authors as experts chose the model of the mathematical competences in the kindergarten. Causal Bayes model describes five levels for mathematical competences. At the end of the modeling authors use Bayes estimator. In the results, authors describe by causal Bayes model of mathematical competences, causal effect mathematical competences or how intervention on some competences cause other competences. Authors measure mathematical competences with their expectation as random variables. When expectation of competences was greater, competences improved. Mathematical competences can be improved with intervention on causal competences. Levels of mathematical competences and the result of intervention on mathematical competences can help mathematical teachers.
A methodology for acquiring qualitative knowledge for probabilistic graphical models
DEFF Research Database (Denmark)
Kjærulff, Uffe Bro; Madsen, Anders L.
2004-01-01
We present a practical and general methodology that simplifies the task of acquiring and formulating qualitative knowledge for constructing probabilistic graphical models (PGMs). The methodology efficiently captures and communicates expert knowledge, and has significantly eased the model...
A theory of causal learning in children: Causal maps and Bayes nets
Gopnik, A; Glymour, C; Sobel, D M; Schulz, L E; Kushnir, T; Danks, D
2004-01-01
The authors outline a cognitive and computational account of causal learning in children. They propose that children use specialized cognitive systems that allow them to recover an accurate "causal map" of the world: an abstract, coherent, learned representation of the causal relations among events. This kind of knowledge can be perspicuously understood in terms of the formalism of directed graphical causal models, or Bayes nets. Children's causal learning and inference may involve computatio...
The complete guide to blender graphics computer modeling and animation
Blain, John M
2014-01-01
Smoothly Leads Users into the Subject of Computer Graphics through the Blender GUIBlender, the free and open source 3D computer modeling and animation program, allows users to create and animate models and figures in scenes, compile feature movies, and interact with the models and create video games. Reflecting the latest version of Blender, The Complete Guide to Blender Graphics: Computer Modeling & Animation, 2nd Edition helps beginners learn the basics of computer animation using this versatile graphics program. This edition incorporates many new features of Blender, including developments
Causal Diagrams for Empirical Research
Pearl, Judea
1994-01-01
The primary aim of this paper is to show how graphical models can be used as a mathematical language for integrating statistical and subject-matter information. In particular, the paper develops a principled, nonparametric framework for causal inference, in which diagrams are queried to determine if the assumptions available are sufficient for identifiying causal effects from non-experimental data. If so the diagrams can be queried to produce mathematical expressions for causal effects in ter...
Causal Analysis for Performance Modeling of Computer Programs
Directory of Open Access Journals (Sweden)
Jan Lemeire
2007-01-01
Full Text Available Causal modeling and the accompanying learning algorithms provide useful extensions for in-depth statistical investigation and automation of performance modeling. We enlarged the scope of existing causal structure learning algorithms by using the form-free information-theoretic concept of mutual information and by introducing the complexity criterion for selecting direct relations among equivalent relations. The underlying probability distribution of experimental data is estimated by kernel density estimation. We then reported on the benefits of a dependency analysis and the decompositional capacities of causal models. Useful qualitative models, providing insight into the role of every performance factor, were inferred from experimental data. This paper reports on the results for a LU decomposition algorithm and on the study of the parameter sensitivity of the Kakadu implementation of the JPEG-2000 standard. Next, the analysis was used to search for generic performance characteristics of the applications.
A probabilistic graphical model based stochastic input model construction
International Nuclear Information System (INIS)
Wan, Jiang; Zabaras, Nicholas
2014-01-01
Model reduction techniques have been widely used in modeling of high-dimensional stochastic input in uncertainty quantification tasks. However, the probabilistic modeling of random variables projected into reduced-order spaces presents a number of computational challenges. Due to the curse of dimensionality, the underlying dependence relationships between these random variables are difficult to capture. In this work, a probabilistic graphical model based approach is employed to learn the dependence by running a number of conditional independence tests using observation data. Thus a probabilistic model of the joint PDF is obtained and the PDF is factorized into a set of conditional distributions based on the dependence structure of the variables. The estimation of the joint PDF from data is then transformed to estimating conditional distributions under reduced dimensions. To improve the computational efficiency, a polynomial chaos expansion is further applied to represent the random field in terms of a set of standard random variables. This technique is combined with both linear and nonlinear model reduction methods. Numerical examples are presented to demonstrate the accuracy and efficiency of the probabilistic graphical model based stochastic input models. - Highlights: • Data-driven stochastic input models without the assumption of independence of the reduced random variables. • The problem is transformed to a Bayesian network structure learning problem. • Examples are given in flows in random media
Optimal covariance selection for estimation using graphical models
Vichik, Sergey; Oshman, Yaakov
2011-01-01
We consider a problem encountered when trying to estimate a Gaussian random field using a distributed estimation approach based on Gaussian graphical models. Because of constraints imposed by estimation tools used in Gaussian graphical models, the a priori covariance of the random field is constrained to embed conditional independence constraints among a significant number of variables. The problem is, then: given the (unconstrained) a priori covariance of the random field, and the conditiona...
A general graphical user interface for automatic reliability modeling
Liceaga, Carlos A.; Siewiorek, Daniel P.
1991-01-01
Reported here is a general Graphical User Interface (GUI) for automatic reliability modeling of Processor Memory Switch (PMS) structures using a Markov model. This GUI is based on a hierarchy of windows. One window has graphical editing capabilities for specifying the system's communication structure, hierarchy, reconfiguration capabilities, and requirements. Other windows have field texts, popup menus, and buttons for specifying parameters and selecting actions. An example application of the GUI is given.
Causal Indicator Models: Unresolved Issues of Construction and Evaluation
West, Stephen G.; Grimm, Kevin J.
2014-01-01
These authors agree with Bainter and Bollen that causal effects represents a useful measurement structure in some applications. The structure of the science of the measurement problem should determine the model; the measurement model should not determine the science. They also applaud Bainter and Bollen's important reminder that the full…
An integrated introduction to computer graphics and geometric modeling
Goldman, Ronald
2009-01-01
… this book may be the first book on geometric modelling that also covers computer graphics. In addition, it may be the first book on computer graphics that integrates a thorough introduction to 'freedom' curves and surfaces and to the mathematical foundations for computer graphics. … the book is well suited for an undergraduate course. … The entire book is very well presented and obviously written by a distinguished and creative researcher and educator. It certainly is a textbook I would recommend. …-Computer-Aided Design, 42, 2010… Many books concentrate on computer programming and soon beco
A theory of causal learning in children: causal maps and Bayes nets.
Gopnik, Alison; Glymour, Clark; Sobel, David M; Schulz, Laura E; Kushnir, Tamar; Danks, David
2004-01-01
The authors outline a cognitive and computational account of causal learning in children. They propose that children use specialized cognitive systems that allow them to recover an accurate "causal map" of the world: an abstract, coherent, learned representation of the causal relations among events. This kind of knowledge can be perspicuously understood in terms of the formalism of directed graphical causal models, or Bayes nets. Children's causal learning and inference may involve computations similar to those for learning causal Bayes nets and for predicting with them. Experimental results suggest that 2- to 4-year-old children construct new causal maps and that their learning is consistent with the Bayes net formalism.
Modelling of JET diagnostics using Bayesian Graphical Models
Energy Technology Data Exchange (ETDEWEB)
Svensson, J. [IPP Greifswald, Greifswald (Germany); Ford, O. [Imperial College, London (United Kingdom); McDonald, D.; Hole, M.; Nessi, G. von; Meakins, A.; Brix, M.; Thomsen, H.; Werner, A.; Sirinelli, A.
2011-07-01
The mapping between physics parameters (such as densities, currents, flows, temperatures etc) defining the plasma 'state' under a given model and the raw observations of each plasma diagnostic will 1) depend on the particular physics model used, 2) is inherently probabilistic, from uncertainties on both observations and instrumental aspects of the mapping, such as calibrations, instrument functions etc. A flexible and principled way of modelling such interconnected probabilistic systems is through so called Bayesian graphical models. Being an amalgam between graph theory and probability theory, Bayesian graphical models can simulate the complex interconnections between physics models and diagnostic observations from multiple heterogeneous diagnostic systems, making it relatively easy to optimally combine the observations from multiple diagnostics for joint inference on parameters of the underlying physics model, which in itself can be represented as part of the graph. At JET about 10 diagnostic systems have to date been modelled in this way, and has lead to a number of new results, including: the reconstruction of the flux surface topology and q-profiles without any specific equilibrium assumption, using information from a number of different diagnostic systems; profile inversions taking into account the uncertainties in the flux surface positions and a substantial increase in accuracy of JET electron density and temperature profiles, including improved pedestal resolution, through the joint analysis of three diagnostic systems. It is believed that the Bayesian graph approach could potentially be utilised for very large sets of diagnostics, providing a generic data analysis framework for nuclear fusion experiments, that would be able to optimally utilize the information from multiple diagnostics simultaneously, and where the explicit graph representation of the connections to underlying physics models could be used for sophisticated model testing. This
The gRbase Package for Graphical Modelling in R
DEFF Research Database (Denmark)
Højsgaard, Søren; Dethlefsen, Claus
We have developed a package, called , consisting of a number of classes and associated methods to support the analysis of data using graphical models. It is developed for the open source language, R, and is available for several platforms. The package is intended to be widely extendible...... these building blocks can be combined and integrated with inference engines in the special cases of hierarchical log-linear models (undirected models). gRbase gRbase dynamicGraph...... and flexible so that package developers may implement further types of graphical models using the available methods. contains methods for representing data, specification of models using a formal language, and is linked to , an interactive graphical user interface for manipulating graphs. We show how...
Graphical means for inspecting qualitative models of system behaviour
Bouwer, A.; Bredeweg, B.
2010-01-01
This article presents the design and evaluation of a tool for inspecting conceptual models of system behaviour. The basis for this research is the Garp framework for qualitative simulation. This framework includes modelling primitives, such as entities, quantities and causal dependencies, which are
Integrating Surface Modeling into the Engineering Design Graphics Curriculum
Hartman, Nathan W.
2006-01-01
It has been suggested there is a knowledge base that surrounds the use of 3D modeling within the engineering design process and correspondingly within engineering design graphics education. While solid modeling receives a great deal of attention and discussion relative to curriculum efforts, and rightly so, surface modeling is an equally viable 3D…
Causal models in epidemiology: past inheritance and genetic future
Directory of Open Access Journals (Sweden)
Kriebel David
2006-07-01
Full Text Available Abstract The eruption of genetic research presents a tremendous opportunity to epidemiologists to improve our ability to identify causes of ill health. Epidemiologists have enthusiastically embraced the new tools of genomics and proteomics to investigate gene-environment interactions. We argue that neither the full import nor limitations of such studies can be appreciated without clarifying underlying theoretical models of interaction, etiologic fraction, and the fundamental concept of causality. We therefore explore different models of causality in the epidemiology of disease arising out of genes, environments, and the interplay between environments and genes. We begin from Rothman's "pie" model of necessary and sufficient causes, and then discuss newer approaches, which provide additional insights into multifactorial causal processes. These include directed acyclic graphs and structural equation models. Caution is urged in the application of two essential and closely related concepts found in many studies: interaction (effect modification and the etiologic or attributable fraction. We review these concepts and present four important limitations. 1. Interaction is a fundamental characteristic of any causal process involving a series of probabilistic steps, and not a second-order phenomenon identified after first accounting for "main effects". 2. Standard methods of assessing interaction do not adequately consider the life course, and the temporal dynamics through which an individual's sufficient cause is completed. Different individuals may be at different stages of development along the path to disease, but this is not usually measurable. Thus, for example, acquired susceptibility in children can be an important source of variation. 3. A distinction must be made between individual-based and population-level models. Most epidemiologic discussions of causality fail to make this distinction. 4. At the population level, there is additional
Causality in Psychiatry: A Hybrid Symptom Network Construct Model
Directory of Open Access Journals (Sweden)
Gerald eYoung
2015-11-01
Full Text Available Causality or etiology in psychiatry is marked by standard biomedical, reductionistic models (symptoms reflect the construct involved that inform approaches to nosology, or classification, such as in the DSM-5 (Diagnostic and Statistical Manual of Mental Disorders, Fifth Edition; American Psychiatric Association, 2013. However, network approaches to symptom interaction (i.e., symptoms are formative of the construct; e.g., McNally, Robinaugh, Wu, Wang, Deserno, & Borsboom, 2014, for PTSD (posttraumatic stress disorder are being developed that speak to bottom-up processes in mental disorder, in contrast to the typical top-down psychological construct approach. The present article presents a hybrid top-down, bottom-up model of the relationship between symptoms and mental disorder, viewing symptom expression and their causal complex as a reciprocally dynamic system with multiple levels, from lower-order symptoms in interaction to higher-order constructs affecting them. The hybrid model hinges on good understanding of systems theory in which it is embedded, so that the article reviews in depth nonlinear dynamical systems theory (NLDST. The article applies the concept of emergent circular causality (Young, 2011 to symptom development, as well. Conclusions consider that symptoms vary over several dimensions, including: subjectivity; objectivity; conscious motivation effort; and unconscious influences, and the degree to which individual (e.g., meaning and universal (e.g., causal processes are involved. The opposition between science and skepticism is a complex one that the article addresses in final comments.
Computation of Probabilities in Causal Models of History of Science
Directory of Open Access Journals (Sweden)
Osvaldo Pessoa Jr.
2006-12-01
Full Text Available : The aim of this paper is to investigate the ascription of probabilities in a causal model of an episode in the history of science. The aim of such a quantitative approach is to allow the implementation of the causal model in a computer, to run simulations. As an example, we look at the beginning of the science of magnetism, “explaining” — in a probabilistic way, in terms of a single causal model — why the field advanced in China but not in Europe (the difference is due to different prior probabilities of certain cultural manifestations. Given the number of years between the occurrences of two causally connected advances X and Y, one proposes a criterion for stipulating the value pY=X of the conditional probability of an advance Y occurring, given X. Next, one must assume a specific form for the cumulative probability function pY=X(t, which we take to be the time integral of an exponential distribution function, as is done in physics of radioactive decay. Rules for calculating the cumulative functions for more than two events are mentioned, involving composition, disjunction and conjunction of causes. We also consider the problems involved in supposing that the appearance of events in time follows an exponential distribution, which are a consequence of the fact that a composition of causes does not follow an exponential distribution, but a “hypoexponential” one. We suggest that a gamma distribution function might more adequately represent the appearance of advances.
A Graphical User Interface to Generalized Linear Models in MATLAB
Directory of Open Access Journals (Sweden)
Peter Dunn
1999-07-01
Full Text Available Generalized linear models unite a wide variety of statistical models in a common theoretical framework. This paper discusses GLMLAB-software that enables such models to be fitted in the popular mathematical package MATLAB. It provides a graphical user interface to the powerful MATLAB computational engine to produce a program that is easy to use but with many features, including offsets, prior weights and user-defined distributions and link functions. MATLAB's graphical capacities are also utilized in providing a number of simple residual diagnostic plots.
Measured, modeled, and causal conceptions of fitness
Abrams, Marshall
2012-01-01
This paper proposes partial answers to the following questions: in what senses can fitness differences plausibly be considered causes of evolution?What relationships are there between fitness concepts used in empirical research, modeling, and abstract theoretical proposals? How does the relevance of different fitness concepts depend on research questions and methodological constraints? The paper develops a novel taxonomy of fitness concepts, beginning with type fitness (a property of a genotype or phenotype), token fitness (a property of a particular individual), and purely mathematical fitness. Type fitness includes statistical type fitness, which can be measured from population data, and parametric type fitness, which is an underlying property estimated by statistical type fitnesses. Token fitness includes measurable token fitness, which can be measured on an individual, and tendential token fitness, which is assumed to be an underlying property of the individual in its environmental circumstances. Some of the paper's conclusions can be outlined as follows: claims that fitness differences do not cause evolution are reasonable when fitness is treated as statistical type fitness, measurable token fitness, or purely mathematical fitness. Some of the ways in which statistical methods are used in population genetics suggest that what natural selection involves are differences in parametric type fitnesses. Further, it's reasonable to think that differences in parametric type fitness can cause evolution. Tendential token fitnesses, however, are not themselves sufficient for natural selection. Though parametric type fitnesses are typically not directly measurable, they can be modeled with purely mathematical fitnesses and estimated by statistical type fitnesses, which in turn are defined in terms of measurable token fitnesses. The paper clarifies the ways in which fitnesses depend on pragmatic choices made by researchers. PMID:23112804
Discrete Discriminant analysis based on tree-structured graphical models
DEFF Research Database (Denmark)
Perez de la Cruz, Gonzalo; Eslava, Guillermina
The purpose of this paper is to illustrate the potential use of discriminant analysis based on tree{structured graphical models for discrete variables. This is done by comparing its empirical performance using estimated error rates for real and simulated data. The results show that discriminant a...... analysis based on tree{structured graphical models is a simple nonlinear method competitive with, and sometimes superior to, other well{known linear methods like those assuming mutual independence between variables and linear logistic regression.......The purpose of this paper is to illustrate the potential use of discriminant analysis based on tree{structured graphical models for discrete variables. This is done by comparing its empirical performance using estimated error rates for real and simulated data. The results show that discriminant...
The appliance of graphics modeling in nuclear plant information system
International Nuclear Information System (INIS)
Bai Zhe; Li Guofang
2010-01-01
The nuclear plants contain a lot of sub-system, such as operation management, manufacture system, inventory system, human resource system and so forth. The standardized data graphics modeling technology can ensure the data interaction, compress the design cycle, avoid the replicated design, ensure the data integrity and consistent. The standardized data format which is on the basis of STEP standard and complied with XML is competent tool in different sub-system of nuclear plants. In order to meet this demand, a data graphics modeling standard is proposed. It is shown the relationship between systems, in system, between data by the standard. The graphic modeling effectively improves the performance between systems, designers, engineers, operations, supports department. It also provides the reliable and available data source for data mining and business intelligence. (authors)
The Gaussian Graphical Model in Cross-Sectional and Time-Series Data.
Epskamp, Sacha; Waldorp, Lourens J; Mõttus, René; Borsboom, Denny
2018-04-16
We discuss the Gaussian graphical model (GGM; an undirected network of partial correlation coefficients) and detail its utility as an exploratory data analysis tool. The GGM shows which variables predict one-another, allows for sparse modeling of covariance structures, and may highlight potential causal relationships between observed variables. We describe the utility in three kinds of psychological data sets: data sets in which consecutive cases are assumed independent (e.g., cross-sectional data), temporally ordered data sets (e.g., n = 1 time series), and a mixture of the 2 (e.g., n > 1 time series). In time-series analysis, the GGM can be used to model the residual structure of a vector-autoregression analysis (VAR), also termed graphical VAR. Two network models can then be obtained: a temporal network and a contemporaneous network. When analyzing data from multiple subjects, a GGM can also be formed on the covariance structure of stationary means-the between-subjects network. We discuss the interpretation of these models and propose estimation methods to obtain these networks, which we implement in the R packages graphicalVAR and mlVAR. The methods are showcased in two empirical examples, and simulation studies on these methods are included in the supplementary materials.
Track-stitching using graphical models and message passing
CSIR Research Space (South Africa)
Van der Merwe, LJ
2013-07-01
Full Text Available In order to stitch tracks together, two tasks are required, namely tracking and track stitching. In this study track stitching is performed using a graphical model and message passing (belief propagation) approach. Tracks are modelled as nodes in a...
Efficient probabilistic model checking on general purpose graphic processors
Bosnacki, D.; Edelkamp, S.; Sulewski, D.; Pasareanu, C.S.
2009-01-01
We present algorithms for parallel probabilistic model checking on general purpose graphic processing units (GPGPUs). For this purpose we exploit the fact that some of the basic algorithms for probabilistic model checking rely on matrix vector multiplication. Since this kind of linear algebraic
Adaptive Inference on General Graphical Models
Acar, Umut A.; Ihler, Alexander T.; Mettu, Ramgopal; Sumer, Ozgur
2012-01-01
Many algorithms and applications involve repeatedly solving variations of the same inference problem; for example we may want to introduce new evidence to the model or perform updates to conditional dependencies. The goal of adaptive inference is to take advantage of what is preserved in the model and perform inference more rapidly than from scratch. In this paper, we describe techniques for adaptive inference on general graphs that support marginal computation and updates to the conditional ...
Engineering graphic modelling a workbook for design engineers
Tjalve, E; Frackmann Schmidt, F
2013-01-01
Engineering Graphic Modelling: A Practical Guide to Drawing and Design covers how engineering drawing relates to the design activity. The book describes modeled properties, such as the function, structure, form, material, dimension, and surface, as well as the coordinates, symbols, and types of projection of the drawing code. The text provides drawing techniques, such as freehand sketching, bold freehand drawing, drawing with a straightedge, a draughting machine or a plotter, and use of templates, and then describes the types of drawing. Graphic designers, design engineers, mechanical engine
MAGIC: Model and Graphic Information Converter
Herbert, W. C.
2009-01-01
MAGIC is a software tool capable of converting highly detailed 3D models from an open, standard format, VRML 2.0/97, into the proprietary DTS file format used by the Torque Game Engine from GarageGames. MAGIC is used to convert 3D simulations from authoritative sources into the data needed to run the simulations in NASA's Distributed Observer Network. The Distributed Observer Network (DON) is a simulation presentation tool built by NASA to facilitate the simulation sharing requirements of the Data Presentation and Visualization effort within the Constellation Program. DON is built on top of the Torque Game Engine (TGE) and has chosen TGE's Dynamix Three Space (DTS) file format to represent 3D objects within simulations.
Graphical models for inference under outcome-dependent sampling
DEFF Research Database (Denmark)
Didelez, V; Kreiner, S; Keiding, N
2010-01-01
a node for the sampling indicator, assumptions about sampling processes can be made explicit. We demonstrate how to read off such graphs whether consistent estimation of the association between exposure and outcome is possible. Moreover, we give sufficient graphical conditions for testing and estimating......We consider situations where data have been collected such that the sampling depends on the outcome of interest and possibly further covariates, as for instance in case-control studies. Graphical models represent assumptions about the conditional independencies among the variables. By including...
Interactive computer graphics for bio-stereochemical modelling
Indian Academy of Sciences (India)
Proc, Indian Acad. Sci., Vol. 87 A (Chem. Sci.), No. 4, April 1978, pp. 95-113, (e) printed in India. Interactive computer graphics for bio-stereochemical modelling. ROBERT REIN, SHLOMONIR, KAREN HAYDOCK and. ROBERTD MACELROY. Department of Experimental Pathology, Roswell Park Memorial Institute,. 666 Elm ...
Methods for teaching geometric modelling and computer graphics
Energy Technology Data Exchange (ETDEWEB)
Rotkov, S.I.; Faitel`son, Yu. Ts.
1992-05-01
This paper considers methods for teaching the methods and algorithms of geometric modelling and computer graphics to programmers, designers and users of CAD and computer-aided research systems. There is a bibliography that can be used to prepare lectures and practical classes. 37 refs., 1 tab.
Renormalization group approach to causal bulk viscous cosmological models
International Nuclear Information System (INIS)
Belinchon, J A; Harko, T; Mak, M K
2002-01-01
The renormalization group method is applied to the study of homogeneous and flat Friedmann-Robertson-Walker type universes, filled with a causal bulk viscous cosmological fluid. The starting point of the study is the consideration of the scaling properties of the gravitational field equations, the causal evolution equation of the bulk viscous pressure and the equations of state. The requirement of scale invariance imposes strong constraints on the temporal evolution of the bulk viscosity coefficient, temperature and relaxation time, thus leading to the possibility of obtaining the bulk viscosity coefficient-energy density dependence. For a cosmological model with bulk viscosity coefficient proportional to the Hubble parameter, we perform the analysis of the renormalization group flow around the scale-invariant fixed point, thereby obtaining the long-time behaviour of the scale factor
Analysis of Local Dependence and Multidimensionality in Graphical Loglinear Rasch Models
DEFF Research Database (Denmark)
Kreiner, Svend; Christensen, Karl Bang
2004-01-01
Local independence; Multidimensionality; Differential item functioning; Uniform local dependence and DIF; Graphical Rasch models; Loglinear Rasch model......Local independence; Multidimensionality; Differential item functioning; Uniform local dependence and DIF; Graphical Rasch models; Loglinear Rasch model...
Reasoning with probabilistic and deterministic graphical models exact algorithms
Dechter, Rina
2013-01-01
Graphical models (e.g., Bayesian and constraint networks, influence diagrams, and Markov decision processes) have become a central paradigm for knowledge representation and reasoning in both artificial intelligence and computer science in general. These models are used to perform many reasoning tasks, such as scheduling, planning and learning, diagnosis and prediction, design, hardware and software verification, and bioinformatics. These problems can be stated as the formal tasks of constraint satisfaction and satisfiability, combinatorial optimization, and probabilistic inference. It is well
Type-2 fuzzy graphical models for pattern recognition
Zeng, Jia
2015-01-01
This book discusses how to combine type-2 fuzzy sets and graphical models to solve a range of real-world pattern recognition problems such as speech recognition, handwritten Chinese character recognition, topic modeling as well as human action recognition. It covers these recent developments while also providing a comprehensive introduction to the fields of type-2 fuzzy sets and graphical models. Though primarily intended for graduate students, researchers and practitioners in fuzzy logic and pattern recognition, the book can also serve as a valuable reference work for researchers without any previous knowledge of these fields. Dr. Jia Zeng is a Professor at the School of Computer Science and Technology, Soochow University, China. Dr. Zhi-Qiang Liu is a Professor at the School of Creative Media, City University of Hong Kong, China.
On a Graphical Technique for Evaluating Some Rational Expectations Models
DEFF Research Database (Denmark)
Johansen, Søren; Swensen, Anders R.
2011-01-01
Campbell and Shiller (1987) proposed a graphical technique for the present value model, which consists of plotting estimates of the spread and theoretical spread as calculated from the cointegrated vector autoregressive model without imposing the restrictions implied by the present value model....... In addition to getting a visual impression of the fit of the model, the purpose is to see if the two spreads are nevertheless similar as measured by correlation, variance ratio, and noise ratio. We extend these techniques to a number of rational expectation models and give a general definition of spread...
A hierarchical causal modeling for large industrial plants supervision
International Nuclear Information System (INIS)
Dziopa, P.; Leyval, L.
1994-01-01
A supervision system has to analyse the process current state and the way it will evolve after a modification of the inputs or disturbance. It is proposed to base this analysis on a hierarchy of models, witch differ by the number of involved variables and the abstraction level used to describe their temporal evolution. In a first step, special attention is paid to causal models building, from the most abstract one. Once the hierarchy of models has been build, the most detailed model parameters are estimated. Several models of different abstraction levels can be used for on line prediction. These methods have been applied to a nuclear reprocessing plant. The abstraction level could be chosen on line by the operator. Moreover when an abnormal process behaviour is detected a more detailed model is automatically triggered in order to focus the operator attention on the suspected subsystem. (authors). 11 refs., 11 figs
Design of Graphic Aggregation Model for Evaluation of Energy Systems
International Nuclear Information System (INIS)
An, Sang Ha; Jeong, Yong Hoon; Chang, Won Joon; Chang, Soon Heung; Kim, Sung Ho; Kim, Tae Woon
2006-01-01
Korea is meeting the growing electric power needs by mix of nuclear, fossil, hydro energy and so on. But we can not depend on fossil energy forever, and the people's concern about environment has been changed. So it is time to plan future energy mix considering multiple parameters such as economics, environment, social, energy security, etc. A multiple aggregation model has been used for decision making process in which multiple variables should be considered like energy mix. In this context, we designed Graphic Aggregation Model for Evaluation of energy systems (GAME) for the dynamic analysis of decision on the energy systems. It can support Analytic Hierarchy Process (AHP) analysis based on Graphic User Interface
GENI: A graphical environment for model-based control
International Nuclear Information System (INIS)
Kleban, S.; Lee, M.; Zambre, Y.
1989-10-01
A new method to operate machine and beam simulation programs for accelerator control has been developed. Existing methods, although cumbersome, have been used in control systems for commissioning and operation of many machines. We developed GENI, a generalized graphical interface to these programs for model-based control. This ''object-oriented''-like environment is described and some typical applications are presented. 4 refs., 5 figs
Graphical Gaussian models with edge and vertex symmetries
DEFF Research Database (Denmark)
Højsgaard, Søren; Lauritzen, Steffen L
2008-01-01
We introduce new types of graphical Gaussian models by placing symmetry restrictions on the concentration or correlation matrix. The models can be represented by coloured graphs, where parameters that are associated with edges or vertices of the same colour are restricted to being identical. We...... study the properties of such models and derive the necessary algorithms for calculating maximum likelihood estimates. We identify conditions for restrictions on the concentration and correlation matrices being equivalent. This is for example the case when symmetries are generated by permutation...
A Causal Model of Consumer-Based Brand Equity
Directory of Open Access Journals (Sweden)
Szőcs Attila
2015-12-01
Full Text Available Branding literature suggests that consumer-based brand equity (CBBE is a multidimensional construct. Starting from this approach and developing a conceptual multidimensional model, this study finds that CBBE can be best modelled with a two-dimensional structure and claims that it achieves this result by choosing the theoretically based causal specification. On the contrary, with reflective specification, one will be able to fit almost any valid construct because of the halo effect and common method bias. In the final model, Trust (in quality and Advantage are causing the second-order Brand Equity. The two-dimensional brand equity is an intuitive model easy to interpret and easy to measure, which thus may be a much more attractive means for the management as well.
Fritzson, Peter; Gunnarsson, Johan; Jirstrand, Mats
2002-01-01
MathModelica is an integrated interactive development environment for advanced system modeling and simulation. The environment integrates Modelica-based modeling and simulation with graphic design, advanced scripting facilities, integration of program code, test cases, graphics, documentation, mathematical type setting, and symbolic formula manipulation provided via Mathematica. The user interface consists of a graphical Model Editor and Notebooks. The Model Editor is a graphical user interfa...
Causal Inference and Model Selection in Complex Settings
Zhao, Shandong
Propensity score methods have become a part of the standard toolkit for applied researchers who wish to ascertain causal effects from observational data. While they were originally developed for binary treatments, several researchers have proposed generalizations of the propensity score methodology for non-binary treatment regimes. In this article, we firstly review three main methods that generalize propensity scores in this direction, namely, inverse propensity weighting (IPW), the propensity function (P-FUNCTION), and the generalized propensity score (GPS), along with recent extensions of the GPS that aim to improve its robustness. We compare the assumptions, theoretical properties, and empirical performance of these methods. We propose three new methods that provide robust causal estimation based on the P-FUNCTION and GPS. While our proposed P-FUNCTION-based estimator preforms well, we generally advise caution in that all available methods can be biased by model misspecification and extrapolation. In a related line of research, we consider adjustment for posttreatment covariates in causal inference. Even in a randomized experiment, observations might have different compliance performance under treatment and control assignment. This posttreatment covariate cannot be adjusted using standard statistical methods. We review the principal stratification framework which allows for modeling this effect as part of its Bayesian hierarchical models. We generalize the current model to add the possibility of adjusting for pretreatment covariates. We also propose a new estimator of the average treatment effect over the entire population. In a third line of research, we discuss the spectral line detection problem in high energy astrophysics. We carefully review how this problem can be statistically formulated as a precise hypothesis test with point null hypothesis, why a usual likelihood ratio test does not apply for problem of this nature, and a doable fix to correctly
Exploring causal networks of bovine milk fatty acids in a multivariate mixed model context
DEFF Research Database (Denmark)
Bouwman, Aniek C; Valente, Bruno D; Janss, Luc L G
2014-01-01
Knowledge regarding causal relationships among traits is important to understand complex biological systems. Structural equation models (SEM) can be used to quantify the causal relations between traits, which allow prediction of outcomes to interventions applied to such a network. Such models...... are fitted conditionally on a causal structure among traits, represented by a directed acyclic graph and an Inductive Causation (IC) algorithm can be used to search for causal structures. The aim of this study was to explore the space of causal structures involving bovine milk fatty acids and to select...
Graphics-based nuclear facility modeling and management
International Nuclear Information System (INIS)
Rod, S.R.
1991-07-01
Nuclear waste management facilities are characterized by their complexity, many unprecedented features, and numerous competing design requirements. This paper describes the development of comprehensive descriptive databases and three-dimensional models of nuclear waste management facilities and applies the database/model to an example facility. The important features of the facility database/model are its abilities to (1) process large volumes of site data, plant data, and nuclear material inventory data in an efficient, integrated manner; (2) produce many different representations of the data to fulfill information needs as they arise; (3) create a complete three-dimensional solid model of the plant with all related information readily accessible; and (4) support complete, consistent inventory control and plant configuration control. While the substantive heart of the system is the database, graphic visualization of the data vastly improves the clarity of the information presented. Graphic representations are a convenient framework for the presentation of plant and inventory data, allowing all types of information to be readily located and presented in a manner that is easily understood. 2 refs., 5 figs., 1 tab
A developmental approach to learning causal models for cyber security
Mugan, Jonathan
2013-05-01
To keep pace with our adversaries, we must expand the scope of machine learning and reasoning to address the breadth of possible attacks. One approach is to employ an algorithm to learn a set of causal models that describes the entire cyber network and each host end node. Such a learning algorithm would run continuously on the system and monitor activity in real time. With a set of causal models, the algorithm could anticipate novel attacks, take actions to thwart them, and predict the second-order effects flood of information, and the algorithm would have to determine which streams of that flood were relevant in which situations. This paper will present the results of efforts toward the application of a developmental learning algorithm to the problem of cyber security. The algorithm is modeled on the principles of human developmental learning and is designed to allow an agent to learn about the computer system in which it resides through active exploration. Children are flexible learners who acquire knowledge by actively exploring their environment and making predictions about what they will find,1, 2 and our algorithm is inspired by the work of the developmental psychologist Jean Piaget.3 Piaget described how children construct knowledge in stages and learn new concepts on top of those they already know. Developmental learning allows our algorithm to focus on subsets of the environment that are most helpful for learning given its current knowledge. In experiments, the algorithm was able to learn the conditions for file exfiltration and use that knowledge to protect sensitive files.
Explaining quantum correlations through evolution of causal models
Harper, Robin; Chapman, Robert J.; Ferrie, Christopher; Granade, Christopher; Kueng, Richard; Naoumenko, Daniel; Flammia, Steven T.; Peruzzo, Alberto
2017-04-01
We propose a framework for the systematic and quantitative generalization of Bell's theorem using causal networks. We first consider the multiobjective optimization problem of matching observed data while minimizing the causal effect of nonlocal variables and prove an inequality for the optimal region that both strengthens and generalizes Bell's theorem. To solve the optimization problem (rather than simply bound it), we develop a genetic algorithm treating as individuals causal networks. By applying our algorithm to a photonic Bell experiment, we demonstrate the trade-off between the quantitative relaxation of one or more local causality assumptions and the ability of data to match quantum correlations.
Grms or graphical representation of model spaces. Vol. I Basics
International Nuclear Information System (INIS)
Duch, W.
1986-01-01
This book presents a novel approach to the many-body problem in quantum chemistry, nuclear shell-theory and solid-state theory. Many-particle model spaces are visualized using graphs, each path of a graph labeling a single basis function or a subspace of functions. Spaces of a very high dimension are represented by small graphs. Model spaces have structure that is reflected in the architecture of the corresponding graphs, that in turn is reflected in the structure of the matrices corresponding to operators acting in these spaces. Insight into this structure leads to formulation of very efficient computer algorithms. Calculation of matrix elements is reduced to comparison of paths in a graph, without ever looking at the functions themselves. Using only very rudimentary mathematical tools graphical rules of matrix element calculation in abelian cases are derived, in particular segmentation rules obtained in the unitary group approached are rederived. The graphs are solutions of Diophantine equations of the type appearing in different branches of applied mathematics. Graphical representation of model spaces should find as many applications as has been found for diagramatical methods in perturbation theory
Implementing the lattice Boltzmann model on commodity graphics hardware
International Nuclear Information System (INIS)
Kaufman, Arie; Fan, Zhe; Petkov, Kaloian
2009-01-01
Modern graphics processing units (GPUs) can perform general-purpose computations in addition to the native specialized graphics operations. Due to the highly parallel nature of graphics processing, the GPU has evolved into a many-core coprocessor that supports high data parallelism. Its performance has been growing at a rate of squared Moore's law, and its peak floating point performance exceeds that of the CPU by an order of magnitude. Therefore, it is a viable platform for time-sensitive and computationally intensive applications. The lattice Boltzmann model (LBM) computations are carried out via linear operations at discrete lattice sites, which can be implemented efficiently using a GPU-based architecture. Our simulations produce results comparable to the CPU version while improving performance by an order of magnitude. We have demonstrated that the GPU is well suited for interactive simulations in many applications, including simulating fire, smoke, lightweight objects in wind, jellyfish swimming in water, and heat shimmering and mirage (using the hybrid thermal LBM). We further advocate the use of a GPU cluster for large scale LBM simulations and for high performance computing. The Stony Brook Visual Computing Cluster has been the platform for several applications, including simulations of real-time plume dispersion in complex urban environments and thermal fluid dynamics in a pressurized water reactor. Major GPU vendors have been targeting the high performance computing market with GPU hardware implementations. Software toolkits such as NVIDIA CUDA provide a convenient development platform that abstracts the GPU and allows access to its underlying stream computing architecture. However, software programming for a GPU cluster remains a challenging task. We have therefore developed the Zippy framework to simplify GPU cluster programming. Zippy is based on global arrays combined with the stream programming model and it hides the low-level details of the
GRAPHIC REALIZATION FOUNDATIONS OF LOGIC-SEMANTIC MODELING IN DIDACTICS
Directory of Open Access Journals (Sweden)
V. E. Steinberg
2017-01-01
Full Text Available Introduction. Nowadays, there are not a lot of works devoted to a graphic method of logic-semantic modeling of knowledge. Meanwhile, an interest towards this method increases due to the fact of essential increase of the content of visual component in information and educational sources. The present publication is the authors’ contribution into the solution of the problem of search of new forms and means convenient for visual and logic perception of a training material, its assimilation, operating by elements of knowledge and their transformations.The aim of the research is to justify graphical implementation of the method of logic-semantic modeling of knowledge, presented by a natural language (training language and to show the possibilities of application of figurative and conceptual models in student teaching.Methodology and research methods. The research methodology is based on the specified activity-regulatory, system-multi-dimensional and structural-invariant approach and the principle of multidimensionality. The methodology the graphic realization of the logic-semantic models in learning technologies is based on didactic design using computer training programs.Results and scientific novelty. Social and anthropological-cultural adaptation bases of the method of logical-semantic knowledge modeling to the problems of didactics are established and reasoned: coordinate-invariant matrix structure is presented as the basis of logical-semantic models of figurative and conceptual nature; the possibilities of using such models as multifunctional didactic regulators – support schemes, navigation in the content of the educational material, educational activities carried out by navigators, etc., are shown. The characteristics of new teaching tools as objects of semiotics and didactic of regulators are considered; their place and role in the structure of the external and internal training curricula learning activities are pointed out
A Probabilistic Graphical Model to Detect Chromosomal Domains
Heermann, Dieter; Hofmann, Andreas; Weber, Eva
To understand the nature of a cell, one needs to understand the structure of its genome. For this purpose, experimental techniques such as Hi-C detecting chromosomal contacts are used to probe the three-dimensional genomic structure. These experiments yield topological information, consistently showing a hierarchical subdivision of the genome into self-interacting domains across many organisms. Current methods for detecting these domains using the Hi-C contact matrix, i.e. a doubly-stochastic matrix, are mostly based on the assumption that the domains are distinct, thus non-overlapping. For overcoming this simplification and for being able to unravel a possible nested domain structure, we developed a probabilistic graphical model that makes no a priori assumptions on the domain structure. Within this approach, the Hi-C contact matrix is analyzed using an Ising like probabilistic graphical model whose coupling constant is proportional to each lattice point (entry in the contact matrix). The results show clear boundaries between identified domains and the background. These domain boundaries are dependent on the coupling constant, so that one matrix yields several clusters of different sizes, which show the self-interaction of the genome on different scales. This work was supported by a Grant from the International Human Frontier Science Program Organization (RGP0014/2014).
Statistical mechanics of sparse generalization and graphical model selection
International Nuclear Information System (INIS)
Lage-Castellanos, Alejandro; Pagnani, Andrea; Weigt, Martin
2009-01-01
One of the crucial tasks in many inference problems is the extraction of an underlying sparse graphical model from a given number of high-dimensional measurements. In machine learning, this is frequently achieved using, as a penalty term, the L p norm of the model parameters, with p≤1 for efficient dilution. Here we propose a statistical mechanics analysis of the problem in the setting of perceptron memorization and generalization. Using a replica approach, we are able to evaluate the relative performance of naive dilution (obtained by learning without dilution, following by applying a threshold to the model parameters), L 1 dilution (which is frequently used in convex optimization) and L 0 dilution (which is optimal but computationally hard to implement). Whereas both L p diluted approaches clearly outperform the naive approach, we find a small region where L 0 works almost perfectly and strongly outperforms the simpler to implement L 1 dilution
Ice-sheet modelling accelerated by graphics cards
Brædstrup, Christian Fredborg; Damsgaard, Anders; Egholm, David Lundbek
2014-11-01
Studies of glaciers and ice sheets have increased the demand for high performance numerical ice flow models over the past decades. When exploring the highly non-linear dynamics of fast flowing glaciers and ice streams, or when coupling multiple flow processes for ice, water, and sediment, researchers are often forced to use super-computing clusters. As an alternative to conventional high-performance computing hardware, the Graphical Processing Unit (GPU) is capable of massively parallel computing while retaining a compact design and low cost. In this study, we present a strategy for accelerating a higher-order ice flow model using a GPU. By applying the newest GPU hardware, we achieve up to 180× speedup compared to a similar but serial CPU implementation. Our results suggest that GPU acceleration is a competitive option for ice-flow modelling when compared to CPU-optimised algorithms parallelised by the OpenMP or Message Passing Interface (MPI) protocols.
An Accurate and Dynamic Computer Graphics Muscle Model
Levine, David Asher
1997-01-01
A computer based musculo-skeletal model was developed at the University in the departments of Mechanical and Biomedical Engineering. This model accurately represents human shoulder kinematics. The result of this model is the graphical display of bones moving through an appropriate range of motion based on inputs of EMGs and external forces. The need existed to incorporate a geometric muscle model in the larger musculo-skeletal model. Previous muscle models did not accurately represent muscle geometries, nor did they account for the kinematics of tendons. This thesis covers the creation of a new muscle model for use in the above musculo-skeletal model. This muscle model was based on anatomical data from the Visible Human Project (VHP) cadaver study. Two-dimensional digital images from the VHP were analyzed and reconstructed to recreate the three-dimensional muscle geometries. The recreated geometries were smoothed, reduced, and sliced to form data files defining the surfaces of each muscle. The muscle modeling function opened these files during run-time and recreated the muscle surface. The modeling function applied constant volume limitations to the muscle and constant geometry limitations to the tendons.
Causal Effect Inference with Deep Latent-Variable Models
Louizos, C; Shalit, U.; Mooij, J.; Sontag, D.; Zemel, R.; Welling, M.
2017-01-01
Learning individual-level causal effects from observational data, such as inferring the most effective medication for a specific patient, is a problem of growing importance for policy makers. The most important aspect of inferring causal effects from observational data is the handling of
Cause and Event: Supporting Causal Claims through Logistic Models
O'Connell, Ann A.; Gray, DeLeon L.
2011-01-01
Efforts to identify and support credible causal claims have received intense interest in the research community, particularly over the past few decades. In this paper, we focus on the use of statistical procedures designed to support causal claims for a treatment or intervention when the response variable of interest is dichotomous. We identify…
Situation models and memory: the effects of temporal and causal information on recall sequence.
Brownstein, Aaron L; Read, Stephen J
2007-10-01
Participants watched an episode of the television show Cheers on video and then reported free recall. Recall sequence followed the sequence of events in the story; if one concept was observed immediately after another, it was recalled immediately after it. We also made a causal network of the show's story and found that recall sequence followed causal links; effects were recalled immediately after their causes. Recall sequence was more likely to follow causal links than temporal sequence, and most likely to follow causal links that were temporally sequential. Results were similar at 10-minute and 1-week delayed recall. This is the most direct and detailed evidence reported on sequential effects in recall. The causal network also predicted probability of recall; concepts with more links and concepts on the main causal chain were most likely to be recalled. This extends the causal network model to more complex materials than previous research.
Local fit evaluation of structural equation models using graphical criteria.
Thoemmes, Felix; Rosseel, Yves; Textor, Johannes
2018-03-01
Evaluation of model fit is critically important for every structural equation model (SEM), and sophisticated methods have been developed for this task. Among them are the χ² goodness-of-fit test, decomposition of the χ², derived measures like the popular root mean square error of approximation (RMSEA) or comparative fit index (CFI), or inspection of residuals or modification indices. Many of these methods provide a global approach to model fit evaluation: A single index is computed that quantifies the fit of the entire SEM to the data. In contrast, graphical criteria like d-separation or trek-separation allow derivation of implications that can be used for local fit evaluation, an approach that is hardly ever applied. We provide an overview of local fit evaluation from the viewpoint of SEM practitioners. In the presence of model misfit, local fit evaluation can potentially help in pinpointing where the problem with the model lies. For models that do fit the data, local tests can identify the parts of the model that are corroborated by the data. Local tests can also be conducted before a model is fitted at all, and they can be used even for models that are globally underidentified. We discuss appropriate statistical local tests, and provide applied examples. We also present novel software in R that automates this type of local fit evaluation. (PsycINFO Database Record (c) 2018 APA, all rights reserved).
Climate Modeling and Causal Identification for Sea Ice Predictability
Energy Technology Data Exchange (ETDEWEB)
Hunke, Elizabeth Clare [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Urrego Blanco, Jorge Rolando [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Urban, Nathan Mark [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)
2018-02-12
This project aims to better understand causes of ongoing changes in the Arctic climate system, particularly as decreasing sea ice trends have been observed in recent decades and are expected to continue in the future. As part of the Sea Ice Prediction Network, a multi-agency effort to improve sea ice prediction products on seasonal-to-interannual time scales, our team is studying sensitivity of sea ice to a collection of physical process and feedback mechanism in the coupled climate system. During 2017 we completed a set of climate model simulations using the fully coupled ACME-HiLAT model. The simulations consisted of experiments in which cloud, sea ice, and air-ocean turbulent exchange parameters previously identified as important for driving output uncertainty in climate models were perturbed to account for parameter uncertainty in simulated climate variables. We conducted a sensitivity study to these parameters, which built upon a previous study we made for standalone simulations (Urrego-Blanco et al., 2016, 2017). Using the results from the ensemble of coupled simulations, we are examining robust relationships between climate variables that emerge across the experiments. We are also using causal discovery techniques to identify interaction pathways among climate variables which can help identify physical mechanisms and provide guidance in predictability studies. This work further builds on and leverages the large ensemble of standalone sea ice simulations produced in our previous w14_seaice project.
Dynamic Causal Modeling of the Cortical Responses to Wrist Perturbations
Directory of Open Access Journals (Sweden)
Yuan Yang
2017-09-01
Full Text Available Mechanical perturbations applied to the wrist joint typically evoke a stereotypical sequence of cortical and muscle responses. The early cortical responses (<100 ms are thought be involved in the “rapid” transcortical reaction to the perturbation while the late cortical responses (>100 ms are related to the “slow” transcortical reaction. Although previous studies indicated that both responses involve the primary motor cortex, it remains unclear if both responses are engaged by the same effective connectivity in the cortical network. To answer this question, we investigated the effective connectivity cortical network after a “ramp-and-hold” mechanical perturbation, in both the early (<100 ms and late (>100 ms periods, using dynamic causal modeling. Ramp-and-hold perturbations were applied to the wrist joint while the subject maintained an isometric wrist flexion. Cortical activity was recorded using a 128-channel electroencephalogram (EEG. We investigated how the perturbation modulated the effective connectivity for the early and late periods. Bayesian model comparisons suggested that different effective connectivity networks are engaged in these two periods. For the early period, we found that only a few cortico-cortical connections were modulated, while more complicated connectivity was identified in the cortical network during the late period with multiple modulated cortico-cortical connections. The limited early cortical network likely allows for a rapid muscle response without involving high-level cognitive processes, while the complexity of the late network may facilitate coordinated responses.
Development of virtual hands using animation software and graphical modelling
International Nuclear Information System (INIS)
Oliveira, Erick da S.; Junior, Alberico B. de C.
2016-01-01
The numerical dosimetry uses virtual anthropomorphic simulators to represent the human being in computational framework and thus assess the risks associated with exposure to a radioactive source. With the development of computer animation software, the development of these simulators was facilitated using only knowledge of human anatomy to prepare various types of simulators (man, woman, child and baby) in various positions (sitting, standing, running) or part thereof (head, trunk and limbs). These simulators are constructed by loops of handling and due to the versatility of the method, one can create various geometries irradiation was not possible before. In this work, we have built an exhibition of a radiopharmaceutical scenario manipulating radioactive material using animation software and graphical modeling and anatomical database. (author)
Reconstructing Constructivism: Causal Models, Bayesian Learning Mechanisms, and the Theory Theory
Gopnik, Alison; Wellman, Henry M.
2012-01-01
We propose a new version of the "theory theory" grounded in the computational framework of probabilistic causal models and Bayesian learning. Probabilistic models allow a constructivist but rigorous and detailed approach to cognitive development. They also explain the learning of both more specific causal hypotheses and more abstract framework…
Development of face recognition: Dynamic causal modelling of MEG data.
He, Wei; Johnson, Blake W
2018-04-01
Electrophysiological studies of adults indicate that brain activity is enhanced during viewing of repeated faces, at a latency of about 250 ms after the onset of the face (M250/N250). The present study aimed to determine if this effect was also present in preschool-aged children, whose brain activity was measured in a custom-sized pediatric MEG system. The results showed that, unlike adults, face repetition did not show any significant modulation of M250 amplitude in children; however children's M250 latencies were significantly faster for repeated than non-repeated faces. Dynamic causal modelling (DCM) of the M250 in both age groups tested the effects of face repetition within the core face network including the occipital face area (OFA), the fusiform face area (FFA), and the superior temporal sulcus (STS). DCM revealed that repetition of identical faces altered both forward and backward connections in children and adults; however the modulations involved inputs to both FFA and OFA in adults but only to OFA in children. These findings suggest that the amplitude-insensitivity of the immature M250 may be due to a weaker connection between the FFA and lower visual areas. Copyright © 2017 The Authors. Published by Elsevier Ltd.. All rights reserved.
Development of face recognition: Dynamic causal modelling of MEG data
Directory of Open Access Journals (Sweden)
Wei He
2018-04-01
Full Text Available Electrophysiological studies of adults indicate that brain activity is enhanced during viewing of repeated faces, at a latency of about 250 ms after the onset of the face (M250/N250. The present study aimed to determine if this effect was also present in preschool-aged children, whose brain activity was measured in a custom-sized pediatric MEG system. The results showed that, unlike adults, face repetition did not show any significant modulation of M250 amplitude in children; however children’s M250 latencies were significantly faster for repeated than non-repeated faces. Dynamic causal modelling (DCM of the M250 in both age groups tested the effects of face repetition within the core face network including the occipital face area (OFA, the fusiform face area (FFA, and the superior temporal sulcus (STS. DCM revealed that repetition of identical faces altered both forward and backward connections in children and adults; however the modulations involved inputs to both FFA and OFA in adults but only to OFA in children. These findings suggest that the amplitude-insensitivity of the immature M250 may be due to a weaker connection between the FFA and lower visual areas. Keywords: MEG, Face recognition, Repetition, DCM, M250, M170
Graphic-based musculoskeletal model for biomechanical analyses and animation.
Chao, Edmund Y S
2003-04-01
The ability to combine physiology and engineering analyses with computer sciences has opened the door to the possibility of creating the 'Virtual Human' reality. This paper presents a broad foundation for a full-featured biomechanical simulator for the human musculoskeletal system physiology. This simulation technology unites the expertise in biomechanical analysis and graphic modeling to investigate joint and connective tissue mechanics at the structural level and to visualize the results in both static and animated forms together with the model. Adaptable anatomical models including prosthetic implants and fracture fixation devices and a robust computational infrastructure for static, kinematic, kinetic, and stress analyses under varying boundary and loading conditions are incorporated on a common platform, the VIMS (Virtual Interactive Musculoskeletal System). Within this software system, a manageable database containing long bone dimensions, connective tissue material properties and a library of skeletal joint system functional activities and loading conditions are also available and they can easily be modified, updated and expanded. Application software is also available to allow end-users to perform biomechanical analyses interactively. This paper details the design, capabilities, and features of the VIMS development at Johns Hopkins University, an effort possible only through academic and commercial collaborations. Examples using these models and the computational algorithms in a virtual laboratory environment are used to demonstrate the utility of this unique database and simulation technology. This integrated system will impact on medical education, basic research, device development and application, and clinical patient care related to musculoskeletal diseases, trauma, and rehabilitation.
The causal pie model: an epidemiological method applied to evolutionary biology and ecology.
Wensink, Maarten; Westendorp, Rudi G J; Baudisch, Annette
2014-05-01
A general concept for thinking about causality facilitates swift comprehension of results, and the vocabulary that belongs to the concept is instrumental in cross-disciplinary communication. The causal pie model has fulfilled this role in epidemiology and could be of similar value in evolutionary biology and ecology. In the causal pie model, outcomes result from sufficient causes. Each sufficient cause is made up of a "causal pie" of "component causes". Several different causal pies may exist for the same outcome. If and only if all component causes of a sufficient cause are present, that is, a causal pie is complete, does the outcome occur. The effect of a component cause hence depends on the presence of the other component causes that constitute some causal pie. Because all component causes are equally and fully causative for the outcome, the sum of causes for some outcome exceeds 100%. The causal pie model provides a way of thinking that maps into a number of recurrent themes in evolutionary biology and ecology: It charts when component causes have an effect and are subject to natural selection, and how component causes affect selection on other component causes; which partitions of outcomes with respect to causes are feasible and useful; and how to view the composition of a(n apparently homogeneous) population. The diversity of specific results that is directly understood from the causal pie model is a test for both the validity and the applicability of the model. The causal pie model provides a common language in which results across disciplines can be communicated and serves as a template along which future causal analyses can be made.
Testing a Model of Diabetes Self-Care Management: A Causal Model Analysis with LISREL.
Nowacek, George A.; And Others
1990-01-01
A diabetes-management model is presented, which includes an attitudinal element and depicts relationships among causal elements. LISREL-VI was used to analyze data from 115 Type-I and 105 Type-II patients. The data did not closely fit the model. Results support the importance of the personal meaning of diabetes. (TJH)
Causal reasoning and models of cognitive tasks for naval nuclear power plant operators
International Nuclear Information System (INIS)
Salazar-Ferrer, P.
1995-06-01
In complex industrial process control, causal reasoning appears as a major component in operators' cognitive tasks. It is tightly linked to diagnosis, prediction of normal and failure states, and explanation. This work provides a detailed review of literature in causal reasoning. A synthesis is proposed as a model of causal reasoning in process control. This model integrates distinct approaches in Cognitive Science: especially qualitative physics, Bayesian networks, knowledge-based systems, and cognitive psychology. Our model defines a framework for the analysis of causal human errors in simulated naval nuclear power plant fault management. Through the methodological framework of critical incident analysis we define a classification of errors and difficulties linked to causal reasoning. This classification is based on shallow characteristics of causal reasoning. As an origin of these errors, more elementary component activities in causal reasoning are identified. The applications cover the field of functional specification for man-machine interfaces, operators support systems design as well as nuclear safety. In addition of this study, we integrate the model of causal reasoning in a model of cognitive task in process control. (authors). 106 refs., 49 figs., 8 tabs
A Statistical Graphical Model of the California Reservoir System
Taeb, A.; Reager, J. T.; Turmon, M.; Chandrasekaran, V.
2017-11-01
The recent California drought has highlighted the potential vulnerability of the state's water management infrastructure to multiyear dry intervals. Due to the high complexity of the network, dynamic storage changes in California reservoirs on a state-wide scale have previously been difficult to model using either traditional statistical or physical approaches. Indeed, although there is a significant line of research on exploring models for single (or a small number of) reservoirs, these approaches are not amenable to a system-wide modeling of the California reservoir network due to the spatial and hydrological heterogeneities of the system. In this work, we develop a state-wide statistical graphical model to characterize the dependencies among a collection of 55 major California reservoirs across the state; this model is defined with respect to a graph in which the nodes index reservoirs and the edges specify the relationships or dependencies between reservoirs. We obtain and validate this model in a data-driven manner based on reservoir volumes over the period 2003-2016. A key feature of our framework is a quantification of the effects of external phenomena that influence the entire reservoir network. We further characterize the degree to which physical factors (e.g., state-wide Palmer Drought Severity Index (PDSI), average temperature, snow pack) and economic factors (e.g., consumer price index, number of agricultural workers) explain these external influences. As a consequence of this analysis, we obtain a system-wide health diagnosis of the reservoir network as a function of PDSI.
Exploring manual asymmetries during grasping: a dynamic causal modeling approach.
Directory of Open Access Journals (Sweden)
Chiara eBegliomini
2015-02-01
Full Text Available Recording of neural activity during grasping actions in macaques showed that grasp-related sensorimotor transformations are accomplished in a circuit constituted by the anterior part of the intraparietal sulcus (AIP, the ventral (F5 and the dorsal (F2 region of the premotor area. In humans, neuroimaging studies have revealed the existence of a similar circuit, involving the putative homolog of macaque areas AIP, F5 and F2. These studies have mainly considered grasping movements performed with the right dominant hand and only a few studies have measured brain activity associated with a movement performed with the left non-dominant hand. As a consequence of this gap, how the brain controls for grasping movement performed with the dominant and the non-dominant hand still represents an open question. A functional resonance imaging experiment (fMRI has been conducted, and effective connectivity (Dynamic Causal Modelling, DCM was used to assess how connectivity among grasping-related areas is modulated by hand (i.e., left and right during the execution of grasping movements towards a small object requiring precision grasping. Results underlined boosted inter-hemispheric couplings between dorsal premotor cortices during the execution of movements performed with the left rather than the right dominant hand. More specifically, they suggest that the dorsal premotor cortices may play a fundamental role in monitoring the configuration of fingers when grasping movements are performed by either the right and the left hand. This role becomes particularly evident when the hand less-skilled (i.e., the left hand to perform such action is utilized. The results are discussed in light of recent theories put forward to explain how parieto-frontal connectivity is modulated by the execution of prehensile movements.
Causal Agency Theory: Reconceptualizing a Functional Model of Self-Determination
Shogren, Karrie A.; Wehmeyer, Michael L.; Palmer, Susan B.; Forber-Pratt, Anjali J.; Little, Todd J.; Lopez, Shane
2015-01-01
This paper introduces Causal Agency Theory, an extension of the functional model of self-determination. Causal Agency Theory addresses the need for interventions and assessments pertaining to selfdetermination for all students and incorporates the significant advances in understanding of disability and in the field of positive psychology since the…
From Ordinary Differential Equations to Structural Causal Models: the deterministic case
Mooij, J.M.; Janzing, D.; Schölkopf, B.; Nicholson, A.; Smyth, P.
2013-01-01
We show how, and under which conditions, the equilibrium states of a first-order Ordinary Differential Equation (ODE) system can be described with a deterministic Structural Causal Model (SCM). Our exposition sheds more light on the concept of causality as expressed within the framework of
Graphical User Interface for Simulink Integrated Performance Analysis Model
Durham, R. Caitlyn
2009-01-01
The J-2X Engine (built by Pratt & Whitney Rocketdyne,) in the Upper Stage of the Ares I Crew Launch Vehicle, will only start within a certain range of temperature and pressure for Liquid Hydrogen and Liquid Oxygen propellants. The purpose of the Simulink Integrated Performance Analysis Model is to verify that in all reasonable conditions the temperature and pressure of the propellants are within the required J-2X engine start boxes. In order to run the simulation, test variables must be entered at all reasonable values of parameters such as heat leak and mass flow rate. To make this testing process as efficient as possible in order to save the maximum amount of time and money, and to show that the J-2X engine will start when it is required to do so, a graphical user interface (GUI) was created to allow the input of values to be used as parameters in the Simulink Model, without opening or altering the contents of the model. The GUI must allow for test data to come from Microsoft Excel files, allow those values to be edited before testing, place those values into the Simulink Model, and get the output from the Simulink Model. The GUI was built using MATLAB, and will run the Simulink simulation when the Simulate option is activated. After running the simulation, the GUI will construct a new Microsoft Excel file, as well as a MATLAB matrix file, using the output values for each test of the simulation so that they may graphed and compared to other values.
DEFF Research Database (Denmark)
Kuhnert, Barbara; Lindner, Felix; Bentzen, Martin Mose
We introduce causal agency models as a modeling technique for representing and reasoning about ethical dilemmas. We find that ethical dilemmas, although they look similar on the surface, have very different causal structures. Based on their structural properties, as identified by the causal agency...... models, we cluster a set of dilemmas in Type 1 and Type 2 dilemmas. We observe that for Type 2 dilemmas but not for Type 1 dilemmas a utilitarian action dominates the possibility of refraining from action. Hence, we hypothesize, based on the model, that Type 2 dilemmas are perceived as less difficult...
A Bayesian Nonparametric Causal Model for Regression Discontinuity Designs
Karabatsos, George; Walker, Stephen G.
2013-01-01
The regression discontinuity (RD) design (Thistlewaite & Campbell, 1960; Cook, 2008) provides a framework to identify and estimate causal effects from a non-randomized design. Each subject of a RD design is assigned to the treatment (versus assignment to a non-treatment) whenever her/his observed value of the assignment variable equals or…
Causal Indicator Models Have Nothing to Do with Measurement
Howell, Roy D.; Breivik, Einar
2016-01-01
In this article, Roy Howell, and Einar Breivik, congratulate Aguirre-Urreta, M. I., Rönkkö, M., & Marakas, G. M., for their work (2016) "Omission of Causal Indicators: Consequences and Implications for Measurement," Measurement: Interdisciplinary Research and Perspectives, 14(3), 75-97. doi:10.1080/15366367.2016.1205935. They call it…
Revisiting Aristotle's causality: model for development in Nigeria ...
African Journals Online (AJOL)
Thus, the development equation must be balanced. Aristotle‟s theory of causality balances the equation. Since Aristotle has no theory of development therefore every individual, nation or industry in pursuit of development should seek to drive economic growth and human capital development together rather than focus on ...
Deadlock Detection Based on Automatic Code Generation from Graphical CSP Models
Jovanovic, D.S.; Liet, Geert K.; Broenink, Johannes F.; Karelse, F.
2004-01-01
The paper describes a way of using standard formal analysis tools for checking deadlock freedom in graphical models for CSP descriptions of concurrent systems. The models capture specification of a possible concurrent implementation of a system to be realized. Building the graphical models and
JACK - ANTHROPOMETRIC MODELING SYSTEM FOR SILICON GRAPHICS WORKSTATIONS
Smith, B.
1994-01-01
JACK is an interactive graphics program developed at the University of Pennsylvania that displays and manipulates articulated geometric figures. JACK is typically used to observe how a human mannequin interacts with its environment and what effects body types will have upon the performance of a task in a simulated environment. Any environment can be created, and any number of mannequins can be placed anywhere in that environment. JACK includes facilities to construct limited geometric objects, position figures, perform a variety of analyses on the figures, describe the motion of the figures and specify lighting and surface property information for rendering high quality images. JACK is supplied with a variety of body types pre-defined and known to the system. There are both male and female bodies, ranging from the 5th to the 95th percentile, based on NASA Standard 3000. Each mannequin is fully articulated and reflects the joint limitations of a normal human. JACK is an editor for manipulating previously defined objects known as "Peabody" objects. Used to describe the figures as well as the internal data structure for representing them, Peabody is a language with a powerful and flexible mechanism for representing connectivity between objects, both the joints between individual segments within a figure and arbitrary connections between different figures. Peabody objects are generally comprised of several individual figures, each one a collection of segments. Each segment has a geometry represented by PSURF files that consist of polygons or curved surface patches. Although JACK does not have the capability to create new objects, objects may be created by other geometric modeling programs and then translated into the PSURF format. Environment files are a collection of figures and attributes that may be dynamically moved under the control of an animation file. The animation facilities allow the user to create a sequence of commands that duplicate the movements of a
A Gaussian graphical model approach to climate networks
Energy Technology Data Exchange (ETDEWEB)
Zerenner, Tanja, E-mail: tanjaz@uni-bonn.de [Meteorological Institute, University of Bonn, Auf dem Hügel 20, 53121 Bonn (Germany); Friederichs, Petra; Hense, Andreas [Meteorological Institute, University of Bonn, Auf dem Hügel 20, 53121 Bonn (Germany); Interdisciplinary Center for Complex Systems, University of Bonn, Brühler Straße 7, 53119 Bonn (Germany); Lehnertz, Klaus [Department of Epileptology, University of Bonn, Sigmund-Freud-Straße 25, 53105 Bonn (Germany); Helmholtz Institute for Radiation and Nuclear Physics, University of Bonn, Nussallee 14-16, 53115 Bonn (Germany); Interdisciplinary Center for Complex Systems, University of Bonn, Brühler Straße 7, 53119 Bonn (Germany)
2014-06-15
Distinguishing between direct and indirect connections is essential when interpreting network structures in terms of dynamical interactions and stability. When constructing networks from climate data the nodes are usually defined on a spatial grid. The edges are usually derived from a bivariate dependency measure, such as Pearson correlation coefficients or mutual information. Thus, the edges indistinguishably represent direct and indirect dependencies. Interpreting climate data fields as realizations of Gaussian Random Fields (GRFs), we have constructed networks according to the Gaussian Graphical Model (GGM) approach. In contrast to the widely used method, the edges of GGM networks are based on partial correlations denoting direct dependencies. Furthermore, GRFs can be represented not only on points in space, but also by expansion coefficients of orthogonal basis functions, such as spherical harmonics. This leads to a modified definition of network nodes and edges in spectral space, which is motivated from an atmospheric dynamics perspective. We construct and analyze networks from climate data in grid point space as well as in spectral space, and derive the edges from both Pearson and partial correlations. Network characteristics, such as mean degree, average shortest path length, and clustering coefficient, reveal that the networks posses an ordered and strongly locally interconnected structure rather than small-world properties. Despite this, the network structures differ strongly depending on the construction method. Straightforward approaches to infer networks from climate data while not regarding any physical processes may contain too strong simplifications to describe the dynamics of the climate system appropriately.
Gaussian graphical modeling reveals specific lipid correlations in glioblastoma cells
Mueller, Nikola S.; Krumsiek, Jan; Theis, Fabian J.; Böhm, Christian; Meyer-Bäse, Anke
2011-06-01
Advances in high-throughput measurements of biological specimens necessitate the development of biologically driven computational techniques. To understand the molecular level of many human diseases, such as cancer, lipid quantifications have been shown to offer an excellent opportunity to reveal disease-specific regulations. The data analysis of the cell lipidome, however, remains a challenging task and cannot be accomplished solely based on intuitive reasoning. We have developed a method to identify a lipid correlation network which is entirely disease-specific. A powerful method to correlate experimentally measured lipid levels across the various samples is a Gaussian Graphical Model (GGM), which is based on partial correlation coefficients. In contrast to regular Pearson correlations, partial correlations aim to identify only direct correlations while eliminating indirect associations. Conventional GGM calculations on the entire dataset can, however, not provide information on whether a correlation is truly disease-specific with respect to the disease samples and not a correlation of control samples. Thus, we implemented a novel differential GGM approach unraveling only the disease-specific correlations, and applied it to the lipidome of immortal Glioblastoma tumor cells. A large set of lipid species were measured by mass spectrometry in order to evaluate lipid remodeling as a result to a combination of perturbation of cells inducing programmed cell death, while the other perturbations served solely as biological controls. With the differential GGM, we were able to reveal Glioblastoma-specific lipid correlations to advance biomedical research on novel gene therapies.
A Gaussian graphical model approach to climate networks
International Nuclear Information System (INIS)
Zerenner, Tanja; Friederichs, Petra; Hense, Andreas; Lehnertz, Klaus
2014-01-01
Distinguishing between direct and indirect connections is essential when interpreting network structures in terms of dynamical interactions and stability. When constructing networks from climate data the nodes are usually defined on a spatial grid. The edges are usually derived from a bivariate dependency measure, such as Pearson correlation coefficients or mutual information. Thus, the edges indistinguishably represent direct and indirect dependencies. Interpreting climate data fields as realizations of Gaussian Random Fields (GRFs), we have constructed networks according to the Gaussian Graphical Model (GGM) approach. In contrast to the widely used method, the edges of GGM networks are based on partial correlations denoting direct dependencies. Furthermore, GRFs can be represented not only on points in space, but also by expansion coefficients of orthogonal basis functions, such as spherical harmonics. This leads to a modified definition of network nodes and edges in spectral space, which is motivated from an atmospheric dynamics perspective. We construct and analyze networks from climate data in grid point space as well as in spectral space, and derive the edges from both Pearson and partial correlations. Network characteristics, such as mean degree, average shortest path length, and clustering coefficient, reveal that the networks posses an ordered and strongly locally interconnected structure rather than small-world properties. Despite this, the network structures differ strongly depending on the construction method. Straightforward approaches to infer networks from climate data while not regarding any physical processes may contain too strong simplifications to describe the dynamics of the climate system appropriately
Boué, Stéphanie; Talikka, Marja; Westra, Jurjen Willem; Hayes, William; Di Fabio, Anselmo; Park, Jennifer; Schlage, Walter K; Sewer, Alain; Fields, Brett; Ansari, Sam; Martin, Florian; Veljkovic, Emilija; Kenney, Renee; Peitsch, Manuel C; Hoeng, Julia
2015-01-01
With the wealth of publications and data available, powerful and transparent computational approaches are required to represent measured data and scientific knowledge in a computable and searchable format. We developed a set of biological network models, scripted in the Biological Expression Language, that reflect causal signaling pathways across a wide range of biological processes, including cell fate, cell stress, cell proliferation, inflammation, tissue repair and angiogenesis in the pulmonary and cardiovascular context. This comprehensive collection of networks is now freely available to the scientific community in a centralized web-based repository, the Causal Biological Network database, which is composed of over 120 manually curated and well annotated biological network models and can be accessed at http://causalbionet.com. The website accesses a MongoDB, which stores all versions of the networks as JSON objects and allows users to search for genes, proteins, biological processes, small molecules and keywords in the network descriptions to retrieve biological networks of interest. The content of the networks can be visualized and browsed. Nodes and edges can be filtered and all supporting evidence for the edges can be browsed and is linked to the original articles in PubMed. Moreover, networks may be downloaded for further visualization and evaluation. Database URL: http://causalbionet.com © The Author(s) 2015. Published by Oxford University Press.
BDgraph: An R Package for Bayesian Structure Learning in Graphical Models
Mohammadi, A.; Wit, E.C.
2017-01-01
Graphical models provide powerful tools to uncover complicated patterns in multivariate data and are commonly used in Bayesian statistics and machine learning. In this paper, we introduce an R package BDgraph which performs Bayesian structure learning for general undirected graphical models with
Beyond Markov: Accounting for independence violations in causal reasoning.
Rehder, Bob
2018-06-01
Although many theories of causal cognition are based on causal graphical models, a key property of such models-the independence relations stipulated by the Markov condition-is routinely violated by human reasoners. This article presents three new accounts of those independence violations, accounts that share the assumption that people's understanding of the correlational structure of data generated from a causal graph differs from that stipulated by causal graphical model framework. To distinguish these models, experiments assessed how people reason with causal graphs that are larger than those tested in previous studies. A traditional common cause network (Y 1 ←X→Y 2 ) was extended so that the effects themselves had effects (Z 1 ←Y 1 ←X→Y 2 →Z 2 ). A traditional common effect network (Y 1 →X←Y 2 ) was extended so that the causes themselves had causes (Z 1 →Y 1 →X←Y 2 ←Z 2 ). Subjects' inferences were most consistent with the beta-Q model in which consistent states of the world-those in which variables are either mostly all present or mostly all absent-are viewed as more probable than stipulated by the causal graphical model framework. Substantial variability in subjects' inferences was also observed, with the result that substantial minorities of subjects were best fit by one of the other models (the dual prototype or a leaky gate models). The discrepancy between normative and human causal cognition stipulated by these models is foundational in the sense that they locate the error not in people's causal reasoning but rather in their causal representations. As a result, they are applicable to any cognitive theory grounded in causal graphical models, including theories of analogy, learning, explanation, categorization, decision-making, and counterfactual reasoning. Preliminary evidence that independence violations indeed generalize to other judgment types is presented. Copyright © 2018 Elsevier Inc. All rights reserved.
Counterfactual Graphical Models for Longitudinal Mediation Analysis with Unobserved Confounding
Shpitser, Ilya
2013-01-01
Questions concerning mediated causal effects are of great interest in psychology, cognitive science, medicine, social science, public health, and many other disciplines. For instance, about 60% of recent papers published in leading journals in social psychology contain at least one mediation test (Rucker, Preacher, Tormala, & Petty, 2011).…
Classical Causal Models for Bell and Kochen-Specker Inequality Violations Require Fine-Tuning
Directory of Open Access Journals (Sweden)
Eric G. Cavalcanti
2018-04-01
Full Text Available Nonlocality and contextuality are at the root of conceptual puzzles in quantum mechanics, and they are key resources for quantum advantage in information-processing tasks. Bell nonlocality is best understood as the incompatibility between quantum correlations and the classical theory of causality, applied to relativistic causal structure. Contextuality, on the other hand, is on a more controversial foundation. In this work, I provide a common conceptual ground between nonlocality and contextuality as violations of classical causality. First, I show that Bell inequalities can be derived solely from the assumptions of no signaling and no fine-tuning of the causal model. This removes two extra assumptions from a recent result from Wood and Spekkens and, remarkably, does not require any assumption related to independence of measurement settings—unlike all other derivations of Bell inequalities. I then introduce a formalism to represent contextuality scenarios within causal models and show that all classical causal models for violations of a Kochen-Specker inequality require fine-tuning. Thus, the quantum violation of classical causality goes beyond the case of spacelike-separated systems and already manifests in scenarios involving single systems.
Directory of Open Access Journals (Sweden)
Lina Zgaga
Full Text Available Vitamin D deficiency has been associated with increased risk of colorectal cancer (CRC, but causal relationship has not yet been confirmed. We investigate the direction of causation between vitamin D and CRC by extending the conventional approaches to allow pleiotropic relationships and by explicitly modelling unmeasured confounders.Plasma 25-hydroxyvitamin D (25-OHD, genetic variants associated with 25-OHD and CRC, and other relevant information was available for 2645 individuals (1057 CRC cases and 1588 controls and included in the model. We investigate whether 25-OHD is likely to be causally associated with CRC, or vice versa, by selecting the best modelling hypothesis according to Bayesian predictive scores. We examine consistency for a range of prior assumptions.Model comparison showed preference for the causal association between low 25-OHD and CRC over the reverse causal hypothesis. This was confirmed for posterior mean deviances obtained for both models (11.5 natural log units in favour of the causal model, and also for deviance information criteria (DIC computed for a range of prior distributions. Overall, models ignoring hidden confounding or pleiotropy had significantly poorer DIC scores.Results suggest causal association between 25-OHD and colorectal cancer, and support the need for randomised clinical trials for further confirmations.
Reconstructing constructivism: Causal models, Bayesian learning mechanisms and the theory theory
Gopnik, Alison; Wellman, Henry M.
2012-01-01
We propose a new version of the “theory theory” grounded in the computational framework of probabilistic causal models and Bayesian learning. Probabilistic models allow a constructivist but rigorous and detailed approach to cognitive development. They also explain the learning of both more specific causal hypotheses and more abstract framework theories. We outline the new theoretical ideas, explain the computational framework in an intuitive and non-technical way, and review an extensive but ...
Decision making in water resource planning: Models and computer graphics
Energy Technology Data Exchange (ETDEWEB)
Fedra, K; Carlsen, A J [ed.
1987-01-01
This paper describes some basic concepts of simulation-based decision support systems for water resources management and the role of symbolic, graphics-based user interfaces. Designed to allow direct and easy access to advanced methods of analysis and decision support for a broad and heterogeneous group of users, these systems combine data base management, system simulation, operations research techniques such as optimization, interactive data analysis, elements of advanced decision technology, and artificial intelligence, with a friendly and conversational, symbolic display oriented user interface. Important features of the interface are the use of several parallel or alternative styles of interaction and display, indlucing colour graphics and natural language. Combining quantitative numerical methods with qualitative and heuristic approaches, and giving the user direct and interactive control over the systems function, human knowledge, experience and judgement are integrated with formal approaches into a tightly coupled man-machine system through an intelligent and easily accessible user interface. 4 drawings, 42 references.
How causal analysis can reveal autonomy in models of biological systems
Marshall, William; Kim, Hyunju; Walker, Sara I.; Tononi, Giulio; Albantakis, Larissa
2017-11-01
Standard techniques for studying biological systems largely focus on their dynamical or, more recently, their informational properties, usually taking either a reductionist or holistic perspective. Yet, studying only individual system elements or the dynamics of the system as a whole disregards the organizational structure of the system-whether there are subsets of elements with joint causes or effects, and whether the system is strongly integrated or composed of several loosely interacting components. Integrated information theory offers a theoretical framework to (1) investigate the compositional cause-effect structure of a system and to (2) identify causal borders of highly integrated elements comprising local maxima of intrinsic cause-effect power. Here we apply this comprehensive causal analysis to a Boolean network model of the fission yeast (Schizosaccharomyces pombe) cell cycle. We demonstrate that this biological model features a non-trivial causal architecture, whose discovery may provide insights about the real cell cycle that could not be gained from holistic or reductionist approaches. We also show how some specific properties of this underlying causal architecture relate to the biological notion of autonomy. Ultimately, we suggest that analysing the causal organization of a system, including key features like intrinsic control and stable causal borders, should prove relevant for distinguishing life from non-life, and thus could also illuminate the origin of life problem. This article is part of the themed issue 'Reconceptualizing the origins of life'.
Graphics-based intelligent search and abstracting using Data Modeling
Jaenisch, Holger M.; Handley, James W.; Case, Carl T.; Songy, Claude G.
2002-11-01
This paper presents an autonomous text and context-mining algorithm that converts text documents into point clouds for visual search cues. This algorithm is applied to the task of data-mining a scriptural database comprised of the Old and New Testaments from the Bible and the Book of Mormon, Doctrine and Covenants, and the Pearl of Great Price. Results are generated which graphically show the scripture that represents the average concept of the database and the mining of the documents down to the verse level.
[Causal analysis approaches in epidemiology].
Dumas, O; Siroux, V; Le Moual, N; Varraso, R
2014-02-01
Epidemiological research is mostly based on observational studies. Whether such studies can provide evidence of causation remains discussed. Several causal analysis methods have been developed in epidemiology. This paper aims at presenting an overview of these methods: graphical models, path analysis and its extensions, and models based on the counterfactual approach, with a special emphasis on marginal structural models. Graphical approaches have been developed to allow synthetic representations of supposed causal relationships in a given problem. They serve as qualitative support in the study of causal relationships. The sufficient-component cause model has been developed to deal with the issue of multicausality raised by the emergence of chronic multifactorial diseases. Directed acyclic graphs are mostly used as a visual tool to identify possible confounding sources in a study. Structural equations models, the main extension of path analysis, combine a system of equations and a path diagram, representing a set of possible causal relationships. They allow quantifying direct and indirect effects in a general model in which several relationships can be tested simultaneously. Dynamic path analysis further takes into account the role of time. The counterfactual approach defines causality by comparing the observed event and the counterfactual event (the event that would have been observed if, contrary to the fact, the subject had received a different exposure than the one he actually received). This theoretical approach has shown limits of traditional methods to address some causality questions. In particular, in longitudinal studies, when there is time-varying confounding, classical methods (regressions) may be biased. Marginal structural models have been developed to address this issue. In conclusion, "causal models", though they were developed partly independently, are based on equivalent logical foundations. A crucial step in the application of these models is the
Co-occurrence rate networks: towards separate training for undirected graphical models
Zhu, Zhemin
2015-01-01
Dependence is a universal phenomenon which can be observed everywhere. In machine learning, probabilistic graphical models (PGMs) represent dependence relations with graphs. PGMs find wide applications in natural language processing (NLP), speech processing, computer vision, biomedicine, information
Interactive Gaussian Graphical Models for Discovering Depth Trends in ChemCam Data
Oyen, D. A.; Komurlu, C.; Lanza, N. L.
2018-04-01
Interactive Gaussian graphical models discover surface compositional features on rocks in ChemCam targets. Our approach visualizes shot-to-shot relationships among LIBS observations, and identifies the wavelengths involved in the trend.
Stochastic Analysis of a Queue Length Model Using a Graphics Processing Unit
Czech Academy of Sciences Publication Activity Database
Přikryl, Jan; Kocijan, J.
2012-01-01
Roč. 5, č. 2 (2012), s. 55-62 ISSN 1802-971X R&D Projects: GA MŠk(CZ) MEB091015 Institutional support: RVO:67985556 Keywords : graphics processing unit * GPU * Monte Carlo simulation * computer simulation * modeling Subject RIV: BC - Control Systems Theory http://library.utia.cas.cz/separaty/2012/AS/prikryl-stochastic analysis of a queue length model using a graphics processing unit.pdf
A Temporal-Causal Modelling Approach to Integrated Contagion and Network Change in Social Networks
Blankendaal, Romy; Parinussa, Sarah; Treur, Jan
2016-01-01
This paper introduces an integrated temporal-causal model for dynamics in social networks addressing the contagion principle by which states are affected mutually, and both the homophily principle and the more-becomes-more principle by which connections are adapted over time. The integrated model
TaskMaster: a prototype graphical user interface to a schedule optimization model
Banham, Stephen R.
1990-01-01
Approved for public release, distribution is unlimited This thesis investigates the use of current graphical interface techniques to build more effective computer-user interfaces to Operations Research (OR) schedule optimization models. The design is directed at the scheduling decision maker who possesses limited OR experience. The feasibility and validity of building an interface for this kind of user is demonstrated in the development of a prototype graphical user interface called TaskMa...
Stable Graphical Model Estimation with Random Forests for Discrete, Continuous, and Mixed Variables
Fellinghauer, Bernd; Bühlmann, Peter; Ryffel, Martin; von Rhein, Michael; Reinhardt, Jan D.
2011-01-01
A conditional independence graph is a concise representation of pairwise conditional independence among many variables. Graphical Random Forests (GRaFo) are a novel method for estimating pairwise conditional independence relationships among mixed-type, i.e. continuous and discrete, variables. The number of edges is a tuning parameter in any graphical model estimator and there is no obvious number that constitutes a good choice. Stability Selection helps choosing this parameter with respect to...
Bayesian nonparametric generative models for causal inference with missing at random covariates.
Roy, Jason; Lum, Kirsten J; Zeldow, Bret; Dworkin, Jordan D; Re, Vincent Lo; Daniels, Michael J
2018-03-26
We propose a general Bayesian nonparametric (BNP) approach to causal inference in the point treatment setting. The joint distribution of the observed data (outcome, treatment, and confounders) is modeled using an enriched Dirichlet process. The combination of the observed data model and causal assumptions allows us to identify any type of causal effect-differences, ratios, or quantile effects, either marginally or for subpopulations of interest. The proposed BNP model is well-suited for causal inference problems, as it does not require parametric assumptions about the distribution of confounders and naturally leads to a computationally efficient Gibbs sampling algorithm. By flexibly modeling the joint distribution, we are also able to impute (via data augmentation) values for missing covariates within the algorithm under an assumption of ignorable missingness, obviating the need to create separate imputed data sets. This approach for imputing the missing covariates has the additional advantage of guaranteeing congeniality between the imputation model and the analysis model, and because we use a BNP approach, parametric models are avoided for imputation. The performance of the method is assessed using simulation studies. The method is applied to data from a cohort study of human immunodeficiency virus/hepatitis C virus co-infected patients. © 2018, The International Biometric Society.
A 2D model of causal set quantum gravity: the emergence of the continuum
International Nuclear Information System (INIS)
Brightwell, Graham; Henson, Joe; Surya, Sumati
2008-01-01
Non-perturbative theories of quantum gravity inevitably include configurations that fail to resemble physically reasonable spacetimes at large scales. Often, these configurations are entropically dominant and pose an obstacle to obtaining the desired classical limit. We examine this 'entropy problem' in a model of causal set quantum gravity corresponding to a discretization of 2D spacetimes. Using results from the theory of partial orders we show that, in the large volume or continuum limit, its partition function is dominated by causal sets which approximate to a region of 2D Minkowski space. This model of causal set quantum gravity thus overcomes the entropy problem and predicts the emergence of a physically reasonable geometry
Capturing cognitive causal paths in human reliability analysis with Bayesian network models
International Nuclear Information System (INIS)
Zwirglmaier, Kilian; Straub, Daniel; Groth, Katrina M.
2017-01-01
reIn the last decade, Bayesian networks (BNs) have been identified as a powerful tool for human reliability analysis (HRA), with multiple advantages over traditional HRA methods. In this paper we illustrate how BNs can be used to include additional, qualitative causal paths to provide traceability. The proposed framework provides the foundation to resolve several needs frequently expressed by the HRA community. First, the developed extended BN structure reflects the causal paths found in cognitive psychology literature, thereby addressing the need for causal traceability and strong scientific basis in HRA. Secondly, the use of node reduction algorithms allows the BN to be condensed to a level of detail at which quantification is as straightforward as the techniques used in existing HRA. We illustrate the framework by developing a BN version of the critical data misperceived crew failure mode in the IDHEAS HRA method, which is currently under development at the US NRC . We illustrate how the model could be quantified with a combination of expert-probabilities and information from operator performance databases such as SACADA. This paper lays the foundations necessary to expand the cognitive and quantitative foundations of HRA. - Highlights: • A framework for building traceable BNs for HRA, based on cognitive causal paths. • A qualitative BN structure, directly showing these causal paths is developed. • Node reduction algorithms are used for making the BN structure quantifiable. • BN quantified through expert estimates and observed data (Bayesian updating). • The framework is illustrated for a crew failure mode of IDHEAS.
Futures Business Models for an IoT Enabled Healthcare Sector: A Causal Layered Analysis Perspective
Julius Francis Gomes; Sara Moqaddemerad
2016-01-01
Purpose: To facilitate futures business research by proposing a novel way to combine business models as a conceptual tool with futures research techniques. Design: A futures perspective is adopted to foresight business models of the Internet of Things (IoT) enabled healthcare sector by using business models as a futures business research tool. In doing so, business models is coupled with one of the most prominent foresight methodologies, Causal Layered Analysis (CLA). Qualitative analysis...
Verification of temporal-causal network models by mathematical analysis
Directory of Open Access Journals (Sweden)
Jan Treur
2016-04-01
Full Text Available Abstract Usually dynamic properties of models can be analysed by conducting simulation experiments. But sometimes, as a kind of prediction properties can also be found by calculations in a mathematical manner, without performing simulations. Examples of properties that can be explored in such a manner are: whether some values for the variables exist for which no change occurs (stationary points or equilibria, and how such values may depend on the values of the parameters of the model and/or the initial values for the variables whether certain variables in the model converge to some limit value (equilibria and how this may depend on the values of the parameters of the model and/or the initial values for the variables whether or not certain variables will show monotonically increasing or decreasing values over time (monotonicity how fast a convergence to a limit value takes place (convergence speed whether situations occur in which no convergence takes place but in the end a specific sequence of values is repeated all the time (limit cycle Such properties found in an analytic mathematical manner can be used for verification of the model by checking them for the values observed in simulation experiments. If one of these properties is not fulfilled, then there will be some error in the implementation of the model. In this paper some methods to analyse such properties of dynamical models will be described and illustrated for the Hebbian learning model, and for dynamic connection strengths in social networks. The properties analysed by the methods discussed cover equilibria, increasing or decreasing trends, recurring patterns (limit cycles, and speed of convergence to equilibria.
Causal Models for Mediation Analysis: An Introduction to Structural Mean Models.
Zheng, Cheng; Atkins, David C; Zhou, Xiao-Hua; Rhew, Isaac C
2015-01-01
Mediation analyses are critical to understanding why behavioral interventions work. To yield a causal interpretation, common mediation approaches must make an assumption of "sequential ignorability." The current article describes an alternative approach to causal mediation called structural mean models (SMMs). A specific SMM called a rank-preserving model (RPM) is introduced in the context of an applied example. Particular attention is given to the assumptions of both approaches to mediation. Applying both mediation approaches to the college student drinking data yield notable differences in the magnitude of effects. Simulated examples reveal instances in which the traditional approach can yield strongly biased results, whereas the RPM approach remains unbiased in these cases. At the same time, the RPM approach has its own assumptions that must be met for correct inference, such as the existence of a covariate that strongly moderates the effect of the intervention on the mediator and no unmeasured confounders that also serve as a moderator of the effect of the intervention or the mediator on the outcome. The RPM approach to mediation offers an alternative way to perform mediation analysis when there may be unmeasured confounders.
Interactions of information transfer along separable causal paths
Jiang, Peishi; Kumar, Praveen
2018-04-01
Complex systems arise as a result of interdependences between multiple variables, whose causal interactions can be visualized in a time-series graph. Transfer entropy and information partitioning approaches have been used to characterize such dependences. However, these approaches capture net information transfer occurring through a multitude of pathways involved in the interaction and as a result mask our ability to discern the causal interaction within a subgraph of interest through specific pathways. We build on recent developments of momentary information transfer along causal paths proposed by Runge [Phys. Rev. E 92, 062829 (2015), 10.1103/PhysRevE.92.062829] to develop a framework for quantifying information partitioning along separable causal paths. Momentary information transfer along causal paths captures the amount of information transfer between any two variables lagged at two specific points in time. Our approach expands this concept to characterize the causal interaction in terms of synergistic, unique, and redundant information transfer through separable causal paths. Through a graphical model, we analyze the impact of the separable and nonseparable causal paths and the causality structure embedded in the graph as well as the noise effect on information partitioning by using synthetic data generated from two coupled logistic equation models. Our approach can provide a valuable reference for an autonomous information partitioning along separable causal paths which form a causal subgraph influencing a target.
On a Numerical and Graphical Technique for Evaluating some Models Involving Rational Expectations
DEFF Research Database (Denmark)
Johansen, Søren; Swensen, Anders Rygh
Campbell and Shiller (1987) proposed a graphical technique for the present value model which consists of plotting the spread and theoretical spread as calculated from the cointegrated vector autoregressive model. We extend these techniques to a number of rational expectation models and give...
On a numerical and graphical technique for evaluating some models involving rational expectations
DEFF Research Database (Denmark)
Johansen, Søren; Swensen, Anders Rygh
Campbell and Shiller (1987) proposed a graphical technique for the present value model which consists of plotting the spread and theoretical spread as calculated from the cointegrated vector autoregressive model. We extend these techniques to a number of rational expectation models and give...
The causal nexus between oil prices and equity market in the U.S.: A regime switching model
International Nuclear Information System (INIS)
Balcilar, Mehmet; Ozdemir, Zeynel Abidin
2013-01-01
The aim of this paper is to analyse the causal link between monthly oil futures price changes and a sub-grouping of S and P 500 stock index changes. The causal linkage between oil and stock markets is modelled using a vector autoregressive model with time-varying parameters so as to reflect changes in Granger causality over time. A Markov switching vector autoregressive (MS-VAR) model, in which causal link between the series is stochastic and governed by an unobservable Markov chain, is used for inferring time-varying causality. Although we do not find any lead–lag type Granger causality, the results based on the MS-VAR model clearly show that oil futures price has strong regime prediction power for a sub-grouping of S and P 500 stock index during various sub-periods in the sample, while there is a weak evidence for the regime prediction power of a sub-grouping of S and P 500 stock indexes. The regime-prediction non-causality tests on the MS-VAR model show that both variables are useful for making inference about the regime process and that the evidence on regime-prediction causality is primarily found in the equation describing a sub-grouping of S and P 500 stock market returns. The evidence from the conditional non-causality tests shows that past information on the other series fails to improve the one step ahead prediction for both oil futures and stock returns. - Highlights: • We analyse the causal links between oil futures price and a sub-grouping of S and P 500 index. • The causal links are modelled using a regime switching model. • We do not find any lead–lag type Granger causality between the series. • The results show that oil futures price has regime prediction power for a sub-grouping of S and P 500 stock index
Sex and Self-Control Theory: The Measures and Causal Model May Be Different
Higgins, George E.; Tewksbury, Richard
2006-01-01
This study examines the distribution differences across sexes in key measures of self-control theory and differences in a causal model. Using cross-sectional data from juveniles ("n" = 1,500), the study shows mean-level differences in many of the self-control, risky behavior, and delinquency measures. Structural equation modeling…
Hindsight Bias Doesn't Always Come Easy: Causal Models, Cognitive Effort, and Creeping Determinism
Nestler, Steffen; Blank, Hartmut; von Collani, Gernot
2008-01-01
Creeping determinism, a form of hindsight bias, refers to people's hindsight perceptions of events as being determined or inevitable. This article proposes, on the basis of a causal-model theory of creeping determinism, that the underlying processes are effortful, and hence creeping determinism should disappear when individuals lack the cognitive…
The impact of school leadership on school level factors: validation of a causal model
Krüger, M.L.; Witziers, B.; Sleegers, P.
2007-01-01
This study aims to contribute to a better understanding of the antecedents and effects of educational leadership, and of the influence of the principal's leadership on intervening and outcome variables. A path analysis was conducted to test and validate a causal model. The results show no direct or
Molenaar, P.C.M.
1987-01-01
Outlines a frequency domain analysis of the dynamic factor model and proposes a solution to the problem of constructing a causal filter of lagged factor loadings. The method is illustrated with applications to simulated and real multivariate time series. The latter applications involve topographic
Causal Modeling of Secondary Science Students' Intentions to Enroll in Physics.
Crawley, Frank E.; Black, Carolyn B.
1992-01-01
Reports a study using the causal modeling method to verify underlying causes of student interest in enrolling in physics as predicted by the theory of planned behavior. Families were identified as major referents in the social support system for physics enrollment. Course and extracurricular conflicts and fear of failure were primary beliefs…
Faes, Luca; Nollo, Giandomenico
2010-11-01
The Partial Directed Coherence (PDC) and its generalized formulation (gPDC) are popular tools for investigating, in the frequency domain, the concept of Granger causality among multivariate (MV) time series. PDC and gPDC are formalized in terms of the coefficients of an MV autoregressive (MVAR) model which describes only the lagged effects among the time series and forsakes instantaneous effects. However, instantaneous effects are known to affect linear parametric modeling, and are likely to occur in experimental time series. In this study, we investigate the impact on the assessment of frequency domain causality of excluding instantaneous effects from the model underlying PDC evaluation. Moreover, we propose the utilization of an extended MVAR model including both instantaneous and lagged effects. This model is used to assess PDC either in accordance with the definition of Granger causality when considering only lagged effects (iPDC), or with an extended form of causality, when we consider both instantaneous and lagged effects (ePDC). The approach is first evaluated on three theoretical examples of MVAR processes, which show that the presence of instantaneous correlations may produce misleading profiles of PDC and gPDC, while ePDC and iPDC derived from the extended model provide here a correct interpretation of extended and lagged causality. It is then applied to representative examples of cardiorespiratory and EEG MV time series. They suggest that ePDC and iPDC are better interpretable than PDC and gPDC in terms of the known cardiovascular and neural physiologies.
Concurrency Models with Causality and Events as Psi-calculi
Directory of Open Access Journals (Sweden)
Håkon Normann
2014-10-01
Full Text Available Psi-calculi are a parametric framework for nominal calculi, where standard calculi are found as instances, like the pi-calculus, or the cryptographic spi-calculus and applied-pi. Psi-calculi have an interleaving operational semantics, with a strong foundation on the theory of nominal sets and process algebras. Much of the expressive power of psi-calculi comes from their logical part, i.e., assertions, conditions, and entailment, which are left quite open thus accommodating a wide range of logics. We are interested in how this expressiveness can deal with event-based models of concurrency. We thus take the popular prime event structures model and give an encoding into an instance of psi-calculi. We also take the recent and expressive model of Dynamic Condition Response Graphs (in which event structures are strictly included and give an encoding into another corresponding instance of psi-calculi. The encodings that we achieve look rather natural and intuitive. Additional results about these encodings give us more confidence in their correctness.
Graphics metafile interface to ARAC emergency response models for remote workstation study
International Nuclear Information System (INIS)
Lawver, B.S.
1985-01-01
The Department of Energy's Atmospheric Response Advisory Capability models are executed on computers at a central computer center with the output distributed to accident advisors in the field. The output of these atmospheric diffusion models are generated as contoured isopleths of concentrations. When these isopleths are overlayed with local geography, they become a useful tool to the accident site advisor. ARAC has developed a workstation that is located at potential accident sites. The workstation allows the accident advisor to view color plots of the model results, scale those plots and print black and white hardcopy of the model results. The graphics metafile, also known as Virtual Device Metafile (VDM) allows the models to generate a single device independent output file that is partitioned into geography, isoopleths and labeling information. The metafile is a very compact data storage technique that is output device independent. The metafile frees the model from either generating output for all known graphic devices or requiring the model to be rerun for additional graphic devices. With the partitioned metafile ARAC can transmit to the remote workstation the isopleths and labeling for each model. The geography database may not change and can be transmitted only when needed. This paper describes the important features of the remote workstation and how these features are supported by the device independent graphics metafile
Causal Analysis of Religious Violence, a Structural Equation Modeling Approach
Directory of Open Access Journals (Sweden)
M Munajat
2015-12-01
[Penelitian ini berusaha mengkaji sebab kekerasan keagamaan dengan menggunakan pendekatan Model Persamaan Struktur (SEM. Penelitian kuantitatif terdahulu dalam bidang gerakan sosial dan kekerasan politik menunjukkan bahwa setidaknya ada tiga faktor yang diduga kuat menjadi penyebab kekerasan kolektif, seperti kekerasan agama, yaitu: 1 semakin fundamentalis seseorang, maka ia akan semakin cenderung menyetujui pernggunaan cara kekerasan, 2 semakin rendah kepercayaan seseorang terhadap pemerintah, maka ia akan semakin menyetujui penggunaan kekerasan, 3 berbeda dengan pendapat ke-dua, hanya orang yang rendah kepercayaanya kepada pemerintah, namun mempunyai semangat politik tinggi, yang akan menyetujui penggunaan cara-cara kekerasan. Berdasarkan pada data yang diambil dari 343 responden dari para aktivis, Front Pembela Islam, Muhammadiyah dan Nahdlatul Ulama, penelitian ini mengkonfirmasi bahwa semakin fundamentalis seseorang, maka ia akan semakin cenderung menyetujui kekerasan, terlepas dari afiliasi organisasi mereka. Namun demikian, penelitian ini tidak mendukung hubungan antara kepercayaan terhadap pemerintah dan kekerasan. Demikian juga, hubungan antara kekerasan dan interaksi antara kepercayaan pemerintah dan semangat politik tidak dapat dibuktikan dari data dalam penelitian ini. Oleh karena itu, penelitian ini menyimpulkan bahwa fundamentalisme, sebagai salah satu bentuk keagamaan, merupakan faktor yang sangat penting dalam menjelaskan kekerasan keagamaan.
Reconstructing constructivism: causal models, Bayesian learning mechanisms, and the theory theory.
Gopnik, Alison; Wellman, Henry M
2012-11-01
We propose a new version of the "theory theory" grounded in the computational framework of probabilistic causal models and Bayesian learning. Probabilistic models allow a constructivist but rigorous and detailed approach to cognitive development. They also explain the learning of both more specific causal hypotheses and more abstract framework theories. We outline the new theoretical ideas, explain the computational framework in an intuitive and nontechnical way, and review an extensive but relatively recent body of empirical results that supports these ideas. These include new studies of the mechanisms of learning. Children infer causal structure from statistical information, through their own actions on the world and through observations of the actions of others. Studies demonstrate these learning mechanisms in children from 16 months to 4 years old and include research on causal statistical learning, informal experimentation through play, and imitation and informal pedagogy. They also include studies of the variability and progressive character of intuitive theory change, particularly theory of mind. These studies investigate both the physical and the psychological and social domains. We conclude with suggestions for further collaborative projects between developmental and computational cognitive scientists.
Nonintersecting string model and graphical approach: equivalence with a Potts model
International Nuclear Information System (INIS)
Perk, J.H.H.; Wu, F.Y.
1986-01-01
Using a graphical method the authors establish the exact equivalence of the partition function of a q-state nonintersecting string (NIS) model on an arbitrary planar, even-valenced lattice with that of a q 2 -state Potts model on a relaxed lattice. The NIS model considered in this paper is one in which the vertex weights are expressible as sums of those of basic vertex types, and the resulting Potts model generally has multispin interactions. For the square and Kagome lattices this leads to the equivalence of a staggered NIS model with Potts models with anisotropic pair interactions, indicating that these NIS models have a first-order transition for q greater than 2. For the triangular lattice the NIS model turns out to be the five-vertex model of Wu and Lin and it relates to a Potts model with two- and three-site interactions. The most general model the authors discuss is an oriented NIS model which contains the six-vertex model and the NIS models of Stroganov and Schultz as special cases
SpineCreator: a Graphical User Interface for the Creation of Layered Neural Models.
Cope, A J; Richmond, P; James, S S; Gurney, K; Allerton, D J
2017-01-01
There is a growing requirement in computational neuroscience for tools that permit collaborative model building, model sharing, combining existing models into a larger system (multi-scale model integration), and are able to simulate models using a variety of simulation engines and hardware platforms. Layered XML model specification formats solve many of these problems, however they are difficult to write and visualise without tools. Here we describe a new graphical software tool, SpineCreator, which facilitates the creation and visualisation of layered models of point spiking neurons or rate coded neurons without requiring the need for programming. We demonstrate the tool through the reproduction and visualisation of published models and show simulation results using code generation interfaced directly into SpineCreator. As a unique application for the graphical creation of neural networks, SpineCreator represents an important step forward for neuronal modelling.
Nobuoki, Eshima; Minoru, Tabata; Geng, Zhi; Department of Medical Information Analysis, Faculty of Medicine, Oita Medical University; Department of Applied Mathematics, Faculty of Engineering, Kobe University; Department of Probability and Statistics, Peking University
2001-01-01
This paper discusses path analysis of categorical variables with logistic regression models. The total, direct and indirect effects in fully recursive causal systems are considered by using model parameters. These effects can be explained in terms of log odds ratios, uncertainty differences, and an inner product of explanatory variables and a response variable. A study on food choice of alligators as a numerical exampleis reanalysed to illustrate the present approach.
Model Verification and Validation Using Graphical Information Systems Tools
2013-07-31
Davis Highway, Suite 1204, Arlington, VA 22202-4302. Respondents should be aware that notwithstanding any other provision of law , no person shall be...12 Geomorphic Measurements...to a model. Ocean flows, which are organized E-2 current systems, transport heat and salinity and cause water to pile up as a water surface
A Local Poisson Graphical Model for inferring networks from sequencing data.
Allen, Genevera I; Liu, Zhandong
2013-09-01
Gaussian graphical models, a class of undirected graphs or Markov Networks, are often used to infer gene networks based on microarray expression data. Many scientists, however, have begun using high-throughput sequencing technologies such as RNA-sequencing or next generation sequencing to measure gene expression. As the resulting data consists of counts of sequencing reads for each gene, Gaussian graphical models are not optimal for this discrete data. In this paper, we propose a novel method for inferring gene networks from sequencing data: the Local Poisson Graphical Model. Our model assumes a Local Markov property where each variable conditional on all other variables is Poisson distributed. We develop a neighborhood selection algorithm to fit our model locally by performing a series of l1 penalized Poisson, or log-linear, regressions. This yields a fast parallel algorithm for estimating networks from next generation sequencing data. In simulations, we illustrate the effectiveness of our methods for recovering network structure from count data. A case study on breast cancer microRNAs (miRNAs), a novel application of graphical models, finds known regulators of breast cancer genes and discovers novel miRNA clusters and hubs that are targets for future research.
Heckbert, Paul S
1994-01-01
Graphics Gems IV contains practical techniques for 2D and 3D modeling, animation, rendering, and image processing. The book presents articles on polygons and polyhedral; a mix of formulas, optimized algorithms, and tutorial information on the geometry of 2D, 3D, and n-D space; transformations; and parametric curves and surfaces. The text also includes articles on ray tracing; shading 3D models; and frame buffer techniques. Articles on image processing; algorithms for graphical layout; basic interpolation methods; and subroutine libraries for vector and matrix algebra are also demonstrated. Com
Directory of Open Access Journals (Sweden)
Paolo Vineis
2017-06-01
Full Text Available Abstract In the last decades, Systems Biology (including cancer research has been driven by technology, statistical modelling and bioinformatics. In this paper we try to bring biological and philosophical thinking back. We thus aim at making different traditions of thought compatible: (a causality in epidemiology and in philosophical theorizing—notably, the “sufficient-component-cause framework” and the “mark transmission” approach; (b new acquisitions about disease pathogenesis, e.g. the “branched model” in cancer, and the role of biomarkers in this process; (c the burgeoning of omics research, with a large number of “signals” and of associations that need to be interpreted. In the paper we summarize first the current views on carcinogenesis, and then explore the relevance of current philosophical interpretations of “cancer causes”. We try to offer a unifying framework to incorporate biomarkers and omic data into causal models, referring to a position called “evidential pluralism”. According to this view, causal reasoning is based on both “evidence of difference-making” (e.g. associations and on “evidence of underlying biological mechanisms”. We conceptualize the way scientists detect and trace signals in terms of information transmission, which is a generalization of the mark transmission theory developed by philosopher Wesley Salmon. Our approach is capable of helping us conceptualize how heterogeneous factors such as micro and macro-biological and psycho-social—are causally linked. This is important not only to understand cancer etiology, but also to design public health policies that target the right causal factors at the macro-level.
A Practical Probabilistic Graphical Modeling Tool for Weighing ...
Past weight-of-evidence frameworks for adverse ecological effects have provided soft-scoring procedures for judgments based on the quality and measured attributes of evidence. Here, we provide a flexible probabilistic structure for weighing and integrating lines of evidence for ecological risk determinations. Probabilistic approaches can provide both a quantitative weighing of lines of evidence and methods for evaluating risk and uncertainty. The current modeling structure wasdeveloped for propagating uncertainties in measured endpoints and their influence on the plausibility of adverse effects. To illustrate the approach, we apply the model framework to the sediment quality triad using example lines of evidence for sediment chemistry measurements, bioassay results, and in situ infauna diversity of benthic communities using a simplified hypothetical case study. We then combine the three lines evidence and evaluate sensitivity to the input parameters, and show how uncertainties are propagated and how additional information can be incorporated to rapidly update the probability of impacts. The developed network model can be expanded to accommodate additional lines of evidence, variables and states of importance, and different types of uncertainties in the lines of evidence including spatial and temporal as well as measurement errors. We provide a flexible Bayesian network structure for weighing and integrating lines of evidence for ecological risk determinations
Directory of Open Access Journals (Sweden)
Yuanyuan Yu
2017-12-01
Full Text Available Abstract Background Confounders can produce spurious associations between exposure and outcome in observational studies. For majority of epidemiologists, adjusting for confounders using logistic regression model is their habitual method, though it has some problems in accuracy and precision. It is, therefore, important to highlight the problems of logistic regression and search the alternative method. Methods Four causal diagram models were defined to summarize confounding equivalence. Both theoretical proofs and simulation studies were performed to verify whether conditioning on different confounding equivalence sets had the same bias-reducing potential and then to select the optimum adjusting strategy, in which logistic regression model and inverse probability weighting based marginal structural model (IPW-based-MSM were compared. The “do-calculus” was used to calculate the true causal effect of exposure on outcome, then the bias and standard error were used to evaluate the performances of different strategies. Results Adjusting for different sets of confounding equivalence, as judged by identical Markov boundaries, produced different bias-reducing potential in the logistic regression model. For the sets satisfied G-admissibility, adjusting for the set including all the confounders reduced the equivalent bias to the one containing the parent nodes of the outcome, while the bias after adjusting for the parent nodes of exposure was not equivalent to them. In addition, all causal effect estimations through logistic regression were biased, although the estimation after adjusting for the parent nodes of exposure was nearest to the true causal effect. However, conditioning on different confounding equivalence sets had the same bias-reducing potential under IPW-based-MSM. Compared with logistic regression, the IPW-based-MSM could obtain unbiased causal effect estimation when the adjusted confounders satisfied G-admissibility and the optimal
Yu, Yuanyuan; Li, Hongkai; Sun, Xiaoru; Su, Ping; Wang, Tingting; Liu, Yi; Yuan, Zhongshang; Liu, Yanxun; Xue, Fuzhong
2017-12-28
Confounders can produce spurious associations between exposure and outcome in observational studies. For majority of epidemiologists, adjusting for confounders using logistic regression model is their habitual method, though it has some problems in accuracy and precision. It is, therefore, important to highlight the problems of logistic regression and search the alternative method. Four causal diagram models were defined to summarize confounding equivalence. Both theoretical proofs and simulation studies were performed to verify whether conditioning on different confounding equivalence sets had the same bias-reducing potential and then to select the optimum adjusting strategy, in which logistic regression model and inverse probability weighting based marginal structural model (IPW-based-MSM) were compared. The "do-calculus" was used to calculate the true causal effect of exposure on outcome, then the bias and standard error were used to evaluate the performances of different strategies. Adjusting for different sets of confounding equivalence, as judged by identical Markov boundaries, produced different bias-reducing potential in the logistic regression model. For the sets satisfied G-admissibility, adjusting for the set including all the confounders reduced the equivalent bias to the one containing the parent nodes of the outcome, while the bias after adjusting for the parent nodes of exposure was not equivalent to them. In addition, all causal effect estimations through logistic regression were biased, although the estimation after adjusting for the parent nodes of exposure was nearest to the true causal effect. However, conditioning on different confounding equivalence sets had the same bias-reducing potential under IPW-based-MSM. Compared with logistic regression, the IPW-based-MSM could obtain unbiased causal effect estimation when the adjusted confounders satisfied G-admissibility and the optimal strategy was to adjust for the parent nodes of outcome, which
Structural identifiability of cyclic graphical models of biological networks with latent variables.
Wang, Yulin; Lu, Na; Miao, Hongyu
2016-06-13
Graphical models have long been used to describe biological networks for a variety of important tasks such as the determination of key biological parameters, and the structure of graphical model ultimately determines whether such unknown parameters can be unambiguously obtained from experimental observations (i.e., the identifiability problem). Limited by resources or technical capacities, complex biological networks are usually partially observed in experiment, which thus introduces latent variables into the corresponding graphical models. A number of previous studies have tackled the parameter identifiability problem for graphical models such as linear structural equation models (SEMs) with or without latent variables. However, the limited resolution and efficiency of existing approaches necessarily calls for further development of novel structural identifiability analysis algorithms. An efficient structural identifiability analysis algorithm is developed in this study for a broad range of network structures. The proposed method adopts the Wright's path coefficient method to generate identifiability equations in forms of symbolic polynomials, and then converts these symbolic equations to binary matrices (called identifiability matrix). Several matrix operations are introduced for identifiability matrix reduction with system equivalency maintained. Based on the reduced identifiability matrices, the structural identifiability of each parameter is determined. A number of benchmark models are used to verify the validity of the proposed approach. Finally, the network module for influenza A virus replication is employed as a real example to illustrate the application of the proposed approach in practice. The proposed approach can deal with cyclic networks with latent variables. The key advantage is that it intentionally avoids symbolic computation and is thus highly efficient. Also, this method is capable of determining the identifiability of each single parameter and
Yue, Chen; Chen, Shaojie; Sair, Haris I; Airan, Raag; Caffo, Brian S
2015-09-01
Data reproducibility is a critical issue in all scientific experiments. In this manuscript, the problem of quantifying the reproducibility of graphical measurements is considered. The image intra-class correlation coefficient (I2C2) is generalized and the graphical intra-class correlation coefficient (GICC) is proposed for such purpose. The concept for GICC is based on multivariate probit-linear mixed effect models. A Markov Chain Monte Carlo EM (mcm-cEM) algorithm is used for estimating the GICC. Simulation results with varied settings are demonstrated and our method is applied to the KIRBY21 test-retest dataset.
Verschoor, M.; Jalba, A.C.
2012-01-01
Elastically deformable models have found applications in various areas ranging from mechanical sciences and engineering to computer graphics. The method of Finite Elements has been the tool of choice for solving the underlying PDE, when accuracy and stability of the computations are more important
A Monthly Water-Balance Model Driven By a Graphical User Interface
McCabe, Gregory J.; Markstrom, Steven L.
2007-01-01
This report describes a monthly water-balance model driven by a graphical user interface, referred to as the Thornthwaite monthly water-balance program. Computations of monthly water-balance components of the hydrologic cycle are made for a specified location. The program can be used as a research tool, an assessment tool, and a tool for classroom instruction.
Scaling-up spatially-explicit ecological models using graphics processors
Koppel, Johan van de; Gupta, Rohit; Vuik, Cornelis
2011-01-01
How the properties of ecosystems relate to spatial scale is a prominent topic in current ecosystem research. Despite this, spatially explicit models typically include only a limited range of spatial scales, mostly because of computing limitations. Here, we describe the use of graphics processors to
Incorporating Solid Modeling and Team-Based Design into Freshman Engineering Graphics.
Buchal, Ralph O.
2001-01-01
Describes the integration of these topics through a major team-based design and computer aided design (CAD) modeling project in freshman engineering graphics at the University of Western Ontario. Involves n=250 students working in teams of four to design and document an original Lego toy. Includes 12 references. (Author/YDS)
Oishi, Makoto; Fukuda, Masafumi; Hiraishi, Tetsuya; Yajima, Naoki; Sato, Yosuke; Fujii, Yukihiko
2012-09-01
The purpose of this paper is to report on the authors' advanced presurgical interactive virtual simulation technique using a 3D computer graphics model for microvascular decompression (MVD) surgery. The authors performed interactive virtual simulation prior to surgery in 26 patients with trigeminal neuralgia or hemifacial spasm. The 3D computer graphics models for interactive virtual simulation were composed of the brainstem, cerebellum, cranial nerves, vessels, and skull individually created by the image analysis, including segmentation, surface rendering, and data fusion for data collected by 3-T MRI and 64-row multidetector CT systems. Interactive virtual simulation was performed by employing novel computer-aided design software with manipulation of a haptic device to imitate the surgical procedures of bone drilling and retraction of the cerebellum. The findings were compared with intraoperative findings. In all patients, interactive virtual simulation provided detailed and realistic surgical perspectives, of sufficient quality, representing the lateral suboccipital route. The causes of trigeminal neuralgia or hemifacial spasm determined by observing 3D computer graphics models were concordant with those identified intraoperatively in 25 (96%) of 26 patients, which was a significantly higher rate than the 73% concordance rate (concordance in 19 of 26 patients) obtained by review of 2D images only (p computer graphics model provided a realistic environment for performing virtual simulations prior to MVD surgery and enabled us to ascertain complex microsurgical anatomy.
Iturria-Medina, Yasser; Carbonell, Félix M; Sotero, Roberto C; Chouinard-Decorte, Francois; Evans, Alan C
2017-05-15
Generative models focused on multifactorial causal mechanisms in brain disorders are scarce and generally based on limited data. Despite the biological importance of the multiple interacting processes, their effects remain poorly characterized from an integrative analytic perspective. Here, we propose a spatiotemporal multifactorial causal model (MCM) of brain (dis)organization and therapeutic intervention that accounts for local causal interactions, effects propagation via physical brain networks, cognitive alterations, and identification of optimum therapeutic interventions. In this article, we focus on describing the model and applying it at the population-based level for studying late onset Alzheimer's disease (LOAD). By interrelating six different neuroimaging modalities and cognitive measurements, this model accurately predicts spatiotemporal alterations in brain amyloid-β (Aβ) burden, glucose metabolism, vascular flow, resting state functional activity, structural properties, and cognitive integrity. The results suggest that a vascular dysregulation may be the most-likely initial pathologic event leading to LOAD. Nevertheless, they also suggest that LOAD it is not caused by a unique dominant biological factor (e.g. vascular or Aβ) but by the complex interplay among multiple relevant direct interactions. Furthermore, using theoretical control analysis of the identified population-based multifactorial causal network, we show the crucial advantage of using combinatorial over single-target treatments, explain why one-target Aβ based therapies might fail to improve clinical outcomes, and propose an efficiency ranking of possible LOAD interventions. Although still requiring further validation at the individual level, this work presents the first analytic framework for dynamic multifactorial brain (dis)organization that may explain both the pathologic evolution of progressive neurological disorders and operationalize the influence of multiple interventional
A Graphical Adversarial Risk Analysis Model for Oil and Gas Drilling Cybersecurity
Vieira, Aitor Couce; Houmb, Siv Hilde; Insua, David Rios
2014-01-01
Oil and gas drilling is based, increasingly, on operational technology, whose cybersecurity is complicated by several challenges. We propose a graphical model for cybersecurity risk assessment based on Adversarial Risk Analysis to face those challenges. We also provide an example of the model in the context of an offshore drilling rig. The proposed model provides a more formal and comprehensive analysis of risks, still using the standard business language based on decisions, risks, and value.
A Graphical Adversarial Risk Analysis Model for Oil and Gas Drilling Cybersecurity
Directory of Open Access Journals (Sweden)
Aitor Couce Vieira
2014-04-01
Full Text Available Oil and gas drilling is based, increasingly, on operational technology, whose cybersecurity is complicated by several challenges. We propose a graphical model for cybersecurity risk assessment based on Adversarial Risk Analysis to face those challenges. We also provide an example of the model in the context of an offshore drilling rig. The proposed model provides a more formal and comprehensive analysis of risks, still using the standard business language based on decisions, risks, and value.
From patterns to causal understanding: Structural equation modeling (SEM) in soil ecology
Eisenhauer, Nico; Powell, Jeff R; Grace, James B.; Bowker, Matthew A.
2015-01-01
In this perspectives paper we highlight a heretofore underused statistical method in soil ecological research, structural equation modeling (SEM). SEM is commonly used in the general ecological literature to develop causal understanding from observational data, but has been more slowly adopted by soil ecologists. We provide some basic information on the many advantages and possibilities associated with using SEM and provide some examples of how SEM can be used by soil ecologists to shift focus from describing patterns to developing causal understanding and inspiring new types of experimental tests. SEM is a promising tool to aid the growth of soil ecology as a discipline, particularly by supporting research that is increasingly hypothesis-driven and interdisciplinary, thus shining light into the black box of interactions belowground.
Reininghaus, Ulrich; Depp, Colin A; Myin-Germeys, Inez
2016-03-01
Integrated models of psychotic disorders have posited a number of putative psychological mechanisms that may contribute to the development of psychotic symptoms, but it is only recently that a modest amount of experience sampling research has provided evidence on their role in daily life, outside the research laboratory. A number of methodological challenges remain in evaluating specificity of potential causal links between a given psychological mechanism and psychosis outcomes in a systematic fashion, capitalizing on longitudinal data to investigate temporal ordering. In this article, we argue for testing ecological interventionist causal models that draw on real world and real-time delivered, ecological momentary interventions for generating evidence on several causal criteria (association, time order, and direction/sole plausibility) under real-world conditions, while maximizing generalizability to social contexts and experiences in heterogeneous populations. Specifically, this approach tests whether ecological momentary interventions can (1) modify a putative mechanism and (2) produce changes in the mechanism that lead to sustainable changes in intended psychosis outcomes in individuals' daily lives. Future research using this approach will provide translational evidence on the active ingredients of mobile health and in-person interventions that promote sustained effectiveness of ecological momentary interventions and, thereby, contribute to ongoing efforts that seek to enhance effectiveness of psychological interventions under real-world conditions. © The Author 2015. Published by Oxford University Press on behalf of the Maryland Psychiatric Research Center. All rights reserved. For permissions, please email: journals.permissions@oup.com.
Reininghaus, Ulrich; Depp, Colin A.; Myin-Germeys, Inez
2016-01-01
Integrated models of psychotic disorders have posited a number of putative psychological mechanisms that may contribute to the development of psychotic symptoms, but it is only recently that a modest amount of experience sampling research has provided evidence on their role in daily life, outside the research laboratory. A number of methodological challenges remain in evaluating specificity of potential causal links between a given psychological mechanism and psychosis outcomes in a systematic fashion, capitalizing on longitudinal data to investigate temporal ordering. In this article, we argue for testing ecological interventionist causal models that draw on real world and real-time delivered, ecological momentary interventions for generating evidence on several causal criteria (association, time order, and direction/sole plausibility) under real-world conditions, while maximizing generalizability to social contexts and experiences in heterogeneous populations. Specifically, this approach tests whether ecological momentary interventions can (1) modify a putative mechanism and (2) produce changes in the mechanism that lead to sustainable changes in intended psychosis outcomes in individuals’ daily lives. Future research using this approach will provide translational evidence on the active ingredients of mobile health and in-person interventions that promote sustained effectiveness of ecological momentary interventions and, thereby, contribute to ongoing efforts that seek to enhance effectiveness of psychological interventions under real-world conditions. PMID:26707864
Modeling And Simulation As The Basis For Hybridity In The Graphic Discipline Learning/Teaching Area
Directory of Open Access Journals (Sweden)
Jana Žiljak Vujić
2009-01-01
Full Text Available Only some fifteen years have passed since the scientific graphics discipline was established. In the transition period from the College of Graphics to «Integrated Graphic Technology Studies» to the contemporary Faculty of Graphics Arts with the University in Zagreb, three main periods of development can be noted: digital printing, computer prepress and automatic procedures in postpress packaging production. Computer technology has enabled a change in the methodology of teaching graphics technology and studying it on the level of secondary and higher education. The task has been set to create tools for simulating printing processes in order to master the program through a hybrid system consisting of methods that are separate in relation to one another: learning with the help of digital models and checking in the actual real system. We are setting a hybrid project for teaching because the overall acquired knowledge is the result of completely different methods. The first method is on the free programs level functioning without consequences. Everything remains as a record in the knowledge database that can be analyzed, statistically processed and repeated with new parameter values of the system being researched. The second method uses the actual real system where the results are in proving the value of new knowledge and this is something that encourages and stimulates new cycles of hybrid behavior in mastering programs. This is the area where individual learning incurs. The hybrid method allows the possibility of studying actual situations on a computer model, proving it on an actual real model and entering the area of learning envisaging future development.
Modeling and Simulation as the Basis for Hybridity in the Graphic Discipline Learning/Teaching Area
Directory of Open Access Journals (Sweden)
Vilko Ziljak
2009-11-01
Full Text Available Only some fifteen years have passed since the scientific graphics discipline was established. In the transition period from the College of Graphics to «Integrated Graphic Technology Studies» to the contemporary Faculty of Graphics Arts with the University in Zagreb, three main periods of development can be noted: digital printing, computer prepress and automatic procedures in postpress packaging production. Computer technology has enabled a change in the methodology of teaching graphics technology and studying it on the level of secondary and higher education. The task has been set to create tools for simulating printing processes in order to master the program through a hybrid system consisting of methods that are separate in relation to one another: learning with the help of digital models and checking in the actual real system. We are setting a hybrid project for teaching because the overall acquired knowledge is the result of completely different methods. The first method is on the free programs level functioning without consequences. Everything remains as a record in the knowledge database that can be analyzed, statistically processed and repeated with new parameter values of the system being researched. The second method uses the actual real system where the results are in proving the value of new knowledge and this is something that encourages and stimulates new cycles of hybrid behavior in mastering programs. This is the area where individual learning incurs. The hybrid method allows the possibility of studying actual situations on a computer model, proving it on an actual real model and entering the area of learning envisaging future development.
DEFF Research Database (Denmark)
Spataru, Sergiu; Sera, Dezso; Kerekes, Tamas
2012-01-01
This paper presents a set of laboratory tools aimed to support students with various backgrounds (no programming) to understand photovoltaic array modelling and characterization techniques. A graphical user interface (GUI) has been developed in Matlab, for modelling PV arrays and characterizing...... the effect of different types of parameters and operating conditions, on the current-voltage and power-voltage curves. The GUI is supported by experimental investigation and validation on PV module level, with the help of an indoor flash solar simulator....
Graphical models for simulation and control of robotic systems for waste handling
International Nuclear Information System (INIS)
Drotning, W.D.; Bennett, P.C.
1992-01-01
This paper discusses detailed geometric models which have been used within a graphical simulation environment to study transportation cask facility design and to perform design and analyses of robotic systems for handling of nuclear waste. The models form the basis for a robot control environment which provides safety, flexibility, and reliability for operations which span the spectrum from autonomous control to tasks requiring direct human intervention
Planning of O&M for Offfshore Wind Turbines using Bayesian Graphical Models
DEFF Research Database (Denmark)
Nielsen, Jannie Jessen; Sørensen, John Dalsgaard
2010-01-01
The costs to operation and maintenance (O&M) for offshore wind turbines are large, and riskbased planning of O&M has the potential of reducing these costs. This paper presents how Bayesian graphical models can be used to establish a probabilistic damage model and include data from imperfect...... inspections and monitoring. The method offers efficient updating of the failure probability, which is necessary for risk-based decision making. An application example is presented to demonstrate the capabilities of the method....
International Nuclear Information System (INIS)
Paćko, P; Bielak, T; Staszewski, W J; Uhl, T; Spencer, A B; Worden, K
2012-01-01
This paper demonstrates new parallel computation technology and an implementation for Lamb wave propagation modelling in complex structures. A graphical processing unit (GPU) and computer unified device architecture (CUDA), available in low-cost graphical cards in standard PCs, are used for Lamb wave propagation numerical simulations. The local interaction simulation approach (LISA) wave propagation algorithm has been implemented as an example. Other algorithms suitable for parallel discretization can also be used in practice. The method is illustrated using examples related to damage detection. The results demonstrate good accuracy and effective computational performance of very large models. The wave propagation modelling presented in the paper can be used in many practical applications of science and engineering. (paper)
Modeling the Effect of Religion on Human Empathy Based on an Adaptive Temporal-Causal Network Model
van Ments, L.I.; Roelofsma, P.H.M.P.; Treur, J.
2018-01-01
Religion is a central aspect of many individuals’ lives around the world, and its influence on human behaviour has been extensively studied from many different perspectives. The current study integrates a number of these perspectives into one adaptive temporal-causal network model describing the
Directory of Open Access Journals (Sweden)
Kaustubh Supekar
2012-02-01
Full Text Available Cognitive skills undergo protracted developmental changes resulting in proficiencies that are a hallmark of human cognition. One skill that develops over time is the ability to problem solve, which in turn relies on cognitive control and attention abilities. Here we use a novel multimodal neurocognitive network-based approach combining task-related fMRI, resting-state fMRI and diffusion tensor imaging (DTI to investigate the maturation of control processes underlying problem solving skills in 7-9 year-old children. Our analysis focused on two key neurocognitive networks implicated in a wide range of cognitive tasks including control: the insula-cingulate salience network, anchored in anterior insula (AI, ventrolateral prefrontal cortex and anterior cingulate cortex, and the fronto-parietal central executive network, anchored in dorsolateral prefrontal cortex and posterior parietal cortex (PPC. We found that, by age 9, the AI node of the salience network is a major causal hub initiating control signals during problem solving. Critically, despite stronger AI activation, the strength of causal regulatory influences from AI to the PPC node of the central executive network was significantly weaker and contributed to lower levels of behavioral performance in children compared to adults. These results were validated using two different analytic methods for estimating causal interactions in fMRI data. In parallel, DTI-based tractography revealed weaker AI-PPC structural connectivity in children. Our findings point to a crucial role of AI connectivity, and its causal cross-network influences, in the maturation of dynamic top-down control signals underlying cognitive development. Overall, our study demonstrates how a unified neurocognitive network model when combined with multimodal imaging enhances our ability to generalize beyond individual task-activated foci and provides a common framework for elucidating key features of brain and cognitive
GRAPHICAL USER INTERFACE WITH APPLICATIONS IN SUSCEPTIBLE-INFECTIOUS-SUSCEPTIBLE MODELS.
Ilea, M; Turnea, M; Arotăriţei, D; Rotariu, Mariana; Popescu, Marilena
2015-01-01
Practical significance of understanding the dynamics and evolution of infectious diseases increases continuously in contemporary world. The mathematical study of the dynamics of infectious diseases has a long history. By incorporating statistical methods and computer-based simulations in dynamic epidemiological models, it could be possible for modeling methods and theoretical analyses to be more realistic and reliable, allowing a more detailed understanding of the rules governing epidemic spreading. To provide the basis for a disease transmission, the population of a region is often divided into various compartments, and the model governing their relation is called the compartmental model. To present all of the information available, a graphical user interface provides icons and visual indicators. The graphical interface shown in this paper is performed using the MATLAB software ver. 7.6.0. MATLAB software offers a wide range of techniques by which data can be displayed graphically. The process of data viewing involves a series of operations. To achieve it, I had to make three separate files, one for defining the mathematical model and two for the interface itself. Considering a fixed population, it is observed that the number of susceptible individuals diminishes along with an increase in the number of infectious individuals so that in about ten days the number of individuals infected and susceptible, respectively, has the same value. If the epidemic is not controlled, it will continue for an indefinite period of time. By changing the global parameters specific of the SIS model, a more rapid increase of infectious individuals is noted. Using the graphical user interface shown in this paper helps achieving a much easier interaction with the computer, simplifying the structure of complex instructions by using icons and menus, and, in particular, programs and files are much easier to organize. Some numerical simulations have been presented to illustrate theoretical
Toward a formalized account of attitudes: The Causal Attitude Network (CAN) model.
Dalege, Jonas; Borsboom, Denny; van Harreveld, Frenk; van den Berg, Helma; Conner, Mark; van der Maas, Han L J
2016-01-01
This article introduces the Causal Attitude Network (CAN) model, which conceptualizes attitudes as networks consisting of evaluative reactions and interactions between these reactions. Relevant evaluative reactions include beliefs, feelings, and behaviors toward the attitude object. Interactions between these reactions arise through direct causal influences (e.g., the belief that snakes are dangerous causes fear of snakes) and mechanisms that support evaluative consistency between related contents of evaluative reactions (e.g., people tend to align their belief that snakes are useful with their belief that snakes help maintain ecological balance). In the CAN model, the structure of attitude networks conforms to a small-world structure: evaluative reactions that are similar to each other form tight clusters, which are connected by a sparser set of "shortcuts" between them. We argue that the CAN model provides a realistic formalized measurement model of attitudes and therefore fills a crucial gap in the attitude literature. Furthermore, the CAN model provides testable predictions for the structure of attitudes and how they develop, remain stable, and change over time. Attitude strength is conceptualized in terms of the connectivity of attitude networks and we show that this provides a parsimonious account of the differences between strong and weak attitudes. We discuss the CAN model in relation to possible extensions, implication for the assessment of attitudes, and possibilities for further study. (c) 2015 APA, all rights reserved).
Zhang, Xiao-Fei; Ou-Yang, Le; Yan, Hong
2017-08-15
Understanding how gene regulatory networks change under different cellular states is important for revealing insights into network dynamics. Gaussian graphical models, which assume that the data follow a joint normal distribution, have been used recently to infer differential networks. However, the distributions of the omics data are non-normal in general. Furthermore, although much biological knowledge (or prior information) has been accumulated, most existing methods ignore the valuable prior information. Therefore, new statistical methods are needed to relax the normality assumption and make full use of prior information. We propose a new differential network analysis method to address the above challenges. Instead of using Gaussian graphical models, we employ a non-paranormal graphical model that can relax the normality assumption. We develop a principled model to take into account the following prior information: (i) a differential edge less likely exists between two genes that do not participate together in the same pathway; (ii) changes in the networks are driven by certain regulator genes that are perturbed across different cellular states and (iii) the differential networks estimated from multi-view gene expression data likely share common structures. Simulation studies demonstrate that our method outperforms other graphical model-based algorithms. We apply our method to identify the differential networks between platinum-sensitive and platinum-resistant ovarian tumors, and the differential networks between the proneural and mesenchymal subtypes of glioblastoma. Hub nodes in the estimated differential networks rediscover known cancer-related regulator genes and contain interesting predictions. The source code is at https://github.com/Zhangxf-ccnu/pDNA. szuouyl@gmail.com. Supplementary data are available at Bioinformatics online. © The Author (2017). Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com
From least squares to multilevel modeling: A graphical introduction to Bayesian inference
Loredo, Thomas J.
2016-01-01
This tutorial presentation will introduce some of the key ideas and techniques involved in applying Bayesian methods to problems in astrostatistics. The focus will be on the big picture: understanding the foundations (interpreting probability, Bayes's theorem, the law of total probability and marginalization), making connections to traditional methods (propagation of errors, least squares, chi-squared, maximum likelihood, Monte Carlo simulation), and highlighting problems where a Bayesian approach can be particularly powerful (Poisson processes, density estimation and curve fitting with measurement error). The "graphical" component of the title reflects an emphasis on pictorial representations of some of the math, but also on the use of graphical models (multilevel or hierarchical models) for analyzing complex data. Code for some examples from the talk will be available to participants, in Python and in the Stan probabilistic programming language.
Pecevski, Dejan; Buesing, Lars; Maass, Wolfgang
2011-12-01
An important open problem of computational neuroscience is the generic organization of computations in networks of neurons in the brain. We show here through rigorous theoretical analysis that inherent stochastic features of spiking neurons, in combination with simple nonlinear computational operations in specific network motifs and dendritic arbors, enable networks of spiking neurons to carry out probabilistic inference through sampling in general graphical models. In particular, it enables them to carry out probabilistic inference in Bayesian networks with converging arrows ("explaining away") and with undirected loops, that occur in many real-world tasks. Ubiquitous stochastic features of networks of spiking neurons, such as trial-to-trial variability and spontaneous activity, are necessary ingredients of the underlying computational organization. We demonstrate through computer simulations that this approach can be scaled up to neural emulations of probabilistic inference in fairly large graphical models, yielding some of the most complex computations that have been carried out so far in networks of spiking neurons.
A semiparametric graphical modelling approach for large-scale equity selection.
Liu, Han; Mulvey, John; Zhao, Tianqi
2016-01-01
We propose a new stock selection strategy that exploits rebalancing returns and improves portfolio performance. To effectively harvest rebalancing gains, we apply ideas from elliptical-copula graphical modelling and stability inference to select stocks that are as independent as possible. The proposed elliptical-copula graphical model has a latent Gaussian representation; its structure can be effectively inferred using the regularized rank-based estimators. The resulting algorithm is computationally efficient and scales to large data-sets. To show the efficacy of the proposed method, we apply it to conduct equity selection based on a 16-year health care stock data-set and a large 34-year stock data-set. Empirical tests show that the proposed method is superior to alternative strategies including a principal component analysis-based approach and the classical Markowitz strategy based on the traditional buy-and-hold assumption.
Probabilistic Graphical Models for the Analysis and Synthesis of Musical Audio
2010-11-01
Graphical model for the HDP. . . . . . . . . . . . . . . . . . . . . . . . . 15 2.5 Chinese Restaurant Franchise (CRF) for three groups of eight observations...associated with ob- servations indirectly through table assignments in the Chinese Restaurant Franchise (CRF). This means that the concentration...other kj,−i in the same song j and on the global component proportions β is given by the Chinese 95 restaurant franchise : p(kji|kj,−i,β, α) = n·kj
Surles, M C; Richardson, J S; Richardson, D C; Brooks, F P
1994-02-01
We describe a new paradigm for modeling proteins in interactive computer graphics systems--continual maintenance of a physically valid representation, combined with direct user control and visualization. This is achieved by a fast algorithm for energy minimization, capable of real-time performance on all atoms of a small protein, plus graphically specified user tugs. The modeling system, called Sculpt, rigidly constrains bond lengths, bond angles, and planar groups (similar to existing interactive modeling programs), while it applies elastic restraints to minimize the potential energy due to torsions, hydrogen bonds, and van der Waals and electrostatic interactions (similar to existing batch minimization programs), and user-specified springs. The graphical interface can show bad and/or favorable contacts, and individual energy terms can be turned on or off to determine their effects and interactions. Sculpt finds a local minimum of the total energy that satisfies all the constraints using an augmented Lagrange-multiplier method; calculation time increases only linearly with the number of atoms because the matrix of constraint gradients is sparse and banded. On a 100-MHz MIPS R4000 processor (Silicon Graphics Indigo), Sculpt achieves 11 updates per second on a 20-residue fragment and 2 updates per second on an 80-residue protein, using all atoms except non-H-bonding hydrogens, and without electrostatic interactions. Applications of Sculpt are described: to reverse the direction of bundle packing in a designed 4-helix bundle protein, to fold up a 2-stranded beta-ribbon into an approximate beta-barrel, and to design the sequence and conformation of a 30-residue peptide that mimics one partner of a protein subunit interaction. Computer models that are both interactive and physically realistic (within the limitations of a given force field) have 2 significant advantages: (1) they make feasible the modeling of very large changes (such as needed for de novo design), and
A quantum causal discovery algorithm
Giarmatzi, Christina; Costa, Fabio
2018-03-01
Finding a causal model for a set of classical variables is now a well-established task—but what about the quantum equivalent? Even the notion of a quantum causal model is controversial. Here, we present a causal discovery algorithm for quantum systems. The input to the algorithm is a process matrix describing correlations between quantum events. Its output consists of different levels of information about the underlying causal model. Our algorithm determines whether the process is causally ordered by grouping the events into causally ordered non-signaling sets. It detects if all relevant common causes are included in the process, which we label Markovian, or alternatively if some causal relations are mediated through some external memory. For a Markovian process, it outputs a causal model, namely the causal relations and the corresponding mechanisms, represented as quantum states and channels. Our algorithm opens the route to more general quantum causal discovery methods.
Neural pathways in processing of sexual arousal: a dynamic causal modeling study.
Seok, J-W; Park, M-S; Sohn, J-H
2016-09-01
Three decades of research have investigated brain processing of visual sexual stimuli with neuroimaging methods. These researchers have found that sexual arousal stimuli elicit activity in a broad neural network of cortical and subcortical brain areas that are known to be associated with cognitive, emotional, motivational and physiological components. However, it is not completely understood how these neural systems integrate and modulated incoming information. Therefore, we identify cerebral areas whose activations were correlated with sexual arousal using event-related functional magnetic resonance imaging and used the dynamic causal modeling method for searching the effective connectivity about the sexual arousal processing network. Thirteen heterosexual males were scanned while they passively viewed alternating short trials of erotic and neutral pictures on a monitor. We created a subset of seven models based on our results and previous studies and selected a dominant connectivity model. Consequently, we suggest a dynamic causal model of the brain processes mediating the cognitive, emotional, motivational and physiological factors of human male sexual arousal. These findings are significant implications for the neuropsychology of male sexuality.
Inventory of data bases, graphics packages, and models in Department of Energy laboratories
International Nuclear Information System (INIS)
Shriner, C.R.; Peck, L.J.
1978-11-01
A central inventory of energy-related environmental bibliographic and numeric data bases, graphics packages, integrated hardware/software systems, and models was established at Oak Ridge National Laboratory in an effort to make these resources at Department of Energy (DOE) laboratories better known and available to researchers and managers. This inventory will also serve to identify and avoid duplication among laboratories. The data were collected at each DOE laboratory, then sent to ORNL and merged into a single file. This document contains the data from the merged file. The data descriptions are organized under major data types: data bases, graphics packages, integrated hardware/software systems, and models. The data include descriptions of subject content, documentation, and contact persons. Also provided are computer data such as media on which the item is available, size of the item, computer on which the item executes, minimum hardware configuration necessary to execute the item, software language(s) and/or data base management system utilized, and character set used. For the models, additional data are provided to define the model more accurately. These data include a general statement of algorithms, computational methods, and theories used by the model; organizations currently using the model; the general application area of the model; sources of data utilized by the model; model validation methods, sensitivity analysis, and procedures; and general model classification. Data in this inventory will be available for on-line data retrieval on the DOE/RECON system
Energy Technology Data Exchange (ETDEWEB)
Salazar-Ferrer, P
1995-06-01
In complex industrial process control, causal reasoning appears as a major component in operators` cognitive tasks. It is tightly linked to diagnosis, prediction of normal and failure states, and explanation. This work provides a detailed review of literature in causal reasoning. A synthesis is proposed as a model of causal reasoning in process control. This model integrates distinct approaches in Cognitive Science: especially qualitative physics, Bayesian networks, knowledge-based systems, and cognitive psychology. Our model defines a framework for the analysis of causal human errors in simulated naval nuclear power plant fault management. Through the methodological framework of critical incident analysis we define a classification of errors and difficulties linked to causal reasoning. This classification is based on shallow characteristics of causal reasoning. As an origin of these errors, more elementary component activities in causal reasoning are identified. The applications cover the field of functional specification for man-machine interfaces, operators support systems design as well as nuclear safety. In addition of this study, we integrate the model of causal reasoning in a model of cognitive task in process control. (authors). 106 refs., 49 figs., 8 tabs.
The Reactive-Causal Architecture: Introducing an Emotion Model along with Theories of Needs
Aydin, Ali Orhan; Orgun, Mehmet Ali
In the entertainment application area, one of the major aims is to develop believable agents. To achieve this aim, agents should be highly autonomous, situated, flexible, and display affect. The Reactive-Causal Architecture (ReCau) is proposed to simulate these core attributes. In its current form, ReCau cannot explain the effects of emotions on intelligent behaviour. This study aims is to further improve the emotion model of ReCau to explain the effects of emotions on intelligent behaviour. This improvement allows ReCau to be emotional to support the development of believable agents.
Modeling the Effect of Religion on Human Empathy Based on an Adaptive Temporal-Causal Network Model
van Ments, L.I.; Roelofsma, P.H.M.P.; Treur, J.
2018-01-01
Religion is a central aspect of many individuals’ lives around the world, and its influence on human behaviour has been extensively studied from many different perspectives. The current study integrates a number of these perspectives into one adaptive temporal-causal network model describing the mental states involved, their mutual relations, and the adaptation of some of these relations over time due to learning. By first developing a conceptual representation of a network model based on lit...
THE CAUSAL ANALYSIS / DIAGNOSIS DECISION ...
CADDIS is an on-line decision support system that helps investigators in the regions, states and tribes find, access, organize, use and share information to produce causal evaluations in aquatic systems. It is based on the US EPA's Stressor Identification process which is a formal method for identifying causes of impairments in aquatic systems. CADDIS 2007 increases access to relevant information useful for causal analysis and provides methods and tools that practitioners can use to analyze their own data. The new Candidate Cause section provides overviews of commonly encountered causes of impairments to aquatic systems: metals, sediments, nutrients, flow alteration, temperature, ionic strength, and low dissolved oxygen. CADDIS includes new Conceptual Models that illustrate the relationships from sources to stressors to biological effects. An Interactive Conceptual Model for phosphorus links the diagram with supporting literature citations. The new Analyzing Data section helps practitioners analyze their data sets and interpret and use those results as evidence within the USEPA causal assessment process. Downloadable tools include a graphical user interface statistical package (CADStat), and programs for use with the freeware R statistical package, and a Microsoft Excel template. These tools can be used to quantify associations between causes and biological impairments using innovative methods such as species-sensitivity distributions, biological inferenc
Treinish, Lloyd A.; Gough, Michael L.; Wildenhain, W. David
1987-01-01
The capability was developed of rapidly producing visual representations of large, complex, multi-dimensional space and earth sciences data sets via the implementation of computer graphics modeling techniques on the Massively Parallel Processor (MPP) by employing techniques recently developed for typically non-scientific applications. Such capabilities can provide a new and valuable tool for the understanding of complex scientific data, and a new application of parallel computing via the MPP. A prototype system with such capabilities was developed and integrated into the National Space Science Data Center's (NSSDC) Pilot Climate Data System (PCDS) data-independent environment for computer graphics data display to provide easy access to users. While developing these capabilities, several problems had to be solved independently of the actual use of the MPP, all of which are outlined.
The relationship of family characteristics and bipolar disorder using causal-pie models.
Chen, Y-C; Kao, C-F; Lu, M-K; Yang, Y-K; Liao, S-C; Jang, F-L; Chen, W J; Lu, R-B; Kuo, P-H
2014-01-01
Many family characteristics were reported to increase the risk of bipolar disorder (BPD). The development of BPD may be mediated through different pathways, involving diverse risk factor profiles. We evaluated the associations of family characteristics to build influential causal-pie models to estimate their contributions on the risk of developing BPD at the population level. We recruited 329 clinically diagnosed BPD patients and 202 healthy controls to collect information in parental psychopathology, parent-child relationship, and conflict within family. Other than logistic regression models, we applied causal-pie models to identify pathways involved with different family factors for BPD. The risk of BPD was significantly increased with parental depression, neurosis, anxiety, paternal substance use problems, and poor relationship with parents. Having a depressed mother further predicted early onset of BPD. Additionally, a greater risk for BPD was observed with higher numbers of paternal/maternal psychopathologies. Three significant risk profiles were identified for BPD, including paternal substance use problems (73.0%), maternal depression (17.6%), and through poor relationship with parents and conflict within the family (6.3%). Our findings demonstrate that different aspects of family characteristics elicit negative impacts on bipolar illness, which can be utilized to target specific factors to design and employ efficient intervention programs. Copyright © 2013 Elsevier Masson SAS. All rights reserved.
IDAS, software support for mathematical models and map-based graphics
International Nuclear Information System (INIS)
Birnbaum, M.D.; Wecker, D.B.
1984-01-01
IDAS (Intermediate Dose Assessment System) was developed for the U.S. Nuclear Regulatory Commission as a hardware/software host for radiological models and display of map-based plume graphics at the Operations Center (HQ), regional incident response centers, and site emergency facilities. IDAS design goals acknowledged the likelihood of future changes in the suite of models and the composition of map features for analysis and graphical display. IDAS provides a generalized software support environment to programmers and users of modeling programs. A database manager process provides multi-user access control to all input and output data for modeling programs. A programmer-created data description file (schema) specifies data field names, data types, legal and recommended ranges, default values, preferred units of measurement, and ''help'' text. Subroutine calls to IDAS from a model program invoke a consistent user interface which can show any of the schema contents, convert units of measurement, and route data to multiple logical devices, including the database. A stand-alone data editor allows the user to read and write model data records without execution of a model. IDAS stores digitized map features in a 4-level naming hierarchy. A user can select the map icon, color, and whether to show a stored name tag, for each map feature. The user also selects image scale (zoom) within limits set by map digitization. The resulting image combines static map information, computed analytic modeling results, and the user's feature selections for display to decision-makers
Positioning graphical objects on computer screens: a three-phase model.
Pastel, Robert
2011-02-01
This experiment identifies and models phases during the positioning of graphical objects (called cursors in this article) on computer displays. The human computer-interaction community has traditionally used Fitts' law to model selection in graphical user interfaces, whereas human factors experiments have found the single-component Fitts' law inadequate to model positioning of real objects. Participants (N=145) repeatedly positioned variably sized square cursors within variably sized rectangular targets using computer mice. The times for the cursor to just touch the target, for the cursor to enter the target, and for participants to indicate positioning completion were observed. The positioning tolerances were varied from very precise and difficult to imprecise and easy. The time for the cursor to touch the target was proportional to the initial cursor-target distance. The time for the cursor to completely enter the target after touching was proportional to the logarithms of cursor size divided by target tolerances. The time for participants to indicate positioning after entering was inversely proportional to the tolerance. A three-phase model defined by regions--distant, proximate, and inside the target--was proposed and could model the positioning tasks. The three-phase model provides a framework for ergonomists to evaluate new positioning techniques and can explain their deficiencies. The model provides a means to analyze tasks and enhance interaction during positioning.
Futures Business Models for an IoT Enabled Healthcare Sector: A Causal Layered Analysis Perspective
Directory of Open Access Journals (Sweden)
Julius Francis Gomes
2016-12-01
Full Text Available Purpose: To facilitate futures business research by proposing a novel way to combine business models as a conceptual tool with futures research techniques. Design: A futures perspective is adopted to foresight business models of the Internet of Things (IoT enabled healthcare sector by using business models as a futures business research tool. In doing so, business models is coupled with one of the most prominent foresight methodologies, Causal Layered Analysis (CLA. Qualitative analysis provides deeper understanding of the phenomenon through the layers of CLA; litany, social causes, worldview and myth. Findings: It is di cult to predict the far future for a technology oriented sector like healthcare. This paper presents three scenarios for short-, medium- and long-term future. Based on these scenarios we also present a set of business model elements for different future time frames. This paper shows a way to combine business models with CLA, a foresight methodology; in order to apply business models in futures business research. Besides offering early results for futures business research, this study proposes a conceptual space to work with individual business models for managerial stakeholders. Originality / Value: Much research on business models has offered conceptualization of the phenomenon, innovation through business model and transformation of business models. However, existing literature does not o er much on using business model as a futures research tool. Enabled by futures thinking, we collected key business model elements and building blocks for the futures market and ana- lyzed them through the CLA framework.
International Nuclear Information System (INIS)
Buonomano, V.; Engel, A.
1974-10-01
Some speculations on a causal model that seems to provide a common conceptual foundation for Relativity Gravitation and Quantum Mechanics are presented. The present approach is a unifying of three theories. The first being the repulsive theory of gravitational forces first proposed by Lesage in the eighteenth century. The second of these theories is the Brownian Motion Theory of Quantum Mechanics or Stocastic Mechanics which treats the non-deterministic Nature of Quantum Mechanics as being due to a Brownian motion of all objects. This Brownian motion being caused by the statistical variation in the graviton flux. The above two theories are unified with the Causal Theory of Special Relativity. Within the present context, the time dilations (and other effects) of Relativity are explained by assuming that the rate of a clock is a function of the total number or intensity of gravitons and the average frequency or energy of the gravitons that the clock receives. The Special Theory would then be the special case of the General Theory where the intensity is constant but the average frequency varies. In all the previous it is necessary to assume a particular model of the creation of the universe, namely the Big Bang Theory. This assumption gives us the existence of a preferred reference frame, the frame in which the Big Bang explosion was at rest. The above concepts of graviton distribution and real time dilations become meaningful by assuming the Big Bang Theory along with this preferred frame. An experimental test is proposed
A Module for Graphical Display of Model Results with the CBP Toolbox
Energy Technology Data Exchange (ETDEWEB)
Smith, F. [Savannah River Site (SRS), Aiken, SC (United States). Savannah River National Lab. (SRNL)
2015-04-21
This report describes work performed by the Savannah River National Laboratory (SRNL) in fiscal year 2014 to add enhanced graphical capabilities to display model results in the Cementitious Barriers Project (CBP) Toolbox. Because Version 2.0 of the CBP Toolbox has just been released, the graphing enhancements described in this report have not yet been integrated into a new version of the Toolbox. Instead they have been tested using a standalone GoldSim model and, while they are substantially complete, may undergo further refinement before full implementation. Nevertheless, this report is issued to document the FY14 development efforts which will provide a basis for further development of the CBP Toolbox.
Manu, Patrick A; Ankrah, Nii A; Proverbs, David G; Suresh, Subashini
2012-09-01
Construction project features (CPFs) are organisational, physical and operational attributes that characterise construction projects. Although previous studies have examined the accident causal influence of CPFs, the multi-causal attribute of this causal phenomenon still remain elusive and thus requires further investigation. Aiming to shed light on this facet of the accident causal phenomenon of CPFs, this study examines relevant literature and crystallises the attained insight of the multi-causal attribute by a graphical model which is subsequently operationalised by a derived mathematical risk expression that offers a systematic approach for evaluating the potential of CPFs to cause harm and consequently their health and safety (H&S) risk implications. The graphical model and the risk expression put forth by the study thus advance current understanding of the accident causal phenomenon of CPFs and they present an opportunity for project participants to manage the H&S risk associated with CPFs from the early stages of project procurement. Copyright © 2011 Elsevier Ltd. All rights reserved.
Sanchez, Julio
2003-01-01
Part I - Graphics Fundamentals PC GRAPHICS OVERVIEW History and Evolution Short History of PC Video PS/2 Video Systems SuperVGA Graphics Coprocessors and Accelerators Graphics Applications State-of-the-Art in PC Graphics 3D Application Programming Interfaces POLYGONAL MODELING Vector and Raster Data Coordinate Systems Modeling with Polygons IMAGE TRANSFORMATIONS Matrix-based Representations Matrix Arithmetic 3D Transformations PROGRAMMING MATRIX TRANSFORMATIONS Numeric Data in Matrix Form Array Processing PROJECTIONS AND RENDERING Perspective The Rendering Pipeline LIGHTING AND SHADING Lightin
Louie, Jacob; Shalaby, Amer; Habib, Khandker Nurul
2017-01-01
Most investigations of incident-related delay duration in the transportation context are restricted to highway traffic, with little attention given to delays due to transit service disruptions. Studies of transit-based delay duration are also considerably less comprehensive than their highway counterparts with respect to examining the effects of non-causal variables on the delay duration. However, delays due to incidents in public transit service can have serious consequences on the overall urban transportation system due to the pivotal and vital role of public transit. The ability to predict the durations of various types of transit system incidents is indispensable for better management and mitigation of service disruptions. This paper presents a detailed investigation on incident delay durations in Toronto's subway system over the year 2013, focusing on the effects of the incidents' location and time, the train-type involved, and the non-adherence to proper recovery procedures. Accelerated Failure Time (AFT) hazard models are estimated to investigate the relationship between these factors and the resulting delay duration. The empirical investigation reveals that incident types that impact both safety and operations simultaneously generally have longer expected delays than incident types that impact either safety or operations alone. Incidents at interchange stations are cleared faster than incidents at non-interchange stations. Incidents during peak periods have nearly the same delay durations as off-peak incidents. The estimated models are believed to be useful tools in predicting the relative magnitude of incident delay duration for better management of subway operations. Copyright © 2016 Elsevier Ltd. All rights reserved.
Modelling the effect of religion on human empathy based on an adaptive temporal–causal network model
van Ments, Laila; Roelofsma, Peter; Treur, Jan
2018-01-01
Background Religion is a central aspect of many individuals’ lives around the world, and its influence on human behaviour has been extensively studied from many different perspectives. Methods The current study integrates a number of these perspectives into one adaptive temporal–causal network model describing the mental states involved, their mutual relations, and the adaptation of some of these relations over time due to learning. Results By first developing a conceptual representation of a...
Directory of Open Access Journals (Sweden)
Prof. Patty K. Wongpakdee
2013-06-01
Full Text Available “Resurfacing Graphics” deals with the subject of unconventional design, with the purpose of engaging the viewer to experience the graphics beyond paper’s passive surface. Unconventional designs serve to reinvigorate people, whose senses are dulled by the typical, printed graphics, which bombard them each day. Today’s cutting-edge designers, illustrators and artists utilize graphics in a unique manner that allows for tactile interaction. Such works serve as valuable teaching models and encourage students to do the following: 1 investigate the trans-disciplines of art and technology; 2 appreciate that this approach can have a positive effect on the environment; 3 examine and research other approaches of design communications and 4 utilize new mediums to stretch the boundaries of artistic endeavor. This paper examines how visuals communicators are “Resurfacing Graphics” by using atypical surfaces and materials such as textile, wood, ceramics and even water. Such non-traditional transmissions of visual language serve to demonstrate student’s overreliance on paper as an outdated medium. With this exposure, students can become forward-thinking, eco-friendly, creative leaders by expanding their creative breadth and continuing the perpetual exploration for new ways to make their mark.
Directory of Open Access Journals (Sweden)
Prof. Patty K. Wongpakdee
2013-06-01
Full Text Available “Resurfacing Graphics” deals with the subject of unconventional design, with the purpose of engaging the viewer to experience the graphics beyond paper’s passive surface. Unconventional designs serve to reinvigorate people, whose senses are dulled by the typical, printed graphics, which bombard them each day. Today’s cutting-edge designers, illustrators and artists utilize graphics in a unique manner that allows for tactile interaction. Such works serve as valuable teaching models and encourage students to do the following: 1 investigate the trans-disciplines of art and technology; 2 appreciate that this approach can have a positive effect on the environment; 3 examine and research other approaches of design communications and 4 utilize new mediums to stretch the boundaries of artistic endeavor. This paper examines how visuals communicators are “Resurfacing Graphics” by using atypical surfaces and materials such as textile, wood, ceramics and even water. Such non-traditional transmissions of visual language serve to demonstrate student’s overreliance on paper as an outdated medium. With this exposure, students can become forward-thinking, eco-friendly, creative leaders by expanding their creative breadth and continuing the perpetual exploration for new ways to make their mark.
Robust Measurement via A Fused Latent and Graphical Item Response Theory Model.
Chen, Yunxiao; Li, Xiaoou; Liu, Jingchen; Ying, Zhiliang
2018-03-12
Item response theory (IRT) plays an important role in psychological and educational measurement. Unlike the classical testing theory, IRT models aggregate the item level information, yielding more accurate measurements. Most IRT models assume local independence, an assumption not likely to be satisfied in practice, especially when the number of items is large. Results in the literature and simulation studies in this paper reveal that misspecifying the local independence assumption may result in inaccurate measurements and differential item functioning. To provide more robust measurements, we propose an integrated approach by adding a graphical component to a multidimensional IRT model that can offset the effect of unknown local dependence. The new model contains a confirmatory latent variable component, which measures the targeted latent traits, and a graphical component, which captures the local dependence. An efficient proximal algorithm is proposed for the parameter estimation and structure learning of the local dependence. This approach can substantially improve the measurement, given no prior information on the local dependence structure. The model can be applied to measure both a unidimensional latent trait and multidimensional latent traits.
Causal attributions of vineyard executives â A mental model study of vineyard managementâ
Directory of Open Access Journals (Sweden)
Martin FG Schaffernicht
2017-12-01
Full Text Available This article contributes a reference of causal attributions made by vineyard executives in Chile, where increasing costs and stagnating prices challenge the vineyardsâ profits. The investigation was motivated by the question how executives interpret the industry's mid term future and how they reflect on steering their companies. Based on in-depth interviews, causal maps were elaborated to represent the executivesâ mental models. These are represented as sequences of attributions, connecting variables by causal links. It was found that some mental models guide policies bound to increase the prices, whereas other models suggest taking the prices as givens and control costs. The collection of causal attributions of the vineyard executives (CAVE has been made publicly available. As a result, CAVE can be used by other management scholars to elicit other executivesâ mental models and increase the data base available. Since such research will be cumulative, a minimum size for meaningful statistical analysis can be reached, opening up an avenue for improving the design of business policies. CAVE can also serve executives and consultants in constructing causal argumentations and business policies. Future research and development of supporting software are called for. Keywords: Mental models, Strategy, Business model
Kramers-Kronig relations and causality conditions for graphene in the framework of the Dirac model
Klimchitskaya, G. L.; Mostepanenko, V. M.
2018-04-01
We analyze the concept of causality for the conductivity of graphene described by the Dirac model. It is recalled that the condition of causality leads to the analyticity of conductivity in the upper half-plane of complex frequencies and to the standard symmetry properties for its real and imaginary parts. This results in the Kramers-Kronig relations, which explicit form depends on whether the conductivity has no pole at zero frequency (as in the case of zero temperature when the band gap of graphene is larger than twice the chemical potential) or it has a pole (as in all other cases, specifically, at nonzero temperature). Through the direct analytic calculation it is shown that the real and imaginary parts of graphene conductivity, found recently on the basis of first principles of thermal quantum field theory using the polarization tensor in (2 +1 )-dimensional space-time, satisfy the Kramers-Kronig relations precisely. In so doing, the values of two integrals in the commonly used tables, which are also important for a wider area of dispersion relations in quantum field theory and elementary particle physics, are corrected. The obtained results are not of only fundamental theoretical character, but can be used as a guideline in testing the validity of different phenomenological approaches and for the interpretation of experimental data.
Repeated causal decision making.
Hagmayer, York; Meder, Björn
2013-01-01
Many of our decisions refer to actions that have a causal impact on the external environment. Such actions may not only allow for the mere learning of expected values or utilities but also for acquiring knowledge about the causal structure of our world. We used a repeated decision-making paradigm to examine what kind of knowledge people acquire in such situations and how they use their knowledge to adapt to changes in the decision context. Our studies show that decision makers' behavior is strongly contingent on their causal beliefs and that people exploit their causal knowledge to assess the consequences of changes in the decision problem. A high consistency between hypotheses about causal structure, causally expected values, and actual choices was observed. The experiments show that (a) existing causal hypotheses guide the interpretation of decision feedback, (b) consequences of decisions are used to revise existing causal beliefs, and (c) decision makers use the experienced feedback to induce a causal model of the choice situation even when they have no initial causal hypotheses, which (d) enables them to adapt their choices to changes of the decision problem. (PsycINFO Database Record (c) 2013 APA, all rights reserved).
Hoțoiu, Maria; Tavella, Federico; Treur, Jan
2018-01-01
This paper presents a computational network model for a person with a Borderline Personality Disorder. It was designed according to a Network-Oriented Modeling approach as a temporal-causal network based on neuropsychological background knowledge. Some example simulations are discussed. The model
Causal modeling of secondary science students' intentions to enroll in physics
Crawley, Frank E.; Black, Carolyn B.
The purpose of this study was to explore the utility of the theory of planned behavior model developed by social psychologists for understanding and predicting the behavioral intentions of secondary science students regarding enrolling in physics. In particular, the study used a three-stage causal model to investigate the links from external variables to behavioral, normative, and control beliefs; from beliefs to attitudes, subjective norm, and perceived behavioral control; and from attitudes, subjective norm, and perceived behavioral control to behavioral intentions. The causal modeling method was employed to verify the underlying causes of secondary science students' interest in enrolling physics as predicted in the theory of planned behavior. Data were collected from secondary science students (N = 264) residing in a central Texas city who were enrolled in earth science (8th grade), biology (9th grade), physical science (10th grade), or chemistry (11th grade) courses. Cause-and-effect relationships were analyzed using path analysis to test the direct effects of model variables specified in the theory of planned behavior. Results of this study indicated that students' intention to enroll in a high school physics course was determined by their attitude toward enrollment and their degree of perceived behavioral control. Attitude, subjective norm, and perceived behavioral control were, in turn, formed as a result of specific beliefs that students held about enrolling in physics. Grade level and career goals were found to be instrumental in shaping students' attitude. Immediate family members were identified as major referents in the social support system for enrolling in physics. Course and extracurricular conflicts and the fear of failure were shown to be the primary beliefs obstructing students' perception of control over physics enrollment. Specific recommendations are offered to researchers and practitioners for strengthening secondary school students
Causal inference in biology networks with integrated belief propagation.
Chang, Rui; Karr, Jonathan R; Schadt, Eric E
2015-01-01
Inferring causal relationships among molecular and higher order phenotypes is a critical step in elucidating the complexity of living systems. Here we propose a novel method for inferring causality that is no longer constrained by the conditional dependency arguments that limit the ability of statistical causal inference methods to resolve causal relationships within sets of graphical models that are Markov equivalent. Our method utilizes Bayesian belief propagation to infer the responses of perturbation events on molecular traits given a hypothesized graph structure. A distance measure between the inferred response distribution and the observed data is defined to assess the 'fitness' of the hypothesized causal relationships. To test our algorithm, we infer causal relationships within equivalence classes of gene networks in which the form of the functional interactions that are possible are assumed to be nonlinear, given synthetic microarray and RNA sequencing data. We also apply our method to infer causality in real metabolic network with v-structure and feedback loop. We show that our method can recapitulate the causal structure and recover the feedback loop only from steady-state data which conventional method cannot.
Goel, Narendra S.; Rozehnal, Ivan; Thompson, Richard L.
1991-01-01
A computer-graphics-based model, named DIANA, is presented for generation of objects of arbitrary shape and for calculating bidirectional reflectances and scattering from them, in the visible and infrared region. The computer generation is based on a modified Lindenmayer system approach which makes it possible to generate objects of arbitrary shapes and to simulate their growth, dynamics, and movement. Rendering techniques are used to display an object on a computer screen with appropriate shading and shadowing and to calculate the scattering and reflectance from the object. The technique is illustrated with scattering from canopies of simulated corn plants.
Application of dynamic uncertain causality graph in spacecraft fault diagnosis: Logic cycle
Yao, Quanying; Zhang, Qin; Liu, Peng; Yang, Ping; Zhu, Ma; Wang, Xiaochen
2017-04-01
Intelligent diagnosis system are applied to fault diagnosis in spacecraft. Dynamic Uncertain Causality Graph (DUCG) is a new probability graphic model with many advantages. In the knowledge expression of spacecraft fault diagnosis, feedback among variables is frequently encountered, which may cause directed cyclic graphs (DCGs). Probabilistic graphical models (PGMs) such as bayesian network (BN) have been widely applied in uncertain causality representation and probabilistic reasoning, but BN does not allow DCGs. In this paper, DUGG is applied to fault diagnosis in spacecraft: introducing the inference algorithm for the DUCG to deal with feedback. Now, DUCG has been tested in 16 typical faults with 100% diagnosis accuracy.
Anquez, Jérémie; Boubekeur, Tamy; Bibin, Lazar; Angelini, Elsa; Bloch, Isabelle
2009-01-01
Potential sanitary effects related to electromagnetic fields exposure raise public concerns, especially for fetuses during pregnancy. Human fetus exposure can only be assessed through simulated dosimetry studies, performed on anthropomorphic models of pregnant women. In this paper, we propose a new methodology to generate a set of detailed utero-fetal unit (UFU) 3D models during the first and third trimesters of pregnancy, based on segmented 3D ultrasound and MRI data. UFU models are built using recent geometry processing methods derived from mesh-based computer graphics techniques and embedded in a synthetic woman body. Nine pregnant woman models have been generated using this approach and validated by obstetricians, for anatomical accuracy and representativeness.
Three Cs in Measurement Models: Causal Indicators, Composite Indicators, and Covariates
Bollen, Kenneth A.; Bauldry, Shawn
2011-01-01
In the last two decades attention to causal (and formative) indicators has grown. Accompanying this growth has been the belief that we can classify indicators into two categories, effect (reflective) indicators and causal (formative) indicators. This paper argues that the dichotomous view is too simple. Instead, there are effect indicators and three types of variables on which a latent variable depends: causal indicators, composite (formative) indicators, and covariates (the “three Cs”). Caus...
Identifying abnormal connectivity in patients using Dynamic Causal Modelling of fMRI responses.
Directory of Open Access Journals (Sweden)
Mohamed L Seghier
2010-08-01
Full Text Available Functional imaging studies of brain damaged patients offer a unique opportunity to understand how sensori-motor and cognitive tasks can be carried out when parts of the neural system that support normal performance are no longer available. In addition to knowing which regions a patient activates, we also need to know how these regions interact with one another, and how these inter-regional interactions deviate from normal. Dynamic Causal Modelling (DCM offers the opportunity to assess task-dependent interactions within a set of regions. Here we review its use in patients when the question of interest concerns the characterisation of abnormal connectivity for a given pathology. We describe the currently available implementations of DCM for fMRI responses, varying from the deterministic bilinear models with one-state equation to the stochastic nonlinear models with two-state equations. We also highlight the importance of the new Bayesian model selection and averaging tools that allow different plausible models to be compared at the single subject and group level. These procedures allow inferences to be made at different levels of model selection, from features (model families to connectivity parameters. Following a critical review of previous DCM studies that investigated abnormal connectivity we propose a systematic procedure that will ensure more flexibility and efficiency when using DCM in patients. Finally, some practical and methodological issues crucial for interpreting or generalising DCM findings in patients are discussed.
Infertile individuals' marital relationship status, happiness, and mental health: a causal model.
Ahmadi Forooshany, Seyed Habiballah; Yazdkhasti, Fariba; Safari Hajataghaie, Saiede; Nasr Esfahani, Mohammad Hossein
2014-10-01
This study examined the causal model of relation between marital relation- ship status, happiness, and mental health in infertile individuals. In this descriptive study, 155 subjects (men: 52 and women: 78), who had been visited in one of the infertility Centers, voluntarily participated in a self-evaluation. Golombok Rust Inventory of Marital Status, Oxford Happiness Ques- tionnaire, and General Health Questionnaire were used as instruments of the study. Data was analyzed by SPSS17 and Amos 5 software using descriptive statistics, independent sample t test, and path analysis. Disregarding the gender factor, marital relationship status was directly related to happiness (phappiness was directly related to mental health, (phappiness and mental health was significant (phappiness had a mediator role in relation between marital relationship status and mental health in infertile individu- als disregarding the gender factor. Also, considering the gender factor, only in infertile women, marital relationship status can directly and indirectly affect happiness and mental health.
Ellis, George FR; Pabjan, Tadeusz
2013-01-01
Written by philosophers, cosmologists, and physicists, this collection of essays deals with causality, which is a core issue for both science and philosophy. Readers will learn about different types of causality in complex systems and about new perspectives on this issue based on physical and cosmological considerations. In addition, the book includes essays pertaining to the problem of causality in ancient Greek philosophy, and to the problem of God's relation to the causal structures of nature viewed in the light of contemporary physics and cosmology.
Pecevski, Dejan; Buesing, Lars; Maass, Wolfgang
2011-01-01
An important open problem of computational neuroscience is the generic organization of computations in networks of neurons in the brain. We show here through rigorous theoretical analysis that inherent stochastic features of spiking neurons, in combination with simple nonlinear computational operations in specific network motifs and dendritic arbors, enable networks of spiking neurons to carry out probabilistic inference through sampling in general graphical models. In particular, it enables them to carry out probabilistic inference in Bayesian networks with converging arrows (“explaining away”) and with undirected loops, that occur in many real-world tasks. Ubiquitous stochastic features of networks of spiking neurons, such as trial-to-trial variability and spontaneous activity, are necessary ingredients of the underlying computational organization. We demonstrate through computer simulations that this approach can be scaled up to neural emulations of probabilistic inference in fairly large graphical models, yielding some of the most complex computations that have been carried out so far in networks of spiking neurons. PMID:22219717
Directory of Open Access Journals (Sweden)
Dejan Pecevski
2011-12-01
Full Text Available An important open problem of computational neuroscience is the generic organization of computations in networks of neurons in the brain. We show here through rigorous theoretical analysis that inherent stochastic features of spiking neurons, in combination with simple nonlinear computational operations in specific network motifs and dendritic arbors, enable networks of spiking neurons to carry out probabilistic inference through sampling in general graphical models. In particular, it enables them to carry out probabilistic inference in Bayesian networks with converging arrows ("explaining away" and with undirected loops, that occur in many real-world tasks. Ubiquitous stochastic features of networks of spiking neurons, such as trial-to-trial variability and spontaneous activity, are necessary ingredients of the underlying computational organization. We demonstrate through computer simulations that this approach can be scaled up to neural emulations of probabilistic inference in fairly large graphical models, yielding some of the most complex computations that have been carried out so far in networks of spiking neurons.
InteractiveROSETTA: a graphical user interface for the PyRosetta protein modeling suite.
Schenkelberg, Christian D; Bystroff, Christopher
2015-12-15
Modern biotechnical research is becoming increasingly reliant on computational structural modeling programs to develop novel solutions to scientific questions. Rosetta is one such protein modeling suite that has already demonstrated wide applicability to a number of diverse research projects. Unfortunately, Rosetta is largely a command-line-driven software package which restricts its use among non-computational researchers. Some graphical interfaces for Rosetta exist, but typically are not as sophisticated as commercial software. Here, we present InteractiveROSETTA, a graphical interface for the PyRosetta framework that presents easy-to-use controls for several of the most widely used Rosetta protocols alongside a sophisticated selection system utilizing PyMOL as a visualizer. InteractiveROSETTA is also capable of interacting with remote Rosetta servers, facilitating sophisticated protocols that are not accessible in PyRosetta or which require greater computational resources. InteractiveROSETTA is freely available at https://github.com/schenc3/InteractiveROSETTA/releases and relies upon a separate download of PyRosetta which is available at http://www.pyrosetta.org after obtaining a license (free for academic use). © The Author 2015. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.
Causal Analysis After Haavelmo
Heckman, James; Pinto, Rodrigo
2014-01-01
Haavelmo's seminal 1943 and 1944 papers are the first rigorous treatment of causality. In them, he distinguished the definition of causal parameters from their identification. He showed that causal parameters are defined using hypothetical models that assign variation to some of the inputs determining outcomes while holding all other inputs fixed. He thus formalized and made operational Marshall's (1890) ceteris paribus analysis. We embed Haavelmo's framework into the recursive framework of Directed Acyclic Graphs (DAGs) used in one influential recent approach to causality (Pearl, 2000) and in the related literature on Bayesian nets (Lauritzen, 1996). We compare the simplicity of an analysis of causality based on Haavelmo's methodology with the complex and nonintuitive approach used in the causal literature of DAGs—the “do-calculus” of Pearl (2009). We discuss the severe limitations of DAGs and in particular of the do-calculus of Pearl in securing identification of economic models. We extend our framework to consider models for simultaneous causality, a central contribution of Haavelmo. In general cases, DAGs cannot be used to analyze models for simultaneous causality, but Haavelmo's approach naturally generalizes to cover them. PMID:25729123
Irvine, Kathryn M.; Miller, Scott; Al-Chokhachy, Robert K.; Archer, Erik; Roper, Brett B.; Kershner, Jeffrey L.
2015-01-01
Conceptual models are an integral facet of long-term monitoring programs. Proposed linkages between drivers, stressors, and ecological indicators are identified within the conceptual model of most mandated programs. We empirically evaluate a conceptual model developed for a regional aquatic and riparian monitoring program using causal models (i.e., Bayesian path analysis). We assess whether data gathered for regional status and trend estimation can also provide insights on why a stream may deviate from reference conditions. We target the hypothesized causal pathways for how anthropogenic drivers of road density, percent grazing, and percent forest within a catchment affect instream biological condition. We found instream temperature and fine sediments in arid sites and only fine sediments in mesic sites accounted for a significant portion of the maximum possible variation explainable in biological condition among managed sites. However, the biological significance of the direct effects of anthropogenic drivers on instream temperature and fine sediments were minimal or not detected. Consequently, there was weak to no biological support for causal pathways related to anthropogenic drivers’ impact on biological condition. With weak biological and statistical effect sizes, ignoring environmental contextual variables and covariates that explain natural heterogeneity would have resulted in no evidence of human impacts on biological integrity in some instances. For programs targeting the effects of anthropogenic activities, it is imperative to identify both land use practices and mechanisms that have led to degraded conditions (i.e., moving beyond simple status and trend estimation). Our empirical evaluation of the conceptual model underpinning the long-term monitoring program provided an opportunity for learning and, consequently, we discuss survey design elements that require modification to achieve question driven monitoring, a necessary step in the practice of
ModelMuse - A Graphical User Interface for MODFLOW-2005 and PHAST
Winston, Richard B.
2009-01-01
ModelMuse is a graphical user interface (GUI) for the U.S. Geological Survey (USGS) models MODFLOW-2005 and PHAST. This software package provides a GUI for creating the flow and transport input file for PHAST and the input files for MODFLOW-2005. In ModelMuse, the spatial data for the model is independent of the grid, and the temporal data is independent of the stress periods. Being able to input these data independently allows the user to redefine the spatial and temporal discretization at will. This report describes the basic concepts required to work with ModelMuse. These basic concepts include the model grid, data sets, formulas, objects, the method used to assign values to data sets, and model features. The ModelMuse main window has a top, front, and side view of the model that can be used for editing the model, and a 3-D view of the model that can be used to display properties of the model. ModelMuse has tools to generate and edit the model grid. It also has a variety of interpolation methods and geographic functions that can be used to help define the spatial variability of the model. ModelMuse can be used to execute both MODFLOW-2005 and PHAST and can also display the results of MODFLOW-2005 models. An example of using ModelMuse with MODFLOW-2005 is included in this report. Several additional examples are described in the help system for ModelMuse, which can be accessed from the Help menu.
Causal and causally separable processes
Oreshkov, Ognyan; Giarmatzi, Christina
2016-09-01
The idea that events are equipped with a partial causal order is central to our understanding of physics in the tested regimes: given two pointlike events A and B, either A is in the causal past of B, B is in the causal past of A, or A and B are space-like separated. Operationally, the meaning of these order relations corresponds to constraints on the possible correlations between experiments performed in the vicinities of the respective events: if A is in the causal past of B, an experimenter at A could signal to an experimenter at B but not the other way around, while if A and B are space-like separated, no signaling is possible in either direction. In the context of a concrete physical theory, the correlations compatible with a given causal configuration may obey further constraints. For instance, space-like correlations in quantum mechanics arise from local measurements on joint quantum states, while time-like correlations are established via quantum channels. Similarly to other variables, however, the causal order of a set of events could be random, and little is understood about the constraints that causality implies in this case. A main difficulty concerns the fact that the order of events can now generally depend on the operations performed at the locations of these events, since, for instance, an operation at A could influence the order in which B and C occur in A’s future. So far, no formal theory of causality compatible with such dynamical causal order has been developed. Apart from being of fundamental interest in the context of inferring causal relations, such a theory is imperative for understanding recent suggestions that the causal order of events in quantum mechanics can be indefinite. Here, we develop such a theory in the general multipartite case. Starting from a background-independent definition of causality, we derive an iteratively formulated canonical decomposition of multipartite causal correlations. For a fixed number of settings and
Causal and causally separable processes
International Nuclear Information System (INIS)
Oreshkov, Ognyan; Giarmatzi, Christina
2016-01-01
The idea that events are equipped with a partial causal order is central to our understanding of physics in the tested regimes: given two pointlike events A and B , either A is in the causal past of B , B is in the causal past of A , or A and B are space-like separated. Operationally, the meaning of these order relations corresponds to constraints on the possible correlations between experiments performed in the vicinities of the respective events: if A is in the causal past of B , an experimenter at A could signal to an experimenter at B but not the other way around, while if A and B are space-like separated, no signaling is possible in either direction. In the context of a concrete physical theory, the correlations compatible with a given causal configuration may obey further constraints. For instance, space-like correlations in quantum mechanics arise from local measurements on joint quantum states, while time-like correlations are established via quantum channels. Similarly to other variables, however, the causal order of a set of events could be random, and little is understood about the constraints that causality implies in this case. A main difficulty concerns the fact that the order of events can now generally depend on the operations performed at the locations of these events, since, for instance, an operation at A could influence the order in which B and C occur in A ’s future. So far, no formal theory of causality compatible with such dynamical causal order has been developed. Apart from being of fundamental interest in the context of inferring causal relations, such a theory is imperative for understanding recent suggestions that the causal order of events in quantum mechanics can be indefinite. Here, we develop such a theory in the general multipartite case. Starting from a background-independent definition of causality, we derive an iteratively formulated canonical decomposition of multipartite causal correlations. For a fixed number of settings and
Gao, Xiangyun; Huang, Shupei; Sun, Xiaoqi; Hao, Xiaoqing; An, Feng
2018-03-01
Microscopic factors are the basis of macroscopic phenomena. We proposed a network analysis paradigm to study the macroscopic financial system from a microstructure perspective. We built the cointegration network model and the Granger causality network model based on econometrics and complex network theory and chose stock price time series of the real estate industry and its upstream and downstream industries as empirical sample data. Then, we analysed the cointegration network for understanding the steady long-term equilibrium relationships and analysed the Granger causality network for identifying the diffusion paths of the potential risks in the system. The results showed that the influence from a few key stocks can spread conveniently in the system. The cointegration network and Granger causality network are helpful to detect the diffusion path between the industries. We can also identify and intervene in the transmission medium to curb risk diffusion.
Programs as Causal Models: Speculations on Mental Programs and Mental Representation
Chater, Nick; Oaksford, Mike
2013-01-01
Judea Pearl has argued that counterfactuals and causality are central to intelligence, whether natural or artificial, and has helped create a rich mathematical and computational framework for formally analyzing causality. Here, we draw out connections between these notions and various current issues in cognitive science, including the nature of…
Murrell, Paul
2005-01-01
R is revolutionizing the world of statistical computing. Powerful, flexible, and best of all free, R is now the program of choice for tens of thousands of statisticians. Destined to become an instant classic, R Graphics presents the first complete, authoritative exposition on the R graphical system. Paul Murrell, widely known as the leading expert on R graphics, has developed an in-depth resource that takes nothing for granted and helps both neophyte and seasoned users master the intricacies of R graphics. After an introductory overview of R graphics facilities, the presentation first focuses
Path generation algorithm for UML graphic modeling of aerospace test software
Qu, MingCheng; Wu, XiangHu; Tao, YongChao; Chen, Chao
2018-03-01
Aerospace traditional software testing engineers are based on their own work experience and communication with software development personnel to complete the description of the test software, manual writing test cases, time-consuming, inefficient, loopholes and more. Using the high reliability MBT tools developed by our company, the one-time modeling can automatically generate test case documents, which is efficient and accurate. UML model to describe the process accurately express the need to rely on the path is reached, the existing path generation algorithm are too simple, cannot be combined into a path and branch path with loop, or too cumbersome, too complicated arrangement generates a path is meaningless, for aerospace software testing is superfluous, I rely on our experience of ten load space, tailor developed a description of aerospace software UML graphics path generation algorithm.
REDUCED DATA FOR CURVE MODELING – APPLICATIONS IN GRAPHICS, COMPUTER VISION AND PHYSICS
Directory of Open Access Journals (Sweden)
Małgorzata Janik
2013-06-01
Full Text Available In this paper we consider the problem of modeling curves in Rn via interpolation without a priori specified interpolation knots. We discuss two approaches to estimate the missing knots for non-parametric data (i.e. collection of points. The first approach (uniform evaluation is based on blind guess in which knots are chosen uniformly. The second approach (cumulative chord parameterization incorporates the geometry of the distribution of data points. More precisely, the difference is equal to the Euclidean distance between data points qi+1 and qi. The second method partially compensates for the loss of the information carried by the reduced data. We also present the application of the above schemes for fitting non-parametric data in computer graphics (light-source motion rendering, in computer vision (image segmentation and in physics (high velocity particles trajectory modeling. Though experiments are conducted for points in R2 and R3 the entire method is equally applicable in Rn.
Mencarini, Letizia; Vignoli, Daniele; Gottard, Anna
2015-03-01
This paper studies fertility intentions and their outcomes, analyzing the complete path leading to fertility behavior according to the social psychological model of Theory Planned Behavior (TPB). We move beyond existing research using graphical models to have a precise understanding, and a formal description, of the developmental fertility decision-making process. Our findings yield new results for the Italian case which are empirically robust and theoretically coherent, adding important insights to the effectiveness of the TPB for fertility research. In line with TPB, all intentions' primary antecedents are found to be determinants of the level of fertility intentions, but do not affect fertility outcomes, being pre-filtered by fertility intentions. Nevertheless, in contrast with TPB, background factors are not fully mediated by intentions' primary antecedents, influencing directly fertility intentions and even fertility behaviors. Copyright © 2014 Elsevier Ltd. All rights reserved.
A scale-free structure prior for graphical models with applications in functional genomics.
Directory of Open Access Journals (Sweden)
Paul Sheridan
Full Text Available The problem of reconstructing large-scale, gene regulatory networks from gene expression data has garnered considerable attention in bioinformatics over the past decade with the graphical modeling paradigm having emerged as a popular framework for inference. Analysis in a full Bayesian setting is contingent upon the assignment of a so-called structure prior-a probability distribution on networks, encoding a priori biological knowledge either in the form of supplemental data or high-level topological features. A key topological consideration is that a wide range of cellular networks are approximately scale-free, meaning that the fraction, , of nodes in a network with degree is roughly described by a power-law with exponent between and . The standard practice, however, is to utilize a random structure prior, which favors networks with binomially distributed degree distributions. In this paper, we introduce a scale-free structure prior for graphical models based on the formula for the probability of a network under a simple scale-free network model. Unlike the random structure prior, its scale-free counterpart requires a node labeling as a parameter. In order to use this prior for large-scale network inference, we design a novel Metropolis-Hastings sampler for graphical models that includes a node labeling as a state space variable. In a simulation study, we demonstrate that the scale-free structure prior outperforms the random structure prior at recovering scale-free networks while at the same time retains the ability to recover random networks. We then estimate a gene association network from gene expression data taken from a breast cancer tumor study, showing that scale-free structure prior recovers hubs, including the previously unknown hub SLC39A6, which is a zinc transporter that has been implicated with the spread of breast cancer to the lymph nodes. Our analysis of the breast cancer expression data underscores the value of the scale
Dynamic causal modeling of touch-evoked potentials in the rubber hand illusion.
Zeller, Daniel; Friston, Karl J; Classen, Joseph
2016-09-01
The neural substrate of bodily ownership can be disclosed by the rubber hand illusion (RHI); namely, the illusory self-attribution of an artificial hand that is induced by synchronous tactile stimulation of the subject's hand that is hidden from view. Previous studies have pointed to the premotor cortex (PMC) as a pivotal area in such illusions. To investigate the effective connectivity between - and within - sensory and premotor areas involved in bodily perceptions, we used dynamic causal modeling of touch-evoked responses in 13 healthy subjects. Each subject's right hand was stroked while viewing their own hand ("REAL"), or an artificial hand presented in an anatomically plausible ("CONGRUENT") or implausible ("INCONGRUENT") position. Bayesian model comparison revealed strong evidence for a differential involvement of the PMC in the generation of touch-evoked responses under the three conditions, confirming a crucial role of PMC in bodily self-attribution. In brief, the extrinsic (forward) connection from left occipital cortex to left PMC was stronger for CONGRUENT and INCONGRUENT as compared to REAL, reflecting the augmentation of bottom-up visual input when multisensory integration is challenged. Crucially, intrinsic connectivity in the primary somatosensory cortex (S1) was attenuated in the CONGRUENT condition, during the illusory percept. These findings support predictive coding models of the functional architecture of multisensory integration (and attenuation) in bodily perceptual experience. Copyright © 2016 Elsevier Inc. All rights reserved.
Connectivity-based neurofeedback: Dynamic causal modeling for real-time fMRI☆
Koush, Yury; Rosa, Maria Joao; Robineau, Fabien; Heinen, Klaartje; W. Rieger, Sebastian; Weiskopf, Nikolaus; Vuilleumier, Patrik; Van De Ville, Dimitri; Scharnowski, Frank
2013-01-01
Neurofeedback based on real-time fMRI is an emerging technique that can be used to train voluntary control of brain activity. Such brain training has been shown to lead to behavioral effects that are specific to the functional role of the targeted brain area. However, real-time fMRI-based neurofeedback so far was limited to mainly training localized brain activity within a region of interest. Here, we overcome this limitation by presenting near real-time dynamic causal modeling in order to provide feedback information based on connectivity between brain areas rather than activity within a single brain area. Using a visual–spatial attention paradigm, we show that participants can voluntarily control a feedback signal that is based on the Bayesian model comparison between two predefined model alternatives, i.e. the connectivity between left visual cortex and left parietal cortex vs. the connectivity between right visual cortex and right parietal cortex. Our new approach thus allows for training voluntary control over specific functional brain networks. Because most mental functions and most neurological disorders are associated with network activity rather than with activity in a single brain region, this novel approach is an important methodological innovation in order to more directly target functionally relevant brain networks. PMID:23668967
Model Selection and Accounting for Model Uncertainty in Graphical Models Using OCCAM’s Window
1991-07-22
mental work; C, strenuous physical work; D, systolic blood pressure: E. ratio of 13 and Qt proteins; F, family anamnesis of coronary heart disease...of F, family anamnesis . The models are shown in Figure 4. 12 Table 1: Risk factors for Coronary lfeart Disea:W B No Yes A No Yes No Yes F E D C...a link from smoking (A) to systolic blood pressure (D). There is decisive evidence in favour of the marginal independence of family anamnesis of
Two graphical user interfaces for managing and analyzing MODFLOW groundwater-model scenarios
Banta, Edward R.
2014-01-01
Scenario Manager and Scenario Analyzer are graphical user interfaces that facilitate the use of calibrated, MODFLOW-based groundwater models for investigating possible responses to proposed stresses on a groundwater system. Scenario Manager allows a user, starting with a calibrated model, to design and run model scenarios by adding or modifying stresses simulated by the model. Scenario Analyzer facilitates the process of extracting data from model output and preparing such display elements as maps, charts, and tables. Both programs are designed for users who are familiar with the science on which groundwater modeling is based but who may not have a groundwater modeler’s expertise in building and calibrating a groundwater model from start to finish. With Scenario Manager, the user can manipulate model input to simulate withdrawal or injection wells, time-variant specified hydraulic heads, recharge, and such surface-water features as rivers and canals. Input for stresses to be simulated comes from user-provided geographic information system files and time-series data files. A Scenario Manager project can contain multiple scenarios and is self-documenting. Scenario Analyzer can be used to analyze output from any MODFLOW-based model; it is not limited to use with scenarios generated by Scenario Manager. Model-simulated values of hydraulic head, drawdown, solute concentration, and cell-by-cell flow rates can be presented in display elements. Map data can be represented as lines of equal value (contours) or as a gradated color fill. Charts and tables display time-series data obtained from output generated by a transient-state model run or from user-provided text files of time-series data. A display element can be based entirely on output of a single model run, or, to facilitate comparison of results of multiple scenarios, an element can be based on output from multiple model runs. Scenario Analyzer can export display elements and supporting metadata as a Portable
Benbenishty, Rami; Astor, Ron Avi; Roziner, Ilan; Wrabel, Stephani L.
2016-01-01
The present study explores the causal link between school climate, school violence, and a school's general academic performance over time using a school-level, cross-lagged panel autoregressive modeling design. We hypothesized that reductions in school violence and climate improvement would lead to schools' overall improved academic performance.…
Directory of Open Access Journals (Sweden)
Ämin Baumeler
2017-07-01
Full Text Available Computation models such as circuits describe sequences of computation steps that are carried out one after the other. In other words, algorithm design is traditionally subject to the restriction imposed by a fixed causal order. We address a novel computing paradigm beyond quantum computing, replacing this assumption by mere logical consistency: We study non-causal circuits, where a fixed time structure within a gate is locally assumed whilst the global causal structure between the gates is dropped. We present examples of logically consistent non-causal circuits outperforming all causal ones; they imply that suppressing loops entirely is more restrictive than just avoiding the contradictions they can give rise to. That fact is already known for correlations as well as for communication, and we here extend it to computation.
Balfer, Jenny; Bajorath, Jürgen
2014-09-22
Supervised machine learning models are widely used in chemoinformatics, especially for the prediction of new active compounds or targets of known actives. Bayesian classification methods are among the most popular machine learning approaches for the prediction of activity from chemical structure. Much work has focused on predicting structure-activity relationships (SARs) on the basis of experimental training data. By contrast, only a few efforts have thus far been made to rationalize the performance of Bayesian or other supervised machine learning models and better understand why they might succeed or fail. In this study, we introduce an intuitive approach for the visualization and graphical interpretation of naïve Bayesian classification models. Parameters derived during supervised learning are visualized and interactively analyzed to gain insights into model performance and identify features that determine predictions. The methodology is introduced in detail and applied to assess Bayesian modeling efforts and predictions on compound data sets of varying structural complexity. Different classification models and features determining their performance are characterized in detail. A prototypic implementation of the approach is provided.
Glossiness of Colored Papers based on Computer Graphics Model and Its Measuring Method
Aida, Teizo
In the case of colored papers, the color of surface effects strongly upon the gloss of its paper. The new glossiness for such a colored paper is suggested in this paper. First, using the Achromatic and Chromatic Munsell colored chips, the author obtained experimental equation which represents the relation between lightness V ( or V and saturation C ) and psychological glossiness Gph of these chips. Then, the author defined a new glossiness G for the colored papers, based on the above mentioned experimental equations Gph and Cook-Torrance's reflection model which are widely used in the filed of Computer Graphics. This new glossiness is shown to be nearly proportional to the psychological glossiness Gph. The measuring system for the new glossiness G is furthermore descrived. The measuring time for one specimen is within 1 minute.
Shaded computer graphic techniques for visualizing and interpreting analytic fluid flow models
Parke, F. I.
1981-01-01
Mathematical models which predict the behavior of fluid flow in different experiments are simulated using digital computers. The simulations predict values of parameters of the fluid flow (pressure, temperature and velocity vector) at many points in the fluid. Visualization of the spatial variation in the value of these parameters is important to comprehend and check the data generated, to identify the regions of interest in the flow, and for effectively communicating information about the flow to others. The state of the art imaging techniques developed in the field of three dimensional shaded computer graphics is applied to visualization of fluid flow. Use of an imaging technique known as 'SCAN' for visualizing fluid flow, is studied and the results are presented.
uPy: a ubiquitous computer graphics Python API with Biological Modeling Applications
Autin, L.; Johnson, G.; Hake, J.; Olson, A.; Sanner, M.
2015-01-01
In this paper we describe uPy, an extension module for the Python programming language that provides a uniform abstraction of the APIs of several 3D computer graphics programs called hosts, including: Blender, Maya, Cinema4D, and DejaVu. A plugin written with uPy is a unique piece of code that will run in all uPy-supported hosts. We demonstrate the creation of complex plug-ins for molecular/cellular modeling and visualization and discuss how uPy can more generally simplify programming for many types of projects (not solely science applications) intended for multi-host distribution. uPy is available at http://upy.scripps.edu PMID:24806987
Comparison of two integration methods for dynamic causal modeling of electrophysiological data.
Lemaréchal, Jean-Didier; George, Nathalie; David, Olivier
2018-06-01
Dynamic causal modeling (DCM) is a methodological approach to study effective connectivity among brain regions. Based on a set of observations and a biophysical model of brain interactions, DCM uses a Bayesian framework to estimate the posterior distribution of the free parameters of the model (e.g. modulation of connectivity) and infer architectural properties of the most plausible model (i.e. model selection). When modeling electrophysiological event-related responses, the estimation of the model relies on the integration of the system of delay differential equations (DDEs) that describe the dynamics of the system. In this technical note, we compared two numerical schemes for the integration of DDEs. The first, and standard, scheme approximates the DDEs (more precisely, the state of the system, with respect to conduction delays among brain regions) using ordinary differential equations (ODEs) and solves it with a fixed step size. The second scheme uses a dedicated DDEs solver with adaptive step sizes to control error, making it theoretically more accurate. To highlight the effects of the approximation used by the first integration scheme in regard to parameter estimation and Bayesian model selection, we performed simulations of local field potentials using first, a simple model comprising 2 regions and second, a more complex model comprising 6 regions. In these simulations, the second integration scheme served as the standard to which the first one was compared. Then, the performances of the two integration schemes were directly compared by fitting a public mismatch negativity EEG dataset with different models. The simulations revealed that the use of the standard DCM integration scheme was acceptable for Bayesian model selection but underestimated the connectivity parameters and did not allow an accurate estimation of conduction delays. Fitting to empirical data showed that the models systematically obtained an increased accuracy when using the second
Infertile Individuals’ Marital Relationship Status, Happiness, and Mental Health: A Causal Model
Directory of Open Access Journals (Sweden)
Seyed Habiballah Ahmadi Forooshany
2014-11-01
Full Text Available Background: This study examined the causal model of relation between marital relationship status, happiness, and mental health in infertile individuals. Materials and Methods: In this descriptive study, 155 subjects (men: 52 and women: 78, who had been visited in one of the infertility Centers, voluntarily participated in a self-evaluation. Golombok Rust Inventory of Marital Status, Oxford Happiness Questionnaire, and General Health Questionnaire were used as instruments of the study. Data was analyzed by SPSS17 and Amos 5 software using descriptive statistics, independent sample t test, and path analysis. Results: Disregarding the gender factor, marital relationship status was directly related to happiness (p<0.05 and happiness was directly related to mental health, (p<0.05. Also, indirect relation between marital relationship status and mental health was significant (p<0.05. These results were confirmed in women participants but in men participants only the direct relation between happiness and mental health was significant (p<0.05. Conclusion: Based on goodness of model fit in fitness indexes, happiness had a mediator role in relation between marital relationship status and mental health in infertile individuals disregarding the gender factor. Also, considering the gender factor, only in infertile women, marital relationship status can directly and indirectly affect happiness and mental health.
Gaussian Graphical Models Identify Networks of Dietary Intake in a German Adult Population.
Iqbal, Khalid; Buijsse, Brian; Wirth, Janine; Schulze, Matthias B; Floegel, Anna; Boeing, Heiner
2016-03-01
Data-reduction methods such as principal component analysis are often used to derive dietary patterns. However, such methods do not assess how foods are consumed in relation to each other. Gaussian graphical models (GGMs) are a set of novel methods that can address this issue. We sought to apply GGMs to derive sex-specific dietary intake networks representing consumption patterns in a German adult population. Dietary intake data from 10,780 men and 16,340 women of the European Prospective Investigation into Cancer and Nutrition (EPIC)-Potsdam cohort were cross-sectionally analyzed to construct dietary intake networks. Food intake for each participant was estimated using a 148-item food-frequency questionnaire that captured the intake of 49 food groups. GGMs were applied to log-transformed intakes (grams per day) of 49 food groups to construct sex-specific food networks. Semiparametric Gaussian copula graphical models (SGCGMs) were used to confirm GGM results. In men, GGMs identified 1 major dietary network that consisted of intakes of red meat, processed meat, cooked vegetables, sauces, potatoes, cabbage, poultry, legumes, mushrooms, soup, and whole-grain and refined breads. For women, a similar network was identified with the addition of fried potatoes. Other identified networks consisted of dairy products and sweet food groups. SGCGMs yielded results comparable to those of GGMs. GGMs are a powerful exploratory method that can be used to construct dietary networks representing dietary intake patterns that reveal how foods are consumed in relation to each other. GGMs indicated an apparent major role of red meat intake in a consumption pattern in the studied population. In the future, identified networks might be transformed into pattern scores for investigating their associations with health outcomes. © 2016 American Society for Nutrition.
Presenting the students’ academic achievement causal model based on goal orientation
Directory of Open Access Journals (Sweden)
EBRAHIM NASIRI
2017-10-01
Full Text Available Introduction: Several factors play a role in academic achievement, individual’s excellence and capability to do actions and tasks that the learner is in charge of in learning areas. The main goal of this study was to present academic achievement causal model based on the dimensions of goal orientation and learning approaches among the students of Medical Science and Dentistry courses in Guilan University of Medical Sciences in 2013. Methods: This study is based on a cross-sectional model. The participants included 175 first and second year students of the Medical and Dentistry schools in Guilan University of Medical Sciences selected by random cluster sampling [121 persons (69% Medical Basic Science students and 54 (30.9% Dentistry students]. The measurement tool included the Goal Orientation Scale of Bouffard and Study Process Questionnaire of Biggs and the students’ Grade Point Average. The study data were analyzed using Pearson correlation coefficient and structural equations modeling. SPSS 14 and Amos were used to analyze the data. Results: The results indicated a significant relationship between goal orientation and learning strategies (P<0.05. In addition, the results revealed that a significant relationship exists between learning strategies [Deep Learning (r=0.37, P<0.05, Surface Learning (r=-0.21, P<0.05], and academic achievement. The suggested model of research is fitted to the data of the research. Conclusion: Results showed that the students’ academic achievement model fits with experimental data, so it can be used in learning principles which lead to students’ achievement in learning.
Mathematical structures for computer graphics
Janke, Steven J
2014-01-01
A comprehensive exploration of the mathematics behind the modeling and rendering of computer graphics scenes Mathematical Structures for Computer Graphics presents an accessible and intuitive approach to the mathematical ideas and techniques necessary for two- and three-dimensional computer graphics. Focusing on the significant mathematical results, the book establishes key algorithms used to build complex graphics scenes. Written for readers with various levels of mathematical background, the book develops a solid foundation for graphics techniques and fills in relevant grap
Higher-order ice-sheet modelling accelerated by multigrid on graphics cards
Brædstrup, Christian; Egholm, David
2013-04-01
Higher-order ice flow modelling is a very computer intensive process owing primarily to the nonlinear influence of the horizontal stress coupling. When applied for simulating long-term glacial landscape evolution, the ice-sheet models must consider very long time series, while both high temporal and spatial resolution is needed to resolve small effects. The use of higher-order and full stokes models have therefore seen very limited usage in this field. However, recent advances in graphics card (GPU) technology for high performance computing have proven extremely efficient in accelerating many large-scale scientific computations. The general purpose GPU (GPGPU) technology is cheap, has a low power consumption and fits into a normal desktop computer. It could therefore provide a powerful tool for many glaciologists working on ice flow models. Our current research focuses on utilising the GPU as a tool in ice-sheet and glacier modelling. To this extent we have implemented the Integrated Second-Order Shallow Ice Approximation (iSOSIA) equations on the device using the finite difference method. To accelerate the computations, the GPU solver uses a non-linear Red-Black Gauss-Seidel iterator coupled with a Full Approximation Scheme (FAS) multigrid setup to further aid convergence. The GPU finite difference implementation provides the inherent parallelization that scales from hundreds to several thousands of cores on newer cards. We demonstrate the efficiency of the GPU multigrid solver using benchmark experiments.
Quantum-Assisted Learning of Hardware-Embedded Probabilistic Graphical Models
Benedetti, Marcello; Realpe-Gómez, John; Biswas, Rupak; Perdomo-Ortiz, Alejandro
2017-10-01
Mainstream machine-learning techniques such as deep learning and probabilistic programming rely heavily on sampling from generally intractable probability distributions. There is increasing interest in the potential advantages of using quantum computing technologies as sampling engines to speed up these tasks or to make them more effective. However, some pressing challenges in state-of-the-art quantum annealers have to be overcome before we can assess their actual performance. The sparse connectivity, resulting from the local interaction between quantum bits in physical hardware implementations, is considered the most severe limitation to the quality of constructing powerful generative unsupervised machine-learning models. Here, we use embedding techniques to add redundancy to data sets, allowing us to increase the modeling capacity of quantum annealers. We illustrate our findings by training hardware-embedded graphical models on a binarized data set of handwritten digits and two synthetic data sets in experiments with up to 940 quantum bits. Our model can be trained in quantum hardware without full knowledge of the effective parameters specifying the corresponding quantum Gibbs-like distribution; therefore, this approach avoids the need to infer the effective temperature at each iteration, speeding up learning; it also mitigates the effect of noise in the control parameters, making it robust to deviations from the reference Gibbs distribution. Our approach demonstrates the feasibility of using quantum annealers for implementing generative models, and it provides a suitable framework for benchmarking these quantum technologies on machine-learning-related tasks.
Quantum-Assisted Learning of Hardware-Embedded Probabilistic Graphical Models
Directory of Open Access Journals (Sweden)
Marcello Benedetti
2017-11-01
Full Text Available Mainstream machine-learning techniques such as deep learning and probabilistic programming rely heavily on sampling from generally intractable probability distributions. There is increasing interest in the potential advantages of using quantum computing technologies as sampling engines to speed up these tasks or to make them more effective. However, some pressing challenges in state-of-the-art quantum annealers have to be overcome before we can assess their actual performance. The sparse connectivity, resulting from the local interaction between quantum bits in physical hardware implementations, is considered the most severe limitation to the quality of constructing powerful generative unsupervised machine-learning models. Here, we use embedding techniques to add redundancy to data sets, allowing us to increase the modeling capacity of quantum annealers. We illustrate our findings by training hardware-embedded graphical models on a binarized data set of handwritten digits and two synthetic data sets in experiments with up to 940 quantum bits. Our model can be trained in quantum hardware without full knowledge of the effective parameters specifying the corresponding quantum Gibbs-like distribution; therefore, this approach avoids the need to infer the effective temperature at each iteration, speeding up learning; it also mitigates the effect of noise in the control parameters, making it robust to deviations from the reference Gibbs distribution. Our approach demonstrates the feasibility of using quantum annealers for implementing generative models, and it provides a suitable framework for benchmarking these quantum technologies on machine-learning-related tasks.
Optimal causal inference: estimating stored information and approximating causal architecture.
Still, Susanne; Crutchfield, James P; Ellison, Christopher J
2010-09-01
We introduce an approach to inferring the causal architecture of stochastic dynamical systems that extends rate-distortion theory to use causal shielding--a natural principle of learning. We study two distinct cases of causal inference: optimal causal filtering and optimal causal estimation. Filtering corresponds to the ideal case in which the probability distribution of measurement sequences is known, giving a principled method to approximate a system's causal structure at a desired level of representation. We show that in the limit in which a model-complexity constraint is relaxed, filtering finds the exact causal architecture of a stochastic dynamical system, known as the causal-state partition. From this, one can estimate the amount of historical information the process stores. More generally, causal filtering finds a graded model-complexity hierarchy of approximations to the causal architecture. Abrupt changes in the hierarchy, as a function of approximation, capture distinct scales of structural organization. For nonideal cases with finite data, we show how the correct number of the underlying causal states can be found by optimal causal estimation. A previously derived model-complexity control term allows us to correct for the effect of statistical fluctuations in probability estimates and thereby avoid overfitting.
Repositioning the knee joint in human body FE models using a graphics-based technique.
Jani, Dhaval; Chawla, Anoop; Mukherjee, Sudipto; Goyal, Rahul; Vusirikala, Nataraju; Jayaraman, Suresh
2012-01-01
Human body finite element models (FE-HBMs) are available in standard occupant or pedestrian postures. There is a need to have FE-HBMs in the same posture as a crash victim or to be configured in varying postures. Developing FE models for all possible positions is not practically viable. The current work aims at obtaining a posture-specific human lower extremity model by reconfiguring an existing one. A graphics-based technique was developed to reposition the lower extremity of an FE-HBM by specifying the flexion-extension angle. Elements of the model were segregated into rigid (bones) and deformable components (soft tissues). The bones were rotated about the flexion-extension axis followed by rotation about the longitudinal axis to capture the twisting of the tibia. The desired knee joint movement was thus achieved. Geometric heuristics were then used to reposition the skin. A mapping defined over the space between bones and the skin was used to regenerate the soft tissues. Mesh smoothing was then done to augment mesh quality. The developed method permits control over the kinematics of the joint and maintains the initial mesh quality of the model. For some critical areas (in the joint vicinity) where element distortion is large, mesh smoothing is done to improve mesh quality. A method to reposition the knee joint of a human body FE model was developed. Repositions of a model from 9 degrees of flexion to 90 degrees of flexion in just a few seconds without subjective interventions was demonstrated. Because the mesh quality of the repositioned model was maintained to a predefined level (typically to the level of a well-made model in the initial configuration), the model was suitable for subsequent simulations.
On modeling HIV and T cells in vivo: assessing causal estimators in vaccine trials.
Directory of Open Access Journals (Sweden)
W David Wick
2006-06-01
Full Text Available The first efficacy trials--named STEP--of a T cell vaccine against HIV/AIDS began in 2004. The unprecedented structure of these trials raised new modeling and statistical challenges. Is it plausible that memory T cells, as opposed to antibodies, can actually prevent infection? If they fail at prevention, to what extent can they ameliorate disease? And how do we estimate efficacy in a vaccine trial with two primary endpoints, one traditional, one entirely novel (viral load after infection, and where the latter may be influenced by selection bias due to the former? In preparation for the STEP trials, biostatisticians developed novel techniques for estimating a causal effect of a vaccine on viral load, while accounting for post-randomization selection bias. But these techniques have not been tested in biologically plausible scenarios. We introduce new stochastic models of T cell and HIV kinetics, making use of new estimates of the rate that cytotoxic T lymphocytes--CTLs; the so-called killer T cells--can kill HIV-infected cells. Based on these models, we make the surprising discovery that it is not entirely implausible that HIV-specific CTLs might prevent infection--as the designers explicitly acknowledged when they chose the endpoints of the STEP trials. By simulating thousands of trials, we demonstrate that the new statistical methods can correctly identify an efficacious vaccine, while protecting against a false conclusion that the vaccine exacerbates disease. In addition to uncovering a surprising immunological scenario, our results illustrate the utility of mechanistic modeling in biostatistics.
Network interactions underlying mirror feedback in stroke: A dynamic causal modeling study
Directory of Open Access Journals (Sweden)
Soha Saleh
2017-01-01
Full Text Available Mirror visual feedback (MVF is potentially a powerful tool to facilitate recovery of disordered movement and stimulate activation of under-active brain areas due to stroke. The neural mechanisms underlying MVF have therefore been a focus of recent inquiry. Although it is known that sensorimotor areas can be activated via mirror feedback, the network interactions driving this effect remain unknown. The aim of the current study was to fill this gap by using dynamic causal modeling to test the interactions between regions in the frontal and parietal lobes that may be important for modulating the activation of the ipsilesional motor cortex during mirror visual feedback of unaffected hand movement in stroke patients. Our intent was to distinguish between two theoretical neural mechanisms that might mediate ipsilateral activation in response to mirror-feedback: transfer of information between bilateral motor cortices versus recruitment of regions comprising an action observation network which in turn modulate the motor cortex. In an event-related fMRI design, fourteen chronic stroke subjects performed goal-directed finger flexion movements with their unaffected hand while observing real-time visual feedback of the corresponding (veridical or opposite (mirror hand in virtual reality. Among 30 plausible network models that were tested, the winning model revealed significant mirror feedback-based modulation of the ipsilesional motor cortex arising from the contralesional parietal cortex, in a region along the rostral extent of the intraparietal sulcus. No winning model was identified for the veridical feedback condition. We discuss our findings in the context of supporting the latter hypothesis, that mirror feedback-based activation of motor cortex may be attributed to engagement of a contralateral (contralesional action observation network. These findings may have important implications for identifying putative cortical areas, which may be targeted with
Infertile Individuals’ Marital Relationship Status, Happiness, and Mental Health: A Causal Model
Ahmadi Forooshany, Seyed Habiballah; Yazdkhasti, Fariba; Safari Hajataghaie, Saiede; Nasr Esfahani, Mohammad Hossein
2014-01-01
Background This study examined the causal model of relation between marital relation- ship status, happiness, and mental health in infertile individuals. Materials and Methods In this descriptive study, 155 subjects (men: 52 and women: 78), who had been visited in one of the infertility Centers, voluntarily participated in a self-evaluation. Golombok Rust Inventory of Marital Status, Oxford Happiness Ques- tionnaire, and General Health Questionnaire were used as instruments of the study. Data was analyzed by SPSS17 and Amos 5 software using descriptive statistics, independent sample t test, and path analysis. Results Disregarding the gender factor, marital relationship status was directly related to happiness (phappiness was directly related to mental health, (phappiness and mental health was significant (phappiness had a mediator role in relation between marital relationship status and mental health in infertile individu- als disregarding the gender factor. Also, considering the gender factor, only in infertile women, marital relationship status can directly and indirectly affect happiness and mental health. PMID:25379161
Monk, J.; Zhu, Y.; Koons, P. O.; Segee, B. E.
2009-12-01
With the introduction of the G8X series of cards by nVidia an architecture called CUDA was released, virtually all subsequent video cards have had CUDA support. With this new architecture nVidia provided extensions for C/C++ that create an Application Programming Interface (API) allowing code to be executed on the GPU. Since then the concept of GPGPU (general purpose graphics processing unit) has been growing, this is the concept that the GPU is very good a algebra and running things in parallel so we should take use of that power for other applications. This is highly appealing in the area of geodynamic modeling, as multiple parallel solutions of the same differential equations at different points in space leads to a large speedup in simulation speed. Another benefit of CUDA is a programmatic method of transferring large amounts of data between the computer's main memory and the dedicated GPU memory located on the video card. In addition to being able to compute and render on the video card, the CUDA framework allows for a large speedup in the situation, such as with a tiled display wall, where the rendered pixels are to be displayed in a different location than where they are rendered. A CUDA extension for VirtualGL was developed allowing for faster read back at high resolutions. This paper examines several aspects of rendering OpenGL graphics on large displays using VirtualGL and VNC. It demonstrates how performance can be significantly improved in rendering on a tiled monitor wall. We present a CUDA enhanced version of VirtualGL as well as the advantages to having multiple VNC servers. It will discuss restrictions caused by read back and blitting rates and how they are affected by different sizes of virtual displays being rendered.
Meznarich, R. A.; Shava, R. C.; Lightner, S. L.
2009-01-01
Engineering design graphics courses taught in colleges or universities should provide and equip students preparing for employment with the basic occupational graphics skill competences required by engineering and technology disciplines. Academic institutions should introduce and include topics that cover the newer and more efficient graphics…
Directory of Open Access Journals (Sweden)
Andres eOrtiz
2015-11-01
Full Text Available Alzheimer’s Disease (AD is the most common neurodegenerative disease in elderly people. Itsdevelopment has been shown to be closely related to changes in the brain connectivity networkand in the brain activation patterns along with structural changes caused by the neurodegenerativeprocess.Methods to infer dependence between brain regions are usually derived from the analysis ofcovariance between activation levels in the different areas. However, these covariance-basedmethods are not able to estimate conditional independence between variables to factor out theinfluence of other regions. Conversely, models based on the inverse covariance, or precisionmatrix, such as Sparse Gaussian Graphical Models allow revealing conditional independencebetween regions by estimating the covariance between two variables given the rest as constant.This paper uses Sparse Inverse Covariance Estimation (SICE methods to learn undirectedgraphs in order to derive functional and structural connectivity patterns from Fludeoxyglucose(18F-FDG Position Emission Tomography (PET data and segmented Magnetic Resonanceimages (MRI, drawn from the ADNI database, for Control, MCI (Mild Cognitive ImpairmentSubjects and AD subjects. Sparse computation fits perfectly here as brain regions usually onlyinteract with a few other areas.The models clearly show different metabolic covariation patters between subject groups, revealingthe loss of strong connections in AD and MCI subjects when compared to Controls. Similarly,the variance between GM (Grey Matter densities of different regions reveals different structuralcovariation patterns between the different groups. Thus, the different connectivity patterns forcontrols and AD are used in this paper to select regions of interest in PET and GM images withdiscriminative power for early AD diagnosis. Finally, functional an structural models are combinedto leverage the classification accuracy.The results obtained in this work show the usefulness
Causal modeling of self-concept, job satisfaction, and retention of nurses.
Cowin, Leanne S; Johnson, Maree; Craven, Rhonda G; Marsh, Herbert W
2008-10-01
The critical shortage of nurses experienced throughout the western world has prompted researchers to examine one major component of this complex problem - the impact of nurses' professional identity and job satisfaction on retention. A descriptive correlational design with a longitudinal element was used to examine a causal model of nurses' self-concept, job satisfaction, and retention plans in 2002. A random sample of 2000 registered nurses was selected from the state registering authority listing. A postal survey assessing multiple dimensions of nurses' self-concept (measured by the nurse self-concept questionnaire), job satisfaction (measured by the index of work satisfaction) was undertaken at Time 1 (n=528) and 8 months later at Time 2 (n=332) (including retention plans (measured by the Nurse Retention Index). Using confirmatory factor analysis, correlation matrices and path analysis, measurement and structural models were examined on matching pairs of data from T1 and T2 (total sample N=332). Nurses' self-concept was found to have a stronger association with nurses' retention plans (B=.45) than job satisfaction (B=.28). Aspects of pay and task were not significantly related to retention plans, however, professional status (r=.51), and to a lesser extent, organizational policies (r=.27) were significant factors. Nurses' general self-concept was strongly related (r=.57) to retention plans. Strategies or interventions requiring implementation and evaluation include: counseling to improve nurse general self-concept, education programs and competencies in health communication between health professionals, reporting of nurse-initiated programs with substantial patient benefit, nurse-friendly organizational policies, common health team learning opportunities, and autonomous practice models.
Causality and headache triggers
Turner, Dana P.; Smitherman, Todd A.; Martin, Vincent T.; Penzien, Donald B.; Houle, Timothy T.
2013-01-01
Objective The objective of this study was to explore the conditions necessary to assign causal status to headache triggers. Background The term “headache trigger” is commonly used to label any stimulus that is assumed to cause headaches. However, the assumptions required for determining if a given stimulus in fact has a causal-type relationship in eliciting headaches have not been explicated. Methods A synthesis and application of Rubin’s Causal Model is applied to the context of headache causes. From this application the conditions necessary to infer that one event (trigger) causes another (headache) are outlined using basic assumptions and examples from relevant literature. Results Although many conditions must be satisfied for a causal attribution, three basic assumptions are identified for determining causality in headache triggers: 1) constancy of the sufferer; 2) constancy of the trigger effect; and 3) constancy of the trigger presentation. A valid evaluation of a potential trigger’s effect can only be undertaken once these three basic assumptions are satisfied during formal or informal studies of headache triggers. Conclusions Evaluating these assumptions is extremely difficult or infeasible in clinical practice, and satisfying them during natural experimentation is unlikely. Researchers, practitioners, and headache sufferers are encouraged to avoid natural experimentation to determine the causal effects of headache triggers. Instead, formal experimental designs or retrospective diary studies using advanced statistical modeling techniques provide the best approaches to satisfy the required assumptions and inform causal statements about headache triggers. PMID:23534872
Kammerdiner, Alla; Xanthopoulos, Petros; Pardalos, Panos M.
2007-11-01
In this chapter a potential problem with application of the Granger-causality based on the simple vector autoregressive (VAR) modeling to EEG data is investigated. Although some initial studies tested whether the data support the stationarity assumption of VAR, the stability of the estimated model is rarely (if ever) been verified. In fact, in cases when the stability condition is violated the process may exhibit a random walk like behavior or even be explosive. The problem is illustrated by an example.
Pairwise graphical models for structural health monitoring with dense sensor arrays
Mohammadi Ghazi, Reza; Chen, Justin G.; Büyüköztürk, Oral
2017-09-01
Through advances in sensor technology and development of camera-based measurement techniques, it has become affordable to obtain high spatial resolution data from structures. Although measured datasets become more informative by increasing the number of sensors, the spatial dependencies between sensor data are increased at the same time. Therefore, appropriate data analysis techniques are needed to handle the inference problem in presence of these dependencies. In this paper, we propose a novel approach that uses graphical models (GM) for considering the spatial dependencies between sensor measurements in dense sensor networks or arrays to improve damage localization accuracy in structural health monitoring (SHM) application. Because there are always unobserved damaged states in this application, the available information is insufficient for learning the GMs. To overcome this challenge, we propose an approximated model that uses the mutual information between sensor measurements to learn the GMs. The study is backed by experimental validation of the method on two test structures. The first is a three-story two-bay steel model structure that is instrumented by MEMS accelerometers. The second experimental setup consists of a plate structure and a video camera to measure the displacement field of the plate. Our results show that considering the spatial dependencies by the proposed algorithm can significantly improve damage localization accuracy.
Thompson, John
2009-01-01
Graphic storytelling is a medium that allows students to make and share stories, while developing their art communication skills. American comics today are more varied in genre, approach, and audience than ever before. When considering the impact of Japanese manga on the youth, graphic storytelling emerges as a powerful player in pop culture. In…
Skataric, Maja; Bose, Sandip; Zeroug, Smaine; Tilke, Peter
2017-02-01
It is not uncommon in the field of non-destructive evaluation that multiple measurements encompassing a variety of modalities are available for analysis and interpretation for determining the underlying states of nature of the materials or parts being tested. Despite and sometimes due to the richness of data, significant challenges arise in the interpretation manifested as ambiguities and inconsistencies due to various uncertain factors in the physical properties (inputs), environment, measurement device properties, human errors, and the measurement data (outputs). Most of these uncertainties cannot be described by any rigorous mathematical means, and modeling of all possibilities is usually infeasible for many real time applications. In this work, we will discuss an approach based on Hierarchical Bayesian Graphical Models (HBGM) for the improved interpretation of complex (multi-dimensional) problems with parametric uncertainties that lack usable physical models. In this setting, the input space of the physical properties is specified through prior distributions based on domain knowledge and expertise, which are represented as Gaussian mixtures to model the various possible scenarios of interest for non-destructive testing applications. Forward models are then used offline to generate the expected distribution of the proposed measurements which are used to train a hierarchical Bayesian network. In Bayesian analysis, all model parameters are treated as random variables, and inference of the parameters is made on the basis of posterior distribution given the observed data. Learned parameters of the posterior distribution obtained after the training can therefore be used to build an efficient classifier for differentiating new observed data in real time on the basis of pre-trained models. We will illustrate the implementation of the HBGM approach to ultrasonic measurements used for cement evaluation of cased wells in the oil industry.
A graphical method for reducing and relating models in systems biology.
Gay, Steven; Soliman, Sylvain; Fages, François
2010-09-15
In Systems Biology, an increasing collection of models of various biological processes is currently developed and made available in publicly accessible repositories, such as biomodels.net for instance, through common exchange formats such as SBML. To date, however, there is no general method to relate different models to each other by abstraction or reduction relationships, and this task is left to the modeler for re-using and coupling models. In mathematical biology, model reduction techniques have been studied for a long time, mainly in the case where a model exhibits different time scales, or different spatial phases, which can be analyzed separately. These techniques are however far too restrictive to be applied on a large scale in systems biology, and do not take into account abstractions other than time or phase decompositions. Our purpose here is to propose a general computational method for relating models together, by considering primarily the structure of the interactions and abstracting from their dynamics in a first step. We present a graph-theoretic formalism with node merge and delete operations, in which model reductions can be studied as graph matching problems. From this setting, we derive an algorithm for deciding whether there exists a reduction from one model to another, and evaluate it on the computation of the reduction relations between all SBML models of the biomodels.net repository. In particular, in the case of the numerous models of MAPK signalling, and of the circadian clock, biologically meaningful mappings between models of each class are automatically inferred from the structure of the interactions. We conclude on the generality of our graphical method, on its limits with respect to the representation of the structure of the interactions in SBML, and on some perspectives for dealing with the dynamics. The algorithms described in this article are implemented in the open-source software modeling platform BIOCHAM available at http
Li, Liang; Li, Baojuan; Bai, Yuanhan; Liu, Wenlei; Wang, Huaning; Leung, Hoi-Chung; Tian, Ping; Zhang, Linchuan; Guo, Fan; Cui, Long-Biao; Yin, Hong; Lu, Hongbing; Tan, Qingrong
2017-07-01
Understanding the neural basis underlying major depressive disorder (MDD) is essential for the diagnosis and treatment of this mental disorder. Aberrant activation and functional connectivity of the default mode network (DMN) have been consistently found in patients with MDD. It is not known whether effective connectivity within the DMN is altered in MDD. The primary object of this study is to investigate the effective connectivity within the DMN during resting state in MDD patients before and after eight weeks of antidepressant treatment. We defined four regions of the DMN (medial frontal cortex, posterior cingulate cortex, left parietal cortex, and right parietal cortex) for each participant using a group independent component analysis. The coupling parameters reflecting the causal interactions among the DMN regions were estimated using spectral dynamic causal modeling (DCM). Twenty-seven MDD patients and 27 healthy controls were included in the statistical analysis. Our results showed declined influences from the left parietal cortex to other DMN regions in the pre-treatment patients as compared with healthy controls. After eight weeks of treatment, the influence from the right parietal cortex to the posterior cingulate cortex significantly decreased. These findings suggest that the reduced excitatory causal influence of the left parietal cortex is the key alteration of the DMN in patients with MDD, and the disrupted causal influences that parietal cortex exerts on the posterior cingulate cortex is responsive to antidepressant treatment.
Morabia, Alfredo
2005-01-01
Epidemiological methods, which combine population thinking and group comparisons, can primarily identify causes of disease in populations. There is therefore a tension between our intuitive notion of a cause, which we want to be deterministic and invariant at the individual level, and the epidemiological notion of causes, which are invariant only at the population level. Epidemiologists have given heretofore a pragmatic solution to this tension. Causal inference in epidemiology consists in checking the logical coherence of a causality statement and determining whether what has been found grossly contradicts what we think we already know: how strong is the association? Is there a dose-response relationship? Does the cause precede the effect? Is the effect biologically plausible? Etc. This approach to causal inference can be traced back to the English philosophers David Hume and John Stuart Mill. On the other hand, the mode of establishing causality, devised by Jakob Henle and Robert Koch, which has been fruitful in bacteriology, requires that in every instance the effect invariably follows the cause (e.g., inoculation of Koch bacillus and tuberculosis). This is incompatible with epidemiological causality which has to deal with probabilistic effects (e.g., smoking and lung cancer), and is therefore invariant only for the population.
Directory of Open Access Journals (Sweden)
Kateryna P. Osadcha
2017-12-01
Full Text Available The article is devoted to some aspects of the formation of future bachelor's graphic competence in computer sciences while teaching the fundamentals for working with three-dimensional modelling means. The analysis, classification and systematization of three-dimensional modelling means are given. The aim of research consists in investigating the set of instruments and classification of three-dimensional modelling means and correlation of skills, which are being formed, concerning inquired ones at the labour market in order to use them further in the process of forming graphic competence during training future bachelors in computer sciences. The peculiarities of the process of forming future bachelor's graphic competence in computer sciences by means of revealing, analyzing and systematizing three-dimensional modelling means and types of three-dimensional graphics at present stage of the development of informational technologies are traced a line round. The result of the research is a soft-ware choice in three-dimensional modelling for the process of training future bachelors in computer sciences.
Directory of Open Access Journals (Sweden)
Antonio Luis Ampliato Briones
2014-10-01
Full Text Available This paper primarily reflects on the need to create graphical codes for producing images intended to communicate architecture. Each step of the drawing needs to be a deliberate process in which the proposed code highlights the relationship between architectural theory and graphic action. Our aim is not to draw the result of the architectural process but the design structure of the actual process; to draw as we design; to draw as we build. This analysis of the work of the Late Gothic architect Hernan Ruiz the Elder, from Cordoba, addresses two aspects: the historical and architectural investigation, and the graphical project for communication purposes.
Directory of Open Access Journals (Sweden)
Jinping Sun
2017-01-01
Full Text Available The multiple hypothesis tracker (MHT is currently the preferred method for addressing data association problem in multitarget tracking (MTT application. MHT seeks the most likely global hypothesis by enumerating all possible associations over time, which is equal to calculating maximum a posteriori (MAP estimate over the report data. Despite being a well-studied method, MHT remains challenging mostly because of the computational complexity of data association. In this paper, we describe an efficient method for solving the data association problem using graphical model approaches. The proposed method uses the graph representation to model the global hypothesis formation and subsequently applies an efficient message passing algorithm to obtain the MAP solution. Specifically, the graph representation of data association problem is formulated as a maximum weight independent set problem (MWISP, which translates the best global hypothesis formation into finding the maximum weight independent set on the graph. Then, a max-product belief propagation (MPBP inference algorithm is applied to seek the most likely global hypotheses with the purpose of avoiding a brute force hypothesis enumeration procedure. The simulation results show that the proposed MPBP-MHT method can achieve better tracking performance than other algorithms in challenging tracking situations.
Mezlini, Aziz M; Goldenberg, Anna
2017-10-01
Discovering genetic mechanisms driving complex diseases is a hard problem. Existing methods often lack power to identify the set of responsible genes. Protein-protein interaction networks have been shown to boost power when detecting gene-disease associations. We introduce a Bayesian framework, Conflux, to find disease associated genes from exome sequencing data using networks as a prior. There are two main advantages to using networks within a probabilistic graphical model. First, networks are noisy and incomplete, a substantial impediment to gene discovery. Incorporating networks into the structure of a probabilistic models for gene inference has less impact on the solution than relying on the noisy network structure directly. Second, using a Bayesian framework we can keep track of the uncertainty of each gene being associated with the phenotype rather than returning a fixed list of genes. We first show that using networks clearly improves gene detection compared to individual gene testing. We then show consistently improved performance of Conflux compared to the state-of-the-art diffusion network-based method Hotnet2 and a variety of other network and variant aggregation methods, using randomly generated and literature-reported gene sets. We test Hotnet2 and Conflux on several network configurations to reveal biases and patterns of false positives and false negatives in each case. Our experiments show that our novel Bayesian framework Conflux incorporates many of the advantages of the current state-of-the-art methods, while offering more flexibility and improved power in many gene-disease association scenarios.
Configuring a Graphical User Interface for Managing Local HYSPLIT Model Runs Through AWIPS
Wheeler, mark M.; Blottman, Peter F.; Sharp, David W.; Hoeth, Brian; VanSpeybroeck, Kurt M.
2009-01-01
Responding to incidents involving the release of harmful airborne pollutants is a continual challenge for Weather Forecast Offices in the National Weather Service. When such incidents occur, current protocol recommends forecaster-initiated requests of NOAA's Hybrid Single-Particle Lagrangian Integrated Trajectory (HYSPLIT) model output through the National Centers of Environmental Prediction to obtain critical dispersion guidance. Individual requests are submitted manually through a secured web site, with desired multiple requests submitted in sequence, for the purpose of obtaining useful trajectory and concentration forecasts associated with the significant release of harmful chemical gases, radiation, wildfire smoke, etc., into local the atmosphere. To help manage the local HYSPLIT for both routine and emergency use, a graphical user interface was designed for operational efficiency. The interface allows forecasters to quickly determine the current HYSPLIT configuration for the list of predefined sites (e.g., fixed sites and floating sites), and to make any necessary adjustments to key parameters such as Input Model. Number of Forecast Hours, etc. When using the interface, forecasters will obtain desired output more confidently and without the danger of corrupting essential configuration files.
Albouy, Philippe; Mattout, Jérémie; Sanchez, Gaëtan; Tillmann, Barbara; Caclin, Anne
2015-01-01
Congenital amusia is a neuro-developmental disorder that primarily manifests as a difficulty in the perception and memory of pitch-based materials, including music. Recent findings have shown that the amusic brain exhibits altered functioning of a fronto-temporal network during pitch perception and short-term memory. Within this network, during the encoding of melodies, a decreased right backward frontal-to-temporal connectivity was reported in amusia, along with an abnormal connectivity within and between auditory cortices. The present study investigated whether connectivity patterns between these regions were affected during the short-term memory retrieval of melodies. Amusics and controls had to indicate whether sequences of six tones that were presented in pairs were the same or different. When melodies were different only one tone changed in the second melody. Brain responses to the changed tone in "Different" trials and to its equivalent (original) tone in "Same" trials were compared between groups using Dynamic Causal Modeling (DCM). DCM results confirmed that congenital amusia is characterized by an altered effective connectivity within and between the two auditory cortices during sound processing. Furthermore, right temporal-to-frontal message passing was altered in comparison to controls, with notably an increase in "Same" trials. An additional analysis in control participants emphasized that the detection of an unexpected event in the typically functioning brain is supported by right fronto-temporal connections. The results can be interpreted in a predictive coding framework as reflecting an abnormal prediction error sent by temporal auditory regions towards frontal areas in the amusic brain.
Directory of Open Access Journals (Sweden)
Philippe eAlbouy
2015-02-01
Full Text Available Congenital amusia is a neuro-developmental disorder that primarily manifests as a difficulty in the perception and memory of pitch-based materials, including music. Recent findings have shown that the amusic brain exhibits altered functioning of a fronto-temporal network during pitch perception and memory. Within this network, during the encoding of melodies, a decreased right backward frontal-to-temporal connectivity was reported in amusia, along with an abnormal connectivity within and between auditory cortices. The present study investigated whether connectivity patterns between these regions were affected during the retrieval of melodies. Amusics and controls had to indicate whether sequences of six tones that were presented in pairs were the same or different. When melodies were different only one tone changed in the second melody. Brain responses to the changed tone in Different trials and to its equivalent (original tone in Same trials were compared between groups using Dynamic Causal Modeling (DCM. DCM results confirmed that congenital amusia is characterized by an altered effective connectivity within and between the two auditory cortices during sound processing. Furthermore, right temporal-to-frontal message passing was altered in comparison to controls, with an increase in Same trials and a decrease in Different trials. An additional analysis in control participants emphasized that the detection of an unexpected event in the typically functioning brain is supported by right fronto-temporal connections. The results can be interpreted in a predictive coding framework as reflecting an abnormal prediction error sent by temporal auditory regions towards frontal areas in the amusic brain.
Directory of Open Access Journals (Sweden)
Liangsuo Ma
2015-01-01
Full Text Available Cocaine dependence is associated with increased impulsivity in humans. Both cocaine dependence and impulsive behavior are under the regulatory control of cortico-striatal networks. One behavioral laboratory measure of impulsivity is response inhibition (ability to withhold a prepotent response in which altered patterns of regional brain activation during executive tasks in service of normal performance are frequently found in cocaine dependent (CD subjects studied with functional magnetic resonance imaging (fMRI. However, little is known about aberrations in specific directional neuronal connectivity in CD subjects. The present study employed fMRI-based dynamic causal modeling (DCM to study the effective (directional neuronal connectivity associated with response inhibition in CD subjects, elicited under performance of a Go/NoGo task with two levels of NoGo difficulty (Easy and Hard. The performance on the Go/NoGo task was not significantly different between CD subjects and controls. The DCM analysis revealed that prefrontal–striatal connectivity was modulated (influenced during the NoGo conditions for both groups. The effective connectivity from left (L anterior cingulate cortex (ACC to L caudate was similarly modulated during the Easy NoGo condition for both groups. During the Hard NoGo condition in controls, the effective connectivity from right (R dorsolateral prefrontal cortex (DLPFC to L caudate became more positive, and the effective connectivity from R ventrolateral prefrontal cortex (VLPFC to L caudate became more negative. In CD subjects, the effective connectivity from L ACC to L caudate became more negative during the Hard NoGo conditions. These results indicate that during Hard NoGo trials in CD subjects, the ACC rather than DLPFC or VLPFC influenced caudate during response inhibition.
Directory of Open Access Journals (Sweden)
John F Magnotti
2017-02-01
Full Text Available Audiovisual speech integration combines information from auditory speech (talker's voice and visual speech (talker's mouth movements to improve perceptual accuracy. However, if the auditory and visual speech emanate from different talkers, integration decreases accuracy. Therefore, a key step in audiovisual speech perception is deciding whether auditory and visual speech have the same source, a process known as causal inference. A well-known illusion, the McGurk Effect, consists of incongruent audiovisual syllables, such as auditory "ba" + visual "ga" (AbaVga, that are integrated to produce a fused percept ("da". This illusion raises two fundamental questions: first, given the incongruence between the auditory and visual syllables in the McGurk stimulus, why are they integrated; and second, why does the McGurk effect not occur for other, very similar syllables (e.g., AgaVba. We describe a simplified model of causal inference in multisensory speech perception (CIMS that predicts the perception of arbitrary combinations of auditory and visual speech. We applied this model to behavioral data collected from 60 subjects perceiving both McGurk and non-McGurk incongruent speech stimuli. The CIMS model successfully predicted both the audiovisual integration observed for McGurk stimuli and the lack of integration observed for non-McGurk stimuli. An identical model without causal inference failed to accurately predict perception for either form of incongruent speech. The CIMS model uses causal inference to provide a computational framework for studying how the brain performs one of its most important tasks, integrating auditory and visual speech cues to allow us to communicate with others.
Construction of a graphic interface for a nuclear reactor modelling and simulation
International Nuclear Information System (INIS)
Cadrdenas C, Carlos Roberto; Riquelme R, Raul Antonio.
1995-01-01
A graphic interface is presented for real time transient analysis under reactivity insertion, reactor operators training, and the RECH-1 reactor licensing, using the Paret (Program for Analysis of Reactor Transients) computer code. 17 refs., 29 figs
GRAPHIC ADVERTISING, SPECIALIZED COMMUNICATIONS MODEL THROUGH SYMBOLS, WORDS, IMAGES WORDS, IMAGES
ADRONACHI Maria
2011-01-01
The aim of the paper is to identify the graphic advertising components: symbol, text, colour, to illustrate how they cooperate in order to create the advertising message, and to analyze the corelation product – advertising – consumer.
GRAPHIC ADVERTISING, SPECIALIZED COMMUNICATIONS MODEL THROUGH SYMBOLS, WORDS, IMAGES WORDS, IMAGES
Directory of Open Access Journals (Sweden)
ADRONACHI Maria
2011-06-01
Full Text Available The aim of the paper is to identify the graphic advertising components: symbol, text, colour, to illustrate how they cooperate in order to create the advertising message, and to analyze the corelation product – advertising – consumer.
A Model-Driven Approach to Graphical User Interface Runtime Adaptation
Criado, Javier; Vicente Chicote, Cristina; Iribarne, Luis; Padilla, Nicolás
2010-01-01
Graphical user interfaces play a key role in human-computer interaction, as they link the system with its end-users, allowing information exchange and improving communication. Nowadays, users increasingly demand applications with adaptive interfaces that dynamically evolve in response to their specific needs. Thus, providing graphical user interfaces with runtime adaptation capabilities is becoming more and more an important issue. To address this problem, this paper proposes a componen...
Fenton, N.; Neil, M.; Lagnado, D.; Marsh, W.; Yet, B.; Constantinou, A.
2016-01-01
We show that existing Bayesian network (BN) modelling techniques cannot capture the correct intuitive reasoning in the important case when a set of mutually exclusive events need to be modelled as separate nodes instead of states of a single node. A previously proposed ‘solution’, which introduces a simple constraint node that enforces mutual exclusivity, fails to preserve the prior probabilities of the events, while other proposed solutions involve major changes to the original model. We pro...
The Mediation Formula: A Guide to the Assessment of Causal Pathways in Nonlinear Models
2011-10-27
There, too, the restrictions imposed by the “principal-strata” framework lead to surrogacy criteria that are incompatible with the practical aims of... surrogacy (see Pearl (2011)). 28 cally manipulable. After all, if our aim is to uncover causal mechanisms, it is hard to accept the PSDE restriction that
Health and Wealth of Elderly Couples : Causality Tests Using Dynamic Panel Data Models
Michaud, P.C.; van Soest, A.H.O.
2004-01-01
A positive relationship between socio-economic status (SES) and health, the so-called \\health-wealth gradient", is repeatedly found in most industrialized countries with similar levels of health care technology and economic welfare. This study analyzes causality from health to wealth (health
A Causal Model of Career Development and Quality of Life of College Students with Disabilities
Chun, Jina
2017-01-01
Researchers have assumed that social cognitive factors play significant roles in the career development of transition youth and young adults with disabilities and those without disabilities. However, research on the influence of the career decision-making process as a primary causal agent in one's psychosocial outcomes such as perceived level of…
Asakawa, Masami; Okano, Masao
2009-01-01
This study examined the factors influencing consumers' perception of online shopping and developed a causal model that explains how this perception affects their online-shopping behavior. We administered a questionnaire survey to 297 college students. By utilizing the answers to 13 questions pertaining to consumer perceptions, we conducted a factor analysis that identified the following three factors: "convenience", "anxiety regarding security" and "poor navigation". On the basis of this resu...
Determining species expansion and extinction possibilities using probabilistic and graphical models
Directory of Open Access Journals (Sweden)
Chaturvedi Rajesh
2015-03-01
Full Text Available Survival of plant species is governed by a number of functions. The participation of each function in species survival and the impact of the contrary behaviour of the species vary from function to function. The probability of extinction of species varies in all such scenarios and has to be calculated separately. Secondly, species follow different patterns of dispersal and localisation at different stages of occupancy state of the site, therefore, the scenarios of competition for resources with climatic shifts leading to deterioration and loss of biodiversity resulting in extinction needs to be studied. Furthermore, most possible deviations of species from climax community states needs to be calculated before species become extinct due to sudden environmental disruption. Globally, various types of anthropogenic disturbances threaten the diversity of biological systems. The impact of these anthropogenic activities needs to be analysed to identify extinction patterns with respect to these activities. All the analyses mentioned above have been tried to be achieved through probabilistic or graphical models in this study.
Analysis of impact of general-purpose graphics processor units in supersonic flow modeling
Emelyanov, V. N.; Karpenko, A. G.; Kozelkov, A. S.; Teterina, I. V.; Volkov, K. N.; Yalozo, A. V.
2017-06-01
Computational methods are widely used in prediction of complex flowfields associated with off-normal situations in aerospace engineering. Modern graphics processing units (GPU) provide architectures and new programming models that enable to harness their large processing power and to design computational fluid dynamics (CFD) simulations at both high performance and low cost. Possibilities of the use of GPUs for the simulation of external and internal flows on unstructured meshes are discussed. The finite volume method is applied to solve three-dimensional unsteady compressible Euler and Navier-Stokes equations on unstructured meshes with high resolution numerical schemes. CUDA technology is used for programming implementation of parallel computational algorithms. Solutions of some benchmark test cases on GPUs are reported, and the results computed are compared with experimental and computational data. Approaches to optimization of the CFD code related to the use of different types of memory are considered. Speedup of solution on GPUs with respect to the solution on central processor unit (CPU) is compared. Performance measurements show that numerical schemes developed achieve 20-50 speedup on GPU hardware compared to CPU reference implementation. The results obtained provide promising perspective for designing a GPU-based software framework for applications in CFD.
Chen, Yonghong; Bressler, Steven L.; Knuth, Kevin H.; Truccolo, Wilson A.; Ding, Mingzhou
2006-06-01
In this article we consider the stochastic modeling of neurobiological time series from cognitive experiments. Our starting point is the variable-signal-plus-ongoing-activity model. From this model a differentially variable component analysis strategy is developed from a Bayesian perspective to estimate event-related signals on a single trial basis. After subtracting out the event-related signal from recorded single trial time series, the residual ongoing activity is treated as a piecewise stationary stochastic process and analyzed by an adaptive multivariate autoregressive modeling strategy which yields power, coherence, and Granger causality spectra. Results from applying these methods to local field potential recordings from monkeys performing cognitive tasks are presented.
Glassner, Andrew S
1993-01-01
""The GRAPHICS GEMS Series"" was started in 1990 by Andrew Glassner. The vision and purpose of the Series was - and still is - to provide tips, techniques, and algorithms for graphics programmers. All of the gems are written by programmers who work in the field and are motivated by a common desire to share interesting ideas and tools with their colleagues. Each volume provides a new set of innovative solutions to a variety of programming problems.
DEFF Research Database (Denmark)
Bergstrøm-Nielsen, Carl
1992-01-01
Texbook to be used along with training the practise of graphic notation. Describes method; exercises; bibliography; collection of examples. If you can read Danish, please refer to that edition which is by far much more updated.......Texbook to be used along with training the practise of graphic notation. Describes method; exercises; bibliography; collection of examples. If you can read Danish, please refer to that edition which is by far much more updated....
Causal inference in econometrics
Kreinovich, Vladik; Sriboonchitta, Songsak
2016-01-01
This book is devoted to the analysis of causal inference which is one of the most difficult tasks in data analysis: when two phenomena are observed to be related, it is often difficult to decide whether one of them causally influences the other one, or whether these two phenomena have a common cause. This analysis is the main focus of this volume. To get a good understanding of the causal inference, it is important to have models of economic phenomena which are as accurate as possible. Because of this need, this volume also contains papers that use non-traditional economic models, such as fuzzy models and models obtained by using neural networks and data mining techniques. It also contains papers that apply different econometric models to analyze real-life economic dependencies.
1990-01-01
A mathematician, David R. Hedgley, Jr. developed a computer program that considers whether a line in a graphic model of a three-dimensional object should or should not be visible. Known as the Hidden Line Computer Code, the program automatically removes superfluous lines and displays an object from a specific viewpoint, just as the human eye would see it. An example of how one company uses the program is the experience of Birdair which specializes in production of fabric skylights and stadium covers. The fabric called SHEERFILL is a Teflon coated fiberglass material developed in cooperation with DuPont Company. SHEERFILL glazed structures are either tension structures or air-supported tension structures. Both are formed by patterned fabric sheets supported by a steel or aluminum frame or cable network. Birdair uses the Hidden Line Computer Code, to illustrate a prospective structure to an architect or owner. The program generates a three- dimensional perspective with the hidden lines removed. This program is still used by Birdair and continues to be commercially available to the public.
van Ments, Laila; Roelofsma, Peter; Treur, Jan
2018-01-01
Religion is a central aspect of many individuals' lives around the world, and its influence on human behaviour has been extensively studied from many different perspectives. The current study integrates a number of these perspectives into one adaptive temporal-causal network model describing the mental states involved, their mutual relations, and the adaptation of some of these relations over time due to learning. By first developing a conceptual representation of a network model based on the literature, and then formalizing this model into a numerical representation, simulations can be done for almost any kind of religion and person, showing different behaviours for persons with different religious backgrounds and characters. The focus was mainly on the influence of religion on human empathy and dis-empathy, a topic very relevant today. The developed model could be valuable for many uses, involving support for a better understanding, and even prediction, of the behaviour of religious individuals. It is illustrated for a number of different scenarios based on different characteristics of the persons and of the religion.
Directory of Open Access Journals (Sweden)
Marinela eCapanu
2015-05-01
Full Text Available Identifying the small number of rare causal variants contributing to disease has beena major focus of investigation in recent years, but represents a formidable statisticalchallenge due to the rare frequencies with which these variants are observed. In thiscommentary we draw attention to a formal statistical framework, namely hierarchicalmodeling, to combine functional genomic annotations with sequencing data with theobjective of enhancing our ability to identify rare causal variants. Using simulations weshow that in all configurations studied, the hierarchical modeling approach has superiordiscriminatory ability compared to a recently proposed aggregate measure of deleteriousness,the Combined Annotation-Dependent Depletion (CADD score, supportingour premise that aggregate functional genomic measures can more accurately identifycausal variants when used in conjunction with sequencing data through a hierarchicalmodeling approach
Fuertes Casals, Alba; Casals Casanova, Miquel; Gangolells Solanellas, Marta; Forcada Matheu, Núria; Macarulla Martí, Marcel; Roca Ramon, Xavier
2013-01-01
Despite the increasing efforts made by the construction sector to reduce the environmental impact of their processes, construction sites are still a major source of pollution and adverse impacts on the environment. This paper aims to improve the understanding of construction-related environmental impacts by identifying on-site causal factors and associated immediate circumstances during construc- tion processes for residential building projects. Based on the literature and focus g...
Causal compositional models in valuation-based systems with examples in specific theories
Czech Academy of Sciences Publication Activity Database
Jiroušek, Radim; Shenoy, P. P.
2016-01-01
Roč. 72, č. 1 (2016), s. 95-112 ISSN 0888-613X Grant - others:GA ČR(CZ) GA15-00215S Institutional support: RVO:67985556 Keywords : operator of composition * causality * belief function Subject RIV: AH - Economics OBOR OECD: Economic Theory Impact factor: 2.845, year: 2016 http://library.utia.cas.cz/separaty/2017/MTR/jirousek-0481260.pdf
Komperda, Regis
The purpose of this dissertation is to test a model of relationships among factors characterizing aspects of a student-centered constructivist learning environment and student outcomes of satisfaction and academic achievement in introductory undergraduate chemistry courses. Constructivism was chosen as the theoretical foundation for this research because of its widespread use in chemical education research and practice. In a constructivist learning environment the role of the teacher shifts from delivering content towards facilitating active student engagement in activities that encourage individual knowledge construction through discussion and application of content. Constructivist approaches to teaching introductory chemistry courses have been adopted by some instructors as a way to improve student outcomes, but little research has been done on the causal relationships among particular aspects of the learning environment and student outcomes. This makes it difficult for classroom teachers to know which aspects of a constructivist teaching approach are critical to adopt and which may be modified to better suit a particular learning environment while still improving student outcomes. To investigate a model of these relationships, a survey designed to measure student perceptions of three factors characterizing a constructivist learning environment in online courses was adapted for use in face-to-face chemistry courses. These three factors, teaching presence, social presence, and cognitive presence, were measured using a slightly modified version of the Community of Inquiry (CoI) instrument. The student outcomes investigated in this research were satisfaction and academic achievement, as measured by standardized American Chemical Society (ACS) exam scores and course grades. Structural equation modeling (SEM) was used to statistically model relationships among the three presence factors and student outcome variables for 391 students enrolled in six sections of a
Ward-Garrison, Christian; Markstrom, Steven L.; Hay, Lauren E.
2009-01-01
The U.S. Geological Survey Downsizer is a computer application that selects, downloads, verifies, and formats station-based time-series data for environmental-resource models, particularly the Precipitation-Runoff Modeling System. Downsizer implements the client-server software architecture. The client presents a map-based, graphical user interface that is intuitive to modelers; the server provides streamflow and climate time-series data from over 40,000 measurement stations across the United States. This report is the Downsizer user's manual and provides (1) an overview of the software design, (2) installation instructions, (3) a description of the graphical user interface, (4) a description of selected output files, and (5) troubleshooting information.
TU-D-209-03: Alignment of the Patient Graphic Model Using Fluoroscopic Images for Skin Dose Mapping
Energy Technology Data Exchange (ETDEWEB)
Oines, A; Oines, A; Kilian-Meneghin, J; Karthikeyan, B; Rudin, S; Bednarek, D [University at Buffalo (SUNY) School of Med., Buffalo, NY (United States)
2016-06-15
Purpose: The Dose Tracking System (DTS) was developed to provide realtime feedback of skin dose and dose rate during interventional fluoroscopic procedures. A color map on a 3D graphic of the patient represents the cumulative dose distribution on the skin. Automated image correlation algorithms are described which use the fluoroscopic procedure images to align and scale the patient graphic for more accurate dose mapping. Methods: Currently, the DTS employs manual patient graphic selection and alignment. To improve the accuracy of dose mapping and automate the software, various methods are explored to extract information about the beam location and patient morphology from the procedure images. To match patient anatomy with a reference projection image, preprocessing is first used, including edge enhancement, edge detection, and contour detection. Template matching algorithms from OpenCV are then employed to find the location of the beam. Once a match is found, the reference graphic is scaled and rotated to fit the patient, using image registration correlation functions in Matlab. The algorithm runs correlation functions for all points and maps all correlation confidences to a surface map. The highest point of correlation is used for alignment and scaling. The transformation data is saved for later model scaling. Results: Anatomic recognition is used to find matching features between model and image and image registration correlation provides for alignment and scaling at any rotation angle with less than onesecond runtime, and at noise levels in excess of 150% of those found in normal procedures. Conclusion: The algorithm provides the necessary scaling and alignment tools to improve the accuracy of dose distribution mapping on the patient graphic with the DTS. Partial support from NIH Grant R01-EB002873 and Toshiba Medical Systems Corp.
TU-D-209-03: Alignment of the Patient Graphic Model Using Fluoroscopic Images for Skin Dose Mapping
International Nuclear Information System (INIS)
Oines, A; Oines, A; Kilian-Meneghin, J; Karthikeyan, B; Rudin, S; Bednarek, D
2016-01-01
Purpose: The Dose Tracking System (DTS) was developed to provide realtime feedback of skin dose and dose rate during interventional fluoroscopic procedures. A color map on a 3D graphic of the patient represents the cumulative dose distribution on the skin. Automated image correlation algorithms are described which use the fluoroscopic procedure images to align and scale the patient graphic for more accurate dose mapping. Methods: Currently, the DTS employs manual patient graphic selection and alignment. To improve the accuracy of dose mapping and automate the software, various methods are explored to extract information about the beam location and patient morphology from the procedure images. To match patient anatomy with a reference projection image, preprocessing is first used, including edge enhancement, edge detection, and contour detection. Template matching algorithms from OpenCV are then employed to find the location of the beam. Once a match is found, the reference graphic is scaled and rotated to fit the patient, using image registration correlation functions in Matlab. The algorithm runs correlation functions for all points and maps all correlation confidences to a surface map. The highest point of correlation is used for alignment and scaling. The transformation data is saved for later model scaling. Results: Anatomic recognition is used to find matching features between model and image and image registration correlation provides for alignment and scaling at any rotation angle with less than onesecond runtime, and at noise levels in excess of 150% of those found in normal procedures. Conclusion: The algorithm provides the necessary scaling and alignment tools to improve the accuracy of dose distribution mapping on the patient graphic with the DTS. Partial support from NIH Grant R01-EB002873 and Toshiba Medical Systems Corp.
Probabilistic graphical models to deal with age estimation of living persons.
Sironi, Emanuele; Gallidabino, Matteo; Weyermann, Céline; Taroni, Franco
2016-03-01
Due to the rise of criminal, civil and administrative judicial situations involving people lacking valid identity documents, age estimation of living persons has become an important operational procedure for numerous forensic and medicolegal services worldwide. The chronological age of a given person is generally estimated from the observed degree of maturity of some selected physical attributes by means of statistical methods. However, their application in the forensic framework suffers from some conceptual and practical drawbacks, as recently claimed in the specialised literature. The aim of this paper is therefore to offer an alternative solution for overcoming these limits, by reiterating the utility of a probabilistic Bayesian approach for age estimation. This approach allows one to deal in a transparent way with the uncertainty surrounding the age estimation process and to produce all the relevant information in the form of posterior probability distribution about the chronological age of the person under investigation. Furthermore, this probability distribution can also be used for evaluating in a coherent way the possibility that the examined individual is younger or older than a given legal age threshold having a particular legal interest. The main novelty introduced by this work is the development of a probabilistic graphical model, i.e. a Bayesian network, for dealing with the problem at hand. The use of this kind of probabilistic tool can significantly facilitate the application of the proposed methodology: examples are presented based on data related to the ossification status of the medial clavicular epiphysis. The reliability and the advantages of this probabilistic tool are presented and discussed.
The effectiveness of an interactive 3-dimensional computer graphics model for medical education.
Battulga, Bayanmunkh; Konishi, Takeshi; Tamura, Yoko; Moriguchi, Hiroki
2012-07-09
Medical students often have difficulty achieving a conceptual understanding of 3-dimensional (3D) anatomy, such as bone alignment, muscles, and complex movements, from 2-dimensional (2D) images. To this end, animated and interactive 3-dimensional computer graphics (3DCG) can provide better visual information to users. In medical fields, research on the advantages of 3DCG in medical education is relatively new. To determine the educational effectiveness of interactive 3DCG. We divided 100 participants (27 men, mean (SD) age 17.9 (0.6) years, and 73 women, mean (SD) age 18.1 (1.1) years) from the Health Sciences University of Mongolia (HSUM) into 3DCG (n = 50) and textbook-only (control) (n = 50) groups. The control group used a textbook and 2D images, while the 3DCG group was trained to use the interactive 3DCG shoulder model in addition to a textbook. We conducted a questionnaire survey via an encrypted satellite network between HSUM and Tokushima University. The questionnaire was scored on a 5-point Likert scale from strongly disagree (score 1) to strongly agree (score 5). Interactive 3DCG was effective in undergraduate medical education. Specifically, there was a significant difference in mean (SD) scores between the 3DCG and control groups in their response to questionnaire items regarding content (4.26 (0.69) vs 3.85 (0.68), P = .001) and teaching methods (4.33 (0.65) vs 3.74 (0.79), P < .001), but no significant difference in the Web category. Participants also provided meaningful comments on the advantages of interactive 3DCG. Interactive 3DCG materials have positive effects on medical education when properly integrated into conventional education. In particular, our results suggest that interactive 3DCG is more efficient than textbooks alone in medical education and can motivate students to understand complex anatomical structures.
Javan, Ramin; Zeman, Merissa N
2018-02-01
In the context of medical three-dimensional (3D) printing, in addition to 3D reconstruction from cross-sectional imaging, graphic design plays a role in developing and/or enhancing 3D-printed models. A custom prototype modular 3D model of the liver was graphically designed depicting segmental anatomy of the parenchyma containing color-coded hepatic vasculature and biliary tree. Subsequently, 3D printing was performed using transparent resin for the surface of the liver and polyamide material to develop hollow internal structures that allow for passage of catheters and wires. A number of concepts were incorporated into the model. A representative mass with surrounding feeding arterial supply was embedded to demonstrate tumor embolization. A straight narrow hollow tract connecting the mass to the surface of the liver, displaying the path of a biopsy device's needle, and the concept of needle "throw" length was designed. A connection between the middle hepatic and right portal veins was created to demonstrate transjugular intrahepatic portosystemic shunt (TIPS) placement. A hollow amorphous structure representing an abscess was created to allow the demonstration of drainage catheter placement with the formation of pigtail tip. Percutaneous biliary drain and cholecystostomy tube placement were also represented. The skills of graphic designers may be utilized in creating highly customized 3D-printed models. A model was developed for the demonstration and simulation of multiple hepatobiliary interventions, for training purposes, patient counseling and consenting, and as a prototype for future development of a functioning interventional phantom.
Julianto, E. A.; Suntoro, W. A.; Dewi, W. S.; Partoyo
2018-03-01
Climate change has been reported to exacerbate land resources degradation including soil fertility decline. The appropriate validity use on soil fertility evaluation could reduce the risk of climate change effect on plant cultivation. This study aims to assess the validity of a Soil Fertility Evaluation Model using a graphical approach. The models evaluated were the Indonesian Soil Research Center (PPT) version model, the FAO Unesco version model, and the Kyuma version model. Each model was then correlated with rice production (dry grain weight/GKP). The goodness of fit of each model can be tested to evaluate the quality and validity of a model, as well as the regression coefficient (R2). This research used the Eviews 9 programme by a graphical approach. The results obtained three curves, namely actual, fitted, and residual curves. If the actual and fitted curves are widely apart or irregular, this means that the quality of the model is not good, or there are many other factors that are still not included in the model (large residual) and conversely. Indeed, if the actual and fitted curves show exactly the same shape, it means that all factors have already been included in the model. Modification of the standard soil fertility evaluation models can improve the quality and validity of a model.
Directory of Open Access Journals (Sweden)
Brook Weld Muller
2014-12-01
Full Text Available This essay describes strategic approaches to graphic representation associated with critical environmental engagement and that build from the idea of works of architecture as stitches in the ecological fabric of the city. It focuses on the building up of partial or fragmented graphics in order to describe inclusive, open-ended possibilities for making architecture that marry rich experience and responsive performance. An aphoristic approach to crafting drawings involves complex layering, conscious absence and the embracing of tension. A self-critical attitude toward the generation of imagery characterized by the notion of ‘loose precision’ may lead to more transformative and environmentally responsive architectures.
Graphic displays on PCs of gaseous diffusion models of radionuclide releases to the atmosphere
International Nuclear Information System (INIS)
Campo Ortega, E. del
1993-01-01
The well-known MESOI program has been modified and improved to adapt it to a PC/AT with VGA colour monitor. Far from losing any of its powerful characteristics to calculate the transport, diffusion, deposition and decay of gaseous radioactive effluents discharged to the atmosphere, it has been enhanced to allow graphic viewing of concentrations, wind speed and direction and puff locations in colour, all on a background map of the site. The background covers a 75 x 75 km square and has a graphic grid density of 421 x 421 pixels. This means that effluent concentration is represented approximately every 170 metres in the 'clouded-area'. Among the modifications and enhancements made, the following are of particular interest: 1. A new subroutine called NUBE has been added, which calculates the distribution of effluent concentration of activity in a grid of 421 x 421 pixels. 2. Several subroutines have been added to obtain graphic displays and printouts of the cloud, wind field and puff locations. 3. Graphic display of the geographic plane of the area surrounding the effluent release point. 4. Off-line preparation of meteorological and topographical data files necessary for program execution. (author)
Reacting to Graphic Horror: A Model of Empathy and Emotional Behavior.
Tamborini, Ron; And Others
1990-01-01
Studies viewer response to graphic horror films. Reports that undergraduate mass communication students viewed clips from two horror films and a scientific television program. Concludes that people who score high on measures for wandering imagination, fictional involvement, humanistic orientation, and emotional contagion tend to find horror films…
Directory of Open Access Journals (Sweden)
A. V. Krasnyuk
2008-03-01
Full Text Available Three-dimensional design possibilities of the AutoCAD system for performing graphic tasks are presented in the article. On the basis of the studies conducted the features of application of computer-aided design system are noted and the methods allowing to decrease considerably the quantity of errors at making the drawings are offered.
Kirk, David
1994-01-01
This sequel to Graphics Gems (Academic Press, 1990), and Graphics Gems II (Academic Press, 1991) is a practical collection of computer graphics programming tools and techniques. Graphics Gems III contains a larger percentage of gems related to modeling and rendering, particularly lighting and shading. This new edition also covers image processing, numerical and programming techniques, modeling and transformations, 2D and 3D geometry and algorithms,ray tracing and radiosity, rendering, and more clever new tools and tricks for graphics programming. Volume III also includes a
Causal inference based on counterfactuals
Directory of Open Access Journals (Sweden)
Höfler M
2005-09-01
Full Text Available Abstract Background The counterfactual or potential outcome model has become increasingly standard for causal inference in epidemiological and medical studies. Discussion This paper provides an overview on the counterfactual and related approaches. A variety of conceptual as well as practical issues when estimating causal effects are reviewed. These include causal interactions, imperfect experiments, adjustment for confounding, time-varying exposures, competing risks and the probability of causation. It is argued that the counterfactual model of causal effects captures the main aspects of causality in health sciences and relates to many statistical procedures. Summary Counterfactuals are the basis of causal inference in medicine and epidemiology. Nevertheless, the estimation of counterfactual differences pose several difficulties, primarily in observational studies. These problems, however, reflect fundamental barriers only when learning from observations, and this does not invalidate the counterfactual concept.
Directory of Open Access Journals (Sweden)
Ingkasuwan Papapit
2012-08-01
Full Text Available Abstract Background Starch serves as a temporal storage of carbohydrates in plant leaves during day/night cycles. To study transcriptional regulatory modules of this dynamic metabolic process, we conducted gene regulation network analysis based on small-sample inference of graphical Gaussian model (GGM. Results Time-series significant analysis was applied for Arabidopsis leaf transcriptome data to obtain a set of genes that are highly regulated under a diurnal cycle. A total of 1,480 diurnally regulated genes included 21 starch metabolic enzymes, 6 clock-associated genes, and 106 transcription factors (TF. A starch-clock-TF gene regulation network comprising 117 nodes and 266 edges was constructed by GGM from these 133 significant genes that are potentially related to the diurnal control of starch metabolism. From this network, we found that β-amylase 3 (b-amy3: At4g17090, which participates in starch degradation in chloroplast, is the most frequently connected gene (a hub gene. The robustness of gene-to-gene regulatory network was further analyzed by TF binding site prediction and by evaluating global co-expression of TFs and target starch metabolic enzymes. As a result, two TFs, indeterminate domain 5 (AtIDD5: At2g02070 and constans-like (COL: At2g21320, were identified as positive regulators of starch synthase 4 (SS4: At4g18240. The inference model of AtIDD5-dependent positive regulation of SS4 gene expression was experimentally supported by decreased SS4 mRNA accumulation in Atidd5 mutant plants during the light period of both short and long day conditions. COL was also shown to positively control SS4 mRNA accumulation. Furthermore, the knockout of AtIDD5 and COL led to deformation of chloroplast and its contained starch granules. This deformity also affected the number of starch granules per chloroplast, which increased significantly in both knockout mutant lines. Conclusions In this study, we utilized a systematic approach of microarray
Magezi, David A
2015-01-01
Linear mixed-effects models (LMMs) are increasingly being used for data analysis in cognitive neuroscience and experimental psychology, where within-participant designs are common. The current article provides an introductory review of the use of LMMs for within-participant data analysis and describes a free, simple, graphical user interface (LMMgui). LMMgui uses the package lme4 (Bates et al., 2014a,b) in the statistical environment R (R Core Team).
Sinha, Shriprakash
2016-12-01
Simulation study in systems biology involving computational experiments dealing with Wnt signaling pathways abound in literature but often lack a pedagogical perspective that might ease the understanding of beginner students and researchers in transition, who intend to work on the modeling of the pathway. This paucity might happen due to restrictive business policies which enforce an unwanted embargo on the sharing of important scientific knowledge. A tutorial introduction to computational modeling of Wnt signaling pathway in a human colorectal cancer dataset using static Bayesian network models is provided. The walkthrough might aid biologists/informaticians in understanding the design of computational experiments that is interleaved with exposition of the Matlab code and causal models from Bayesian network toolbox. The manuscript elucidates the coding contents of the advance article by Sinha (Integr. Biol. 6:1034-1048, 2014) and takes the reader in a step-by-step process of how (a) the collection and the transformation of the available biological information from literature is done, (b) the integration of the heterogeneous data and prior biological knowledge in the network is achieved, (c) the simulation study is designed, (d) the hypothesis regarding a biological phenomena is transformed into computational framework, and (e) results and inferences drawn using d -connectivity/separability are reported. The manuscript finally ends with a programming assignment to help the readers get hands-on experience of a perturbation project. Description of Matlab files is made available under GNU GPL v3 license at the Google code project on https://code.google.com/p/static-bn-for-wnt-signaling-pathway and https: //sites.google.com/site/shriprakashsinha/shriprakashsinha/projects/static-bn-for-wnt-signaling-pathway. Latest updates can be found in the latter website.
Di, Xin; Biswal, Bharat B
2014-02-01
The default mode network is part of the brain structure that shows higher neural activity and energy consumption when one is at rest. The key regions in the default mode network are highly interconnected as conveyed by both the white matter fiber tracing and the synchrony of resting-state functional magnetic resonance imaging signals. However, the causal information flow within the default mode network is still poorly understood. The current study used the dynamic causal modeling on a resting-state fMRI data set to identify the network structure underlying the default mode network. The endogenous brain fluctuations were explicitly modeled by Fourier series at the low frequency band of 0.01-0.08Hz, and those Fourier series were set as driving inputs of the DCM models. Model comparison procedures favored a model wherein the MPFC sends information to the PCC and the bilateral inferior parietal lobule sends information to both the PCC and MPFC. Further analyses provide evidence that the endogenous connectivity might be higher in the right hemisphere than in the left hemisphere. These data provided insight into the functions of each node in the DMN, and also validate the usage of DCM on resting-state fMRI data. © 2013.
Henderson, Nicole L; Dressler, William W
2017-12-01
This study examines the knowledge individuals use to make judgments about persons with substance use disorder. First, we show that there is a cultural model of addiction causality that is both shared and contested. Second, we examine how individuals' understanding of that model is associated with stigma attribution. Research was conducted among undergraduate students at the University of Alabama. College students in the 18-25 age range are especially at risk for developing substance use disorder, and they are, perhaps more than any other population group, intensely targeted by drug education. The elicited cultural model includes different types of causes distributed across five distinct themes: Biological, Self-Medication, Familial, Social, and Hedonistic. Though there was cultural consensus among respondents overall, residual agreement analysis showed that the cultural model of addiction causality is a multicentric domain. Two centers of the model, the moral and the medical, were discovered. Differing adherence to these centers is associated with the level of stigma attributed towards individuals with substance use disorder. The results suggest that current approaches to substance use education could contribute to stigma attribution, which may or may not be inadvertent. The significance of these results for both theory and the treatment of addiction are discussed.
DEFF Research Database (Denmark)
Bergstrøm-Nielsen, Carl
2010-01-01
Graphic notation is taught to music therapy students at Aalborg University in both simple and elaborate forms. This is a method of depicting music visually, and notations may serve as memory aids, as aids for analysis and reflection, and for communication purposes such as supervision or within...
Covariation in Natural Causal Induction.
Cheng, Patricia W.; Novick, Laura R.
1991-01-01
Biases and models usually offered by cognitive and social psychology and by philosophy to explain causal induction are evaluated with respect to focal sets (contextually determined sets of events over which covariation is computed). A probabilistic contrast model is proposed as underlying covariation computation in natural causal induction. (SLD)
Lu, T W; O'Connor, J J
1996-01-01
A computer graphics-based model of the knee ligaments in the sagittal plane was developed for the simulation and visualization of the shape changes and fibre recruitment process of the ligaments during motion under unloaded and loaded conditions. The cruciate and collateral ligaments were modelled as ordered arrays of fibres which link attachment areas on the tibia and femur. Fibres slacken and tighten as the ligament attachment areas on the bones rotate and translate relative to each other. A four-bar linkage, composed of the femur, tibia and selected isometric fibres of the two cruciates, was used to determine the motion of the femur relative to the tibia during passive (unloaded) movement. Fibres were assumed to slacken in a Euler buckling mode when the distances between their attachments are less than chosen reference lengths. The ligament shape changes and buckling patterns are demonstrated with computer graphics. When the tibia is translated anteriorly or posteriorly relative to the femur by muscle forces and external loads, some ligament fibres tighten and are recruited progressively to transmit increasing shear forces. The shape changes and fibre recruitment patterns predicted by the model compare well qualitatively with experimental results reported in the literature. The computer graphics approach provides insight into the micro behaviour of the knee ligaments. It may help to explain ligament injury mechanisms and provide useful information to guide the design of ligament replacements.
Lepley, Lindsey K; McKeon, Patrick O; Fitzpatrick, Shane G; Beckemeyer, Catherine L; Uhl, Timothy L; Butterfield, Timothy A
2016-10-01
The mechanisms that contribute to the development of chronic ankle instability are not understood. Investigators have developed a hypothetical model in which neuromuscular alterations that stem from damaged ankle ligaments are thought to affect periarticular and proximal muscle activity. However, the retrospective nature of these studies does not allow a causal link to be established. To assess temporal alterations in the activity of 2 periarticular muscles of the rat ankle and 2 proximal muscles of the rat hind limb after an ankle sprain. Controlled laboratory study. Laboratory. Five healthy adult male Long Evans rats (age = 16 weeks, mass = 400.0 ± 13.5 g). Indwelling fine-wire electromyography (EMG) electrodes were implanted surgically into the biceps femoris, medial gastrocnemius, vastus lateralis, and tibialis anterior muscles of the rats. We recorded baseline EMG measurements while the rats walked on a motor-driven treadmill and then induced a closed lateral ankle sprain by overextending the lateral ankle ligaments. After ankle sprain, the rats were placed on the treadmill every 24 hours for 7 days, and we recorded postsprain EMG data. Onset time of muscle activity, phase duration, sample entropy, and minimal detectable change (MDC) were assessed and compared with baseline using 2-tailed dependent t tests. Compared with baseline, delayed onset time of muscle activity was exhibited in the biceps femoris (baseline = -16.7 ± 54.0 milliseconds [ms]) on day 0 (5.2 ± 64.1 ms; t 4 = -4.655, P = .043) and tibialis anterior (baseline = 307.0 ± 64.2 ms) muscles on day 3 (362.5 ± 55.9 ms; t 4 = -5.427, P = .03) and day 6 (357.3 ± 39.6 ms; t 4 = -3.802, P = .02). Longer phase durations were observed for the vastus lateralis (baseline = 321.9 ± 92.6 ms) on day 3 (401.3 ± 101.2 ms; t 3 = -4.001, P = .03), day 4 (404.1 ± 93.0 ms; t 3 = -3.320, P = .048), and day 5 (364.6 ± 105.2 ms; t 3 = -3.963, P = .03) and for the tibialis anterior (baseline = 103.9 ± 16.4 ms
Directory of Open Access Journals (Sweden)
Vadim Leonidovich Ushakov
2016-10-01
Full Text Available The purpose of this paper was to study causal relationships between left and right hippocampal regions (LHIP and RHIP, respectively within the default mode network (DMN as represented by its key structures: the medial prefrontal cortex (MPFC, posterior cingulate cortex (PCC and the inferior parietal cortex of left (LIPC and right (RIPC hemispheres. Furthermore, we were interested in testing the stability of the connectivity patterns when adding or deleting regions of interest. The functional magnetic resonance imaging (fMRI data from a group of 30 healthy right-handed subjects in the resting state were collected and a connectivity analysis was performed. To model the effective connectivity, we used the spectral Dynamic Causal Modeling (DCM. Three DCM analyses were completed. Two of them modeled interaction between five nodes that included four DMN key structures in addition to either LHIP or RHIP. The last DCM analysis modeled interactions between four nodes whereby one of the main DMN structures, PCC, was excluded from the analysis. The results of all DCM analyses indicated a high level of stability in the computational method: those parts of the winning models that included the key DMN structures demonstrated causal relations known from recent research. However, we discovered new results as well. First of all, we found a pronounced asymmetry in LHIP and RHIP connections. LHIP demonstrated a high involvement of DMN activity with preponderant information outflow to all other DMN regions. Causal interactions of LHIP were bidirectional only in the case of LIPC. On the contrary, RHIP was primarily affected by inputs from LIPC, RIPC and LHIP without influencing these or other DMN key structures. For the first time, an inhibitory link was found from MPFC to LIPC, which may indicate the subjects’ effort to maintain a resting state. Functional connectivity data echoed these results, though they also showed links not reflected in the patterns of
Perception in statistical graphics
VanderPlas, Susan Ruth
There has been quite a bit of research on statistical graphics and visualization, generally focused on new types of graphics, new software to create graphics, interactivity, and usability studies. Our ability to interpret and use statistical graphics hinges on the interface between the graph itself and the brain that perceives and interprets it, and there is substantially less research on the interplay between graph, eye, brain, and mind than is sufficient to understand the nature of these relationships. The goal of the work presented here is to further explore the interplay between a static graph, the translation of that graph from paper to mental representation (the journey from eye to brain), and the mental processes that operate on that graph once it is transferred into memory (mind). Understanding the perception of statistical graphics should allow researchers to create more effective graphs which produce fewer distortions and viewer errors while reducing the cognitive load necessary to understand the information presented in the graph. Taken together, these experiments should lay a foundation for exploring the perception of statistical graphics. There has been considerable research into the accuracy of numerical judgments viewers make from graphs, and these studies are useful, but it is more effective to understand how errors in these judgments occur so that the root cause of the error can be addressed directly. Understanding how visual reasoning relates to the ability to make judgments from graphs allows us to tailor graphics to particular target audiences. In addition, understanding the hierarchy of salient features in statistical graphics allows us to clearly communicate the important message from data or statistical models by constructing graphics which are designed specifically for the perceptual system.
Sharaev, Maksim G; Zavyalova, Viktoria V; Ushakov, Vadim L; Kartashov, Sergey I; Velichkovsky, Boris M
2016-01-01
The Default Mode Network (DMN) is a brain system that mediates internal modes of cognitive activity, showing higher neural activation when one is at rest. Nowadays, there is a lot of interest in assessing functional interactions between its key regions, but in the majority of studies only association of Blood-oxygen-level dependent (BOLD) activation patterns is measured, so it is impossible to identify causal influences. There are some studies of causal interactions (i.e., effective connectivity), however often with inconsistent results. The aim of the current work is to find a stable pattern of connectivity between four DMN key regions: the medial prefrontal cortex (mPFC), the posterior cingulate cortex (PCC), left and right intraparietal cortex (LIPC and RIPC). For this purpose functional magnetic resonance imaging (fMRI) data from 30 healthy subjects (1000 time points from each one) was acquired and spectral dynamic causal modeling (DCM) on a resting-state fMRI data was performed. The endogenous brain fluctuations were explicitly modeled by Discrete Cosine Set at the low frequency band of 0.0078-0.1 Hz. The best model at the group level is the one where connections from both bilateral IPC to mPFC and PCC are significant and symmetrical in strength (p bidirectional, significant in the group and weaker than connections originating from bilateral IPC. In general, all connections from LIPC/RIPC to other DMN regions are much stronger. One can assume that these regions have a driving role within the DMN. Our results replicate some data from earlier works on effective connectivity within the DMN as well as provide new insights on internal DMN relationships and brain's functioning at resting state.
Directory of Open Access Journals (Sweden)
Maksim eSharaev
2016-02-01
Full Text Available The Default Mode Network (DMN is a brain system that mediates internal modes of cognitive activity, showing higher neural activation when one is at rest. Nowadays, there is a lot of interest in assessing functional interactions between its key regions, but in the majority of studies only association of BOLD (Blood-oxygen-level dependent activation patterns is measured, so it is impossible to identify causal influences. There are some studies of causal interactions (i.e. effective connectivity, however often with inconsistent results. The aim of the current work is to find a stable pattern of connectivity between four DMN key regions: the medial prefrontal cortex mPFC, the posterior cingulate cortex PCC, left and right intraparietal cortex LIPC and RIPC. For this purpose fMRI (functional magnetic resonance imaging data from 30 healthy subjects (1000 time points from each one was acquired and spectral dynamic causal modeling (DCM on a resting-state fMRI data was performed. The endogenous brain fluctuations were explicitly modeled by Discrete Cosine Set at the low frequency band of 0.0078–0.1 Hz. The best model at the group level is the one where connections from both bilateral IPC to mPFC and PCC are significant and symmetrical in strength (p<0.05. Connections between mPFC and PCC are bidirectional, significant in the group and weaker than connections originating from bilateral IPC. In general, all connections from LIPC/RIPC to other DMN regions are much stronger. One can assume that these regions have a driving role within the DMN. Our results replicate some data from earlier works on effective connectivity within the DMN as well as provide new insights on internal DMN relationships and brain’s functioning at resting state.
Directory of Open Access Journals (Sweden)
Ibrahim Delibalta
2017-01-01
Full Text Available We provide a causal inference framework to model the effects of machine learning algorithms on user preferences. We then use this mathematical model to prove that the overall system can be tuned to alter those preferences in a desired manner. A user can be an online shopper or a social media user, exposed to digital interventions produced by machine learning algorithms. A user preference can be anything from inclination towards a product to a political party affiliation. Our framework uses a state-space model to represent user preferences as latent system parameters which can only be observed indirectly via online user actions such as a purchase activity or social media status updates, shares, blogs, or tweets. Based on these observations, machine learning algorithms produce digital interventions such as targeted advertisements or tweets. We model the effects of these interventions through a causal feedback loop, which alters the corresponding preferences of the user. We then introduce algorithms in order to estimate and later tune the user preferences to a particular desired form. We demonstrate the effectiveness of our algorithms through experiments in different scenarios.
Zigzagging causality EPR model: answer to Vigier and coworkers and to Sutherland
Energy Technology Data Exchange (ETDEWEB)
de Beauregard, O.C.
1987-08-01
The concept of propagation in time of Vigier and co-workers (V et al.) implies the ideal of a supertime; it is thus alien to most Minkowskian pictures and certainly to the authors. From this stems much of V et al.'s misunderstandings of his position. In steady motion of a classical fluid nobody thinks that momentum conservation is violated, or that momentum is shot upstream without cause because of the suction from the sinks. Similarly with momentum-energy in spacetime and the acceptance of an advanced causality. As for the CT invariance of the Feynman propagator, the causality asymmetry it entails is factlike, not lawlike. The geometrical counterpart of the symmetry between prediction and retrodiction and between retarded and advanced waves, as expressed in the alternative expressions
Modelling the rand and commodity prices: A Granger causality and cointegration analysis
Directory of Open Access Journals (Sweden)
Xolani Ndlovu
2014-11-01
Full Text Available This paper examines the ‘commodity currency’ hypothesis of the Rand, that is, the postulate that the currency moves in line with commodity prices, and analyses the associated causality using nominal data between 1996 and 2010. We address both the short run and long run relationship between commodity prices and exchange rates. We find that while the levels of the series of both assets are difference stationary, they are not cointegrated. Further, we find the two variables are negatively related, with strong and significant causality running from commodity prices to the exchange rate and not vice versa, implying exogeneity in the determination of commodity prices with respect to the nominal exchange rate. The strength of the relationship is significantly weaker than other OECD commodity currencies. We surmise that the relationship is dynamic over time owing to the portfolio-rebalance argument and the Commodity Terms of Trade (CTT effect and, in the absence of an error correction mechanism, this disconnect may be prolonged. For commodity and currency market participants, this implies that while futures and forward commodity prices may be useful leading indicators of future currency movements, the price risk management strategies may need to be recalibrated over time.
On the zigzagging causality EPR model: Answer to Vigier and coworkers and to Sutherland
Costa de Beauregard, O.
1987-08-01
The concept of “propagation in time” of Vigier and co-workers (V et al.) implies the idea of a supertime; it is thus alien to most Minkowskian pictures and certainly to mine. From this stems much of V et al.'s misunderstandings of my position. In steady motion of a classical fluid nobody thinks that “momentum conservation is violated,” or that “momentum is shot upstream without cause” because of the suction from the sinks! Similarly with momentum-energy in space-time and the acceptance of an advanced causality. As for the CT invariance of the Feynman propagator, the causality asymmetry it entails is factlike, not lawlike. The geometrical counterpart of the symmetry between prediction and retrodiction and between retarded and advanced waves, as expressed in the alternative expressions == for a transition amplitude between a preparation |A> and a measurement |B>, is CPT-invariant, not PT-invariant. These three expressions respectively illustrate the collapse, the retrocollapse, and the symmetric collapse-and-retrocollapse concepts. As for Sutherland's argument, what it “falsifies” is not my retrocausation concept but the hidden-variables assumption he has unwittingly made.
Nigam, Ravi; Schlosser, Ralf W; Lloyd, Lyle L
2006-09-01
Matrix strategies employing parts of speech arranged in systematic language matrices and milieu language teaching strategies have been successfully used to teach word combining skills to children who have cognitive disabilities and some functional speech. The present study investigated the acquisition and generalized production of two-term semantic relationships in a new population using new types of symbols. Three children with cognitive disabilities and little or no functional speech were taught to combine graphic symbols. The matrix strategy and the mand-model procedure were used concomitantly as intervention procedures. A multiple probe design across sets of action-object combinations with generalization probes of untrained combinations was used to teach the production of graphic symbol combinations. Results indicated that two of the three children learned the early syntactic-semantic rule of combining action-object symbols and demonstrated generalization to untrained action-object combinations and generalization across trainers. The results and future directions for research are discussed.
International Nuclear Information System (INIS)
Fukui, Hirokazu; Yoshida, Michio; Yamaura, Kazuho
2000-01-01
For those who run an organization, it is critical to identify the causal relationship between the organization's characteristics and the safety-checking action of its staff, in order to effectively implement activities for promoting safety. In this research. a causal model of the safety-checking action was developed and factors affecting it were studied. A questionnaire survey, which includes safety awareness, attitude toward safety, safety culture and others, was conducted at three nuclear power plants and eight factors were extracted by means of factor analysis of the questionnaire items. The extracted eight interrelated factors were as follows: work norm, supervisory action, interest in training, recognition of importance, safety-checking action, the subject of safety, knowledge/skills, and the attitude of an organization. Among them, seven factors except the recognition of importance were defined as latent variables and a causal model of safety-checking action was constructed. By means of covariance structure analysis, it was found that the three factors: the attitude of an organization, supervisory action and the subject of safety, have a significant effect on the safety-checking action. Moreover, it was also studied that workplaces in which these three factors are highly regarded form social environment where safety-checking action is fully supported by the workplace as a whole, while workplaces in which these three factors are poorly regarded do not fully form social environment where safety-checking action is supported. Therefore, the workplaces form an organizational environment where safety-checking action tends to depend strongly upon the knowledge or skills of individuals. On top of these, it was noted that the attitude of an organization and supervisory action are important factors that serve as the first trigger affecting the formation of the organizational climate for safety. (author)
Energy Technology Data Exchange (ETDEWEB)
Fukui, Hirokazu [Institute of Nuclear Safety System Inc., Seika, Kyoto (Japan); Yoshida, Michio; Yamaura, Kazuho [Japan Institute for Group Dynamics, Fukuoka (Japan)
2000-09-01
For those who run an organization, it is critical to identify the causal relationship between the organization's characteristics and the safety-checking action of its staff, in order to effectively implement activities for promoting safety. In this research. a causal model of the safety-checking action was developed and factors affecting it were studied. A questionnaire survey, which includes safety awareness, attitude toward safety, safety culture and others, was conducted at three nuclear power plants and eight factors were extracted by means of factor analysis of the questionnaire items. The extracted eight interrelated factors were as follows: work norm, supervisory action, interest in training, recognition of importance, safety-checking action, the subject of safety, knowledge/skills, and the attitude of an organization. Among them, seven factors except the recognition of importance were defined as latent variables and a causal model of safety-checking action was constructed. By means of covariance structure analysis, it was found that the three factors: the attitude of an organization, supervisory action and the subject of safety, have a significant effect on the safety-checking action. Moreover, it was also studied that workplaces in which these three factors are highly regarded form social environment where safety-checking action is fully supported by the workplace as a whole, while workplaces in which these three factors are poorly regarded do not fully form social environment where safety-checking action is supported. Therefore, the workplaces form an organizational environment where safety-checking action tends to depend strongly upon the knowledge or skills of individuals. On top of these, it was noted that the attitude of an organization and supervisory action are important factors that serve as the first trigger affecting the formation of the organizational climate for safety. (author)
Causality Statistical Perspectives and Applications
Berzuini, Carlo; Bernardinell, Luisa
2012-01-01
A state of the art volume on statistical causality Causality: Statistical Perspectives and Applications presents a wide-ranging collection of seminal contributions by renowned experts in the field, providing a thorough treatment of all aspects of statistical causality. It covers the various formalisms in current use, methods for applying them to specific problems, and the special requirements of a range of examples from medicine, biology and economics to political science. This book:Provides a clear account and comparison of formal languages, concepts and models for statistical causality. Addr
Structural Equations and Causal Explanations: Some Challenges for Causal SEM
Markus, Keith A.
2010-01-01
One common application of structural equation modeling (SEM) involves expressing and empirically investigating causal explanations. Nonetheless, several aspects of causal explanation that have an impact on behavioral science methodology remain poorly understood. It remains unclear whether applications of SEM should attempt to provide complete…
Explaining through causal mechanisms
Biesbroek, Robbert; Dupuis, Johann; Wellstead, Adam
2017-01-01
This paper synthesizes and builds on recent critiques of the resilience literature; namely that the field has largely been unsuccessful in capturing the complexity of governance processes, in particular cause–effects relationships. We demonstrate that absence of a causal model is reflected in the
Mejía-Aranguré, Juan Manuel
Acute leukemias have a huge morphological, cytogenetic and molecular heterogeneity and genetic polymorphisms associated with susceptibility. Every leukemia presents causal factors associated with the development of the disease. Particularly, when three factors are present, they result in the development of acute leukemia. These phenomena are susceptibility, environmental exposure and a period that, for this model, has been called the period of vulnerability. This framework shows how the concepts of molecular epidemiology have established a reference from which it is more feasible to identify the environmental factors associated with the development of leukemia in children. Subsequently, the arguments show that only susceptible children are likely to develop leukemia once exposed to an environmental factor. For additional exposure, if the child is not susceptible to leukemia, the disease does not develop. In addition, this exposure should occur during a time window when hematopoietic cells and their environment are more vulnerable to such interaction, causing the development of leukemia. This model seeks to predict the time when the leukemia develops and attempts to give a context in which the causality of childhood leukemia should be studied. This information can influence and reduce the risk of a child developing leukemia. Copyright © 2016 Hospital Infantil de México Federico Gómez. Publicado por Masson Doyma México S.A. All rights reserved.
Directory of Open Access Journals (Sweden)
Thanyatorn Amornkitpinyo
2015-02-01
Full Text Available The objective of this study is to design a framework for a causal relationship model of the Information and Communication Technology skills that affect the Technology Acceptance Process (TAP for undergraduate students in the 21ST Century. This research uses correlational analysis. A consideration of the research methodology is divided into two sections. The first section involves a synthesis concept framework for process acceptance of the causal relationship model of the Information and Communication Technology skills that affect the Technology Acceptance Process for undergraduate students in the 21ST Century. The second section proposes the design concept framework of the model. The research findings are as follows: 1 The exogenous latent variables included in the causal relationship model of the Information and Communication Technology skills that affect the Technology Acceptance Process for undergraduate students in the 21ST Century are basic ICT skills and self-efficacy. 2 The mediating latent variables of the causal relationship model of the Information and Communication Technology skills that affect the Technology Acceptance Process for undergraduate students in the 21ST Century are from the TAM Model, these includes three components: 1 perceived usefulness, 2 perceived ease of use and 3 attitudes. 3 The outcome latent variable of the causal relationship model of the Information and Communication Technology skills that affect the Technology Acceptance Process for undergraduate students in the 21ST Century is behavioural intention.
Roy, S. G.; Koons, P. O.; Gerbi, C. C.; Capps, D. K.; Tucker, G. E.; Rogers, Z. A.
2014-12-01
Sophisticated numerical tools exist for modeling geomorphic processes and linking them to tectonic and climatic systems, but they are often seen as inaccessible for users with an exploratory level of interest. We have improved the accessibility of landscape evolution models by producing a simple graphics user interface (GUI) that takes advantage of the Channel-Hillslope Integrated Landscape Development (CHILD) model. Model access is flexible: the user can edit values for basic geomorphic, tectonic, and climate parameters, or obtain greater control by defining the spatiotemporal distributions of those parameters. Users can make educated predictions by choosing their own parametric values for the governing equations and interpreting the results immediately through model graphics. This method of modeling allows users to iteratively build their understanding through experimentation. Use of this GUI is intended for inquiry and discovery-based learning activities. We discuss a number of examples of how the GUI can be used at the upper high school, introductory university, and advanced university level. Effective teaching modules initially focus on an inquiry-based example guided by the instructor. As students become familiar with the GUI and the CHILD model, the class can shift to more student-centered exploration and experimentation. To make model interpretations more robust, digital elevation models can be imported and direct comparisons can be made between CHILD model results and natural topography. The GUI is available online through the University of Maine's Earth and Climate Sciences website, through the Community Surface Dynamics Modeling System (CSDMS) model repository, or by contacting the corresponding author.
Paradoxical Behavior of Granger Causality
Witt, Annette; Battaglia, Demian; Gail, Alexander
2013-03-01
Granger causality is a standard tool for the description of directed interaction of network components and is popular in many scientific fields including econometrics, neuroscience and climate science. For time series that can be modeled as bivariate auto-regressive processes we analytically derive an expression for spectrally decomposed Granger Causality (SDGC) and show that this quantity depends only on two out of four groups of model parameters. Then we present examples of such processes whose SDGC expose paradoxical behavior in the sense that causality is high for frequency ranges with low spectral power. For avoiding misinterpretations of Granger causality analysis we propose to complement it by partial spectral analysis. Our findings are illustrated by an example from brain electrophysiology. Finally, we draw implications for the conventional definition of Granger causality. Bernstein Center for Computational Neuroscience Goettingen
Directory of Open Access Journals (Sweden)
Filippo Trentini
2015-03-01
Full Text Available Backgroung. In public health one debated issue is related to consequences of improper self-management in health care. Some theoretical models have been proposed in Health Communication theory which highlight how components such general literacy and specific knowledge of the disease might be very important for effective actions in healthcare system. Methods. This paper aims at investigating the consistency of Health Empowerment Model by means of both graphical models approach, which is a “data driven” method and a Structural Equation Modeling (SEM approach, which is instead “theory driven”, showing the different information pattern that can be revealed in a health care research context.The analyzed dataset provides data on the relationship between the Health Empowerment Model constructs and the behavioral and health status in 263 chronic low back pain (cLBP patients. We used the graphical models approach to evaluate the dependence structure in a “blind” way, thus learning the structure from the data.Results. From the estimation results dependence structure confirms links design assumed in SEM approach directly from researchers, thus validating the hypotheses which generated the Health Empowerment Model constructs.Conclusions. This models comparison helps in avoiding confirmation bias. In Structural Equation Modeling, we used SPSS AMOS 21 software. Graphical modeling algorithms were implemented in a R software environment.
Schnitzer, Mireille E.; Lok, Judith J.; Gruber, Susan
2015-01-01
This paper investigates the appropriateness of the integration of flexible propensity score modeling (nonparametric or machine learning approaches) in semiparametric models for the estimation of a causal quantity, such as the mean outcome under treatment. We begin with an overview of some of the issues involved in knowledge-based and statistical variable selection in causal inference and the potential pitfalls of automated selection based on the fit of the propensity score. Using a simple example, we directly show the consequences of adjusting for pure causes of the exposure when using inverse probability of treatment weighting (IPTW). Such variables are likely to be selected when using a naive approach to model selection for the propensity score. We describe how the method of Collaborative Targeted minimum loss-based estimation (C-TMLE; van der Laan and Gruber, 2010) capitalizes on the collaborative double robustness property of semiparametric efficient estimators to select covariates for the propensity score based on the error in the conditional outcome model. Finally, we compare several approaches to automated variable selection in low-and high-dimensional settings through a simulation study. From this simulation study, we conclude that using IPTW with flexible prediction for the propensity score can result in inferior estimation, while Targeted minimum loss-based estimation and C-TMLE may benefit from flexible prediction and remain robust to the presence of variables that are highly correlated with treatment. However, in our study, standard influence function-based methods for the variance underestimated the standard errors, resulting in poor coverage under certain data-generating scenarios. PMID:26226129
Computer graphics and research projects
International Nuclear Information System (INIS)
Ingtrakul, P.
1994-01-01
This report was prepared as an account of scientific visualization tools and application tools for scientists and engineers. It is provided a set of tools to create pictures and to interact with them in natural ways. It applied many techniques of computer graphics and computer animation through a number of full-color presentations as computer animated commercials, 3D computer graphics, dynamic and environmental simulations, scientific modeling and visualization, physically based modelling, and beavioral, skelatal, dynamics, and particle animation. It took in depth at original hardware and limitations of existing PC graphics adapters contain syste m performance, especially with graphics intensive application programs and user interfaces
Directory of Open Access Journals (Sweden)
A.N. Khomchenko
2016-08-01
Full Text Available The paper considers the problem of bi-cubic interpolation on the final element of serendipity family. With cognitive-graphical analysis the rigid model of Ergatoudis, Irons and Zenkevich (1968 compared with alternative models, obtained by the methods: direct geometric design, a weighted averaging of the basis polynomials, systematic generation of bases (advanced Taylor procedure. The emphasis is placed on the phenomenon of "gravitational repulsion" (Zenkevich paradox. The causes of rising of inadequate physical spectra nodal loads on serendipity elements of higher orders are investigated. Soft modeling allows us to build a lot of serendipity elements of bicubic interpolation, and you do not even need to know the exact form of the rigid model. The different interpretations of integral characteristics of the basis polynomials: geometrical, physical, probability are offered. Under the soft model in the theory of interpolation of function of two variables implies the model amenable to change through the choice of basis. Such changes in the family of Lagrangian finite elements of higher orders are excluded (hard simulation. Standard models of serendipity family (Zenkevich were also tough. It was found that the "responsibility" for the rigidity of serendipity model rests on ruled surfaces (zero Gaussian curvature - conoids that predominate in the base set. Cognitive portraits zero lines of standard serendipity surfaces suggested that in order to "mitigate" of serendipity pattern conoid should better be replaced by surfaces of alternating Gaussian curvature. The article shows the alternative (soft bases of serendipity models. The work is devoted to solving scientific and technological problems aimed at the creation, dissemination and use of cognitive computer graphics in teaching and learning. The results are of interest to students of specialties: "Computer Science and Information Technologies", "System Analysis", "Software Engineering", as well as
Wang, Wei; Nelson, Suchitra; Albert, Jeffrey M
2013-10-30
Mediators are intermediate variables in the causal pathway between an exposure and an outcome. Mediation analysis investigates the extent to which exposure effects occur through these variables, thus revealing causal mechanisms. In this paper, we consider the estimation of the mediation effect when the outcome is binary and multiple mediators of different types exist. We give a precise definition of the total mediation effect as well as decomposed mediation effects through individual or sets of mediators using the potential outcomes framework. We formulate a model of joint distribution (probit-normal) using continuous latent variables for any binary mediators to account for correlations among multiple mediators. A mediation formula approach is proposed to estimate the total mediation effect and decomposed mediation effects based on this parametric model. Estimation of mediation effects through individual or subsets of mediators requires an assumption involving the joint distribution of multiple counterfactuals. We conduct a simulation study that demonstrates low bias of mediation effect estimators for two-mediator models with various combinations of mediator types. The results also show that the power to detect a nonzero total mediation effect increases as the correlation coefficient between two mediators increases, whereas power for individual mediation effects reaches a maximum when the mediators are uncorrelated. We illustrate our approach by applying it to a retrospective cohort study of dental caries in adolescents with low and high socioeconomic status. Sensitivity analysis is performed to assess the robustness of conclusions regarding mediation effects when the assumption of no unmeasured mediator-outcome confounders is violated. Copyright © 2013 John Wiley & Sons, Ltd.
Nelson, Suchitra; Albert, Jeffrey M.
2013-01-01
Mediators are intermediate variables in the causal pathway between an exposure and an outcome. Mediation analysis investigates the extent to which exposure effects occur through these variables, thus revealing causal mechanisms. In this paper, we consider the estimation of the mediation effect when the outcome is binary and multiple mediators of different types exist. We give a precise definition of the total mediation effect as well as decomposed mediation effects through individual or sets of mediators using the potential outcomes framework. We formulate a model of joint distribution (probit-normal) using continuous latent variables for any binary mediators to account for correlations among multiple mediators. A mediation formula approach is proposed to estimate the total mediation effect and decomposed mediation effects based on this parametric model. Estimation of mediation effects through individual or subsets of mediators requires an assumption involving the joint distribution of multiple counterfactuals. We conduct a simulation study that demonstrates low bias of mediation effect estimators for two-mediator models with various combinations of mediator types. The results also show that the power to detect a non-zero total mediation effect increases as the correlation coefficient between two mediators increases, while power for individual mediation effects reaches a maximum when the mediators are uncorrelated. We illustrate our approach by applying it to a retrospective cohort study of dental caries in adolescents with low and high socioeconomic status. Sensitivity analysis is performed to assess the robustness of conclusions regarding mediation effects when the assumption of no unmeasured mediator-outcome confounders is violated. PMID:23650048
Falasca, N W; D'Ascenzo, S; Di Domenico, A; Onofrj, M; Tommasi, L; Laeng, B; Franciotti, R
2015-04-01
Magnetoencephalography was recorded during a matching-to-sample plus cueing paradigm, in which participants judged the occurrence of changes in either categorical (CAT) or coordinate (COO) spatial relations. Previously, parietal and frontal lobes were identified as key areas in processing spatial relations and it was shown that each hemisphere was differently involved and modulated by the scope of the attention window (e.g. a large and small cue). In this study, Granger analysis highlighted the patterns of causality among involved brain areas--the direction of information transfer ran from the frontal to the visual cortex in the right hemisphere, whereas it ran in the opposite direction in the left side. Thus, the right frontal area seems to exert top-down influence, supporting the idea that, in this task, top-down signals are selectively related to the right side. Additionally, for CAT change preceded by a small cue, the right frontal gyrus was not involved in the information transfer, indicating a selective specialization of the left hemisphere for this condition. The present findings strengthen the conclusion of the presence of a remarkable hemispheric specialization for spatial relation processing and illustrate the complex interactions between the lateralized parts of the neural network. Moreover, they illustrate how focusing attention over large or small regions of the visual field engages these lateralized networks differently, particularly in the frontal regions of each hemisphere, consistent with the theory that spatial relation judgements require a fronto-parietal network in the left hemisphere for categorical relations and on the right hemisphere for coordinate spatial processing. © 2015 Federation of European Neuroscience Societies and John Wiley & Sons Ltd.
Made Tirta, I.; Anggraeni, Dian
2018-04-01
Statistical models have been developed rapidly into various directions to accommodate various types of data. Data collected from longitudinal, repeated measured, clustered data (either continuous, binary, count, or ordinal), are more likely to be correlated. Therefore statistical model for independent responses, such as Generalized Linear Model (GLM), Generalized Additive Model (GAM) are not appropriate. There are several models available to apply for correlated responses including GEEs (Generalized Estimating Equations), for marginal model and various mixed effect model such as GLMM (Generalized Linear Mixed Models) and HGLM (Hierarchical Generalized Linear Models) for subject spesific models. These models are available on free open source software R, but they can only be accessed through command line interface (using scrit). On the othe hand, most practical researchers very much rely on menu based or Graphical User Interface (GUI). We develop, using Shiny framework, standard pull down menu Web-GUI that unifies most models for correlated responses. The Web-GUI has accomodated almost all needed features. It enables users to do and compare various modeling for repeated measure data (GEE, GLMM, HGLM, GEE for nominal responses) much more easily trough online menus. This paper discusses the features of the Web-GUI and illustrates the use of them. In General we find that GEE, GLMM, HGLM gave very closed results.
Souza, W.R.
1999-01-01
This report documents a graphical display post-processor (SutraPlot) for the U.S. Geological Survey Saturated-Unsaturated flow and solute or energy TRAnsport simulation model SUTRA, Version 2D3D.1. This version of SutraPlot is an upgrade to SutraPlot for the 2D-only SUTRA model (Souza, 1987). It has been modified to add 3D functionality, a graphical user interface (GUI), and enhanced graphic output options. Graphical options for 2D SUTRA (2-dimension) simulations include: drawing the 2D finite-element mesh, mesh boundary, and velocity vectors; plots of contours for pressure, saturation, concentration, and temperature within the model region; 2D finite-element based gridding and interpolation; and 2D gridded data export files. Graphical options for 3D SUTRA (3-dimension) simulations include: drawing the 3D finite-element mesh; plots of contours for pressure, saturation, concentration, and temperature in 2D sections of the 3D model region; 3D finite-element based gridding and interpolation; drawing selected regions of velocity vectors (projected on principal coordinate planes); and 3D gridded data export files. Installation instructions and a description of all graphic options are presented. A sample SUTRA problem is described and three step-by-step SutraPlot applications are provided. In addition, the methodology and numerical algorithms for the 2D and 3D finite-element based gridding and interpolation, developed for SutraPlot, are described. 1
DEFF Research Database (Denmark)
Garcia Clavero, Ana Belén; Madsen, A.; Vigre, Håkan
2012-01-01
’ exposure to Campylobacter.In this presentation we focus on the development of a computerized decision support system to aid management decisions on Campylobacter vaccination of commercial broilers. Broilers should be vaccinated against Campylobacter in the first 2 weeks of age. Therefore, the decision...... about vaccination needs to be made usually before Campylobacter is introduced in the flock. In fact, there is uncertainty regarding the introduction of Campylobacter into the flock that needs to be taken into account in the decision making process. Probabilistic Graphical Models (PGMs) integrate......, epidemiological and economic factors (cost-reward functions) have been included in the models. The final outcome of the models is presented in probabilities of expected level of Campylobacter and financial terms influenced by the decision on vaccination. For example, if the best decision seems to be to vaccinate...
International Nuclear Information System (INIS)
Bass, L.; Wynholds, H.W.; Porterfield, W.R.
1975-01-01
Described is an operational system that enables the user, through an intelligent graphics terminal, to construct, modify, analyze, and store fault trees. With this system, complex engineering designs can be analyzed. This paper discusses the system and its capabilities. Included is a brief discussion of fault tree analysis, which represents an aspect of reliability and safety modeling
Causality and Time in Historical Institutionalism
DEFF Research Database (Denmark)
Mahoney, James; Mohamedali, Khairunnisa; Nguyen, Christoph
2016-01-01
This chapter explores the dual concern with causality and time in historical institutionalism using a graphical approach. The analysis focuses on three concepts that are central to this field: critical junctures, gradual change, and path dependence. The analysis makes explicit and formal the logi...
Behavioural Pattern of Causality Parameter of Autoregressive ...
African Journals Online (AJOL)
In this paper, a causal form of Autoregressive Moving Average process, ARMA (p, q) of various orders and behaviour of the causality parameter of ARMA model is investigated. It is deduced that the behaviour of causality parameter ψi depends on positive and negative values of autoregressive parameter φ and moving ...
Representing Personal Determinants in Causal Structures.
Bandura, Albert
1984-01-01
Responds to Staddon's critique of the author's earlier article and addresses issues raised by Staddon's (1984) alternative models of causality. The author argues that it is not the formalizability of causal processes that is the issue but whether cognitive determinants of behavior are reducible to past stimulus inputs in causal structures.…
Polverino, Pierpaolo; Frisk, Erik; Jung, Daniel; Krysander, Mattias; Pianese, Cesare
2017-07-01
The present paper proposes an advanced approach for Polymer Electrolyte Membrane Fuel Cell (PEMFC) systems fault detection and isolation through a model-based diagnostic algorithm. The considered algorithm is developed upon a lumped parameter model simulating a whole PEMFC system oriented towards automotive applications. This model is inspired by other models available in the literature, with further attention to stack thermal dynamics and water management. The developed model is analysed by means of Structural Analysis, to identify the correlations among involved physical variables, defined equations and a set of faults which may occur in the system (related to both auxiliary components malfunctions and stack degradation phenomena). Residual generators are designed by means of Causal Computation analysis and the maximum theoretical fault isolability, achievable with a minimal number of installed sensors, is investigated. The achieved results proved the capability of the algorithm to theoretically detect and isolate almost all faults with the only use of stack voltage and temperature sensors, with significant advantages from an industrial point of view. The effective fault isolability is proved through fault simulations at a specific fault magnitude with an advanced residual evaluation technique, to consider quantitative residual deviations from normal conditions and achieve univocal fault isolation.
Connell, Ellery
2011-01-01
Helping graphic designers expand their 2D skills into the 3D space The trend in graphic design is towards 3D, with the demand for motion graphics, animation, photorealism, and interactivity rapidly increasing. And with the meteoric rise of iPads, smartphones, and other interactive devices, the design landscape is changing faster than ever.2D digital artists who need a quick and efficient way to join this brave new world will want 3D for Graphic Designers. Readers get hands-on basic training in working in the 3D space, including product design, industrial design and visualization, modeling, ani
Chang, C.; Li, M.; Yeh, G.
2010-12-01
The BIOGEOCHEM numerical model (Yeh and Fang, 2002; Fang et al., 2003) was developed with FORTRAN for simulating reaction-based geochemical and biochemical processes with mixed equilibrium and kinetic reactions in batch systems. A complete suite of reactions including aqueous complexation, adsorption/desorption, ion-exchange, redox, precipitation/dissolution, acid-base reactions, and microbial mediated reactions were embodied in this unique modeling tool. Any reaction can be treated as fast/equilibrium or slow/kinetic reaction. An equilibrium reaction is modeled with an implicit finite rate governed by a mass action equilibrium equation or by a user-specified algebraic equation. A kinetic reaction is modeled with an explicit finite rate with an elementary rate, microbial mediated enzymatic kinetics, or a user-specified rate equation. None of the existing models has encompassed this wide array of scopes. To ease the input/output learning curve using the unique feature of BIOGEOCHEM, an interactive graphic user interface was developed with the Microsoft Visual Studio and .Net tools. Several user-friendly features, such as pop-up help windows, typo warning messages, and on-screen input hints, were implemented, which are robust. All input data can be real-time viewed and automated to conform with the input file format of BIOGEOCHEM. A post-processor for graphic visualizations of simulated results was also embedded for immediate demonstrations. By following data input windows step by step, errorless BIOGEOCHEM input files can be created even if users have little prior experiences in FORTRAN. With this user-friendly interface, the time effort to conduct simulations with BIOGEOCHEM can be greatly reduced.
Marvel, Skylar W; To, Kimberly; Grimm, Fabian A; Wright, Fred A; Rusyn, Ivan; Reif, David M
2018-03-05
Drawing integrated conclusions from diverse source data requires synthesis across multiple types of information. The ToxPi (Toxicological Prioritization Index) is an analytical framework that was developed to enable integration of multiple sources of evidence by transforming data into integrated, visual profiles. Methodological improvements have advanced ToxPi and expanded its applicability, necessitating a new, consolidated software platform to provide functionality, while preserving flexibility for future updates. We detail the implementation of a new graphical user interface for ToxPi (Toxicological Prioritization Index) that provides interactive visualization, analysis, reporting, and portability. The interface is deployed as a stand-alone, platform-independent Java application, with a modular design to accommodate inclusion of future analytics. The new ToxPi interface introduces several features, from flexible data import formats (including legacy formats that permit backward compatibility) to similarity-based clustering to options for high-resolution graphical output. We present the new ToxPi interface for dynamic exploration, visualization, and sharing of integrated data models. The ToxPi interface is freely-available as a single compressed download that includes the main Java executable, all libraries, example data files, and a complete user manual from http://toxpi.org .
Hattori, Masasi; Oaksford, Mike
2007-01-01
In this article, 41 models of covariation detection from 2 x 2 contingency tables were evaluated against past data in the literature and against data from new experiments. A new model was also included based on a limiting case of the normative phi-coefficient under an extreme rarity assumption, which has been shown to be an important factor in…
On the detection of effective marketing instruments and causality in VAR models
Horváth, C.; Otter, P.W.
2000-01-01
Dynamic multivariate models become more and more popular in analyzing the behavior of competive marketing environments. Takada and Bass (1998), Dekimpe, Hanssens and Silva-Rosso (1999), and Dekimpe and Hanssens (1999) recommend to use Vector Autoregressive (VAR) models because they provide
Berger, Joseph B.; Milem, Jeffrey F.
1999-01-01
This study refined and applied an integrated model of undergraduate persistence (accounting for both behavioral and perceptual components) to examine first-year retention at a private, highly selective research university. Results suggest that including behaviorally based measures of involvement improves the model's explanatory power concerning…
Classical planning and causal implicatures
DEFF Research Database (Denmark)
Blackburn, Patrick Rowan; Benotti, Luciana
In this paper we motivate and describe a dialogue manager (called Frolog) which uses classical planning to infer causal implicatures. A causal implicature is a type of Gricean relation implicature, a highly context dependent form of inference. As we shall see, causal implicatures are important...... to generate clarification requests"; as a result we can model task-oriented dialogue as an interactive process locally structured by negotiation of the underlying task. We give several examples of Frolog-human dialog, discuss the limitations imposed by the classical planning paradigm, and indicate...
An Empirical Study of Efficiency and Accuracy of Probabilistic Graphical Models
DEFF Research Database (Denmark)
Nielsen, Jens Dalgaard; Jaeger, Manfred
2006-01-01
In this paper we compare Na\\ii ve Bayes (NB) models, general Bayes Net (BN) models and Probabilistic Decision Graph (PDG) models w.r.t. accuracy and efficiency. As the basis for our analysis we use graphs of size vs. likelihood that show the theoretical capabilities of the models. We also measure...
Choosing an optimal model for failure data analysis by graphical approach
International Nuclear Information System (INIS)
Zhang, Tieling; Dwight, Richard
2013-01-01
Many models involving combination of multiple Weibull distributions, modification of Weibull distribution or extension of its modified ones, etc. have been developed to model a given set of failure data. The application of these models to modeling a given data set can be based on plotting the data on Weibull probability paper (WPP). Of them, two or more models are appropriate to model one typical shape of the fitting plot, whereas a specific model may be fit for analyzing different shapes of the plots. Hence, a problem arises, that is how to choose an optimal model for a given data set and how to model the data. The motivation of this paper is to address this issue. This paper summarizes the characteristics of Weibull-related models with more than three parameters including sectional models involving two or three Weibull distributions, competing risk model and mixed Weibull model. The models as discussed in this present paper are appropriate to model the data of which the shapes of plots on WPP can be concave, convex, S-shaped or inversely S-shaped. Then, the method for model selection is proposed, which is based on the shapes of the fitting plots. The main procedure for parameter estimation of the models is described accordingly. In addition, the range of data plots on WPP is clearly highlighted from the practical point of view. To note this is important as mathematical analysis of a model with neglecting the applicable range of the model plot will incur discrepancy or big errors in model selection and parameter estimates
Directory of Open Access Journals (Sweden)
Paula Medina Maçaira
Full Text Available ABSTRACT The Brazilian electricity energy matrix is essentially formed by hydraulic sources which currently account for 70% of the installed capacity. One of the most important characteristics of a generation system with hydro predominance is the strong dependence on the inflow regimes. Nowadays, the Brazilian power sector uses the PAR(p model to generate scenarios for hydrological inflows. This approach does not consider any exogenous information that may affect hydrological regimes. The main objective of this paper is to infer on the influence of climatic events in water inflows as a way to improve the model’s performance. The proposed model is called “causal PAR(p” and considers exogenous variables, such as El Niño and Sunspots, to generate scenarios for some Brazilian reservoirs. The result shows that the error measures decrease approximately 3%. This improvement indicates that the inclusion of climate variables to model and simulate the inflows time series is a valid exercise and should be taken into consideration.
Causal Loop-based Modeling on System Dynamics for Risk Communication
Energy Technology Data Exchange (ETDEWEB)
Lee, Chang Ju [Korea Institute of Nuclear Safety, Daejeon (Korea, Republic of); Kang, Kyung Min [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of)
2009-10-15
It is true that a national policy should be based on public confidence, analyzing their recognition and attitude on life safety, since they have very special risk perception characteristics. For achieving effective public consensus regarding a national policy such as nuclear power, we have to utilize a risk communication (hereafter, calls RiCom) process. However, domestic research models on RiCom process do not provide a practical guideline, because most of them are still superficial and stick on an administrative aspect. Also, most of current models have no experience in terms of verification and validation for effective applications to diverse stake holders. This study focuses on public's dynamic mechanism through the modeling on system dynamics, basically utilizing casual loop diagram (CLD) and stock flow diagram (SFD), which regards as a critical technique for decision making in many industrial RiCom models.
Causal Loop-based Modeling on System Dynamics for Risk Communication
International Nuclear Information System (INIS)
Lee, Chang Ju; Kang, Kyung Min
2009-01-01
It is true that a national policy should be based on public confidence, analyzing their recognition and attitude on life safety, since they have very special risk perception characteristics. For achieving effective public consensus regarding a national policy such as nuclear power, we have to utilize a risk communication (hereafter, calls RiCom) process. However, domestic research models on RiCom process do not provide a practical guideline, because most of them are still superficial and stick on an administrative aspect. Also, most of current models have no experience in terms of verification and validation for effective applications to diverse stake holders. This study focuses on public's dynamic mechanism through the modeling on system dynamics, basically utilizing casual loop diagram (CLD) and stock flow diagram (SFD), which regards as a critical technique for decision making in many industrial RiCom models
New Product Development and Innovation in the Maquiladora Industry: A Causal Model
Directory of Open Access Journals (Sweden)
Jorge Luis García-Alcaraz
2016-07-01
Full Text Available Companies seek to stand out from their competitors and react to other competitive threats. Making a difference means doing things differently in order to create a product that other companies cannot provide. This can be achieved through an innovation process. This article analyses, by means of a structural equation model, the current situation of Mexican maquiladora companies, which face the constant challenge of product innovation. The model associates three success factors for new product development (product, organization, and production process characteristics as independent latent variables with benefits gained by customers and companies (dependent latent variables. Results show that, in the Mexican maquiladora sector, organizational characteristics and production processes characteristics explain only 31% of the variability (R2 = 0.31, and it seems necessary to integrate other aspects. The relationship between customer benefits and company benefits explains 58% of the variability, the largest proportion in the model (R2 = 0.58.
Directory of Open Access Journals (Sweden)
Antonio Belmonte
2018-04-01
Full Text Available This paper presents a new algorithm for the resolution of over-constrained lumped process systems, where partial differential equations of a continuous time and space model of the system are reduced into ordinary differential equations with a finite number of parameters and where the model equations outnumber the unknown model variables. Our proposal is aimed at the study and improvement of the algorithm proposed by Hangos-Szerkenyi-Tuza. This new algorithm improves the computational cost and solves some of the internal problems of the aforementioned algorithm in its original formulation. The proposed algorithm is based on parameter relaxation that can be modified easily. It retains the necessary information of the lumped process system to reduce the time cost after introducing changes during the system formulation. It also allows adjustment of the system formulations that change its differential index between simulations.
Possel, Patrick; Seemann, Simone; Ahrens, Stefanie; Hautzinger, Martin
2006-01-01
In Dodge's model of "social information processing" depression is the result of a linear sequence of five stages of information processing ("Annu Rev Psychol" 44: 559-584, 1993). These stages follow a person's reaction to situational stimuli, such that each stage of information processing mediates the relationship between earlier and later stages.…
Causal Client Models in Selecting Effective Interventions: A Cognitive Mapping Study
de Kwaadsteniet, Leontien; Hagmayer, York; Krol, Nicole P. C. M.; Witteman, Cilia L. M.
2010-01-01
An important reason to choose an intervention to treat psychological problems of clients is the expectation that the intervention will be effective in alleviating the problems. The authors investigated whether clinicians base their ratings of the effectiveness of interventions on models that they construct representing the factors causing and…
Directory of Open Access Journals (Sweden)
Hilary Zelko
Full Text Available BACKGROUND: Although scientific innovation has been a long-standing topic of interest for historians, philosophers and cognitive scientists, few studies in biomedical research have examined from researchers' perspectives how high impact publications are developed and why they are consistently produced by a small group of researchers. Our objective was therefore to interview a group of researchers with a track record of high impact publications to explore what mechanism they believe contribute to the generation of high impact publications. METHODOLOGY/PRINCIPAL FINDINGS: Researchers were located in universities all over the globe and interviews were conducted by phone. All interviews were transcribed using standard qualitative methods. A Grounded Theory approach was used to code each transcript, later aggregating concept and categories into overarching explanation model. The model was then translated into a System Dynamics mathematical model to represent its structure and behavior. Five emerging themes were found in our study. First, researchers used heuristics or rules of thumb that came naturally to them. Second, these heuristics were reinforced by positive feedback from their peers and mentors. Third, good communication skills allowed researchers to provide feedback to their peers, thus closing a positive feedback loop. Fourth, researchers exhibited a number of psychological attributes such as curiosity or open-mindedness that constantly motivated them, even when faced with discouraging situations. Fifth, the system is dominated by randomness and serendipity and is far from a linear and predictable environment. Some researchers, however, took advantage of this randomness by incorporating mechanisms that would allow them to benefit from random findings. The aggregation of these themes into a policy model represented the overall expected behavior of publications and their impact achieved by high impact researchers. CONCLUSIONS: The proposed
Directory of Open Access Journals (Sweden)
Arup Kumar Baksi
2012-08-01
Full Text Available Information technology induced communications (ICTs have revolutionized the operational aspects of service sector and have triggered a perceptual shift in service quality as rapid dis-intermediation has changed the access-mode of services on part of the consumers. ICT-enabled services further stimulated the perception of automated service quality with renewed dimensions and there subsequent significance to influence the behavioural outcomes of the consumers. Customer Relationship Management (CRM has emerged as an offshoot to technological breakthrough as it ensured service-encapsulation by integrating people, process and technology. This paper attempts to explore the relationship between automated service quality and its behavioural consequences in a relatively novel business-philosophy – CRM. The study has been conducted on the largest public sector bank of India - State bank of India (SBI at Kolkata which has successfully completed its decade-long operational automation in the year 2008. The study used structural equation modeling (SEM to justify the proposed model construct and causal loop diagramming (CLD to depict the negative and positive linkages between the variables.
Discrete causal theory emergent spacetime and the causal metric hypothesis
Dribus, Benjamin F
2017-01-01
This book evaluates and suggests potentially critical improvements to causal set theory, one of the best-motivated approaches to the outstanding problems of fundamental physics. Spacetime structure is of central importance to physics beyond general relativity and the standard model. The causal metric hypothesis treats causal relations as the basis of this structure. The book develops the consequences of this hypothesis under the assumption of a fundamental scale, with smooth spacetime geometry viewed as emergent. This approach resembles causal set theory, but differs in important ways; for example, the relative viewpoint, emphasizing relations between pairs of events, and relationships between pairs of histories, is central. The book culminates in a dynamical law for quantum spacetime, derived via generalized path summation.
2015-11-01
independent. The PFT model is deliberately not that of a rational actor doing cost-benefit calculations. Real individuals are affected by emotions ...use the TLWS and PF methods discussed earlier. Our quasi-Bayesian method is “quasi” because we used heuristic methods to determine the weight given...are often justified heuristically on a case-by-case basis. One way to think about the structural issues around which we had to design is to think of
From meta-omics to causality: experimental models for human microbiome research.
Fritz, Joëlle V; Desai, Mahesh S; Shah, Pranjul; Schneider, Jochen G; Wilmes, Paul
2013-05-03
Large-scale 'meta-omic' projects are greatly advancing our knowledge of the human microbiome and its specific role in governing health and disease states. A myriad of ongoing studies aim at identifying links between microbial community disequilibria (dysbiosis) and human diseases. However, due to the inherent complexity and heterogeneity of the human microbiome, cross-sectional, case-control and longitudinal studies may not have enough statistical power to allow causation to be deduced from patterns of association between variables in high-resolution omic datasets. Therefore, to move beyond reliance on the empirical method, experiments are critical. For these, robust experimental models are required that allow the systematic manipulation of variables to test the multitude of hypotheses, which arise from high-throughput molecular studies. Particularly promising in this respect are microfluidics-based in vitro co-culture systems, which allow high-throughput first-pass experiments aimed at proving cause-and-effect relationships prior to testing of hypotheses in animal models. This review focuses on widely used in vivo, in vitro, ex vivo and in silico approaches to study host-microbial community interactions. Such systems, either used in isolation or in a combinatory experimental approach, will allow systematic investigations of the impact of microbes on the health and disease of the human host. All the currently available models present pros and cons, which are described and discussed. Moreover, suggestions are made on how to develop future experimental models that not only allow the study of host-microbiota interactions but are also amenable to high-throughput experimentation.
Towards the Accuracy of Cybernetic Strategy Planning Models: Causal Proof and Function Approximation
Directory of Open Access Journals (Sweden)
Christian A. Hillbrand
2003-04-01
Full Text Available All kind of strategic tasks within an enterprise require a deep understanding of its critical key success factors and their interrelations as well as an in-depth analysis of relevant environmental influences. Due to the openness of the underlying system, there seems to be an indefinite number of unknown variables influencing strategic goals. Cybernetic or systemic planning techniques try to overcome this intricacy by modeling the most important cause-and-effect relations within such a system. Although it seems to be obvious that there are specific influences between business variables, it is mostly impossible to identify the functional dependencies underlying such relations. Hence simulation or evaluation techniques based on such hypothetically assumed models deliver inaccurate results or fail completely. This paper addresses the need for accurate strategy planning models and proposes an approach to prove their cause-andeffect relations by empirical evidence. Based on this foundation an approach for the approximation of the underlying cause-andeffect function by the means of Artificial Neural Networks is developed.
α-Decomposition for estimating parameters in common cause failure modeling based on causal inference
International Nuclear Information System (INIS)
Zheng, Xiaoyu; Yamaguchi, Akira; Takata, Takashi
2013-01-01
The traditional α-factor model has focused on the occurrence frequencies of common cause failure (CCF) events. Global α-factors in the α-factor model are defined as fractions of failure probability for particular groups of components. However, there are unknown uncertainties in the CCF parameters estimation for the scarcity of available failure data. Joint distributions of CCF parameters are actually determined by a set of possible causes, which are characterized by CCF-triggering abilities and occurrence frequencies. In the present paper, the process of α-decomposition (Kelly-CCF method) is developed to learn about sources of uncertainty in CCF parameter estimation. Moreover, it aims to evaluate CCF risk significances of different causes, which are named as decomposed α-factors. Firstly, a Hybrid Bayesian Network is adopted to reveal the relationship between potential causes and failures. Secondly, because all potential causes have different occurrence frequencies and abilities to trigger dependent failures or independent failures, a regression model is provided and proved by conditional probability. Global α-factors are expressed by explanatory variables (causes’ occurrence frequencies) and parameters (decomposed α-factors). At last, an example is provided to illustrate the process of hierarchical Bayesian inference for the α-decomposition process. This study shows that the α-decomposition method can integrate failure information from cause, component and system level. It can parameterize the CCF risk significance of possible causes and can update probability distributions of global α-factors. Besides, it can provide a reliable way to evaluate uncertainty sources and reduce the uncertainty in probabilistic risk assessment. It is recommended to build databases including CCF parameters and corresponding causes’ occurrence frequency of each targeted system
Wang, Wei; Nelson, Suchitra; Albert, Jeffrey M.
2013-01-01
Mediators are intermediate variables in the causal pathway between an exposure and an outcome. Mediation analysis investigates the extent to which exposure effects occur through these variables, thus revealing causal mechanisms. In this paper, we consider the estimation of the mediation effect when the outcome is binary and multiple mediators of different types exist. We give a precise definition of the total mediation effect as well as decomposed mediation effects through individual or sets ...
Anacleto, Osvaldo; Queen, Catriona; Albers, Casper J.
Traffic flow data are routinely collected for many networks worldwide. These invariably large data sets can be used as part of a traffic management system, for which good traffic flow forecasting models are crucial. The linear multiregression dynamic model (LMDM) has been shown to be promising for
International Nuclear Information System (INIS)
Killip, Ian Richmond
2002-01-01
This thesis investigates the range, distribution and causes of high radon levels in dwellings in the Brighton area of Southeast England. Indoor radon levels were measured in more than 1000 homes. The results show that high radon levels can arise in an area previously considered to offer low radon potential from local geological sources. Climate and building-related factors were found to affect significantly the radon levels in dwellings. Multiple regression was used to determine the influence of the various factors on indoor radon levels and an empirical model develop to predict indoor radon levels. The radon hazard, independent of building-related effects, was determined for each surveyed location by adjusting the radon measurement to that expected on the ground floor of a 'model' dwelling. This standardised set of radon levels was entered into a geographical information system (GIS) and related to surface geology. The geometric mean radon level for each lithological unit was plotted to produce a radon hazard map for the area. The highest radon levels were found to be associated with the youngest Chalk Formations, particularly where they meet overlying Tertiary deposits, and with Clay-with-Flints Quaternary deposits in the area. The results were also converted to the radon activity equivalent to that expected from the NRPB's standard dual-detector dwelling survey method and analysed by lognormal modelling to estimate the proportion of dwellings likely to exceed the UK Action Level of 200 Bq/m 3 for each lithological unit. The likely percentages of dwellings affected by radon thus obtained were mapped to lithological boundaries to produce a radon potential map. The radon hazard map and the empirical radon model facilitate the prediction of radon levels in dwellings of comparable construction and above similar geology and should further the understanding of the behaviour of radon gas in buildings to allow indoor radon concentrations to be controlled. The radon
Printing--Graphic Arts--Graphic Communications
Hauenstein, A. Dean
1975-01-01
Recently, "graphic arts" has shifted from printing skills to a conceptual approach of production processes. "Graphic communications" must embrace the total system of communication through graphic media, to serve broad career education purposes; students taught concepts and principles can be flexible and adaptive. The author…
Causes in the construction of causal law: A psycho-ecological model.
Young, Gerald
2010-01-01
The article presents an integrated psycho-ecological model of the construction of law, with implications for practice in law and mental health. The model is based on a series of concentric circles, each representing a layer of influence on the construction of law. The circle furthest removed from the center represents the influence of culture, society and industry, in particular, and the circle at the center of the circle represents the case at hand, for example, about individual complainant or mass action. The article begins by arguing that basic terms in relation to cause need clarification and also work is needed to disambiguate the concepts involved. After dealing with these issues, the article examines science and mental health. Is the scientific evidence presented by the expert sufficiently reliable and valid to meet admissibility standards of good compared to poor or junk science? Is the research undertaken for court or presented to court biased, with factors hidden, such as links to industry. Are individual evaluations conducted with biased science serving to justify partial conclusions? The dangers of powerful influences on the construction of law are highlighted, for example, related to the individual complainant malingering and the insurance industry protecting its financial interests at the expense of genuinely injured patients. In conclusion, suggestions for empirical research are offered. Copyright 2009 Elsevier Ltd. All rights reserved.
Wang, Wei; Albert, Jeffrey M
2017-08-01
An important problem within the social, behavioral, and health sciences is how to partition an exposure effect (e.g. treatment or risk factor) among specific pathway effects and to quantify the importance of each pathway. Mediation analysis based on the potential outcomes framework is an important tool to address this problem and we consider the estimation of mediation effects for the proportional hazards model in this paper. We give precise definitions of the total effect, natural indirect effect, and natural direct effect in terms of the survival probability, hazard function, and restricted mean survival time within the standard two-stage mediation framework. To estimate the mediation effects on different scales, we propose a mediation formula approach in which simple parametric models (fractional polynomials or restricted cubic splines) are utilized to approximate the baseline log cumulative hazard function. Simulation study results demonstrate low bias of the mediation effect estimators and close-to-nominal coverage probability of the confidence intervals for a wide range of complex hazard shapes. We apply this method to the Jackson Heart Study data and conduct sensitivity analysis to assess the impact on the mediation effects inference when the no unmeasured mediator-outcome confounding assumption is violated.
Nuclear reactors; graphical symbols
International Nuclear Information System (INIS)
1987-11-01
This standard contains graphical symbols that reveal the type of nuclear reactor and is used to design graphical and technical presentations. Distinguishing features for nuclear reactors are laid down in graphical symbols. (orig.) [de
Kuchinke, Wolfgang; Ohmann, Christian; Verheij, Robert A; van Veen, Evert-Ben; Arvanitis, Theodoros N; Taweel, Adel; Delaney, Brendan C
2014-12-01
To develop a model describing core concepts and principles of data flow, data privacy and confidentiality, in a simple and flexible way, using concise process descriptions and a diagrammatic notation applied to research workflow processes. The model should help to generate robust data privacy frameworks for research done with patient data. Based on an exploration of EU legal requirements for data protection and privacy, data access policies, and existing privacy frameworks of research projects, basic concepts and common processes were extracted, described and incorporated into a model with a formal graphical representation and a standardised notation. The Unified Modelling Language (UML) notation was enriched by workflow and own symbols to enable the representation of extended data flow requirements, data privacy and data security requirements, privacy enhancing techniques (PET) and to allow privacy threat analysis for research scenarios. Our model is built upon the concept of three privacy zones (Care Zone, Non-care Zone and Research Zone) containing databases, data transformation operators, such as data linkers and privacy filters. Using these model components, a risk gradient for moving data from a zone of high risk for patient identification to a zone of low risk can be described. The model was applied to the analysis of data flows in several general clinical research use cases and two research scenarios from the TRANSFoRm project (e.g., finding patients for clinical research and linkage of databases). The model was validated by representing research done with the NIVEL Primary Care Database in the Netherlands. The model allows analysis of data privacy and confidentiality issues for research with patient data in a structured way and provides a framework to specify a privacy compliant data flow, to communicate privacy requirements and to identify weak points for an adequate implementation of data privacy. Copyright © 2014 Elsevier Ireland Ltd. All rights
Shimizu, Takahiro
2014-01-01
The author developed a new scale aimed at measuring beliefs about "hypnotic states" and investigated the influence of such beliefs and attitudes on hypnotic responses in a large sample of Japanese undergraduate students. Exploratory factor analysis of this new questionnaire examining beliefs about hypnotic states yielded four factors: Dissociative or Depersonalized Experience, Loss of Self-Control, Therapeutic Expectation, and Arousing Extraordinary Ability. The results of structural equation modeling showed that Therapeutic Expectation and Arousing Extraordinary Ability influenced hypnotizability through attitudes toward hypnosis, while also directly affecting subjective experiences without mediating attitudes. Present findings suggest that it is more effective to enhance therapeutic expectations than to correct misconceptions about hypnotic states in modification of patients' beliefs before initiating treatment.
Frisch, Mathias
2014-01-01
Much has been written on the role of causal notions and causal reasoning in the so-called 'special sciences' and in common sense. But does causal reasoning also play a role in physics? Mathias Frisch argues that, contrary to what influential philosophical arguments purport to show, the answer is yes. Time-asymmetric causal structures are as integral a part of the representational toolkit of physics as a theory's dynamical equations. Frisch develops his argument partly through a critique of anti-causal arguments and partly through a detailed examination of actual examples of causal notions in physics, including causal principles invoked in linear response theory and in representations of radiation phenomena. Offering a new perspective on the nature of scientific theories and causal reasoning, this book will be of interest to professional philosophers, graduate students, and anyone interested in the role of causal thinking in science.
Jones, Robert
2010-03-01
There are a wide range of views on causality. To some (e.g. Karl Popper) causality is superfluous. Bertrand Russell said ``In advanced science the word cause never occurs. Causality is a relic of a bygone age.'' At the other extreme Rafael Sorkin and L. Bombelli suggest that space and time do not exist but are only an approximation to a reality that is simply a discrete ordered set, a ``causal set.'' For them causality IS reality. Others, like Judea Pearl and Nancy Cartwright are seaking to build a complex fundamental theory of causality (Causality, Cambridge Univ. Press, 2000) Or perhaps a theory of causality is simply the theory of functions. This is more or less my take on causality.
Probabilistic Graphical Models for the Analysis and Synthesis of Musical Audio
Hoffmann, Matthew Douglas
Content-based Music Information Retrieval (MIR) systems seek to automatically extract meaningful information from musical audio signals. This thesis applies new and existing generative probabilistic models to several content-based MIR tasks: timbral similarity estimation, semantic annotation and retrieval, and latent source discovery and separation. In order to estimate how similar two songs sound to one another, we employ a Hierarchical Dirichlet Process (HDP) mixture model to discover a shared representation of the distribution of timbres in each song. Comparing songs under this shared representation yields better query-by-example retrieval quality and scalability than previous approaches. To predict what tags are likely to apply to a song (e.g., "rap," "happy," or "driving music"), we develop the Codeword Bernoulli Average (CBA) model, a simple and fast mixture-of-experts model. Despite its simplicity, CBA performs at least as well as state-of-the-art approaches at automatically annotating songs and finding to what songs in a database a given tag most applies. Finally, we address the problem of latent source discovery and separation by developing two Bayesian nonparametric models, the Shift-Invariant HDP and Gamma Process NMF. These models allow us to discover what sounds (e.g. bass drums, guitar chords, etc.) are present in a song or set of songs and to isolate or suppress individual source. These models' ability to decide how many latent sources are necessary to model the data is particularly valuable in this application, since it is impossible to guess a priori how many sounds will appear in a given song or set of songs. Once they have been fit to data, probabilistic models can also be used to drive the synthesis of new musical audio, both for creative purposes and to qualitatively diagnose what information a model does and does not capture. We also adapt the SIHDP model to create new versions of input audio with arbitrary sample sets, for example, to create
Directory of Open Access Journals (Sweden)
Idil Kokal
Full Text Available Studies investigating joint actions have suggested a central role for the putative mirror neuron system (pMNS because of the close link between perception and action provided by these brain regions [1], [2], [3]. In contrast, our previous functional magnetic resonance imaging (fMRI experiment demonstrated that the BOLD response of the pMNS does not suggest that it directly integrates observed and executed actions during joint actions [4]. To test whether the pMNS might contribute indirectly to the integration process by sending information to brain areas responsible for this integration (integration network, here we used Granger causality mapping (GCM [5]. We explored the directional information flow between the anterior sites of the pMNS and previously identified integrative brain regions. We found that the left BA44 sent more information than it received to both the integration network (left thalamus, right middle occipital gyrus and cerebellum and more posterior nodes of the pMNS (BA2. Thus, during joint actions, two anatomically separate networks therefore seem effectively connected and the information flow is predominantly from anterior to posterior areas of the brain. These findings suggest that the pMNS is involved indirectly in joint actions by transforming observed and executed actions into a common code and is part of a generative model that could predict the future somatosensory and visual consequences of observed and executed actions in order to overcome otherwise inevitable neural delays.
Chen, Mei-Chih; Chang, Kaowen
2014-11-06
Many city governments choose to supply more developable land and transportation infrastructure with the hope of attracting people and businesses to their cities. However, like those in Taiwan, major cities worldwide suffer from traffic congestion. This study applies the system thinking logic of the causal loops diagram (CLD) model in the System Dynamics (SD) approach to analyze the issue of traffic congestion and other issues related to roads and land development in Taiwan's cities. Comparing the characteristics of development trends with yearbook data for 2002 to 2013 for all of Taiwan's cities, this study explores the developing phenomenon of unlimited city sprawl and identifies the cause and effect relationships in the characteristics of development trends in traffic congestion, high-density population aggregation in cities, land development, and green land disappearance resulting from city sprawl. This study provides conclusions for Taiwan's cities' sustainability and development (S&D). When developing S&D policies, during decision making processes concerning city planning and land use management, governments should think with a holistic view of carrying capacity with the assistance of system thinking to clarify the prejudices in favor of the unlimited developing phenomena resulting from city sprawl.
Kim, Hui Taek; Ahn, Tae Young; Jang, Jae Hoon; Kim, Kang Hee; Lee, Sung Jae; Jung, Duk Young
2017-03-01
Three-dimensional (3D) computed tomography imaging is now being used to generate 3D models for planning orthopaedic surgery, but the process remains time consuming and expensive. For chronic radial head dislocation, we have designed a graphic overlay approach that employs selected 3D computer images and widely available software to simplify the process of osteotomy site selection. We studied 5 patients (2 traumatic and 3 congenital) with unilateral radial head dislocation. These patients were treated with surgery based on traditional radiographs, but they also had full sets of 3D CT imaging done both before and after their surgery: these 3D CT images form the basis for this study. From the 3D CT images, each patient generated 3 sets of 3D-printed bone models: 2 copies of the preoperative condition, and 1 copy of the postoperative condition. One set of the preoperative models was then actually osteotomized and fixed in the manner suggested by our graphic technique. Arcs of rotation of the 3 sets of 3D-printed bone models were then compared. Arcs of rotation of the 3 groups of bone models were significantly different, with the models osteotomized accordingly to our graphic technique having the widest arcs. For chronic radial head dislocation, our graphic overlay approach simplifies the selection of the osteotomy site(s). Three-dimensional-printed bone models suggest that this approach could improve range of motion of the forearm in actual surgical practice. Level IV-therapeutic study.
Regression to Causality : Regression-style presentation influences causal attribution
DEFF Research Database (Denmark)
Bordacconi, Mats Joe; Larsen, Martin Vinæs
2014-01-01
of equivalent results presented as either regression models or as a test of two sample means. Our experiment shows that the subjects who were presented with results as estimates from a regression model were more inclined to interpret these results causally. Our experiment implies that scholars using regression...... models – one of the primary vehicles for analyzing statistical results in political science – encourage causal interpretation. Specifically, we demonstrate that presenting observational results in a regression model, rather than as a simple comparison of means, makes causal interpretation of the results...... more likely. Our experiment drew on a sample of 235 university students from three different social science degree programs (political science, sociology and economics), all of whom had received substantial training in statistics. The subjects were asked to compare and evaluate the validity...
Directory of Open Access Journals (Sweden)
Cristina Puente Águeda
2011-10-01
Full Text Available Causality is a fundamental notion in every field of science. Since the times of Aristotle, causal relationships have been a matter of study as a way to generate knowledge and provide for explanations. In this paper I review the notion of causality through different scientific areas such as physics, biology, engineering, etc. In the scientific area, causality is usually seen as a precise relation: the same cause provokes always the same effect. But in the everyday world, the links between cause and effect are frequently imprecise or imperfect in nature. Fuzzy logic offers an adequate framework for dealing with imperfect causality, so a few notions of fuzzy causality are introduced.
A graphical interface based model for wind turbine drive train dynamics
Energy Technology Data Exchange (ETDEWEB)
Manwell, J.F.; McGowan, J.G.; Abdulwahid, U.; Rogers, A. [Univ. of Massachusetts, Amherst, MA (United States); McNiff, B. [McNiff Light Industry, Blue Hill, ME (United States)
1996-12-31
This paper presents a summary of a wind turbine drive train dynamics code that has been under development at the University of Massachusetts, under National Renewable Energy Laboratory (NREL) support. The code is intended to be used to assist in the proper design and selection of drive train components. This work summarizes the development of the equations of motion for the model, and discusses the method of solution. In addition, a number of comparisons with analytical solutions and experimental field data are given. The summary includes conclusions and suggestions for future work on the model. 13 refs., 10 figs.
Directory of Open Access Journals (Sweden)
Giovanni M. Marchetti
2006-02-01
Full Text Available We describe some functions in the R package ggm to derive from a given Markov model, represented by a directed acyclic graph, different types of graphs induced after marginalizing over and conditioning on some of the variables. The package has a few basic functions that find the essential graph, the induced concentration and covariance graphs, and several types of chain graphs implied by the directed acyclic graph (DAG after grouping and reordering the variables. These functions can be useful to explore the impact of latent variables or of selection effects on a chosen data generating model.
Directory of Open Access Journals (Sweden)
Basabdatta Sen Bhattacharya
2016-11-01
Full Text Available Experimental studies on the Lateral Geniculate Nucleus (LGN of mammals and rodents show that the inhibitory interneurons (IN receive around 47.1% of their afferents from the retinal spiking neurons, and constitute around 20 - 25% of the LGN cell population. However, there is a definite gap in knowledge about the role and impact of IN on thalamocortical dynamics in both experimental and model-based research. We use a neural mass computational model of the LGN with three neural populations viz. IN, thalamocortical relay (TCR, thalamic reticular nucleus (TRN, to study the causality of IN on LGN oscillations and state-transitions. The synaptic information transmission in the model is implemented with kinetic modelling, facilitating the linking of low-level cellular attributes with high-level population dynamics. The model is parameterised and tuned to simulate both Local Field Potential (LFP of LGN and electroencephalogram (EEG of visual cortex in an awake resting state with eyes closed and dominant frequency within the alpha (8-13 Hz band. The results show that: First, the response of the TRN is suppressed in the presence of IN in the circuit; disconnecting the IN from the circuit effects a dramatic change in the model output, displaying high amplitude synchronous oscillations within the alpha band in both TCR and TRN. These observations conform to experimental reports implicating the IN as the primary inhibitory modulator of LGN dynamics in a cognitive state, and that reduced cognition is achieved by suppressing the TRN response. Second, the model validates steady state visually evoked potential response in humans corresponding to periodic input stimuli; however, when the IN is disconnected from the circuit, the output power spectra do not reflect the input frequency. This agrees with experimental reports underpinning the role of IN in efficient retino-geniculate information transmission. Third, a smooth transition from alpha to theta band is
A comparison of algorithms for inference and learning in probabilistic graphical models.
Frey, Brendan J; Jojic, Nebojsa
2005-09-01
Research into methods for reasoning under uncertainty is currently one of the most exciting areas of artificial intelligence, largely because it has recently become possible to record, store, and process large amounts of data. While impressive achievements have been made in pattern classification problems such as handwritten character recognition, face detection, speaker identification, and prediction of gene function, it is even more exciting that researchers are on the verge of introducing systems that can perform large-scale combinatorial analyses of data, decomposing the data into interacting components. For example, computational methods for automatic scene analysis are now emerging in the computer vision community. These methods decompose an input image into its constituent objects, lighting conditions, motion patterns, etc. Two of the main challenges are finding effective representations and models in specific applications and finding efficient algorithms for inference and learning in these models. In this paper, we advocate the use of graph-based probability models and their associated inference and learning algorithms. We review exact techniques and various approximate, computationally efficient techniques, including iterated conditional modes, the expectation maximization (EM) algorithm, Gibbs sampling, the mean field method, variational techniques, structured variational techniques and the sum-product algorithm ("loopy" belief propagation). We describe how each technique can be applied in a vision model of multiple, occluding objects and contrast the behaviors and performances of the techniques using a unifying cost function, free energy.
PIM Pedagogy: Toward a Loosely Unified Model for Teaching and Studying Comics and Graphic Novels
Carter, James B.
2015-01-01
The article debuts and explains "PIM" pedagogy, a construct for teaching comics at the secondary- and post-secondary levels and for deep reading/studying comics. The PIM model for considering comics is actually based in major precepts of education studies, namely constructivist foundations of learning, and loosely unifies constructs…
Phillips, D. T.; Manseur, B.; Foster, J. W.
1982-01-01
Alternate definitions of system failure create complex analysis for which analytic solutions are available only for simple, special cases. The GRASP methodology is a computer simulation approach for solving all classes of problems in which both failure and repair events are modeled according to the probability laws of the individual components of the system.
2017-08-01
used for its GPU computing capability during the experiment. It has Nvidia Tesla K40 GPU accelerators containing 32 GPU nodes consisting of 1024...cores. CUDA is a parallel computing platform and application programming interface (API) model that was created and designed by Nvidia to give direct...Agricultural and Forest Meteorology. 1995:76:277–291, ISSN 0168-1923. 3. GPU vs. CPU? What is GPU computing? Santa Clara (CA): Nvidia Corporation; 2017
Formal Model for Data Dependency Analysis between Controls and Actions of a Graphical User Interface
Directory of Open Access Journals (Sweden)
SKVORC, D.
2012-02-01
Full Text Available End-user development is an emerging computer science discipline that provides programming paradigms, techniques, and tools suitable for users not trained in software engineering. One of the techniques that allow ordinary computer users to develop their own applications without the need to learn a classic programming language is a GUI-level programming based on programming-by-demonstration. To build wizard-based tools that assist users in application development and to verify the correctness of user programs, a computer-supported method for GUI-level data dependency analysis is necessary. Therefore, formal model for GUI representation is needed. In this paper, we present a finite state machine for modeling the data dependencies between GUI controls and GUI actions. Furthermore, we present an algorithm for automatic construction of finite state machine for arbitrary GUI application. We show that proposed state aggregation scheme successfully manages state explosion in state machine construction algorithm, which makes the model applicable for applications with complex GUIs.
Overview of the Graphical User Interface for the GERM Code (GCR Event-Based Risk Model
Kim, Myung-Hee; Cucinotta, Francis A.
2010-01-01
The descriptions of biophysical events from heavy ions are of interest in radiobiology, cancer therapy, and space exploration. The biophysical description of the passage of heavy ions in tissue and shielding materials is best described by a stochastic approach that includes both ion track structure and nuclear interactions. A new computer model called the GCR Event-based Risk Model (GERM) code was developed for the description of biophysical events from heavy ion beams at the NASA Space Radiation Laboratory (NSRL). The GERM code calculates basic physical and biophysical quantities of high-energy protons and heavy ions that have been studied at NSRL for the purpose of simulating space radiobiological effects. For mono-energetic beams, the code evaluates the linear-energy transfer (LET), range (R), and absorption in tissue equivalent material for a given Charge (Z), Mass Number (A) and kinetic energy (E) of an ion. In addition, a set of biophysical properties are evaluated such as the Poisson distribution of ion or delta-ray hits for a specified cellular area, cell survival curves, and mutation and tumor probabilities. The GERM code also calculates the radiation transport of the beam line for either a fixed number of user-specified depths or at multiple positions along the Bragg curve of the particle. The contributions from primary ion and nuclear secondaries are evaluated. The GERM code accounts for the major nuclear interaction processes of importance for describing heavy ion beams, including nuclear fragmentation, elastic scattering, and knockout-cascade processes by using the quantum multiple scattering fragmentation (QMSFRG) model. The QMSFRG model has been shown to be in excellent agreement with available experimental data for nuclear fragmentation cross sections, and has been used by the GERM code for application to thick target experiments. The GERM code provides scientists participating in NSRL experiments with the data needed for the interpretation of their
Overview of the Graphical User Interface for the GERMcode (GCR Event-Based Risk Model)
Kim, Myung-Hee Y.; Cucinotta, Francis A.
2010-01-01
The descriptions of biophysical events from heavy ions are of interest in radiobiology, cancer therapy, and space exploration. The biophysical description of the passage of heavy ions in tissue and shielding materials is best described by a stochastic approach that includes both ion track structure and nuclear interactions. A new computer model called the GCR Event-based Risk Model (GERM) code was developed for the description of biophysical events from heavy ion beams at the NASA Space Radiation Laboratory (NSRL). The GERMcode calculates basic physical and biophysical quantities of high-energy protons and heavy ions that have been studied at NSRL for the purpose of simulating space radiobiological effects. For mono-energetic beams, the code evaluates the linear-energy transfer (LET), range (R), and absorption in tissue equivalent material for a given Charge (Z), Mass Number (A) and kinetic energy (E) of an ion. In addition, a set of biophysical properties are evaluated such as the Poisson distribution of ion or delta-ray hits for a specified cellular area, cell survival curves, and mutation and tumor probabilities. The GERMcode also calculates the radiation transport of the beam line for either a fixed number of user-specified depths or at multiple positions along the Bragg curve of the particle. The contributions from primary ion and nuclear secondaries are evaluated. The GERMcode accounts for the major nuclear interaction processes of importance for describing heavy ion beams, including nuclear fragmentation, elastic scattering, and knockout-cascade processes by using the quantum multiple scattering fragmentation (QMSFRG) model. The QMSFRG model has been shown to be in excellent agreement with available experimental data for nuclear fragmentation cross sections, and has been used by the GERMcode for application to thick target experiments. The GERMcode provides scientists participating in NSRL experiments with the data needed for the interpretation of their
Energy Technology Data Exchange (ETDEWEB)
Fierro, Andrew, E-mail: andrew.fierro@ttu.edu; Dickens, James; Neuber, Andreas [Center for Pulsed Power and Power Electronics, Department of Electrical and Computer Engineering, Texas Tech University, Lubbock, Texas 79409 (United States)
2014-12-15
A 3-dimensional particle-in-cell/Monte Carlo collision simulation that is fully implemented on a graphics processing unit (GPU) is described and used to determine low-temperature plasma characteristics at high reduced electric field, E/n, in nitrogen gas. Details of implementation on the GPU using the NVIDIA Compute Unified Device Architecture framework are discussed with respect to efficient code execution. The software is capable of tracking around 10 × 10{sup 6} particles with dynamic weighting and a total mesh size larger than 10{sup 8} cells. Verification of the simulation is performed by comparing the electron energy distribution function and plasma transport parameters to known Boltzmann Equation (BE) solvers. Under the assumption of a uniform electric field and neglecting the build-up of positive ion space charge, the simulation agrees well with the BE solvers. The model is utilized to calculate plasma characteristics of a pulsed, parallel plate discharge. A photoionization model provides the simulation with additional electrons after the initial seeded electron density has drifted towards the anode. Comparison of the performance benefits between the GPU-implementation versus a CPU-implementation is considered, and a speed-up factor of 13 for a 3D relaxation Poisson solver is obtained. Furthermore, a factor 60 speed-up is realized for parallelization of the electron processes.
Wilderjans, Tom F; Ceulemans, Eva; Van Mechelen, Iven; Depril, Dirk
2011-03-01
In many areas of psychology, one is interested in disclosing the underlying structural mechanisms that generated an object by variable data set. Often, based on theoretical or empirical arguments, it may be expected that these underlying mechanisms imply that the objects are grouped into clusters that are allowed to overlap (i.e., an object may belong to more than one cluster). In such cases, analyzing the data with Mirkin's additive profile clustering model may be appropriate. In this model: (1) each object may belong to no, one or several clusters, (2) there is a specific variable profile associated with each cluster, and (3) the scores of the objects on the variables can be reconstructed by adding the cluster-specific variable profiles of the clusters the object in question belongs to. Until now, however, no software program has been publicly available to perform an additive profile clustering analysis. For this purpose, in this article, the ADPROCLUS program, steered by a graphical user interface, is presented. We further illustrate its use by means of the analysis of a patient by symptom data matrix.
LIMO EEG: a toolbox for hierarchical LInear MOdeling of ElectroEncephaloGraphic data.
Pernet, Cyril R; Chauveau, Nicolas; Gaspar, Carl; Rousselet, Guillaume A
2011-01-01
Magnetic- and electric-evoked brain responses have traditionally been analyzed by comparing the peaks or mean amplitudes of signals from selected channels and averaged across trials. More recently, tools have been developed to investigate single trial response variability (e.g., EEGLAB) and to test differences between averaged evoked responses over the entire scalp and time dimensions (e.g., SPM, Fieldtrip). LIMO EEG is a Matlab toolbox (EEGLAB compatible) to analyse evoked responses over all space and time dimensions, while accounting for single trial variability using a simple hierarchical linear modelling of the data. In addition, LIMO EEG provides robust parametric tests, therefore providing a new and complementary tool in the analysis of neural evoked responses.
Nelson, D. P.
1981-01-01
A graphical presentation of the aerodynamic data acquired during coannular nozzle performance wind tunnel tests is given. The graphical data consist of plots of nozzle gross thrust coefficient, fan nozzle discharge coefficient, and primary nozzle discharge coefficient. Normalized model component static pressure distributions are presented as a function of primary total pressure, fan total pressure, and ambient static pressure for selected operating conditions. In addition, the supersonic cruise configuration data include plots of nozzle efficiency and secondary-to-fan total pressure pumping characteristics. Supersonic and subsonic cruise data are given.
Kriston, Levente; Melchior, Hanne; Hergert, Anika; Bergelt, Corinna; Watzke, Birgit; Schulz, Holger; von Wolff, Alessa
2011-01-01
The aim of our study was to develop a graphical tool that can be used in addition to standard statistical criteria to support decisions on the number of classes in explorative categorical latent variable modeling for rehabilitation research. Data from two rehabilitation research projects were used. In the first study, a latent profile analysis was…
Directory of Open Access Journals (Sweden)
Shonin M.Yu.
2017-02-01
Full Text Available the work is devoted to creation of mathematical model of the solution of problems on use of raw materials and drawing up a diet. These problems have been solved by the authors of this article by means of a graphic method in number.
Derringer, Cory; Rottman, Benjamin Margolin
2018-05-01
Four experiments tested how people learn cause-effect relations when there are many possible causes of an effect. When there are many cues, even if all the cues together strongly predict the effect, the bivariate relation between each individual cue and the effect can be weak, which can make it difficult to detect the influence of each cue. We hypothesized that when detecting the influence of a cue, in addition to learning from the states of the cues and effect (e.g., a cue is present and the effect is present), which is hypothesized by multiple existing theories of learning, participants would also learn from transitions - how the cues and effect change over time (e.g., a cue turns on and the effect turns on). We found that participants were better able to identify positive and negative cues in an environment in which only one cue changed from one trial to the next, compared to multiple cues changing (Experiments 1A, 1B). Within a single learning sequence, participants were also more likely to update their beliefs about causal strength when one cue changed at a time ('one-change transitions') than when multiple cues changed simultaneously (Experiment 2). Furthermore, learning was impaired when the trials were grouped by the state of the effect (Experiment 3) or when the trials were grouped by the state of a cue (Experiment 4), both of which reduce the number of one-change transitions. We developed a modification of the Rescorla-Wagner algorithm to model this 'Informative Transitions' learning processes. Copyright © 2018 Elsevier Inc. All rights reserved.
Miranda, Diogo Julien; Chao, Lung Wen
2018-03-01
Preliminary studies suggest the need of a global vision in academic reform, leading to education re-invention. This would include problem-based education using transversal topics, developing of thinking skills, social interaction, and information-processing skills. We aimed to develop a new educational model in health with modular components to be broadcast and applied as a tele-education course. We developed a systematic model based on a "Skills and Goals Matrix" to adapt scientific contents on fictional screenplays, three-dimensional (3D) computer graphics of the human body, and interactive documentaries. We selected 13 topics based on youth vulnerabilities in Brazil to be disseminated through a television show with 15 episodes. We developed scientific content for each theme, naturally inserting it into screenplays, together with 3D sequences and interactive documentaries. The modular structure was then adapted to a distance-learning course. The television show was broadcast on national television for two consecutive years to an estimated audience of 30 million homes, and ever since on an Internet Protocol Television (IPTV) channel. It was also reorganized as a tele-education course for 2 years, reaching 1,180 subscriptions from all 27 Brazilian states, resulting in 240 graduates. Positive results indicate the feasibility, acceptability, and effectiveness of a model of modular entertainment audio-visual productions using health and education integrated concepts. This structure also allowed the model to be interconnected with other sources and applied as tele-education course, educating, informing, and stimulating the behavior change. Future works should reinforce this joint structure of telehealth, communication, and education.
WE-E-BRE-05: Ensemble of Graphical Models for Predicting Radiation Pneumontis Risk
Energy Technology Data Exchange (ETDEWEB)
Lee, S; Ybarra, N; Jeyaseelan, K; El Naqa, I [McGill University, Montreal, Quebec (Canada); Faria, S; Kopek, N [Montreal General Hospital, Montreal, Quebec (Canada)
2014-06-15
Purpose: We propose a prior knowledge-based approach to construct an interaction graph of biological and dosimetric radiation pneumontis (RP) covariates for the purpose of developing a RP risk classifier. Methods: We recruited 59 NSCLC patients who received curative radiotherapy with minimum 6 month follow-up. 16 RP events was observed (CTCAE grade ≥2). Blood serum was collected from every patient before (pre-RT) and during RT (mid-RT). From each sample the concentration of the following five candidate biomarkers were taken as covariates: alpha-2-macroglobulin (α2M), angiotensin converting enzyme (ACE), transforming growth factor β (TGF-β), interleukin-6 (IL-6), and osteopontin (OPN). Dose-volumetric parameters were also included as covariates. The number of biological and dosimetric covariates was reduced by a variable selection scheme implemented by L1-regularized logistic regression (LASSO). Posterior probability distribution of interaction graphs between the selected variables was estimated from the data under the literature-based prior knowledge to weight more heavily the graphs that contain the expected associations. A graph ensemble was formed by averaging the most probable graphs weighted by their posterior, creating a Bayesian Network (BN)-based RP risk classifier. Results: The LASSO selected the following 7 RP covariates: (1) pre-RT concentration level of α2M, (2) α2M level mid- RT/pre-RT, (3) pre-RT IL6 level, (4) IL6 level mid-RT/pre-RT, (5) ACE mid-RT/pre-RT, (6) PTV volume, and (7) mean lung dose (MLD). The ensemble BN model achieved the maximum sensitivity/specificity of 81%/84% and outperformed univariate dosimetric predictors as shown by larger AUC values (0.78∼0.81) compared with MLD (0.61), V20 (0.65) and V30 (0.70). The ensembles obtained by incorporating the prior knowledge improved classification performance for the ensemble size 5∼50. Conclusion: We demonstrated a probabilistic ensemble method to detect robust associations between
Putting a cap on causality violations in causal dynamical triangulations
International Nuclear Information System (INIS)
Ambjoern, Jan; Loll, Renate; Westra, Willem; Zohren, Stefan
2007-01-01
The formalism of causal dynamical triangulations (CDT) provides us with a non-perturbatively defined model of quantum gravity, where the sum over histories includes only causal space-time histories. Path integrals of CDT and their continuum limits have been studied in two, three and four dimensions. Here we investigate a generalization of the two-dimensional CDT model, where the causality constraint is partially lifted by introducing branching points with a weight g s , and demonstrate that the system can be solved analytically in the genus-zero sector. The solution is analytic in a neighborhood around weight g s = 0 and cannot be analytically continued to g s = ∞, where the branching is entirely geometric and where one would formally recover standard Euclidean two-dimensional quantum gravity defined via dynamical triangulations or Liouville theory
Stereoscopic 3D graphics generation
Li, Zhi; Liu, Jianping; Zan, Y.
1997-05-01
Stereoscopic display technology is one of the key techniques of areas such as simulation, multimedia, entertainment, virtual reality, and so on. Moreover, stereoscopic 3D graphics generation is an important part of stereoscopic 3D display system. In this paper, at first, we describe the principle of stereoscopic display and summarize some methods to generate stereoscopic 3D graphics. Secondly, to overcome the problems which came from the methods of user defined models (such as inconvenience, long modifying period and so on), we put forward the vector graphics files defined method. Thus we can design more directly; modify the model simply and easily; generate more conveniently; furthermore, we can make full use of graphics accelerator card and so on. Finally, we discuss the problem of how to speed up the generation.
ANCIENT SHIPYARD ON TURKEY’S DANA ISLAND: ITS 3D MODELLING WITH PHOTOGRAMMETRY AND COMPUTER GRAPHICS
Directory of Open Access Journals (Sweden)
A. Denker
2018-05-01
Full Text Available Although a small island 2 km off the southern coast of Turkey, Dana Island offers a rich history which is likely to shed light upon the Dark Ages. Starting from 2015 our archaeological team discovered through continuing coastal and underwater excavations 274 shipsheds/slipways there. This discovery places Dana Island among the biggest shipyards of antiquity. The slipways varied in dimensions suitable for vessels of different sizes from small boats to large warships. Historical sources suggest that the name of the island may stem from Yadnana, Yadana or Adana which was mentioned in an Assyrian tablet of the 8th century BC, as an island in the vicinity of Cyprus. Archaeological evidence exists that shows Dana Island had played a significant role in seamanship activities in Levant starting from Neolithic times. A substantial part of the naval campaigns must have involved Dana Island which used be the biggest shipyard/naval base of the Eastern Mediterranean. A 3D model of the island has been made by using photogrammetry and computer graphics methods and simulations were executed to check the hypotheses related to the involvement of Dana Island in the major sea battles of antiquity, such as Sea Battle of Lade in 495 BC.
Directory of Open Access Journals (Sweden)
K S Mwitondi
2013-05-01
Full Text Available Differences in modelling techniques and model performance assessments typically impinge on the quality of knowledge extraction from data. We propose an algorithm for determining optimal patterns in data by separately training and testing three decision tree models in the Pima Indians Diabetes and the Bupa Liver Disorders datasets. Model performance is assessed using ROC curves and the Youden Index. Moving differences between sequential fitted parameters are then extracted, and their respective probability density estimations are used to track their variability using an iterative graphical data visualisation technique developed for this purpose. Our results show that the proposed strategy separates the groups more robustly than the plain ROC/Youden approach, eliminates obscurity, and minimizes over-fitting. Further, the algorithm can easily be understood by non-specialists and demonstrates multi-disciplinary compliance.
Kim, Seehyung
2005-01-01
This research develops and tests a model of the service unit ownership and control patterns used by international service companies. The main purpose of this study is to investigate trivariate causal relationships among environmental factors, organizational structure, and perceived performance in the internationalization process of service firms. A service firm operating in foreign soil has a choice of three general entry mode strategies offering different degrees of ownership and control of ...
Causal imprinting in causal structure learning.
Taylor, Eric G; Ahn, Woo-Kyoung
2012-11-01
Suppose one observes a correlation between two events, B and C, and infers that B causes C. Later one discovers that event A explains away the correlation between B and C. Normatively, one should now dismiss or weaken the belief that B causes C. Nonetheless, participants in the current study who observed a positive contingency between B and C followed by evidence that B and C were independent given A, persisted in believing that B causes C. The authors term this difficulty in revising initially learned causal structures "causal imprinting." Throughout four experiments, causal imprinting was obtained using multiple dependent measures and control conditions. A Bayesian analysis showed that causal imprinting may be normative under some conditions, but causal imprinting also occurred in the current study when it was clearly non-normative. It is suggested that causal imprinting occurs due to the influence of prior knowledge on how reasoners interpret later evidence. Consistent with this view, when participants first viewed the evidence showing that B and C are independent given A, later evidence with only B and C did not lead to the belief that B causes C. Copyright © 2012 Elsevier Inc. All rights reserved.
Bose, Eliezer; Hravnak, Marilyn; Sereika, Susan M
Patients undergoing continuous vital sign monitoring (heart rate [HR], respiratory rate [RR], pulse oximetry [SpO2]) in real time display interrelated vital sign changes during situations of physiological stress. Patterns in this physiological cross-talk could portend impending cardiorespiratory instability (CRI). Vector autoregressive (VAR) modeling with Granger causality tests is one of the most flexible ways to elucidate underlying causal mechanisms in time series data. The purpose of this article is to illustrate the development of patient-specific VAR models using vital sign time series data in a sample of acutely ill, monitored, step-down unit patients and determine their Granger causal dynamics prior to onset of an incident CRI. CRI was defined as vital signs beyond stipulated normality thresholds (HR = 40-140/minute, RR = 8-36/minute, SpO2 time segment prior to onset of first CRI was chosen for time series modeling in 20 patients using a six-step procedure: (a) the uniform time series for each vital sign was assessed for stationarity, (b) appropriate lag was determined using a lag-length selection criteria, (c) the VAR model was constructed, (d) residual autocorrelation was assessed with the Lagrange Multiplier test, (e) stability of the VAR system was checked, and (f) Granger causality was evaluated in the final stable model. The primary cause of incident CRI was low SpO2 (60% of cases), followed by out-of-range RR (30%) and HR (10%). Granger causality testing revealed that change in RR caused change in HR (21%; i.e., RR changed before HR changed) more often than change in HR causing change in RR (15%). Similarly, changes in RR caused changes in SpO2 (15%) more often than changes in SpO2 caused changes in RR (9%). For HR and SpO2, changes in HR causing changes in SpO2 and changes in SpO2 causing changes in HR occurred with equal frequency (18%). Within this sample of acutely ill patients who experienced a CRI event, VAR modeling indicated that RR changes
Bose, Eliezer; Hravnak, Marilyn; Sereika, Susan M.
2016-01-01
Background Patients undergoing continuous vital sign monitoring (heart rate [HR], respiratory rate [RR], pulse oximetry [SpO2]) in real time display inter-related vital sign changes during situations of physiologic stress. Patterns in this physiological cross-talk could portend impending cardiorespiratory instability (CRI). Vector autoregressive (VAR) modeling with Granger causality tests is one of the most flexible ways to elucidate underlying causal mechanisms in time series data. Purpose The purpose of this article is to illustrate development of patient-specific VAR models using vital sign time series (VSTS) data in a sample of acutely ill, monitored, step-down unit (SDU) patients, and determine their Granger causal dynamics prior to onset of an incident CRI. Approach CRI was defined as vital signs beyond stipulated normality thresholds (HR = 40–140/minute, RR = 8–36/minute, SpO2 < 85%) and persisting for 3 minutes within a 5-minute moving window (60% of the duration of the window). A 6-hour time segment prior to onset of first CRI was chosen for time series modeling in 20 patients using a six-step procedure: (a) the uniform time series for each vital sign was assessed for stationarity; (b) appropriate lag was determined using a lag-length selection criteria; (c) the VAR model was constructed; (d) residual autocorrelation was assessed with the Lagrange Multiplier test; (e) stability of the VAR system was checked; and (f) Granger causality was evaluated in the final stable model. Results The primary cause of incident CRI was low SpO2 (60% of cases), followed by out-of-range RR (30%) and HR (10%). Granger causality testing revealed that change in RR caused change in HR (21%) (i.e., RR changed before HR changed) more often than change in HR causing change in RR (15%). Similarly, changes in RR caused changes in SpO2 (15%) more often than changes in SpO2 caused changes in RR (9%). For HR and SpO2, changes in HR causing changes in SpO2 and changes in SpO2 causing
Hierarchical organisation of causal graphs
International Nuclear Information System (INIS)
Dziopa, P.
1993-01-01
This paper deals with the design of a supervision system using a hierarchy of models formed by graphs, in which the variables are the nodes and the causal relations between the variables of the arcs. To obtain a representation of the variables evolutions which contains only the relevant features of their real evolutions, the causal relations are completed with qualitative transfer functions (QTFs) which produce roughly the behaviour of the classical transfer functions. Major improvements have been made in the building of the hierarchical organization. First, the basic variables of the uppermost level and the causal relations between them are chosen. The next graph is built by adding intermediary variables to the upper graph. When the undermost graph has been built, the transfer functions parameters corresponding to its causal relations are identified. The second task consists in the upwelling of the information from the undermost graph to the uppermost one. A fusion procedure of the causal relations has been designed to compute the QFTs relevant for each level. This procedure aims to reduce the number of parameters needed to represent an evolution at a high level of abstraction. These techniques have been applied to the hierarchical modelling of nuclear process. (authors). 8 refs., 12 figs
A General Approach to Causal Mediation Analysis
Imai, Kosuke; Keele, Luke; Tingley, Dustin
2010-01-01
Traditionally in the social sciences, causal mediation analysis has been formulated, understood, and implemented within the framework of linear structural equation models. We argue and demonstrate that this is problematic for 3 reasons: the lack of a general definition of causal mediation effects independent of a particular statistical model, the…
Causal Indicators Can Help to Interpret Factors
Bentler, Peter M.
2016-01-01
The latent factor in a causal indicator model is no more than the latent factor of the factor part of the model. However, if the causal indicator variables are well-understood and help to improve the prediction of individuals' factor scores, they can help to interpret the meaning of the latent factor. Aguirre-Urreta, Rönkkö, and Marakas (2016)…
International Nuclear Information System (INIS)
Yang, Jin Seok
1993-04-01
This book gives description of basic of graphic knowledge and understanding and realization of graphic file form. The first part deals with graphic with graphic data, store of graphic data and compress of data, programing language such as assembling, stack, compile and link of program and practice and debugging. The next part mentions graphic file form such as Mac paint file, GEM/IMG file, PCX file, GIF file, and TIFF file, consideration of hardware like mono screen driver and color screen driver in high speed, basic conception of dithering and conversion of formality.
Bayesian networks improve causal environmental ...
Rule-based weight of evidence approaches to ecological risk assessment may not account for uncertainties and generally lack probabilistic integration of lines of evidence. Bayesian networks allow causal inferences to be made from evidence by including causal knowledge about the problem, using this knowledge with probabilistic calculus to combine multiple lines of evidence, and minimizing biases in predicting or diagnosing causal relationships. Too often, sources of uncertainty in conventional weight of evidence approaches are ignored that can be accounted for with Bayesian networks. Specifying and propagating uncertainties improve the ability of models to incorporate strength of the evidence in the risk management phase of an assessment. Probabilistic inference from a Bayesian network allows evaluation of changes in uncertainty for variables from the evidence. The network structure and probabilistic framework of a Bayesian approach provide advantages over qualitative approaches in weight of evidence for capturing the impacts of multiple sources of quantifiable uncertainty on predictions of ecological risk. Bayesian networks can facilitate the development of evidence-based policy under conditions of uncertainty by incorporating analytical inaccuracies or the implications of imperfect information, structuring and communicating causal issues through qualitative directed graph formulations, and quantitatively comparing the causal power of multiple stressors on value
McGeown, Laura; Davis, Ron
2018-02-15
The social modeling of eating effect refers to the consistently demonstrated phenomenon that individuals tend to match their quantity of food intake to their eating companion. The current study sought to explore whether activity within the mirror neuron system (MNS) mediates the social modeling of eating effect as a function of EEG frontal asymmetry and body mass index (BMI). Under the guise of rating empathy, 93 female undergraduates viewed a female video confederate "incidentally" consume either a low or high intake of chips while electroencephalogram (EEG) activity was recorded. Subsequent ad libitum chip consumption was quantified. A first- and second-stage dual moderation model revealed that frontal asymmetry and BMI moderated an indirect effect of model consumption on participants' food consumption as mediated by MNS activity at electrode site C3, a 3 b 3 =-0.718, SE=0.365, 95% CI [-1.632, -0.161]. Left frontal asymmetry was associated with greater mu activity and a positive association between model and participant chip consumption, while right frontal asymmetry was associated with less mu activity and a negative association between model and participant consumption. Across all levels of frontal asymmetry, the effect was only significant among those with a BMI at the 50th percentile or lower. Thus, among leaner individuals, the MNS was demonstrated to mediate social modeling of eating, as moderated by frontal asymmetry. These findings are integrated within the normative account of social modeling of eating. It is proposed that the normative framework may benefit from consideration of both conscious and unconscious operation of intake norms. Copyright © 2017 Elsevier B.V. All rights reserved.
Introductive remarks on causal inference
Directory of Open Access Journals (Sweden)
Silvana A. Romio
2013-05-01
Full Text Available One of the more challenging issues in epidemiological research is being able to provide an unbiased estimate of the causal exposure-disease effect, to assess the possible etiological mechanisms and the implication for public health. A major source of bias is confounding, which can spuriously create or mask the causal relationship. In the last ten years, methodological research has been developed to better de_ne the concept of causation in epidemiology and some important achievements have resulted in new statistical models. In this review, we aim to show how a technique the well known by statisticians, i.e. standardization, can be seen as a method to estimate causal e_ects, equivalent under certain conditions to the inverse probability treatment weight procedure.
A complete graphical criterion for the adjustment formula in mediation analysis.
Shpitser, Ilya; VanderWeele, Tyler J
2011-03-04
Various assumptions have been used in the literature to identify natural direct and indirect effects in mediation analysis. These effects are of interest because they allow for effect decomposition of a total effect into a direct and indirect effect even in the presence of interactions or non-linear models. In this paper, we consider the relation and interpretation of various identification assumptions in terms of causal diagrams interpreted as a set of non-parametric structural equations. We show that for such causal diagrams, two sets of assumptions for identification that have been described in the literature are in fact equivalent in the sense that if either set of assumptions holds for all models inducing a particular causal diagram, then the other set of assumptions will also hold for all models inducing that diagram. We moreover build on prior work concerning a complete graphical identification criterion for covariate adjustment for total effects to provide a complete graphical criterion for using covariate adjustment to identify natural direct and indirect effects. Finally, we show that this criterion is equivalent to the two sets of independence assumptions used previously for mediation analysis.
National Oceanic and Atmospheric Administration, Department of Commerce — Forecast turbulence hazards identified by the Graphical Turbulence Guidance algorithm. The Graphical Turbulence Guidance product depicts mid-level and upper-level...
Graphical Turbulence Guidance - Composite
National Oceanic and Atmospheric Administration, Department of Commerce — Forecast turbulence hazards identified by the Graphical Turbulence Guidance algorithm. The Graphical Turbulence Guidance product depicts mid-level and upper-level...
International Nuclear Information System (INIS)
Wang, C.C.; Booth, A.W.; Chen, Y.M.; Botlo, M.
1993-06-01
At the Superconducting Super Collider Laboratory (SSCL) a tool called DAQSIM has been developed to study the behavior of Data Acquisition (DAQ) systems. This paper reports and discusses the graphics used in DAQSIM. DAQSIM graphics includes graphical user interface (GUI), animation, debugging, and control facilities. DAQSIM graphics not only provides a convenient DAQ simulation environment, it also serves as an efficient manager in simulation development and verification
Louis, Linda
2013-01-01
This article reports on the most recent phase of an ongoing research program that examines the artistic graphic representational behavior and paintings of children between the ages of four and seven. The goal of this research program is to articulate a contemporary account of artistic growth and to illuminate how young children's changing…
Campbell's and Rubin's Perspectives on Causal Inference
West, Stephen G.; Thoemmes, Felix
2010-01-01
Donald Campbell's approach to causal inference (D. T. Campbell, 1957; W. R. Shadish, T. D. Cook, & D. T. Campbell, 2002) is widely used in psychology and education, whereas Donald Rubin's causal model (P. W. Holland, 1986; D. B. Rubin, 1974, 2005) is widely used in economics, statistics, medicine, and public health. Campbell's approach focuses on…
Sinha, Shriprakash
2016-01-01
Simulation study in systems biology involving computational experiments dealing with Wnt signaling pathways abound in literature but often lack a pedagogical perspective that might ease the understanding of beginner students and researchers in transition, who intend to work on the modeling of the pathway. This paucity might happen due to restrictive business policies which enforce an unwanted embargo on the sharing of important scientific knowledge. A tutorial introduction to computational mo...
Kim, Jane Paik; Roberts, Laura Weiss
Empirical ethics inquiry works from the notion that stakeholder perspectives are necessary for gauging the ethical acceptability of human studies and assuring that research aligns with societal expectations. Although common, studies involving different populations often entail comparisons of trends that problematize the interpretation of results. Using graphical model selection - a technique aimed at transcending limitations of conventional methods - this report presents data on the ethics of clinical research with two objectives: (1) to display the patterns of views held by ill and healthy individuals in clinical research as a test of the study's original hypothesis and (2) to introduce graphical model selection as a key analytic tool for ethics research. In this IRB-approved, NIH-funded project, data were collected from 60 mentally ill and 43 physically ill clinical research protocol volunteers, 47 healthy protocol-consented participants, and 29 healthy individuals without research protocol experience. Respondents were queried on the ethical acceptability of research involving people with mental and physical illness (i.e., cancer, HIV, depression, schizophrenia, and post-traumatic stress disorder) and non-illness related sources of vulnerability (e.g., age, class, gender, ethnicity). Using a statistical algorithm, we selected graphical models to display interrelationships among responses to questions. Both mentally and physically ill protocol volunteers revealed a high degree of connectivity among ethically-salient perspectives. Healthy participants, irrespective of research protocol experience, revealed patterns of views that were not highly connected. Between ill and healthy protocol participants, the pattern of views is vastly different. Experience with illness was tied to dense connectivity, whereas healthy individuals expressed views with sparse connections. In offering a nuanced perspective on the interrelation of ethically relevant responses, graphical
International Nuclear Information System (INIS)
Scott, W.A.; Turner, R.M.; McCammon, R.B.
1981-01-01
Integrated logic circuits were described as a means of formally representing genetic-geologic models for estimating undiscovered uranium resources. The logic circuits are logical combinations of selected geologic characteristics judged to be associated with particular types of uranium deposits. Each combination takes on a value which corresponds to the combined presence, absence, or don't know states of the selected characteristic within a specified geographic cell. Within each cell, the output of the logic circuit is taken as a measure of the favorability of occurrence of an undiscovered deposit of the type being considered. In this way, geological, geochemical, and geophysical data are incorporated explicitly into potential uranium resource estimates. The present report describes how integrated logic circuits are constructed by use of a computer graphics program. A user's guide is also included
Arvo, James
1991-01-01
Graphics Gems II is a collection of articles shared by a diverse group of people that reflect ideas and approaches in graphics programming which can benefit other computer graphics programmers.This volume presents techniques for doing well-known graphics operations faster or easier. The book contains chapters devoted to topics on two-dimensional and three-dimensional geometry and algorithms, image processing, frame buffer techniques, and ray tracing techniques. The radiosity approach, matrix techniques, and numerical and programming techniques are likewise discussed.Graphics artists and comput
Repeated Causal Decision Making
Hagmayer, York; Meder, Bjorn
2013-01-01
Many of our decisions refer to actions that have a causal impact on the external environment. Such actions may not only allow for the mere learning of expected values or utilities but also for acquiring knowledge about the causal structure of our world. We used a repeated decision-making paradigm to examine what kind of knowledge people acquire in…
International Nuclear Information System (INIS)
Novello, M.; Salim, J.M.; Torres, J.; Oliveira, H.P. de
1989-01-01
A set of spatially homogeneous and isotropic cosmological geometries generated by a class of non-perfect is investigated fluids. The irreversibility if this system is studied in the context of causal thermodynamics which provides a useful mechanism to conform to the non-violation of the causal principle. (author) [pt
Causality in Classical Electrodynamics
Savage, Craig
2012-01-01
Causality in electrodynamics is a subject of some confusion, especially regarding the application of Faraday's law and the Ampere-Maxwell law. This has led to the suggestion that we should not teach students that electric and magnetic fields can cause each other, but rather focus on charges and currents as the causal agents. In this paper I argue…
Causality in Europeanization Research
DEFF Research Database (Denmark)
Lynggaard, Kennet
2012-01-01
to develop discursive institutional analytical frameworks and something that comes close to the formulation of hypothesis on the effects of European Union (EU) policies and institutions on domestic change. Even if these efforts so far do not necessarily amount to substantive theories or claims of causality......Discourse analysis as a methodology is perhaps not readily associated with substantive causality claims. At the same time the study of discourses is very much the study of conceptions of causal relations among a set, or sets, of agents. Within Europeanization research we have seen endeavours......, it suggests that discourse analysis and the study of causality are by no means opposites. The study of Europeanization discourses may even be seen as an essential step in the move towards claims of causality in Europeanization research. This chapter deals with the question of how we may move from the study...
Directory of Open Access Journals (Sweden)
Thomas eWidlok
2014-11-01
Full Text Available Cognitive Scientists interested in causal cognition increasingly search for evidence from non-WEIRD people but find only very few cross-cultural studies that specifically target causal cognition. This article suggests how information about causality can be retrieved from ethnographic monographs, specifically from ethnographies that discuss agency and concepts of time. Many apparent cultural differences with regard to causal cognition dissolve when cultural extensions of agency and personhood to non-humans are taken into account. At the same time considerable variability remains when we include notions of time, linearity and sequence. The article focuses on ethnographic case studies from Africa but provides a more general perspective on the role of ethnography in research on the diversity and universality of causal cognition.
Graphical programming at Sandia National Laboratories
International Nuclear Information System (INIS)
McDonald, M.J.; Palmquist, R.D.; Desjarlais, L.
1993-09-01
Sandia has developed an advanced operational control system approach, called Graphical Programming, to design, program, and operate robotic systems. The Graphical Programming approach produces robot systems that are faster to develop and use, safer in operation, and cheaper overall than altemative teleoperation or autonomous robot control systems. Graphical Programming also provides an efficient and easy-to-use interface to traditional robot systems for use in setup and programming tasks. This paper provides an overview of the Graphical Programming approach and lists key features of Graphical Programming systems. Graphical Programming uses 3-D visualization and simulation software with intuitive operator interfaces for the programming and control of complex robotic systems. Graphical Programming Supervisor software modules allow an operator to command and simulate complex tasks in a graphic preview mode and, when acceptable, command the actual robots and monitor their motions with the graphic system. Graphical Programming Supervisors maintain registration with the real world and allow the robot to perform tasks that cannot be accurately represented with models alone by using a combination of model and sensor-based control
Salimi, Parisa; Hamedi, Mohsen; Jamshidi, Nima; Vismeh, Milad
2017-04-01
Diabetes and its associated complications are realized as one of the most challenging medical conditions threatening more than 29 million people only in the USA. The forecasts suggest a suffering of more than half a billion worldwide by 2030. Amid all diabetic complications, diabetic foot ulcer (DFU) has attracted much scientific investigations to lead to a better management of this disease. In this paper, a system thinking methodology is adopted to investigate the dynamic nature of the ulceration. The causal loop diagram as a tool is utilized to illustrate the well-researched relations and interrelations between causes of the DFU. The result of clustering causality evaluation suggests a vicious loop that relates external trauma to callus. Consequently a hypothesis is presented which localizes development of foot ulceration considering distribution of normal and shear stress. It specifies that normal and tangential forces, as the main representatives of external trauma, play the most important role in foot ulceration. The evaluation of this hypothesis suggests the significance of the information related to both normal and shear stress for managing DFU. The results also discusses how these two react on different locations on foot such as metatarsal head, heel and hallux. The findings of this study can facilitate tackling the complexity of DFU problem and looking for constructive mitigation measures. Moreover they lead to developing a more promising methodology for managing DFU including better prognosis, designing prosthesis and insoles for DFU and patient caring recommendations. Copyright © 2017 Elsevier Ltd. All rights reserved.
mediation: R Package for Causal Mediation Analysis
Directory of Open Access Journals (Sweden)
Dustin Tingley
2014-09-01
Full Text Available In this paper, we describe the R package mediation for conducting causal mediation analysis in applied empirical research. In many scientific disciplines, the goal of researchers is not only estimating causal effects of a treatment but also understanding the process in which the treatment causally affects the outcome. Causal mediation analysis is frequently used to assess potential causal mechanisms. The mediation package implements a comprehensive suite of statistical tools for conducting such an analysis. The package is organized into two distinct approaches. Using the model-based approach, researchers can estimate causal mediation effects and conduct sensitivity analysis under the standard research design. Furthermore, the design-based approach provides several analysis tools that are applicable under different experimental designs. This approach requires weaker assumptions than the model-based approach. We also implement a statistical method for dealing with multiple (causally dependent mediators, which are often encountered in practice. Finally, the package also offers a methodology for assessing causal mediation in the presence of treatment noncompliance, a common problem in randomized trials.
Directory of Open Access Journals (Sweden)
Rozilene Maria Cota Aroeira
2017-05-01
Full Text Available Abstract Introduction: Biomedical studies involve complex anatomical structures, which require specific methodology to generate their geometric models. The middle segment of the thoracic spine (T5-T10 is the site of the highest incidence of vertebral deformity in adolescents. Traditionally, its geometries are derived from computed tomography or magnetic resonance imaging data. However, this approach may restrict certain studies. The study aimed to generate two 3D geometric model of the T5-T10 thoracic spine segment, obtained from graphical images, and to create mesh for finite element studies. Methods A 3D geometric model of T5-T10 was generated using two anatomical images of T6 vertebra (side and top. The geometric model was created in Autodesk® Maya® 3D 2013, and the mesh process in HiperMesh and MeshMixer (v11.0.544 Autodesk. Results The T5-T10 thoracic segment model is presented with its passive components, bones, intervertebral discs and flavum, intertransverse and supraspinous ligaments, in different views, as well as the volumetric mesh. Conclusion The 3D geometric model generated from graphical images is suitable for application in non-patient-specific finite element model studies or, with restrictions, in the use of computed tomography or magnetic resonance imaging. This model may be useful for biomechanical studies related to the middle thoracic spine, the most vulnerable site for vertebral deformations.
User-Extensible Graphics Using Abstract Structure,
1987-08-01
Flex 6 The Algol68 model of the graphical abstract structure 5 The creation of a PictureDefinition 6 The making of a picture from a PictureDefinition 7...data together with the operations that can be performed on that data. i 7! ś I _ § 4, The Alqol68 model of the graphical abstract structure Every
Causal mapping of emotion networks in the human brain: Framework and initial findings.
Dubois, Julien; Oya, Hiroyuki; Tyszka, J Michael; Howard, Matthew; Eberhardt, Frederick; Adolphs, Ralph
2017-11-13
Emotions involve many cortical and subcortical regions, prominently including the amygdala. It remains unknown how these multiple network components interact, and it remains unknown how they cause the behavioral, autonomic, and experiential effects of emotions. Here we describe a framework for combining a novel technique, concurrent electrical stimulation with fMRI (es-fMRI), together with a novel analysis, inferring causal structure from fMRI data (causal discovery). We outline a research program for investigating human emotion with these new tools, and provide initial findings from two large resting-state datasets as well as case studies in neurosurgical patients with electrical stimulation of the amygdala. The overarching goal is to use causal discovery methods on fMRI data to infer causal graphical models of how brain regions interact, and then to further constrain these models with direct stimulation of specific brain regions and concurrent fMRI. We conclude by discussing limitations and future extensions. The approach could yield anatomical hypotheses about brain connectivity, motivate rational strategies for treating mood disorders with deep brain stimulation, and could be extended to animal studies that use combined optogenetic fMRI. Copyright © 2017 Elsevier Ltd. All rights reserved.
Identity, causality, and pronoun ambiguity.
Sagi, Eyal; Rips, Lance J
2014-10-01
This article looks at the way people determine the antecedent of a pronoun in sentence pairs, such as: Albert invited Ron to dinner. He spent hours cleaning the house. The experiment reported here is motivated by the idea that such judgments depend on reasoning about identity (e.g., the identity of the he who cleaned the house). Because the identity of an individual over time depends on the causal-historical path connecting the stages of the individual, the correct antecedent will also depend on causal connections. The experiment varied how likely it is that the event of the first sentence (e.g., the invitation) would cause the event of the second (the house cleaning) for each of the two individuals (the likelihood that if Albert invited Ron to dinner, this would cause Albert to clean the house, versus cause Ron to clean the house). Decisions about the antecedent followed causal likelihood. A mathematical model of causal identity accounted for most of the key aspects of the data from the individual sentence pairs. Copyright © 2014 Cognitive Science Society, Inc.
Keegan, Ronan M; McNicholas, Stuart J; Thomas, Jens M H; Simpkin, Adam J; Simkovic, Felix; Uski, Ville; Ballard, Charles C; Winn, Martyn D; Wilson, Keith S; Rigden, Daniel J
2018-03-01
Increasing sophistication in molecular-replacement (MR) software and the rapid expansion of the PDB in recent years have allowed the technique to become the dominant method for determining the phases of a target structure in macromolecular X-ray crystallography. In addition, improvements in bioinformatic techniques for finding suitable homologous structures for use as MR search models, combined with developments in refinement and model-building techniques, have pushed the applicability of MR to lower sequence identities and made weak MR solutions more amenable to refinement and improvement. MrBUMP is a CCP4 pipeline which automates all stages of the MR procedure. Its scope covers everything from the sourcing and preparation of suitable search models right through to rebuilding of the positioned search model. Recent improvements to the pipeline include the adoption of more sensitive bioinformatic tools for sourcing search models, enhanced model-preparation techniques including better ensembling of homologues, and the use of phase improvement and model building on the resulting solution. The pipeline has also been deployed as an online service through CCP4 online, which allows its users to exploit large bioinformatic databases and coarse-grained parallelism to speed up the determination of a possible solution. Finally, the molecular-graphics application CCP4mg has been combined with MrBUMP to provide an interactive visual aid to the user during the process of selecting and manipulating search models for use in MR. Here, these developments in MrBUMP are described with a case study to explore how some of the enhancements to the pipeline and to CCP4mg can help to solve a difficult case.
D'Ariano, Giacomo Mauro
2018-07-13
Causality has never gained the status of a 'law' or 'principle' in physics. Some recent literature has even popularized the false idea that causality is a notion that should be banned from theory. Such misconception relies on an alleged universality of the reversibility of the laws of physics, based either on the determinism of classical theory, or on the multiverse interpretation of quantum theory, in both cases motivated by mere interpretational requirements for realism of the theory. Here, I will show that a properly defined unambiguous notion of causality is a theorem of quantum theory, which is also a falsifiable proposition of the theory. Such a notion of causality appeared in the literature within the framework of operational probabilistic theories. It is a genuinely theoretical notion, corresponding to establishing a definite partial order among events, in the same way as we do by using the future causal cone on Minkowski space. The notion of causality is logically completely independent of the misidentified concept of 'determinism', and, being a consequence of quantum theory, is ubiquitous in physics. In addition, as classical theory can be regarded as a restriction of quantum theory, causality holds also in the classical case, although the determinism of the theory trivializes it. I then conclude by arguing that causality naturally establishes an arrow of time. This implies that the scenario of the 'block Universe' and the connected 'past hypothesis' are incompatible with causality, and thus with quantum theory: they are both doomed to remain mere interpretations and, as such, are not falsifiable, similar to the hypothesis of 'super-determinism'.This article is part of a discussion meeting issue 'Foundations of quantum mechanics and their impact on contemporary society'. © 2018 The Author(s).
Deterministic Graphical Games Revisited
DEFF Research Database (Denmark)
Andersson, Daniel; Hansen, Kristoffer Arnsfelt; Miltersen, Peter Bro
2008-01-01
We revisit the deterministic graphical games of Washburn. A deterministic graphical game can be described as a simple stochastic game (a notion due to Anne Condon), except that we allow arbitrary real payoffs but disallow moves of chance. We study the complexity of solving deterministic graphical...... games and obtain an almost-linear time comparison-based algorithm for computing an equilibrium of such a game. The existence of a linear time comparison-based algorithm remains an open problem....
International Nuclear Information System (INIS)
Allensworth, J.A.
1984-04-01
EASI (Estimate of Adversary Sequence Interruption) is an analytical technique for measuring the effectiveness of physical protection systems. EASI Graphics is a computer graphics extension of EASI which provides a capability for performing sensitivity and trade-off analyses of the parameters of a physical protection system. This document reports on the implementation of the Version II of EASI Graphics and illustrates its application with some examples. 5 references, 15 figures, 6 tables
The computer graphics metafile
Henderson, LR; Shepherd, B; Arnold, D B
1990-01-01
The Computer Graphics Metafile deals with the Computer Graphics Metafile (CGM) standard and covers topics ranging from the structure and contents of a metafile to CGM functionality, metafile elements, and real-world applications of CGM. Binary Encoding, Character Encoding, application profiles, and implementations are also discussed. This book is comprised of 18 chapters divided into five sections and begins with an overview of the CGM standard and how it can meet some of the requirements for storage of graphical data within a graphics system or application environment. The reader is then intr
The computer graphics interface
Steinbrugge Chauveau, Karla; Niles Reed, Theodore; Shepherd, B
2014-01-01
The Computer Graphics Interface provides a concise discussion of computer graphics interface (CGI) standards. The title is comprised of seven chapters that cover the concepts of the CGI standard. Figures and examples are also included. The first chapter provides a general overview of CGI; this chapter covers graphics standards, functional specifications, and syntactic interfaces. Next, the book discusses the basic concepts of CGI, such as inquiry, profiles, and registration. The third chapter covers the CGI concepts and functions, while the fourth chapter deals with the concept of graphic obje
van Dijk, Marjolein J A M; Claassen, Tom; Suwartono, Christiany; van der Veld, William M; van der Heijden, Paul T; Hendriks, Marc P H
Since the publication of the WAIS-IV in the U.S. in 2008, efforts have been made to explore the structural validity by applying factor analysis to various samples. This study aims to achieve a more fine-grained understanding of the structure of the Dutch language version of the WAIS-IV (WAIS-IV-NL) by applying an alternative analysis based on causal modeling in addition to confirmatory factor analysis (CFA). The Bayesian Constraint-based Causal Discovery (BCCD) algorithm learns underlying network structures directly from data and assesses more complex structures than is possible with factor analysis. WAIS-IV-NL profiles of two clinical samples of 202 patients (i.e. patients with temporal lobe epilepsy and a mixed psychiatric outpatient group) were analyzed and contrasted with a matched control group (N = 202) selected from the Dutch standardization sample of the WAIS-IV-NL to investigate internal structure by means of CFA and BCCD. With CFA, the four-factor structure as proposed by Wechsler demonstrates acceptable fit in all three subsamples. However, BCCD revealed three consistent clusters (verbal comprehension, visual processing, and processing speed) in all three subsamples. The combination of Arithmetic and Digit Span as a coherent working memory factor could not be verified, and Matrix Reasoning appeared to be isolated. With BCCD, some discrepancies from the proposed four-factor structure are exemplified. Furthermore, these results fit CHC theory of intelligence more clearly. Consistent clustering patterns indicate these results are robust. The structural causal discovery approach may be helpful in better interpreting existing tests, the development of new tests, and aid in diagnostic instruments.
Energy Technology Data Exchange (ETDEWEB)
Hunke, Elizabeth Clare [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Urrego Blanco, Jorge Rolando [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Urban, Nathan Mark [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)
2018-02-12
Coupled climate models have a large number of input parameters that can affect output uncertainty. We conducted a sensitivity analysis of sea ice proper:es and Arc:c related climate variables to 5 parameters in the HiLAT climate model: air-ocean turbulent exchange parameter (C), conversion of water vapor to clouds (cldfrc_rhminl) and of ice crystals to snow (micro_mg_dcs), snow thermal conduc:vity (ksno), and maximum snow grain size (rsnw_mlt). We used an elementary effect (EE) approach to rank their importance for output uncertainty. EE is an extension of one-at-a-time sensitivity analyses, but it is more efficient in sampling multi-dimensional parameter spaces. We looked for emerging relationships among climate variables across the model ensemble, and used causal discovery algorithms to establish potential pathways for those relationships.
How to Be Causal: Time, Spacetime and Spectra
Kinsler, Paul
2011-01-01
I explain a simple definition of causality in widespread use, and indicate how it links to the Kramers-Kronig relations. The specification of causality in terms of temporal differential equations then shows us the way to write down dynamical models so that their causal nature "in the sense used here" should be obvious to all. To extend existing…
International Nuclear Information System (INIS)
Maund, J.B.
1979-01-01
Although the existence of tachyons is not ruled out by special relativity, it appears that causal paradoxes will arise if there are tachyons. The usual solutions to these paradoxes employ some form of the reinterpretation principle. In this paper it is argued first that, the principle is incoherent, second, that even if it is not, some causal paradoxes remain, and third, the most plausible ''solution,'' which appeals to boundary conditions of the universe, will conflict with special relativity
Dynamics and causality constraints
International Nuclear Information System (INIS)
Sousa, Manoelito M. de
2001-04-01
The physical meaning and the geometrical interpretation of causality implementation in classical field theories are discussed. Causality in field theory are kinematical constraints dynamically implemented via solutions of the field equation, but in a limit of zero-distance from the field sources part of these constraints carries a dynamical content that explains old problems of classical electrodynamics away with deep implications to the nature of physicals interactions. (author)
Efficient nonparametric estimation of causal mediation effects
Chan, K. C. G.; Imai, K.; Yam, S. C. P.; Zhang, Z.
2016-01-01
An essential goal of program evaluation and scientific research is the investigation of causal mechanisms. Over the past several decades, causal mediation analysis has been used in medical and social sciences to decompose the treatment effect into the natural direct and indirect effects. However, all of the existing mediation analysis methods rely on parametric modeling assumptions in one way or another, typically requiring researchers to specify multiple regression models involving the treat...
Directory of Open Access Journals (Sweden)
Inken Rothkirch
Full Text Available Writer's cramp (WC is a focal task-specific dystonia characterized by sustained or intermittent muscle contractions while writing, particularly with the dominant hand. Since structural lesions rarely cause WC, it has been assumed that the disease might be caused by a functional maladaptation within the sensory-motor system. Therefore, our objective was to examine the differences between patients suffering from WC and a healthy control (HC group with regard to the effective connectivity that describes causal influences one brain region exerts over another within the motor network. The effective connectivity within a network including contralateral motor cortex (M1, supplementary motor area (SMA, globus pallidus (GP, putamen (PU and ipsilateral cerebellum (CB was investigated using dynamic causal modeling (DCM for fMRI. Eight connectivity models of functional motor systems were compared. Fifteen WC patients and 18 age-matched HC performed a sequential, five-element finger-tapping task with the non-dominant and non-affected left hand within a 3 T MRI-scanner as quickly and accurately as possible. The task was conducted in a fixed block design repeated 15 times and included 30 s of tapping followed by 30 s of rest. DCM identified the same model in WC and HC as superior for reflecting basal ganglia and cerebellar motor circuits of healthy subjects. The M1-PU, as well as M1-CB connectivity, was more strongly influenced by tapping in WC, but the intracortical M1-SMA connection was more facilitating in controls. Inhibiting influences originating from GP to M1 were stronger in controls compared to WC patients whereby facilitating influences the PU exerts over CB and CB exerts over M1 were not as strong. Although the same model structure explains the given data best, DCM confirms previous research demonstrating a malfunction in effective connectivity intracortically (M1-SMA and in the cortico-basal ganglia circuitry in WC. In addition, DCM analysis
Interactive Graphic Journalism
Schlichting, Laura
2016-01-01
textabstractThis paper examines graphic journalism (GJ) in a transmedial context, and argues that transmedial graphic journalism (TMGJ) is an important and fruitful new form of visual storytelling, that will re-invigorate the field of journalism, as it steadily tests out and plays with new media,
Mathematics for computer graphics
Vince, John
2006-01-01
Helps you understand the mathematical ideas used in computer animation, virtual reality, CAD, and other areas of computer graphics. This work also helps you to rediscover the mathematical techniques required to solve problems and design computer programs for computer graphic applications
Graphic Communications. Curriculum Guide.
North Dakota State Board for Vocational Education, Bismarck.
This guide provides the basic foundation to develop a one-semester course based on the cluster concept, graphic communications. One of a set of six guides for an industrial arts curriculum at the junior high school level, it suggests exploratory experiences designed to (1) develop an awareness and understanding of the drafting and graphic arts…
Padula, Amy M.; Mortimer, Kathleen; Hubbard, Alan; Lurmann, Frederick; Jerrett, Michael; Tager, Ira B.
2012-01-01
Traffic-related air pollution is recognized as an important contributor to health problems. Epidemiologic analyses suggest that prenatal exposure to traffic-related air pollutants may be associated with adverse birth outcomes; however, there is insufficient evidence to conclude that the relation is causal. The Study of Air Pollution, Genetics and Early Life Events comprises all births to women living in 4 counties in California's San Joaquin Valley during the years 2000–2006. The probability of low birth weight among full-term infants in the population was estimated using machine learning and targeted maximum likelihood estimation for each quartile of traffic exposure during pregnancy. If everyone lived near high-volume freeways (approximated as the fourth quartile of traffic density), the estimated probability of term low birth weight would be 2.27% (95% confidence interval: 2.16, 2.38) as compared with 2.02% (95% confidence interval: 1.90, 2.12) if everyone lived near smaller local roads (first quartile of traffic density). Assessment of potentially causal associations, in the absence of arbitrary model assumptions applied to the data, should result in relatively unbiased estimates. The current results support findings from previous studies that prenatal exposure to traffic-related air pollution may adversely affect birth weight among full-term infants. PMID:23045474
Hagmayer, York; Engelmann, Neele
2014-01-01
Cognitive psychological research focuses on causal learning and reasoning while cognitive anthropological and social science research tend to focus on systems of beliefs. Our aim was to explore how these two types of research can inform each other. Cognitive psychological theories (causal model theory and causal Bayes nets) were used to derive predictions for systems of causal beliefs. These predictions were then applied to lay theories of depression as a specific test case. A systematic lite...
Using Graphic Organizers in Intercultural Education
Ciascai, Liliana
2009-01-01
Graphic organizers are instruments of representation, illustration and modeling of information. In the educational practice they are used for building, and systematization of knowledge. Graphic organizers are instruments that addressed mostly visual learning style, but their use is beneficial to all learners. In this paper we illustrate the use of…
A frequency domain subspace algorithm for mixed causal, anti-causal LTI systems
Fraanje, Rufus; Verhaegen, Michel; Verdult, Vincent; Pintelon, Rik
2003-01-01
The paper extends the subspacc identification method to estimate state-space models from frequency response function (FRF) samples, proposed by McKelvey et al. (1996) for mixed causal/anti-causal systems, and shows that other frequency domain subspace algorithms can be extended similarly. The method
Causal structure of analogue spacetimes
International Nuclear Information System (INIS)
Barcelo, Carlos; Liberati, Stefano; Sonego, Sebastiano; Visser, Matt
2004-01-01
The so-called 'analogue models of general relativity' provide a number of specific physical systems, well outside the traditional realm of general relativity, that nevertheless are well-described by the differential geometry of curved spacetime. Specifically, the propagation of perturbations in these condensed matter systems is described by 'effective metrics' that carry with them notions of 'causal structure' as determined by an exchange of quasi-particles. These quasi-particle-induced causal structures serve as specific examples of what can be done in the presence of a Lorentzian metric without having recourse to the Einstein equations of general relativity. (After all, the underlying analogue model is governed by its own specific physics, not necessarily by the Einstein equations.) In this paper we take a careful look at what can be said about the causal structure of analogue spacetimes, focusing on those containing quasi-particle horizons, both with a view to seeing what is different from standard general relativity, and what the similarities might be. For definiteness, and because the physics is particularly simple to understand, we will phrase much of the discussion in terms of acoustic disturbances in moving fluids, where the underlying physics is ordinary fluid mechanics, governed by the equations of traditional hydrodynamics, and the relevant quasi-particles are the phonons. It must however be emphasized that this choice of example is only for the sake of pedagogical simplicity and that our considerations apply generically to wide classes of analogue spacetimes