WorldWideScience

Sample records for making inferences based

  1. Making Type Inference Practical

    Schwartzbach, Michael Ignatieff; Oxhøj, Nicholas; Palsberg, Jens

    1992-01-01

    We present the implementation of a type inference algorithm for untyped object-oriented programs with inheritance, assignments, and late binding. The algorithm significantly improves our previous one, presented at OOPSLA'91, since it can handle collection classes, such as List, in a useful way. Abo......, the complexity has been dramatically improved, from exponential time to low polynomial time. The implementation uses the techniques of incremental graph construction and constraint template instantiation to avoid representing intermediate results, doing superfluous work, and recomputing type information....... Experiments indicate that the implementation type checks as much as 100 lines pr. second. This results in a mature product, on which a number of tools can be based, for example a safety tool, an image compression tool, a code optimization tool, and an annotation tool. This may make type inference for object...

  2. The development of adaptive decision making: Recognition-based inference in children and adolescents.

    Horn, Sebastian S; Ruggeri, Azzurra; Pachur, Thorsten

    2016-09-01

    Judgments about objects in the world are often based on probabilistic information (or cues). A frugal judgment strategy that utilizes memory (i.e., the ability to discriminate between known and unknown objects) as a cue for inference is the recognition heuristic (RH). The usefulness of the RH depends on the structure of the environment, particularly the predictive power (validity) of recognition. Little is known about developmental differences in use of the RH. In this study, the authors examined (a) to what extent children and adolescents recruit the RH when making judgments, and (b) around what age adaptive use of the RH emerges. Primary schoolchildren (M = 9 years), younger adolescents (M = 12 years), and older adolescents (M = 17 years) made comparative judgments in task environments with either high or low recognition validity. Reliance on the RH was measured with a hierarchical multinomial model. Results indicated that primary schoolchildren already made systematic use of the RH. However, only older adolescents adaptively adjusted their strategy use between environments and were better able to discriminate between situations in which the RH led to correct versus incorrect inferences. These findings suggest that the use of simple heuristics does not progress unidirectionally across development but strongly depends on the task environment, in line with the perspective of ecological rationality. Moreover, adaptive heuristic inference seems to require experience and a developed base of domain knowledge. (PsycINFO Database Record (c) 2016 APA, all rights reserved).

  3. The Development of Adaptive Decision Making: Recognition-Based Inference in Children and Adolescents

    Horn, Sebastian S.; Ruggeri, Azzurra; Pachur, Thorsten

    2016-01-01

    Judgments about objects in the world are often based on probabilistic information (or cues). A frugal judgment strategy that utilizes memory (i.e., the ability to discriminate between known and unknown objects) as a cue for inference is the recognition heuristic (RH). The usefulness of the RH depends on the structure of the environment,…

  4. The importance of learning when making inferences

    Jorg Rieskamp

    2008-03-01

    Full Text Available The assumption that people possess a repertoire of strategies to solve the inference problems they face has been made repeatedly. The experimental findings of two previous studies on strategy selection are reexamined from a learning perspective, which argues that people learn to select strategies for making probabilistic inferences. This learning process is modeled with the strategy selection learning (SSL theory, which assumes that people develop subjective expectancies for the strategies they have. They select strategies proportional to their expectancies, which are updated on the basis of experience. For the study by Newell, Weston, and Shanks (2003 it can be shown that people did not anticipate the success of a strategy from the beginning of the experiment. Instead, the behavior observed at the end of the experiment was the result of a learning process that can be described by the SSL theory. For the second study, by Br"oder and Schiffer (2006, the SSL theory is able to provide an explanation for why participants only slowly adapted to new environments in a dynamic inference situation. The reanalysis of the previous studies illustrates the importance of learning for probabilistic inferences.

  5. Making Inferences in Adulthood: Falling Leaves Mean It's Fall.

    Zandi, Taher; Gregory, Monica E.

    1988-01-01

    Assessed age differences in making inferences from prose. Older adults correctly answered mean of 10 questions related to implicit information and 8 related to explicit information. Young adults answered mean of 7 implicit and 12 explicit information questions. In spite of poorer recall of factual details, older subjects made inferences to greater…

  6. Memory-Based Simple Heuristics as Attribute Substitution: Competitive Tests of Binary Choice Inference Models

    Honda, Hidehito; Matsuka, Toshihiko; Ueda, Kazuhiro

    2017-01-01

    Some researchers on binary choice inference have argued that people make inferences based on simple heuristics, such as recognition, fluency, or familiarity. Others have argued that people make inferences based on available knowledge. To examine the boundary between heuristic and knowledge usage, we examine binary choice inference processes in…

  7. Training Inference Making Skills Using a Situation Model Approach Improves Reading Comprehension

    Lisanne eBos

    2016-02-01

    Full Text Available This study aimed to enhance third and fourth graders’ text comprehension at the situation model level. Therefore, we tested a reading strategy training developed to target inference making skills, which are widely considered to be pivotal to situation model construction. The training was grounded in contemporary literature on situation model-based inference making and addressed the source (text-based versus knowledge-based, type (necessary versus unnecessary for (re-establishing coherence, and depth of an inference (making single lexical inferences versus combining multiple lexical inferences, as well as the type of searching strategy (forward versus backward. Results indicated that, compared to a control group (n = 51, children who followed the experimental training (n = 67 improved their inference making skills supportive to situation model construction. Importantly, our training also resulted in increased levels of general reading comprehension and motivation. In sum, this study showed that a ‘level of text representation’-approach can provide a useful framework to teach inference making skills to third and fourth graders.

  8. Generative Inferences Based on Learned Relations

    Chen, Dawn; Lu, Hongjing; Holyoak, Keith J.

    2017-01-01

    A key property of relational representations is their "generativity": From partial descriptions of relations between entities, additional inferences can be drawn about other entities. A major theoretical challenge is to demonstrate how the capacity to make generative inferences could arise as a result of learning relations from…

  9. Making inference from wildlife collision data: inferring predator absence from prey strikes

    Peter Caley

    2017-02-01

    Full Text Available Wildlife collision data are ubiquitous, though challenging for making ecological inference due to typically irreducible uncertainty relating to the sampling process. We illustrate a new approach that is useful for generating inference from predator data arising from wildlife collisions. By simply conditioning on a second prey species sampled via the same collision process, and by using a biologically realistic numerical response functions, we can produce a coherent numerical response relationship between predator and prey. This relationship can then be used to make inference on the population size of the predator species, including the probability of extinction. The statistical conditioning enables us to account for unmeasured variation in factors influencing the runway strike incidence for individual airports and to enable valid comparisons. A practical application of the approach for testing hypotheses about the distribution and abundance of a predator species is illustrated using the hypothesized red fox incursion into Tasmania, Australia. We estimate that conditional on the numerical response between fox and lagomorph runway strikes on mainland Australia, the predictive probability of observing no runway strikes of foxes in Tasmania after observing 15 lagomorph strikes is 0.001. We conclude there is enough evidence to safely reject the null hypothesis that there is a widespread red fox population in Tasmania at a population density consistent with prey availability. The method is novel and has potential wider application.

  10. Making inference from wildlife collision data: inferring predator absence from prey strikes.

    Caley, Peter; Hosack, Geoffrey R; Barry, Simon C

    2017-01-01

    Wildlife collision data are ubiquitous, though challenging for making ecological inference due to typically irreducible uncertainty relating to the sampling process. We illustrate a new approach that is useful for generating inference from predator data arising from wildlife collisions. By simply conditioning on a second prey species sampled via the same collision process, and by using a biologically realistic numerical response functions, we can produce a coherent numerical response relationship between predator and prey. This relationship can then be used to make inference on the population size of the predator species, including the probability of extinction. The statistical conditioning enables us to account for unmeasured variation in factors influencing the runway strike incidence for individual airports and to enable valid comparisons. A practical application of the approach for testing hypotheses about the distribution and abundance of a predator species is illustrated using the hypothesized red fox incursion into Tasmania, Australia. We estimate that conditional on the numerical response between fox and lagomorph runway strikes on mainland Australia, the predictive probability of observing no runway strikes of foxes in Tasmania after observing 15 lagomorph strikes is 0.001. We conclude there is enough evidence to safely reject the null hypothesis that there is a widespread red fox population in Tasmania at a population density consistent with prey availability. The method is novel and has potential wider application.

  11. Research designs and making causal inferences from health care studies.

    Flannelly, Kevin J; Jankowski, Katherine R B

    2014-01-01

    This article summarizes the major types of research designs used in healthcare research, including experimental, quasi-experimental, and observational studies. Observational studies are divided into survey studies (descriptive and correlational studies), case-studies and analytic studies, the last of which are commonly used in epidemiology: case-control, retrospective cohort, and prospective cohort studies. Similarities and differences among the research designs are described and the relative strength of evidence they provide is discussed. Emphasis is placed on five criteria for drawing causal inferences that are derived from the writings of the philosopher John Stuart Mill, especially his methods or canons. The application of the criteria to experimentation is explained. Particular attention is given to the degree to which different designs meet the five criteria for making causal inferences. Examples of specific studies that have used various designs in chaplaincy research are provided.

  12. Causal inference based on counterfactuals

    Höfler M

    2005-09-01

    Full Text Available Abstract Background The counterfactual or potential outcome model has become increasingly standard for causal inference in epidemiological and medical studies. Discussion This paper provides an overview on the counterfactual and related approaches. A variety of conceptual as well as practical issues when estimating causal effects are reviewed. These include causal interactions, imperfect experiments, adjustment for confounding, time-varying exposures, competing risks and the probability of causation. It is argued that the counterfactual model of causal effects captures the main aspects of causality in health sciences and relates to many statistical procedures. Summary Counterfactuals are the basis of causal inference in medicine and epidemiology. Nevertheless, the estimation of counterfactual differences pose several difficulties, primarily in observational studies. These problems, however, reflect fundamental barriers only when learning from observations, and this does not invalidate the counterfactual concept.

  13. Inference

    Møller, Jesper

    2010-01-01

    Chapter 9: This contribution concerns statistical inference for parametric models used in stochastic geometry and based on quick and simple simulation free procedures as well as more comprehensive methods based on a maximum likelihood or Bayesian approach combined with markov chain Monte Carlo...... (MCMC) techniques. Due to space limitations the focus is on spatial point processes....

  14. Children Use Nonverbal Cues to Make Inferences About Social Power

    Brey, Elizabeth; Shutts, Kristin

    2016-01-01

    Four studies (N=192) tested whether young children use nonverbal information to make inferences about differences in social power. Five- and 6-year-old children were able to determine which of two adults was “in charge” in dynamic videotaped conversations (Study 1) and in static photographs (Study 4) using only nonverbal cues. Younger children (3–4 years) were not successful in Study 1 or Study 4. Removing irrelevant linguistic information from conversations did not improve the performance of 3–4-year-old children (Study 3), but including relevant linguistic cues did (Study 2). Thus, at least by 5 years of age, children show sensitivity to some of the same nonverbal cues adults use to determine other people’s social roles. PMID:25521913

  15. Perceptual Decision-Making as Probabilistic Inference by Neural Sampling.

    Haefner, Ralf M; Berkes, Pietro; Fiser, József

    2016-05-04

    We address two main challenges facing systems neuroscience today: understanding the nature and function of cortical feedback between sensory areas and of correlated variability. Starting from the old idea of perception as probabilistic inference, we show how to use knowledge of the psychophysical task to make testable predictions for the influence of feedback signals on early sensory representations. Applying our framework to a two-alternative forced choice task paradigm, we can explain multiple empirical findings that have been hard to account for by the traditional feedforward model of sensory processing, including the task dependence of neural response correlations and the diverging time courses of choice probabilities and psychophysical kernels. Our model makes new predictions and characterizes a component of correlated variability that represents task-related information rather than performance-degrading noise. It demonstrates a normative way to integrate sensory and cognitive components into physiologically testable models of perceptual decision-making. Copyright © 2016 Elsevier Inc. All rights reserved.

  16. Statistical inference based on divergence measures

    Pardo, Leandro

    2005-01-01

    The idea of using functionals of Information Theory, such as entropies or divergences, in statistical inference is not new. However, in spite of the fact that divergence statistics have become a very good alternative to the classical likelihood ratio test and the Pearson-type statistic in discrete models, many statisticians remain unaware of this powerful approach.Statistical Inference Based on Divergence Measures explores classical problems of statistical inference, such as estimation and hypothesis testing, on the basis of measures of entropy and divergence. The first two chapters form an overview, from a statistical perspective, of the most important measures of entropy and divergence and study their properties. The author then examines the statistical analysis of discrete multivariate data with emphasis is on problems in contingency tables and loglinear models using phi-divergence test statistics as well as minimum phi-divergence estimators. The final chapter looks at testing in general populations, prese...

  17. Inference

    Møller, Jesper

    (This text written by Jesper Møller, Aalborg University, is submitted for the collection ‘Stochastic Geometry: Highlights, Interactions and New Perspectives', edited by Wilfrid S. Kendall and Ilya Molchanov, to be published by ClarendonPress, Oxford, and planned to appear as Section 4.1 with the ......(This text written by Jesper Møller, Aalborg University, is submitted for the collection ‘Stochastic Geometry: Highlights, Interactions and New Perspectives', edited by Wilfrid S. Kendall and Ilya Molchanov, to be published by ClarendonPress, Oxford, and planned to appear as Section 4.......1 with the title ‘Inference'.) This contribution concerns statistical inference for parametric models used in stochastic geometry and based on quick and simple simulation free procedures as well as more comprehensive methods using Markov chain Monte Carlo (MCMC) simulations. Due to space limitations the focus...

  18. Reasoning in explanation-based decision making.

    Pennington, N; Hastie, R

    1993-01-01

    A general theory of explanation-based decision making is outlined and the multiple roles of inference processes in the theory are indicated. A typology of formal and informal inference forms, originally proposed by Collins (1978a, 1978b), is introduced as an appropriate framework to represent inferences that occur in the overarching explanation-based process. Results from the analysis of verbal reports of decision processes are presented to demonstrate the centrality and systematic character of reasoning in a representative legal decision-making task.

  19. Spatial Inference Based on Geometric Proportional Analogies

    Mullally, Emma-Claire; O'Donoghue, Diarmuid P.

    2006-01-01

    We describe an instance-based reasoning solution to a variety of spatial reasoning problems. The solution centers on identifying an isomorphic mapping between labelled graphs that represent some problem data and a known solution instance. We describe a number of spatial reasoning problems that are solved by generating non-deductive inferences, integrating topology with area (and other) features. We report the accuracy of our algorithm on different categories of spatial reasoning tasks from th...

  20. Abductive Inference using Array-Based Logic

    Frisvad, Jeppe Revall; Falster, Peter; Møller, Gert L.

    The notion of abduction has found its usage within a wide variety of AI fields. Computing abductive solutions has, however, shown to be highly intractable in logic programming. To avoid this intractability we present a new approach to logicbased abduction; through the geometrical view of data...... employed in array-based logic we embrace abduction in a simple structural operation. We argue that a theory of abduction on this form allows for an implementation which, at runtime, can perform abductive inference quite efficiently on arbitrary rules of logic representing knowledge of finite domains....

  1. Making inferences about patterns in wildlife tourism activities in Scotland using social media

    Mancini, Francesca

    2016-01-01

    The Power Point file contains the slides of my talk for the BES 2016 annual meeting. The title of the talk is: "Making inferences about patterns in wildlife tourism activities in Scotland using social media".

  2. Structural Inference in the Art of Violin Making.

    Morse-Fortier, Leonard Joseph

    The "secrets" of success of early Italian violins have long been sought. Among their many efforts to reproduce the results of Stradiveri, Guarneri, and Amati, luthiers have attempted to order and match natural resonant frequencies in the free violin plates. This tap-tone plate tuning technique is simply an eigenvalue extraction scheme. In the final stages of carving, the violin maker complements considerable intuitive knowledge of violin plate structure and of modal attributes with tap-tone frequency estimates to better understand plate structure and to inform decisions about plate carving and completeness. Examining the modal attributes of violin plates, this work develops and incorporates an impulse-response scheme for modal inference, measures resonant frequencies and modeshapes for a pair of violin plates, and presents modeshapes through a unique computer visualization scheme developed specifically for this purpose. The work explores, through simple examples questions of how plate modal attributes reflect underlying structure, and questions about the so -called evolution of modeshapes and frequencies through assembly of the violin. Separately, the work develops computer code for a carved, anisotropic, plate/shell finite element. Solutions are found to the static displacement and free-vibration eigenvalue problems for an orthotropic plate, and used to verify element accuracy. Finally, a violin back plate is modelled with full consideration of plate thickness and arching. Model estimates for modal attributes compare very well against experimentally acquired values. Finally, the modal synthesis technique is applied to predicting the modal attributes of the violin top plate with ribs attached from those of the top plate alone, and with an estimate of rib mass and stiffness. This last analysis serves to verify the modal synthesis method, and to quantify its limits of applicability in attempting to solve problems with severe structural modification. Conclusions

  3. Likelihood-Based Inference of B Cell Clonal Families.

    Duncan K Ralph

    2016-10-01

    Full Text Available The human immune system depends on a highly diverse collection of antibody-making B cells. B cell receptor sequence diversity is generated by a random recombination process called "rearrangement" forming progenitor B cells, then a Darwinian process of lineage diversification and selection called "affinity maturation." The resulting receptors can be sequenced in high throughput for research and diagnostics. Such a collection of sequences contains a mixture of various lineages, each of which may be quite numerous, or may consist of only a single member. As a step to understanding the process and result of this diversification, one may wish to reconstruct lineage membership, i.e. to cluster sampled sequences according to which came from the same rearrangement events. We call this clustering problem "clonal family inference." In this paper we describe and validate a likelihood-based framework for clonal family inference based on a multi-hidden Markov Model (multi-HMM framework for B cell receptor sequences. We describe an agglomerative algorithm to find a maximum likelihood clustering, two approximate algorithms with various trade-offs of speed versus accuracy, and a third, fast algorithm for finding specific lineages. We show that under simulation these algorithms greatly improve upon existing clonal family inference methods, and that they also give significantly different clusters than previous methods when applied to two real data sets.

  4. Inference-based procedural modeling of solids

    Biggers, Keith

    2011-11-01

    As virtual environments become larger and more complex, there is an increasing need for more automated construction algorithms to support the development process. We present an approach for modeling solids by combining prior examples with a simple sketch. Our algorithm uses an inference-based approach to incrementally fit patches together in a consistent fashion to define the boundary of an object. This algorithm samples and extracts surface patches from input models, and develops a Petri net structure that describes the relationship between patches along an imposed parameterization. Then, given a new parameterized line or curve, we use the Petri net to logically fit patches together in a manner consistent with the input model. This allows us to easily construct objects of varying sizes and configurations using arbitrary articulation, repetition, and interchanging of parts. The result of our process is a solid model representation of the constructed object that can be integrated into a simulation-based environment. © 2011 Elsevier Ltd. All rights reserved.

  5. Using statistical inference for decision making in best estimate analyses

    Sermer, P.; Weaver, K.; Hoppe, F.; Olive, C.; Quach, D.

    2008-01-01

    For broad classes of safety analysis problems, one needs to make decisions when faced with randomly varying quantities which are also subject to errors. The means for doing this involves a statistical approach which takes into account the nature of the physical problems, and the statistical constraints they impose. We describe the methodology for doing this which has been developed at Nuclear Safety Solutions, and we draw some comparisons to other methods which are commonly used in Canada and internationally. Our methodology has the advantages of being robust and accurate and compares favourably to other best estimate methods. (author)

  6. Causal Inference for Cross-Modal Action Selection: A Computational Study in a Decision Making Framework.

    Daemi, Mehdi; Harris, Laurence R; Crawford, J Douglas

    2016-01-01

    Animals try to make sense of sensory information from multiple modalities by categorizing them into perceptions of individual or multiple external objects or internal concepts. For example, the brain constructs sensory, spatial representations of the locations of visual and auditory stimuli in the visual and auditory cortices based on retinal and cochlear stimulations. Currently, it is not known how the brain compares the temporal and spatial features of these sensory representations to decide whether they originate from the same or separate sources in space. Here, we propose a computational model of how the brain might solve such a task. We reduce the visual and auditory information to time-varying, finite-dimensional signals. We introduce controlled, leaky integrators as working memory that retains the sensory information for the limited time-course of task implementation. We propose our model within an evidence-based, decision-making framework, where the alternative plan units are saliency maps of space. A spatiotemporal similarity measure, computed directly from the unimodal signals, is suggested as the criterion to infer common or separate causes. We provide simulations that (1) validate our model against behavioral, experimental results in tasks where the participants were asked to report common or separate causes for cross-modal stimuli presented with arbitrary spatial and temporal disparities. (2) Predict the behavior in novel experiments where stimuli have different combinations of spatial, temporal, and reliability features. (3) Illustrate the dynamics of the proposed internal system. These results confirm our spatiotemporal similarity measure as a viable criterion for causal inference, and our decision-making framework as a viable mechanism for target selection, which may be used by the brain in cross-modal situations. Further, we suggest that a similar approach can be extended to other cognitive problems where working memory is a limiting factor, such

  7. Contingency inferences driven by base rates: Valid by sampling

    Florian Kutzner

    2011-04-01

    Full Text Available Fiedler et al. (2009, reviewed evidence for the utilization of a contingency inference strategy termed pseudocontingencies (PCs. In PCs, the more frequent levels (and, by implication, the less frequent levels are assumed to be associated. PCs have been obtained using a wide range of task settings and dependent measures. Yet, the readiness with which decision makers rely on PCs is poorly understood. A computer simulation explored two potential sources of subjective validity of PCs. First, PCs are shown to perform above chance level when the task is to infer the sign of moderate to strong population contingencies from a sample of observations. Second, contingency inferences based on PCs and inferences based on cell frequencies are shown to partially agree across samples. Intriguingly, this criterion and convergent validity are by-products of random sampling error, highlighting the inductive nature of contingency inferences.

  8. Detective Questions: A Strategy for Improving Inference-Making in Children With Mild Disabilities

    Jiménez-Fernández, Gracia

    2015-01-01

    One of the most frequent problems in reading comprehension is the difficulty in making inferences from the text, especially for students with mild disabilities (i.e., children with learning disabilities or with high-functioning autism). It is essential, therefore, that educators include the teaching of reading strategies to improve their students'…

  9. Surrogate based approaches to parameter inference in ocean models

    Knio, Omar

    2016-01-06

    This talk discusses the inference of physical parameters using model surrogates. Attention is focused on the use of sampling schemes to build suitable representations of the dependence of the model response on uncertain input data. Non-intrusive spectral projections and regularized regressions are used for this purpose. A Bayesian inference formalism is then applied to update the uncertain inputs based on available measurements or observations. To perform the update, we consider two alternative approaches, based on the application of Markov Chain Monte Carlo methods or of adjoint-based optimization techniques. We outline the implementation of these techniques to infer dependence of wind drag, bottom drag, and internal mixing coefficients.

  10. Surrogate based approaches to parameter inference in ocean models

    Knio, Omar

    2016-01-01

    This talk discusses the inference of physical parameters using model surrogates. Attention is focused on the use of sampling schemes to build suitable representations of the dependence of the model response on uncertain input data. Non-intrusive spectral projections and regularized regressions are used for this purpose. A Bayesian inference formalism is then applied to update the uncertain inputs based on available measurements or observations. To perform the update, we consider two alternative approaches, based on the application of Markov Chain Monte Carlo methods or of adjoint-based optimization techniques. We outline the implementation of these techniques to infer dependence of wind drag, bottom drag, and internal mixing coefficients.

  11. Application of fuzzy inference system to increase efficiency of management decision-making in agricultural enterprises

    Balanovskаya, Tetiana Ivanovna; Boretska, Zoreslava Petrovna

    2014-01-01

    Application of fuzzy inference system to increase efficiency of management decision- making in agricultural enterprises. Theoretical and methodological issues, practical recommendations on improvement of management decision-making in agricultural enterprises to increase their competitiveness have been intensified and developed in the article. A simulation example of a quality management system for agricultural products on the basis of the theory of fuzzy sets and fuzzy logic has been proposed...

  12. IELTS ACADEMIC READING ACHIEVEMENT: THE CONTRIBUTION OF INFERENCE-MAKING AND EVALUATION OF ARGUMENTS

    Afsaneh Ghanizadeh; Azam Vahidian Pour; Akram Hosseini

    2017-01-01

    The pivotal undertaking of education today is to endow individuals with the capacity to be able to think flexibly, reason rationally, and have open minds to be able to evaluate and interpret situations. In line with the studies demonstrating the positive relationship between higher-order thinking skills and academic achievement, this study aimed to particularly examine the impact of the two subcomponents of critical thinking, i.e., inference-making and evaluation of arguments on academic IELT...

  13. An inferentialist perspective on the coordination of actions and reasons involved in making a statistical inference

    Bakker, Arthur; Ben-Zvi, Dani; Makar, Katie

    2017-12-01

    To understand how statistical and other types of reasoning are coordinated with actions to reduce uncertainty, we conducted a case study in vocational education that involved statistical hypothesis testing. We analyzed an intern's research project in a hospital laboratory in which reducing uncertainties was crucial to make a valid statistical inference. In his project, the intern, Sam, investigated whether patients' blood could be sent through pneumatic post without influencing the measurement of particular blood components. We asked, in the process of making a statistical inference, how are reasons and actions coordinated to reduce uncertainty? For the analysis, we used the semantic theory of inferentialism, specifically, the concept of webs of reasons and actions—complexes of interconnected reasons for facts and actions; these reasons include premises and conclusions, inferential relations, implications, motives for action, and utility of tools for specific purposes in a particular context. Analysis of interviews with Sam, his supervisor and teacher as well as video data of Sam in the classroom showed that many of Sam's actions aimed to reduce variability, rule out errors, and thus reduce uncertainties so as to arrive at a valid inference. Interestingly, the decisive factor was not the outcome of a t test but of the reference change value, a clinical chemical measure of analytic and biological variability. With insights from this case study, we expect that students can be better supported in connecting statistics with context and in dealing with uncertainty.

  14. Preverbal Infants Infer Third-Party Social Relationships Based on Language

    Liberman, Zoe; Woodward, Amanda L.; Kinzler, Katherine D.

    2017-01-01

    Language provides rich social information about its speakers. For instance, adults and children make inferences about a speaker's social identity, geographic origins, and group membership based on her language and accent. Although infants prefer speakers of familiar languages (Kinzler, Dupoux, & Spelke, 2007), little is known about the…

  15. Assessing school-aged children's inference-making: the effect of story test format in listening comprehension.

    Freed, Jenny; Cain, Kate

    2017-01-01

    Comprehension is critical for classroom learning and educational success. Inferences are integral to good comprehension: successful comprehension requires the listener to generate local coherence inferences, which involve integrating information between clauses, and global coherence inferences, which involve integrating textual information with background knowledge to infer motivations, themes, etc. A central priority for the diagnosis of comprehension difficulties and our understanding of why these difficulties arise is the development of valid assessment instruments. We explored typically developing children's ability to make local and global coherence inferences using a novel assessment of listening comprehension. The aims were to determine whether children were more likely to make the target inferences when these were asked during story presentation versus after presentation of the story, and whether there were any age differences between conditions. Children in Years 3 (n = 29) and 5 (n = 31) listened to short stories presented either in a segmented format, in which questions to assess local and global coherence inferences were asked at specific points during story presentation, or in a whole format, when all the questions were asked after the story had been presented. There was developmental progression between age groups for both types of inference question. Children also scored higher on the global coherence inference questions than the local coherence inference questions. There was a benefit of the segmented format for younger children, particularly for the local inference questions. The results suggest that children are more likely to make target inferences if prompted during presentation of the story, and that this format is particularly facilitative for younger children and for local coherence inferences. This has implications for the design of comprehension assessments as well as for supporting children with comprehension difficulties in the classroom

  16. Compression-based inference on graph data

    Bloem, P.; van den Bosch, A.; Heskes, T.; van Leeuwen, D.

    2013-01-01

    We investigate the use of compression-based learning on graph data. General purpose compressors operate on bitstrings or other sequential representations. A single graph can be represented sequentially in many ways, which may in uence the performance of sequential compressors. Using Normalized

  17. Statistical inference based on latent ability estimates

    Hoijtink, H.J.A.; Boomsma, A.

    The quality of approximations to first and second order moments (e.g., statistics like means, variances, regression coefficients) based on latent ability estimates is being discussed. The ability estimates are obtained using either the Rasch, oi the two-parameter logistic model. Straightforward use

  18. Individual based population inference using tagging data

    Pedersen, Martin Wæver; Thygesen, Uffe Høgsbro; Baktoft, Henrik

    A hierarchical framework for simultaneous analysis of multiple related individual datasets is presented. The approach is very similar to mixed effects modelling as known from statistical theory. The model used at the individual level is, in principle, irrelevant as long as a maximum likelihood...... estimate and its uncertainty (Hessian) can be computed. The individual model used in this text is a hidden Markov model. A simulation study concerning a two-dimensional biased random walk is examined to verify the consistency of the hierarchical estimation framework. In addition, a study based on acoustic...... telemetry data from pike illustrates how the framework can identify individuals that deviate from the remaining population....

  19. Access Control Based on Trail Inference

    ALBARELO, P. C.

    2015-06-01

    Full Text Available Professionals are constantly seeking qualification and consequently increasing their knowledge in their area of expertise. Thus, it is interesting to develop a computer system that knows its users and their work history. Using this information, even in the case of professional role change, the system could allow the renewed authorization for activities, based on previously authorized use. This article proposes a model for user access control that is embedded in a context-aware environment. The model applies the concept of trails to manage access control, recording activities usage in contexts and applying this history as a criterion to grant new accesses. Despite the fact that previous related research works consider contexts, none of them uses the concept of trails. Hence, the main contribution of this work is the use of a new access control criterion, namely, the history of previous accesses (trails. A prototype was implemented and applied in an evaluation based on scenarios. The results demonstrate the feasibility of the proposal, allowing for access control systems to use an alternative way to support access rights.

  20. SPEEDY: An Eclipse-based IDE for invariant inference

    David R. Cok

    2014-04-01

    Full Text Available SPEEDY is an Eclipse-based IDE for exploring techniques that assist users in generating correct specifications, particularly including invariant inference algorithms and tools. It integrates with several back-end tools that propose invariants and will incorporate published algorithms for inferring object and loop invariants. Though the architecture is language-neutral, current SPEEDY targets C programs. Building and using SPEEDY has confirmed earlier experience demonstrating the importance of showing and editing specifications in the IDEs that developers customarily use, automating as much of the production and checking of specifications as possible, and showing counterexample information directly in the source code editing environment. As in previous work, automation of specification checking is provided by back-end SMT solvers. However, reducing the effort demanded of software developers using formal methods also requires a GUI design that guides users in writing, reviewing, and correcting specifications and automates specification inference.

  1. Likelihood-based inference for clustered line transect data

    Waagepetersen, Rasmus; Schweder, Tore

    2006-01-01

    The uncertainty in estimation of spatial animal density from line transect surveys depends on the degree of spatial clustering in the animal population. To quantify the clustering we model line transect data as independent thinnings of spatial shot-noise Cox processes. Likelihood-based inference...

  2. Likelihood-based inference for clustered line transect data

    Waagepetersen, Rasmus Plenge; Schweder, Tore

    The uncertainty in estimation of spatial animal density from line transect surveys depends on the degree of spatial clustering in the animal population. To quantify the clustering we model line transect data as independent thinnings of spatial shot-noise Cox processes. Likelihood-based inference...

  3. Likelihood based inference for partially observed renewal processes

    van Lieshout, Maria Nicolette Margaretha

    2016-01-01

    This paper is concerned with inference for renewal processes on the real line that are observed in a broken interval. For such processes, the classic history-based approach cannot be used. Instead, we adapt tools from sequential spatial point process theory to propose a Monte Carlo maximum

  4. Bayesian inference and decision theory - A framework for decision making in natural resource management

    Dorazio, R.M.; Johnson, F.A.

    2003-01-01

    Bayesian inference and decision theory may be used in the solution of relatively complex problems of natural resource management, owing to recent advances in statistical theory and computing. In particular, Markov chain Monte Carlo algorithms provide a computational framework for fitting models of adequate complexity and for evaluating the expected consequences of alternative management actions. We illustrate these features using an example based on management of waterfowl habitat.

  5. Functional networks inference from rule-based machine learning models.

    Lazzarini, Nicola; Widera, Paweł; Williamson, Stuart; Heer, Rakesh; Krasnogor, Natalio; Bacardit, Jaume

    2016-01-01

    Functional networks play an important role in the analysis of biological processes and systems. The inference of these networks from high-throughput (-omics) data is an area of intense research. So far, the similarity-based inference paradigm (e.g. gene co-expression) has been the most popular approach. It assumes a functional relationship between genes which are expressed at similar levels across different samples. An alternative to this paradigm is the inference of relationships from the structure of machine learning models. These models are able to capture complex relationships between variables, that often are different/complementary to the similarity-based methods. We propose a protocol to infer functional networks from machine learning models, called FuNeL. It assumes, that genes used together within a rule-based machine learning model to classify the samples, might also be functionally related at a biological level. The protocol is first tested on synthetic datasets and then evaluated on a test suite of 8 real-world datasets related to human cancer. The networks inferred from the real-world data are compared against gene co-expression networks of equal size, generated with 3 different methods. The comparison is performed from two different points of view. We analyse the enriched biological terms in the set of network nodes and the relationships between known disease-associated genes in a context of the network topology. The comparison confirms both the biological relevance and the complementary character of the knowledge captured by the FuNeL networks in relation to similarity-based methods and demonstrates its potential to identify known disease associations as core elements of the network. Finally, using a prostate cancer dataset as a case study, we confirm that the biological knowledge captured by our method is relevant to the disease and consistent with the specialised literature and with an independent dataset not used in the inference process. The

  6. Decision-making when data and inferences are not conclusive: risk-benefit and acceptable regret approach.

    Hozo, Iztok; Schell, Michael J; Djulbegovic, Benjamin

    2008-07-01

    The absolute truth in research is unobtainable, as no evidence or research hypothesis is ever 100% conclusive. Therefore, all data and inferences can in principle be considered as "inconclusive." Scientific inference and decision-making need to take into account errors, which are unavoidable in the research enterprise. The errors can occur at the level of conclusions that aim to discern the truthfulness of research hypothesis based on the accuracy of research evidence and hypothesis, and decisions, the goal of which is to enable optimal decision-making under present and specific circumstances. To optimize the chance of both correct conclusions and correct decisions, the synthesis of all major statistical approaches to clinical research is needed. The integration of these approaches (frequentist, Bayesian, and decision-analytic) can be accomplished through formal risk:benefit (R:B) analysis. This chapter illustrates the rational choice of a research hypothesis using R:B analysis based on decision-theoretic expected utility theory framework and the concept of "acceptable regret" to calculate the threshold probability of the "truth" above which the benefit of accepting a research hypothesis outweighs its risks.

  7. Bootstrap-Based Inference for Cube Root Consistent Estimators

    Cattaneo, Matias D.; Jansson, Michael; Nagasawa, Kenichi

    This note proposes a consistent bootstrap-based distributional approximation for cube root consistent estimators such as the maximum score estimator of Manski (1975) and the isotonic density estimator of Grenander (1956). In both cases, the standard nonparametric bootstrap is known...... to be inconsistent. Our method restores consistency of the nonparametric bootstrap by altering the shape of the criterion function defining the estimator whose distribution we seek to approximate. This modification leads to a generic and easy-to-implement resampling method for inference that is conceptually distinct...... from other available distributional approximations based on some form of modified bootstrap. We offer simulation evidence showcasing the performance of our inference method in finite samples. An extension of our methodology to general M-estimation problems is also discussed....

  8. Bootstrap-based Support of HGT Inferred by Maximum Parsimony

    Nakhleh Luay

    2010-05-01

    Full Text Available Abstract Background Maximum parsimony is one of the most commonly used criteria for reconstructing phylogenetic trees. Recently, Nakhleh and co-workers extended this criterion to enable reconstruction of phylogenetic networks, and demonstrated its application to detecting reticulate evolutionary relationships. However, one of the major problems with this extension has been that it favors more complex evolutionary relationships over simpler ones, thus having the potential for overestimating the amount of reticulation in the data. An ad hoc solution to this problem that has been used entails inspecting the improvement in the parsimony length as more reticulation events are added to the model, and stopping when the improvement is below a certain threshold. Results In this paper, we address this problem in a more systematic way, by proposing a nonparametric bootstrap-based measure of support of inferred reticulation events, and using it to determine the number of those events, as well as their placements. A number of samples is generated from the given sequence alignment, and reticulation events are inferred based on each sample. Finally, the support of each reticulation event is quantified based on the inferences made over all samples. Conclusions We have implemented our method in the NEPAL software tool (available publicly at http://bioinfo.cs.rice.edu/, and studied its performance on both biological and simulated data sets. While our studies show very promising results, they also highlight issues that are inherently challenging when applying the maximum parsimony criterion to detect reticulate evolution.

  9. Bootstrap-based support of HGT inferred by maximum parsimony.

    Park, Hyun Jung; Jin, Guohua; Nakhleh, Luay

    2010-05-05

    Maximum parsimony is one of the most commonly used criteria for reconstructing phylogenetic trees. Recently, Nakhleh and co-workers extended this criterion to enable reconstruction of phylogenetic networks, and demonstrated its application to detecting reticulate evolutionary relationships. However, one of the major problems with this extension has been that it favors more complex evolutionary relationships over simpler ones, thus having the potential for overestimating the amount of reticulation in the data. An ad hoc solution to this problem that has been used entails inspecting the improvement in the parsimony length as more reticulation events are added to the model, and stopping when the improvement is below a certain threshold. In this paper, we address this problem in a more systematic way, by proposing a nonparametric bootstrap-based measure of support of inferred reticulation events, and using it to determine the number of those events, as well as their placements. A number of samples is generated from the given sequence alignment, and reticulation events are inferred based on each sample. Finally, the support of each reticulation event is quantified based on the inferences made over all samples. We have implemented our method in the NEPAL software tool (available publicly at http://bioinfo.cs.rice.edu/), and studied its performance on both biological and simulated data sets. While our studies show very promising results, they also highlight issues that are inherently challenging when applying the maximum parsimony criterion to detect reticulate evolution.

  10. Algorithms for MDC-Based Multi-locus Phylogeny Inference

    Yu, Yun; Warnow, Tandy; Nakhleh, Luay

    One of the criteria for inferring a species tree from a collection of gene trees, when gene tree incongruence is assumed to be due to incomplete lineage sorting (ILS), is minimize deep coalescence, or MDC. Exact algorithms for inferring the species tree from rooted, binary trees under MDC were recently introduced. Nevertheless, in phylogenetic analyses of biological data sets, estimated gene trees may differ from true gene trees, be incompletely resolved, and not necessarily rooted. In this paper, we propose new MDC formulations for the cases where the gene trees are unrooted/binary, rooted/non-binary, and unrooted/non-binary. Further, we prove structural theorems that allow us to extend the algorithms for the rooted/binary gene tree case to these cases in a straightforward manner. Finally, we study the performance of these methods in coalescent-based computer simulations.

  11. EEG Based Inference of Spatio-Temporal Brain Dynamics

    Hansen, Sofie Therese

    Electroencephalography (EEG) provides a measure of brain activity and has improved our understanding of the brain immensely. However, there is still much to be learned and the full potential of EEG is yet to be realized. In this thesis we suggest to improve the information gain of EEG using three...... different approaches; 1) by recovery of the EEG sources, 2) by representing and inferring the propagation path of EEG sources, and 3) by combining EEG with functional magnetic resonance imaging (fMRI). The common goal of the methods, and thus of this thesis, is to improve the spatial dimension of EEG...... recovery ability. The forward problem describes the propagation of neuronal activity in the brain to the EEG electrodes on the scalp. The geometry and conductivity of the head layers are normally required to model this path. We propose a framework for inferring forward models which is based on the EEG...

  12. Generating inferences from knowledge structures based on general automata

    Koenig, E C

    1983-01-01

    The author shows that the model for knowledge structures for computers based on general automata accommodates procedures for establishing inferences. Algorithms are presented which generate inferences as output of a computer when its sentence input names appropriate knowledge elements contained in an associated knowledge structure already stored in the memory of the computer. The inferences are found to have either a single graph tuple or more than one graph tuple of associated knowledge. Six algorithms pertain to a single graph tuple and a seventh pertains to more than one graph tuple of associated knowledge. A named term is either the automaton, environment, auxiliary receptor, principal receptor, auxiliary effector, or principal effector. The algorithm pertaining to more than one graph tuple requires that the input sentence names the automaton, transformation response, and environment of one of the tuples of associated knowledge in a sequence of tuples. Interaction with the computer may be either in a conversation or examination mode. The algorithms are illustrated by an example. 13 references.

  13. Inference-Based Surface Reconstruction of Cluttered Environments

    Biggers, K.

    2012-08-01

    We present an inference-based surface reconstruction algorithm that is capable of identifying objects of interest among a cluttered scene, and reconstructing solid model representations even in the presence of occluded surfaces. Our proposed approach incorporates a predictive modeling framework that uses a set of user-provided models for prior knowledge, and applies this knowledge to the iterative identification and construction process. Our approach uses a local to global construction process guided by rules for fitting high-quality surface patches obtained from these prior models. We demonstrate the application of this algorithm on several example data sets containing heavy clutter and occlusion. © 2012 IEEE.

  14. Pre-Service Mathematics Teachers' Use of Probability Models in Making Informal Inferences about a Chance Game

    Kazak, Sibel; Pratt, Dave

    2017-01-01

    This study considers probability models as tools for both making informal statistical inferences and building stronger conceptual connections between data and chance topics in teaching statistics. In this paper, we aim to explore pre-service mathematics teachers' use of probability models for a chance game, where the sum of two dice matters in…

  15. Making Inferences: Comprehension of Physical Causality, Intentionality, and Emotions in Discourse by High-Functioning Older Children, Adolescents, and Adults with Autism

    Bodner, Kimberly E.; Engelhardt, Christopher R.; Minshew, Nancy J.; Williams, Diane L.

    2015-01-01

    Studies investigating inferential reasoning in autism spectrum disorder (ASD) have focused on the ability to make socially-related inferences or inferences more generally. Important variables for intervention planning such as whether inferences depend on physical experiences or the nature of social information have received less consideration. A…

  16. Exemplar-based inference in multi-attribute decision making

    Linnea Karlsson

    2008-03-01

    Full Text Available Several studies propose that exemplar retrieval contributes to multi-attribute decisions. The authors have proposed a process theory enabling a priori predictions of what cognitive representations people use as input to their judgment process (extit{Sigma}, for ``summation''; P. Juslin, L. Karlsson, and H. Olsson, 2008. According to Sigma, exemplar retrieval is a back-up system when the task does not allow for additive and linear abstraction and integration of cue-criterion knowledge (e.g., when the task is non-additive. An important question is to what extent such shifts occur spontaneously as part of automatic procedures, such as error-minimization with the Delta rule, or if they are controlled extit{strategy} shifts contingent on the ability to identify a sufficiently successful judgment strategy. In this article data are reviewed that demonstrate a shift between exemplar memory and cue abstraction, as well as data where the expected shift does extit{not} occur. In contrast to a common assumption of previous models, these results suggest a controlled and contingent strategy shift.

  17. Bayesian Inference for Signal-Based Seismic Monitoring

    Moore, D.

    2015-12-01

    Traditional seismic monitoring systems rely on discrete detections produced by station processing software, discarding significant information present in the original recorded signal. SIG-VISA (Signal-based Vertically Integrated Seismic Analysis) is a system for global seismic monitoring through Bayesian inference on seismic signals. By modeling signals directly, our forward model is able to incorporate a rich representation of the physics underlying the signal generation process, including source mechanisms, wave propagation, and station response. This allows inference in the model to recover the qualitative behavior of recent geophysical methods including waveform matching and double-differencing, all as part of a unified Bayesian monitoring system that simultaneously detects and locates events from a global network of stations. We demonstrate recent progress in scaling up SIG-VISA to efficiently process the data stream of global signals recorded by the International Monitoring System (IMS), including comparisons against existing processing methods that show increased sensitivity from our signal-based model and in particular the ability to locate events (including aftershock sequences that can tax analyst processing) precisely from waveform correlation effects. We also provide a Bayesian analysis of an alleged low-magnitude event near the DPRK test site in May 2010 [1] [2], investigating whether such an event could plausibly be detected through automated processing in a signal-based monitoring system. [1] Zhang, Miao and Wen, Lianxing. "Seismological Evidence for a Low-Yield Nuclear Test on 12 May 2010 in North Korea". Seismological Research Letters, January/February 2015. [2] Richards, Paul. "A Seismic Event in North Korea on 12 May 2010". CTBTO SnT 2015 oral presentation, video at https://video-archive.ctbto.org/index.php/kmc/preview/partner_id/103/uiconf_id/4421629/entry_id/0_ymmtpps0/delivery/http

  18. Adaptive neuro-fuzzy inference system based automatic generation control

    Hosseini, S.H.; Etemadi, A.H. [Department of Electrical Engineering, Sharif University of Technology, Tehran (Iran)

    2008-07-15

    Fixed gain controllers for automatic generation control are designed at nominal operating conditions and fail to provide best control performance over a wide range of operating conditions. So, to keep system performance near its optimum, it is desirable to track the operating conditions and use updated parameters to compute control gains. A control scheme based on artificial neuro-fuzzy inference system (ANFIS), which is trained by the results of off-line studies obtained using particle swarm optimization, is proposed in this paper to optimize and update control gains in real-time according to load variations. Also, frequency relaxation is implemented using ANFIS. The efficiency of the proposed method is demonstrated via simulations. Compliance of the proposed method with NERC control performance standard is verified. (author)

  19. Higher-level fusion for military operations based on abductive inference: proof of principle

    Pantaleev, Aleksandar V.; Josephson, John

    2006-04-01

    The ability of contemporary military commanders to estimate and understand complicated situations already suffers from information overload, and the situation can only grow worse. We describe a prototype application that uses abductive inferencing to fuse information from multiple sensors to evaluate the evidence for higher-level hypotheses that are close to the levels of abstraction needed for decision making (approximately JDL levels 2 and 3). Abductive inference (abduction, inference to the best explanation) is a pattern of reasoning that occurs naturally in diverse settings such as medical diagnosis, criminal investigations, scientific theory formation, and military intelligence analysis. Because abduction is part of common-sense reasoning, implementations of it can produce reasoning traces that are very human understandable. Automated abductive inferencing can be deployed to augment human reasoning, taking advantage of computation to process large amounts of information, and to bypass limits to human attention and short-term memory. We illustrate the workings of the prototype system by describing an example of its use for small-unit military operations in an urban setting. Knowledge was encoded as it might be captured prior to engagement from a standard military decision making process (MDMP) and analysis of commander's priority intelligence requirements (PIR). The system is able to reasonably estimate the evidence for higher-level hypotheses based on information from multiple sensors. Its inference processes can be examined closely to verify correctness. Decision makers can override conclusions at any level and changes will propagate appropriately.

  20. A Multiobjective Fuzzy Inference System based Deployment Strategy for a Distributed Mobile Sensor Network

    Amol P. Bhondekar

    2010-03-01

    Full Text Available Sensor deployment scheme highly governs the effectiveness of distributed wireless sensor network. Issues such as energy conservation and clustering make the deployment problem much more complex. A multiobjective Fuzzy Inference System based strategy for mobile sensor deployment is presented in this paper. This strategy gives a synergistic combination of energy capacity, clustering and peer-to-peer deployment. Performance of our strategy is evaluated in terms of coverage, uniformity, speed and clustering. Our algorithm is compared against a modified distributed self-spreading algorithm to exhibit better performance.

  1. How to Make Correct Predictions in False Belief Tasks without Attributing False Beliefs: An Analysis of Alternative Inferences and How to Avoid Them

    Ricardo Augusto Perera

    2018-04-01

    Full Text Available The use of new paradigms of false belief tasks (FBT allowed to reduce the age of children who pass the test from the previous 4 years in the standard version to only 15 months or even a striking 6 months in the nonverbal modification. These results are often taken as evidence that infants already possess an—at least implicit—theory of mind (ToM. We criticize this inferential leap on the grounds that inferring a ToM from the predictive success on a false belief task requires to assume as premise that a belief reasoning is a necessary condition for correct action prediction. It is argued that the FBT does not satisfactorily constrain the predictive means, leaving room for the use of belief-independent inferences (that can rely on the attribution of non-representational mental states or the consideration of behavioral patterns that dispense any reference to other minds. These heuristics, when applied to the FBT, can achieve the same predictive success of a belief-based inference because information provided by the test stimulus allows the recognition of particular situations that can be subsumed by their ‘laws’. Instead of solving this issue by designing a single experimentum crucis that would render unfeasible the use of non-representational inferences, we suggest the application of a set of tests in which, although individually they can support inferences dissociated from a ToM, only an inference that makes use of false beliefs is able to correctly predict all the outcomes.

  2. Indirect Inference for Stochastic Differential Equations Based on Moment Expansions

    Ballesio, Marco; Tempone, Raul; Vilanova, Pedro

    2016-01-01

    We provide an indirect inference method to estimate the parameters of timehomogeneous scalar diffusion and jump diffusion processes. We obtain a system of ODEs that approximate the time evolution of the first two moments of the process

  3. Neural Fuzzy Inference System-Based Weather Prediction Model and Its Precipitation Predicting Experiment

    Jing Lu

    2014-11-01

    Full Text Available We propose a weather prediction model in this article based on neural network and fuzzy inference system (NFIS-WPM, and then apply it to predict daily fuzzy precipitation given meteorological premises for testing. The model consists of two parts: the first part is the “fuzzy rule-based neural network”, which simulates sequential relations among fuzzy sets using artificial neural network; and the second part is the “neural fuzzy inference system”, which is based on the first part, but could learn new fuzzy rules from the previous ones according to the algorithm we proposed. NFIS-WPM (High Pro and NFIS-WPM (Ave are improved versions of this model. It is well known that the need for accurate weather prediction is apparent when considering the benefits. However, the excessive pursuit of accuracy in weather prediction makes some of the “accurate” prediction results meaningless and the numerical prediction model is often complex and time-consuming. By adapting this novel model to a precipitation prediction problem, we make the predicted outcomes of precipitation more accurate and the prediction methods simpler than by using the complex numerical forecasting model that would occupy large computation resources, be time-consuming and which has a low predictive accuracy rate. Accordingly, we achieve more accurate predictive precipitation results than by using traditional artificial neural networks that have low predictive accuracy.

  4. Inferring regulatory networks from expression data using tree-based methods.

    Vân Anh Huynh-Thu

    2010-09-01

    Full Text Available One of the pressing open problems of computational systems biology is the elucidation of the topology of genetic regulatory networks (GRNs using high throughput genomic data, in particular microarray gene expression data. The Dialogue for Reverse Engineering Assessments and Methods (DREAM challenge aims to evaluate the success of GRN inference algorithms on benchmarks of simulated data. In this article, we present GENIE3, a new algorithm for the inference of GRNs that was best performer in the DREAM4 In Silico Multifactorial challenge. GENIE3 decomposes the prediction of a regulatory network between p genes into p different regression problems. In each of the regression problems, the expression pattern of one of the genes (target gene is predicted from the expression patterns of all the other genes (input genes, using tree-based ensemble methods Random Forests or Extra-Trees. The importance of an input gene in the prediction of the target gene expression pattern is taken as an indication of a putative regulatory link. Putative regulatory links are then aggregated over all genes to provide a ranking of interactions from which the whole network is reconstructed. In addition to performing well on the DREAM4 In Silico Multifactorial challenge simulated data, we show that GENIE3 compares favorably with existing algorithms to decipher the genetic regulatory network of Escherichia coli. It doesn't make any assumption about the nature of gene regulation, can deal with combinatorial and non-linear interactions, produces directed GRNs, and is fast and scalable. In conclusion, we propose a new algorithm for GRN inference that performs well on both synthetic and real gene expression data. The algorithm, based on feature selection with tree-based ensemble methods, is simple and generic, making it adaptable to other types of genomic data and interactions.

  5. An algebra-based method for inferring gene regulatory networks.

    Vera-Licona, Paola; Jarrah, Abdul; Garcia-Puente, Luis David; McGee, John; Laubenbacher, Reinhard

    2014-03-26

    The inference of gene regulatory networks (GRNs) from experimental observations is at the heart of systems biology. This includes the inference of both the network topology and its dynamics. While there are many algorithms available to infer the network topology from experimental data, less emphasis has been placed on methods that infer network dynamics. Furthermore, since the network inference problem is typically underdetermined, it is essential to have the option of incorporating into the inference process, prior knowledge about the network, along with an effective description of the search space of dynamic models. Finally, it is also important to have an understanding of how a given inference method is affected by experimental and other noise in the data used. This paper contains a novel inference algorithm using the algebraic framework of Boolean polynomial dynamical systems (BPDS), meeting all these requirements. The algorithm takes as input time series data, including those from network perturbations, such as knock-out mutant strains and RNAi experiments. It allows for the incorporation of prior biological knowledge while being robust to significant levels of noise in the data used for inference. It uses an evolutionary algorithm for local optimization with an encoding of the mathematical models as BPDS. The BPDS framework allows an effective representation of the search space for algebraic dynamic models that improves computational performance. The algorithm is validated with both simulated and experimental microarray expression profile data. Robustness to noise is tested using a published mathematical model of the segment polarity gene network in Drosophila melanogaster. Benchmarking of the algorithm is done by comparison with a spectrum of state-of-the-art network inference methods on data from the synthetic IRMA network to demonstrate that our method has good precision and recall for the network reconstruction task, while also predicting several of the

  6. Modeling and knowledge acquisition processes using case-based inference

    Ameneh Khadivar

    2017-03-01

    Full Text Available The method of acquisition and presentation of the organizational Process Knowledge has considered by many KM researches. In this research a model for process knowledge acquisition and presentation has been presented by using the approach of Case Base Reasoning. The validation of the presented model was evaluated by conducting an expert panel. Then a software has been developed based on the presented model and implemented in Eghtesad Novin Bank of Iran. In this company, based on the stages of the presented model, first the knowledge intensive processes has been identified, then the Process Knowledge was stored in a knowledge base in the format of problem/solution/consequent .The retrieval of the knowledge was done based on the similarity of the nearest neighbor algorithm. For validating of the implemented system, results of the system has compared by the results of the decision making of the expert of the process.

  7. Indirect Inference for Stochastic Differential Equations Based on Moment Expansions

    Ballesio, Marco

    2016-01-06

    We provide an indirect inference method to estimate the parameters of timehomogeneous scalar diffusion and jump diffusion processes. We obtain a system of ODEs that approximate the time evolution of the first two moments of the process by the approximation of the stochastic model applying a second order Taylor expansion of the SDE s infinitesimal generator in the Dynkin s formula. This method allows a simple and efficient procedure to infer the parameters of such stochastic processes given the data by the maximization of the likelihood of an approximating Gaussian process described by the two moments equations. Finally, we perform numerical experiments for two datasets arising from organic and inorganic fouling deposition phenomena.

  8. RISK MANAGEMENT AUTOMATION OF SOFTWARE PROJECTS BASED ОN FUZZY INFERENCE

    T. M. Zubkova

    2015-09-01

    Full Text Available Application suitability for one of the intelligent methods for risk management of software projects has been shown based on the review of existing algorithms for fuzzy inference in the field of applied problems. Information sources in the management of software projects are analyzed; major and minor risks are highlighted. The most critical parameters have been singled out giving the possibility to estimate the occurrence of an adverse situations (project duration, the frequency of customer’s requirements changing, work deadlines, experience of developers’ participation in such projects and others.. The method of qualitative fuzzy description based on fuzzy logic has been developed for analysis of these parameters. Evaluation of possible situations and knowledge base formation rely on a survey of experts. The main limitations of existing automated systems have been identified in relation to their applicability to risk management in the software design. Theoretical research set the stage for software system that makes it possible to automate the risk management process for software projects. The developed software system automates the process of fuzzy inference in the following stages: rule base formation of the fuzzy inference systems, fuzzification of input variables, aggregation of sub-conditions, activation and accumulation of conclusions for fuzzy production rules, variables defuzzification. The result of risk management automation process in the software design is their quantitative and qualitative assessment and expert advice for their minimization. Practical significance of the work lies in the fact that implementation of the developed automated system gives the possibility for performance improvement of software projects.

  9. Making predictions in a changing world-inference, uncertainty, and learning.

    O'Reilly, Jill X

    2013-01-01

    To function effectively, brains need to make predictions about their environment based on past experience, i.e., they need to learn about their environment. The algorithms by which learning occurs are of interest to neuroscientists, both in their own right (because they exist in the brain) and as a tool to model participants' incomplete knowledge of task parameters and hence, to better understand their behavior. This review focusses on a particular challenge for learning algorithms-how to match the rate at which they learn to the rate of change in the environment, so that they use as much observed data as possible whilst disregarding irrelevant, old observations. To do this algorithms must evaluate whether the environment is changing. We discuss the concepts of likelihood, priors and transition functions, and how these relate to change detection. We review expected and estimation uncertainty, and how these relate to change detection and learning rate. Finally, we consider the neural correlates of uncertainty and learning. We argue that the neural correlates of uncertainty bear a resemblance to neural systems that are active when agents actively explore their environments, suggesting that the mechanisms by which the rate of learning is set may be subject to top down control (in circumstances when agents actively seek new information) as well as bottom up control (by observations that imply change in the environment).

  10. Risk-Based Predictive Maintenance for Safety-Critical Systems by Using Probabilistic Inference

    Tianhua Xu

    2013-01-01

    Full Text Available Risk-based maintenance (RBM aims to improve maintenance planning and decision making by reducing the probability and consequences of failure of equipment. A new predictive maintenance strategy that integrates dynamic evolution model and risk assessment is proposed which can be used to calculate the optimal maintenance time with minimal cost and safety constraints. The dynamic evolution model provides qualified risks by using probabilistic inference with bucket elimination and gives the prospective degradation trend of a complex system. Based on the degradation trend, an optimal maintenance time can be determined by minimizing the expected maintenance cost per time unit. The effectiveness of the proposed method is validated and demonstrated by a collision accident of high-speed trains with obstacles in the presence of safety and cost constrains.

  11. Distributional Inference

    Kroese, A.H.; van der Meulen, E.A.; Poortema, Klaas; Schaafsma, W.

    1995-01-01

    The making of statistical inferences in distributional form is conceptionally complicated because the epistemic 'probabilities' assigned are mixtures of fact and fiction. In this respect they are essentially different from 'physical' or 'frequency-theoretic' probabilities. The distributional form is

  12. Decision-Making Based on Emotional Images

    Katahira, Kentaro; Fujimura, Tomomi; Okanoya, Kazuo; Okada, Masato

    2011-01-01

    The emotional outcome of a choice affects subsequent decision making. While the relationship between decision making and emotion has attracted attention, studies on emotion and decision making have been independently developed. In this study, we investigated how the emotional valence of pictures, which was stochastically contingent on participants’ choices, influenced subsequent decision making. In contrast to traditional value-based decision-making studies that used money or food as a reward...

  13. Decision making based on emotional images

    Kentaro eKatahira; Kentaro eKatahira; Kentaro eKatahira; Tomomi eFujimura; Tomomi eFujimura; Kazuo eOkanoya; Kazuo eOkanoya; Kazuo eOkanoya; Masato eOkada; Masato eOkada; Masato eOkada

    2011-01-01

    The emotional outcome of a choice affects subsequent decision making. While the relationship between decision making and emotion has attracted attention, studies on emotion and decision making have been independently developed. In this study, we investigated how the emotional valence of pictures, which was stochastically contingent on participants’ choices, influenced subsequent decision making. In contrast to traditional value-based decision-making studies that used money or food as a reward...

  14. Statistical inference for extended or shortened phase II studies based on Simon's two-stage designs.

    Zhao, Junjun; Yu, Menggang; Feng, Xi-Ping

    2015-06-07

    Simon's two-stage designs are popular choices for conducting phase II clinical trials, especially in the oncology trials to reduce the number of patients placed on ineffective experimental therapies. Recently Koyama and Chen (2008) discussed how to conduct proper inference for such studies because they found that inference procedures used with Simon's designs almost always ignore the actual sampling plan used. In particular, they proposed an inference method for studies when the actual second stage sample sizes differ from planned ones. We consider an alternative inference method based on likelihood ratio. In particular, we order permissible sample paths under Simon's two-stage designs using their corresponding conditional likelihood. In this way, we can calculate p-values using the common definition: the probability of obtaining a test statistic value at least as extreme as that observed under the null hypothesis. In addition to providing inference for a couple of scenarios where Koyama and Chen's method can be difficult to apply, the resulting estimate based on our method appears to have certain advantage in terms of inference properties in many numerical simulations. It generally led to smaller biases and narrower confidence intervals while maintaining similar coverages. We also illustrated the two methods in a real data setting. Inference procedures used with Simon's designs almost always ignore the actual sampling plan. Reported P-values, point estimates and confidence intervals for the response rate are not usually adjusted for the design's adaptiveness. Proper statistical inference procedures should be used.

  15. Cycle-Based Cluster Variational Method for Direct and Inverse Inference

    Furtlehner, Cyril; Decelle, Aurélien

    2016-08-01

    Large scale inference problems of practical interest can often be addressed with help of Markov random fields. This requires to solve in principle two related problems: the first one is to find offline the parameters of the MRF from empirical data (inverse problem); the second one (direct problem) is to set up the inference algorithm to make it as precise, robust and efficient as possible. In this work we address both the direct and inverse problem with mean-field methods of statistical physics, going beyond the Bethe approximation and associated belief propagation algorithm. We elaborate on the idea that loop corrections to belief propagation can be dealt with in a systematic way on pairwise Markov random fields, by using the elements of a cycle basis to define regions in a generalized belief propagation setting. For the direct problem, the region graph is specified in such a way as to avoid feed-back loops as much as possible by selecting a minimal cycle basis. Following this line we are led to propose a two-level algorithm, where a belief propagation algorithm is run alternatively at the level of each cycle and at the inter-region level. Next we observe that the inverse problem can be addressed region by region independently, with one small inverse problem per region to be solved. It turns out that each elementary inverse problem on the loop geometry can be solved efficiently. In particular in the random Ising context we propose two complementary methods based respectively on fixed point equations and on a one-parameter log likelihood function minimization. Numerical experiments confirm the effectiveness of this approach both for the direct and inverse MRF inference. Heterogeneous problems of size up to 10^5 are addressed in a reasonable computational time, notably with better convergence properties than ordinary belief propagation.

  16. Illustration interface of accident progression in PWR by quick inference based on multilevel flow models

    Yoshikawa, H.; Ouyang, J.; Niwa, Y.

    2006-01-01

    In this paper, a new accident inference method is proposed by using a goal and function oriented modeling method called Multilevel Flow Model focusing on explaining the causal-consequence relations and the objective of automatic action in the accident of nuclear power plant. Users can easily grasp how the various plant parameters will behave and how the various safety facilities will be activated sequentially to cope with the accident until the nuclear power plants are settled into safety state, i.e., shutdown state. The applicability of the developed method was validated by the conduction of internet-based 'view' experiment to the voluntary respondents, and in the future, further elaboration of interface design and the further introduction of instruction contents will be developed to make it become the usable CAI system. (authors)

  17. Sensitivity analysis practices: Strategies for model-based inference

    Saltelli, Andrea; Ratto, Marco; Tarantola, Stefano; Campolongo, Francesca

    2006-01-01

    Fourteen years after Science's review of sensitivity analysis (SA) methods in 1989 (System analysis at molecular scale, by H. Rabitz) we search Science Online to identify and then review all recent articles having 'sensitivity analysis' as a keyword. In spite of the considerable developments which have taken place in this discipline, of the good practices which have emerged, and of existing guidelines for SA issued on both sides of the Atlantic, we could not find in our review other than very primitive SA tools, based on 'one-factor-at-a-time' (OAT) approaches. In the context of model corroboration or falsification, we demonstrate that this use of OAT methods is illicit and unjustified, unless the model under analysis is proved to be linear. We show that available good practices, such as variance based measures and others, are able to overcome OAT shortcomings and easy to implement. These methods also allow the concept of factors importance to be defined rigorously, thus making the factors importance ranking univocal. We analyse the requirements of SA in the context of modelling, and present best available practices on the basis of an elementary model. We also point the reader to available recipes for a rigorous SA

  18. Sensitivity analysis practices: Strategies for model-based inference

    Saltelli, Andrea [Institute for the Protection and Security of the Citizen (IPSC), European Commission, Joint Research Centre, TP 361, 21020 Ispra (Vatican City State, Holy See,) (Italy)]. E-mail: andrea.saltelli@jrc.it; Ratto, Marco [Institute for the Protection and Security of the Citizen (IPSC), European Commission, Joint Research Centre, TP 361, 21020 Ispra (VA) (Italy); Tarantola, Stefano [Institute for the Protection and Security of the Citizen (IPSC), European Commission, Joint Research Centre, TP 361, 21020 Ispra (VA) (Italy); Campolongo, Francesca [Institute for the Protection and Security of the Citizen (IPSC), European Commission, Joint Research Centre, TP 361, 21020 Ispra (VA) (Italy)

    2006-10-15

    Fourteen years after Science's review of sensitivity analysis (SA) methods in 1989 (System analysis at molecular scale, by H. Rabitz) we search Science Online to identify and then review all recent articles having 'sensitivity analysis' as a keyword. In spite of the considerable developments which have taken place in this discipline, of the good practices which have emerged, and of existing guidelines for SA issued on both sides of the Atlantic, we could not find in our review other than very primitive SA tools, based on 'one-factor-at-a-time' (OAT) approaches. In the context of model corroboration or falsification, we demonstrate that this use of OAT methods is illicit and unjustified, unless the model under analysis is proved to be linear. We show that available good practices, such as variance based measures and others, are able to overcome OAT shortcomings and easy to implement. These methods also allow the concept of factors importance to be defined rigorously, thus making the factors importance ranking univocal. We analyse the requirements of SA in the context of modelling, and present best available practices on the basis of an elementary model. We also point the reader to available recipes for a rigorous SA.

  19. Connectivity inference from neural recording data: Challenges, mathematical bases and research directions.

    Magrans de Abril, Ildefons; Yoshimoto, Junichiro; Doya, Kenji

    2018-06-01

    This article presents a review of computational methods for connectivity inference from neural activity data derived from multi-electrode recordings or fluorescence imaging. We first identify biophysical and technical challenges in connectivity inference along the data processing pipeline. We then review connectivity inference methods based on two major mathematical foundations, namely, descriptive model-free approaches and generative model-based approaches. We investigate representative studies in both categories and clarify which challenges have been addressed by which method. We further identify critical open issues and possible research directions. Copyright © 2018 The Author(s). Published by Elsevier Ltd.. All rights reserved.

  20. Statistical Methods for Population Genetic Inference Based on Low-Depth Sequencing Data from Modern and Ancient DNA

    Korneliussen, Thorfinn Sand

    Due to the recent advances in DNA sequencing technology genomic data are being generated at an unprecedented rate and we are gaining access to entire genomes at population level. The technology does, however, not give direct access to the genetic variation and the many levels of preprocessing...... that is required before being able to make inferences from the data introduces multiple levels of uncertainty, especially for low-depth data. Therefore methods that take into account the inherent uncertainty are needed for being able to make robust inferences in the downstream analysis of such data. This poses...... a problem for a range of key summary statistics within populations genetics where existing methods are based on the assumption that the true genotypes are known. Motivated by this I present: 1) a new method for the estimation of relatedness between pairs of individuals, 2) a new method for estimating...

  1. Fooled by First Impressions? Reexamining the Diagnostic Value of Appearance-Based Inferences

    2010-01-01

    Abstract We often form opinions about the characteristics of others from single, static samples of their appearance --the very first thing we see when, or even before, we meet them. These inferences occur spontaneously, rapidly, and can impact decisions in a variety of important domains. A crucial question, then, is whether appearance-based inferences are accurate. Using a naturalistic data set of more than 1 million appearance-based judgments obtained from a popular website (Study...

  2. Design of uav robust autopilot based on adaptive neuro-fuzzy inference system

    Mohand Achour Touat

    2008-04-01

    Full Text Available  This paper is devoted to the application of adaptive neuro-fuzzy inference systems to the robust control of the UAV longitudinal motion. The adaptive neore-fuzzy inference system model needs to be trained by input/output data. This data were obtained from the modeling of a ”crisp” robust control system. The synthesis of this system is based on the separation theorem, which defines the structure and parameters of LQG-optimal controller, and further - robust optimization of this controller, based on the genetic algorithm. Such design procedure can define the rule base and parameters of fuzzyfication and defuzzyfication algorithms of the adaptive neore-fuzzy inference system controller, which ensure the robust properties of the control system. Simulation of the closed loop control system of UAV longitudinal motion with adaptive neore-fuzzy inference system controller demonstrates high efficiency of proposed design procedure.

  3. Rule-based decision making model

    Sirola, Miki

    1998-01-01

    A rule-based decision making model is designed in G2 environment. A theoretical and methodological frame for the model is composed and motivated. The rule-based decision making model is based on object-oriented modelling, knowledge engineering and decision theory. The idea of safety objective tree is utilized. Advanced rule-based methodologies are applied. A general decision making model 'decision element' is constructed. The strategy planning of the decision element is based on e.g. value theory and utility theory. A hypothetical process model is built to give input data for the decision element. The basic principle of the object model in decision making is division in tasks. Probability models are used in characterizing component availabilities. Bayes' theorem is used to recalculate the probability figures when new information is got. The model includes simple learning features to save the solution path. A decision analytic interpretation is given to the decision making process. (author)

  4. Polynomial Chaos–Based Bayesian Inference of K-Profile Parameterization in a General Circulation Model of the Tropical Pacific

    Sraj, Ihab; Zedler, Sarah E.; Knio, Omar; Jackson, Charles S.; Hoteit, Ibrahim

    2016-01-01

    The authors present a polynomial chaos (PC)-based Bayesian inference method for quantifying the uncertainties of the K-profile parameterization (KPP) within the MIT general circulation model (MITgcm) of the tropical Pacific. The inference

  5. FUZZY INFERENCE BASED LEAK ESTIMATION IN WATER PIPELINES SYSTEM

    N. Lavanya

    2015-01-01

    Full Text Available Pipeline networks are the most widely used mode for transporting fluids and gases around the world. Leakage in this pipeline causes harmful effects when the flowing fluid/gas is hazardous. Hence the detection of leak becomes essential to avoid/minimize such undesirable effects. This paper presents the leak detection by spectral analysis methods in a laboratory pipeline system. Transient in the pressure signal in the pipeline is created by opening and closing the exit valve. These pressure variations are captured and power spectrum is obtained by using Fast Fourier Transform (FFT method and Filter Diagonalization Method (FDM. The leaks at various positions are simulated and located using these methods and the results are compared. In order to determine the quantity of leak a 2 × 1 fuzzy inference system is created using the upstream and downstream pressure as input and the leak size as the output. Thus a complete leak detection, localization and quantification are done by using only the pressure variations in the pipeline.

  6. The Bayes linear approach to inference and decision-making for a reliability programme

    Goldstein, Michael; Bedford, Tim

    2007-01-01

    In reliability modelling it is conventional to build sophisticated models of the probabilistic behaviour of the component lifetimes in a system in order to deduce information about the probabilistic behaviour of the system lifetime. Decision modelling of the reliability programme requires a priori, therefore, an even more sophisticated set of models in order to capture the evidence the decision maker believes may be obtained from different types of data acquisition. Bayes linear analysis is a methodology that uses expectation rather than probability as the fundamental expression of uncertainty. By working only with expected values, a simpler level of modelling is needed as compared to full probability models. In this paper we shall consider the Bayes linear approach to the estimation of a mean time to failure MTTF of a component. The model built will take account of the variance in our estimate of the MTTF, based on a variety of sources of information

  7. Cultural effects on the association between election outcomes and face-based trait inferences.

    Chujun Lin

    Full Text Available How competent a politician looks, as assessed in the laboratory, is correlated with whether the politician wins in real elections. This finding has led many to investigate whether the association between candidate appearances and election outcomes transcends cultures. However, these studies have largely focused on European countries and Caucasian candidates. To the best of our knowledge, there are only four cross-cultural studies that have directly investigated how face-based trait inferences correlate with election outcomes across Caucasian and Asian cultures. These prior studies have provided some initial evidence regarding cultural differences, but methodological problems and inconsistent findings have complicated our understanding of how culture mediates the effects of candidate appearances on election outcomes. Additionally, these four past studies have focused on positive traits, with a relative neglect of negative traits, resulting in an incomplete picture of how culture may impact a broader range of trait inferences. To study Caucasian-Asian cultural effects with a more balanced experimental design, and to explore a more complete profile of traits, here we compared how Caucasian and Korean participants' inferences of positive and negative traits correlated with U.S. and Korean election outcomes. Contrary to previous reports, we found that inferences of competence (made by participants from both cultures correlated with both U.S. and Korean election outcomes. Inferences of open-mindedness and threat, two traits neglected in previous cross-cultural studies, were correlated with Korean but not U.S. election outcomes. This differential effect was found in trait judgments made by both Caucasian and Korean participants. Interestingly, the faster the participants made face-based trait inferences, the more strongly those inferences were correlated with real election outcomes. These findings provide new insights into cultural effects and the

  8. Cultural effects on the association between election outcomes and face-based trait inferences.

    Lin, Chujun; Adolphs, Ralph; Alvarez, R Michael

    2017-01-01

    How competent a politician looks, as assessed in the laboratory, is correlated with whether the politician wins in real elections. This finding has led many to investigate whether the association between candidate appearances and election outcomes transcends cultures. However, these studies have largely focused on European countries and Caucasian candidates. To the best of our knowledge, there are only four cross-cultural studies that have directly investigated how face-based trait inferences correlate with election outcomes across Caucasian and Asian cultures. These prior studies have provided some initial evidence regarding cultural differences, but methodological problems and inconsistent findings have complicated our understanding of how culture mediates the effects of candidate appearances on election outcomes. Additionally, these four past studies have focused on positive traits, with a relative neglect of negative traits, resulting in an incomplete picture of how culture may impact a broader range of trait inferences. To study Caucasian-Asian cultural effects with a more balanced experimental design, and to explore a more complete profile of traits, here we compared how Caucasian and Korean participants' inferences of positive and negative traits correlated with U.S. and Korean election outcomes. Contrary to previous reports, we found that inferences of competence (made by participants from both cultures) correlated with both U.S. and Korean election outcomes. Inferences of open-mindedness and threat, two traits neglected in previous cross-cultural studies, were correlated with Korean but not U.S. election outcomes. This differential effect was found in trait judgments made by both Caucasian and Korean participants. Interestingly, the faster the participants made face-based trait inferences, the more strongly those inferences were correlated with real election outcomes. These findings provide new insights into cultural effects and the difficult question of

  9. Cultural effects on the association between election outcomes and face-based trait inferences

    Adolphs, Ralph; Alvarez, R. Michael

    2017-01-01

    How competent a politician looks, as assessed in the laboratory, is correlated with whether the politician wins in real elections. This finding has led many to investigate whether the association between candidate appearances and election outcomes transcends cultures. However, these studies have largely focused on European countries and Caucasian candidates. To the best of our knowledge, there are only four cross-cultural studies that have directly investigated how face-based trait inferences correlate with election outcomes across Caucasian and Asian cultures. These prior studies have provided some initial evidence regarding cultural differences, but methodological problems and inconsistent findings have complicated our understanding of how culture mediates the effects of candidate appearances on election outcomes. Additionally, these four past studies have focused on positive traits, with a relative neglect of negative traits, resulting in an incomplete picture of how culture may impact a broader range of trait inferences. To study Caucasian-Asian cultural effects with a more balanced experimental design, and to explore a more complete profile of traits, here we compared how Caucasian and Korean participants’ inferences of positive and negative traits correlated with U.S. and Korean election outcomes. Contrary to previous reports, we found that inferences of competence (made by participants from both cultures) correlated with both U.S. and Korean election outcomes. Inferences of open-mindedness and threat, two traits neglected in previous cross-cultural studies, were correlated with Korean but not U.S. election outcomes. This differential effect was found in trait judgments made by both Caucasian and Korean participants. Interestingly, the faster the participants made face-based trait inferences, the more strongly those inferences were correlated with real election outcomes. These findings provide new insights into cultural effects and the difficult question of

  10. How does aging affect recognition-based inference? A hierarchical Bayesian modeling approach.

    Horn, Sebastian S; Pachur, Thorsten; Mata, Rui

    2015-01-01

    The recognition heuristic (RH) is a simple strategy for probabilistic inference according to which recognized objects are judged to score higher on a criterion than unrecognized objects. In this article, a hierarchical Bayesian extension of the multinomial r-model is applied to measure use of the RH on the individual participant level and to re-evaluate differences between younger and older adults' strategy reliance across environments. Further, it is explored how individual r-model parameters relate to alternative measures of the use of recognition and other knowledge, such as adherence rates and indices from signal-detection theory (SDT). Both younger and older adults used the RH substantially more often in an environment with high than low recognition validity, reflecting adaptivity in strategy use across environments. In extension of previous analyses (based on adherence rates), hierarchical modeling revealed that in an environment with low recognition validity, (a) older adults had a stronger tendency than younger adults to rely on the RH and (b) variability in RH use between individuals was larger than in an environment with high recognition validity; variability did not differ between age groups. Further, the r-model parameters correlated moderately with an SDT measure expressing how well people can discriminate cases where the RH leads to a correct vs. incorrect inference; this suggests that the r-model and the SDT measures may offer complementary insights into the use of recognition in decision making. In conclusion, younger and older adults are largely adaptive in their application of the RH, but cognitive aging may be associated with an increased tendency to rely on this strategy. Copyright © 2014 Elsevier B.V. All rights reserved.

  11. Inference of time-delayed gene regulatory networks based on dynamic Bayesian network hybrid learning method.

    Yu, Bin; Xu, Jia-Meng; Li, Shan; Chen, Cheng; Chen, Rui-Xin; Wang, Lei; Zhang, Yan; Wang, Ming-Hui

    2017-10-06

    Gene regulatory networks (GRNs) research reveals complex life phenomena from the perspective of gene interaction, which is an important research field in systems biology. Traditional Bayesian networks have a high computational complexity, and the network structure scoring model has a single feature. Information-based approaches cannot identify the direction of regulation. In order to make up for the shortcomings of the above methods, this paper presents a novel hybrid learning method (DBNCS) based on dynamic Bayesian network (DBN) to construct the multiple time-delayed GRNs for the first time, combining the comprehensive score (CS) with the DBN model. DBNCS algorithm first uses CMI2NI (conditional mutual inclusive information-based network inference) algorithm for network structure profiles learning, namely the construction of search space. Then the redundant regulations are removed by using the recursive optimization algorithm (RO), thereby reduce the false positive rate. Secondly, the network structure profiles are decomposed into a set of cliques without loss, which can significantly reduce the computational complexity. Finally, DBN model is used to identify the direction of gene regulation within the cliques and search for the optimal network structure. The performance of DBNCS algorithm is evaluated by the benchmark GRN datasets from DREAM challenge as well as the SOS DNA repair network in Escherichia coli , and compared with other state-of-the-art methods. The experimental results show the rationality of the algorithm design and the outstanding performance of the GRNs.

  12. I know why you voted for Trump: (Over)inferring motives based on choice.

    Barasz, Kate; Kim, Tami; Evangelidis, Ioannis

    2018-05-10

    People often speculate about why others make the choices they do. This paper investigates how such inferences are formed as a function of what is chosen. Specifically, when observers encounter someone else's choice (e.g., of political candidate), they use the chosen option's attribute values (e.g., a candidate's specific stance on a policy issue) to infer the importance of that attribute (e.g., the policy issue) to the decision-maker. Consequently, when a chosen option has an attribute whose value is extreme (e.g., an extreme policy stance), observers infer-sometimes incorrectly-that this attribute disproportionately motivated the decision-maker's choice. Seven studies demonstrate how observers use an attribute's value to infer its weight-the value-weight heuristic-and identify the role of perceived diagnosticity: more extreme attribute values give observers the subjective sense that they know more about a decision-maker's preferences, and in turn, increase the attribute's perceived importance. The paper explores how this heuristic can produce erroneous inferences and influence broader beliefs about decision-makers. Copyright © 2018 Elsevier B.V. All rights reserved.

  13. Implementing a novel movement-based approach to inferring parturition and neonate caribou calf survival.

    Maegwin Bonar

    mortality dates from both methods were similar to the observed distribution derived from VHF-collared calves. Both methods underestimated herd-wide calf survival based on VHF-collared calves, however, a combination of the individual- and population-based methods produced herd-wide survival estimates similar to estimates generated from collared calves. The limitations we experienced when applying the DeMars model could result from the shortcomings in our data violating model assumptions. However despite the differences in our caribou systems, with proper validation techniques the framework in the DeMars model is sufficient to make inferences on parturition and calf mortality.

  14. Improving the extraction of complex regulatory events from scientific text by using ontology-based inference.

    Kim, Jung-Jae; Rebholz-Schuhmann, Dietrich

    2011-10-06

    The extraction of complex events from biomedical text is a challenging task and requires in-depth semantic analysis. Previous approaches associate lexical and syntactic resources with ontologies for the semantic analysis, but fall short in testing the benefits from the use of domain knowledge. We developed a system that deduces implicit events from explicitly expressed events by using inference rules that encode domain knowledge. We evaluated the system with the inference module on three tasks: First, when tested against a corpus with manually annotated events, the inference module of our system contributes 53.2% of correct extractions, but does not cause any incorrect results. Second, the system overall reproduces 33.1% of the transcription regulatory events contained in RegulonDB (up to 85.0% precision) and the inference module is required for 93.8% of the reproduced events. Third, we applied the system with minimum adaptations to the identification of cell activity regulation events, confirming that the inference improves the performance of the system also on this task. Our research shows that the inference based on domain knowledge plays a significant role in extracting complex events from text. This approach has great potential in recognizing the complex concepts of such biomedical ontologies as Gene Ontology in the literature.

  15. Improving the extraction of complex regulatory events from scientific text by using ontology-based inference

    Kim Jung-jae

    2011-10-01

    Full Text Available Abstract Background The extraction of complex events from biomedical text is a challenging task and requires in-depth semantic analysis. Previous approaches associate lexical and syntactic resources with ontologies for the semantic analysis, but fall short in testing the benefits from the use of domain knowledge. Results We developed a system that deduces implicit events from explicitly expressed events by using inference rules that encode domain knowledge. We evaluated the system with the inference module on three tasks: First, when tested against a corpus with manually annotated events, the inference module of our system contributes 53.2% of correct extractions, but does not cause any incorrect results. Second, the system overall reproduces 33.1% of the transcription regulatory events contained in RegulonDB (up to 85.0% precision and the inference module is required for 93.8% of the reproduced events. Third, we applied the system with minimum adaptations to the identification of cell activity regulation events, confirming that the inference improves the performance of the system also on this task. Conclusions Our research shows that the inference based on domain knowledge plays a significant role in extracting complex events from text. This approach has great potential in recognizing the complex concepts of such biomedical ontologies as Gene Ontology in the literature.

  16. Inferring Trust Relationships in Web-Based Social Networks

    Golbeck, Jennifer; Hendler, James

    2006-01-01

    The growth of web-based social networking and the properties of those networks have created great potential for producing intelligent software that integrates a user's social network and preferences...

  17. Decision-making based on emotional images.

    Katahira, Kentaro; Fujimura, Tomomi; Okanoya, Kazuo; Okada, Masato

    2011-01-01

    The emotional outcome of a choice affects subsequent decision making. While the relationship between decision making and emotion has attracted attention, studies on emotion and decision making have been independently developed. In this study, we investigated how the emotional valence of pictures, which was stochastically contingent on participants' choices, influenced subsequent decision making. In contrast to traditional value-based decision-making studies that used money or food as a reward, the "reward value" of the decision outcome, which guided the update of value for each choice, is unknown beforehand. To estimate the reward value of emotional pictures from participants' choice data, we used reinforcement learning models that have successfully been used in previous studies for modeling value-based decision making. Consequently, we found that the estimated reward value was asymmetric between positive and negative pictures. The negative reward value of negative pictures (relative to neutral pictures) was larger in magnitude than the positive reward value of positive pictures. This asymmetry was not observed in valence for an individual picture, which was rated by the participants regarding the emotion experienced upon viewing it. These results suggest that there may be a difference between experienced emotion and the effect of the experienced emotion on subsequent behavior. Our experimental and computational paradigm provides a novel way for quantifying how and what aspects of emotional events affect human behavior. The present study is a first step toward relating a large amount of knowledge in emotion science and in taking computational approaches to value-based decision making.

  18. Gene regulatory network inference by point-based Gaussian approximation filters incorporating the prior information.

    Jia, Bin; Wang, Xiaodong

    2013-12-17

    : The extended Kalman filter (EKF) has been applied to inferring gene regulatory networks. However, it is well known that the EKF becomes less accurate when the system exhibits high nonlinearity. In addition, certain prior information about the gene regulatory network exists in practice, and no systematic approach has been developed to incorporate such prior information into the Kalman-type filter for inferring the structure of the gene regulatory network. In this paper, an inference framework based on point-based Gaussian approximation filters that can exploit the prior information is developed to solve the gene regulatory network inference problem. Different point-based Gaussian approximation filters, including the unscented Kalman filter (UKF), the third-degree cubature Kalman filter (CKF3), and the fifth-degree cubature Kalman filter (CKF5) are employed. Several types of network prior information, including the existing network structure information, sparsity assumption, and the range constraint of parameters, are considered, and the corresponding filters incorporating the prior information are developed. Experiments on a synthetic network of eight genes and the yeast protein synthesis network of five genes are carried out to demonstrate the performance of the proposed framework. The results show that the proposed methods provide more accurate inference results than existing methods, such as the EKF and the traditional UKF.

  19. Statistical inference for remote sensing-based estimates of net deforestation

    Ronald E. McRoberts; Brian F. Walters

    2012-01-01

    Statistical inference requires expression of an estimate in probabilistic terms, usually in the form of a confidence interval. An approach to constructing confidence intervals for remote sensing-based estimates of net deforestation is illustrated. The approach is based on post-classification methods using two independent forest/non-forest classifications because...

  20. Identification of Fuzzy Inference Systems by Means of a Multiobjective Opposition-Based Space Search Algorithm

    Wei Huang

    2013-01-01

    Full Text Available We introduce a new category of fuzzy inference systems with the aid of a multiobjective opposition-based space search algorithm (MOSSA. The proposed MOSSA is essentially a multiobjective space search algorithm improved by using an opposition-based learning that employs a so-called opposite numbers mechanism to speed up the convergence of the optimization algorithm. In the identification of fuzzy inference system, the MOSSA is exploited to carry out the parametric identification of the fuzzy model as well as to realize its structural identification. Experimental results demonstrate the effectiveness of the proposed fuzzy models.

  1. New developments of a knowledge based system (VEG) for inferring vegetation characteristics

    Kimes, D. S.; Harrison, P. A.; Harrison, P. R.

    1992-01-01

    An extraction technique for inferring physical and biological surface properties of vegetation using nadir and/or directional reflectance data as input has been developed. A knowledge-based system (VEG) accepts spectral data of an unknown target as input, determines the best strategy for inferring the desired vegetation characteristic, applies the strategy to the target data, and provides a rigorous estimate of the accuracy of the inference. Progress in developing the system is presented. VEG combines methods from remote sensing and artificial intelligence, and integrates input spectral measurements with diverse knowledge bases. VEG has been developed to (1) infer spectral hemispherical reflectance from any combination of nadir and/or off-nadir view angles; (2) test and develop new extraction techniques on an internal spectral database; (3) browse, plot, or analyze directional reflectance data in the system's spectral database; (4) discriminate between user-defined vegetation classes using spectral and directional reflectance relationships; and (5) infer unknown view angles from known view angles (known as view angle extension).

  2. Strategies for memory-based decision making : Modeling behavioral and neural signatures within a cognitive architecture

    Fechner, Hanna B; Pachur, Thorsten; Schooler, Lael J; Mehlhorn, Katja; Battal, Ceren; Volz, Kirsten G; Borst, Jelmer P.

    2016-01-01

    How do people use memories to make inferences about real-world objects? We tested three strategies based on predicted patterns of response times and blood-oxygen-level-dependent (BOLD) responses: one strategy that relies solely on recognition memory, a second that retrieves additional knowledge, and

  3. Data-based Non-Markovian Model Inference

    Ghil, Michael

    2015-04-01

    This talk concentrates on obtaining stable and efficient data-based models for simulation and prediction in the geosciences and life sciences. The proposed model derivation relies on using a multivariate time series of partial observations from a large-dimensional system, and the resulting low-order models are compared with the optimal closures predicted by the non-Markovian Mori-Zwanzig formalism of statistical physics. Multilayer stochastic models (MSMs) are introduced as both a very broad generalization and a time-continuous limit of existing multilevel, regression-based approaches to data-based closure, in particular of empirical model reduction (EMR). We show that the multilayer structure of MSMs can provide a natural Markov approximation to the generalized Langevin equation (GLE) of the Mori-Zwanzig formalism. A simple correlation-based stopping criterion for an EMR-MSM model is derived to assess how well it approximates the GLE solution. Sufficient conditions are given for the nonlinear cross-interactions between the constitutive layers of a given MSM to guarantee the existence of a global random attractor. This existence ensures that no blow-up can occur for a very broad class of MSM applications. The EMR-MSM methodology is first applied to a conceptual, nonlinear, stochastic climate model of coupled slow and fast variables, in which only slow variables are observed. The resulting reduced model with energy-conserving nonlinearities captures the main statistical features of the slow variables, even when there is no formal scale separation and the fast variables are quite energetic. Second, an MSM is shown to successfully reproduce the statistics of a partially observed, generalized Lokta-Volterra model of population dynamics in its chaotic regime. The positivity constraint on the solutions' components replaces here the quadratic-energy-preserving constraint of fluid-flow problems and it successfully prevents blow-up. This work is based on a close

  4. Sensor-based activity recognition using extended belief rule-based inference methodology.

    Calzada, A; Liu, J; Nugent, C D; Wang, H; Martinez, L

    2014-01-01

    The recently developed extended belief rule-based inference methodology (RIMER+) recognizes the need of modeling different types of information and uncertainty that usually coexist in real environments. A home setting with sensors located in different rooms and on different appliances can be considered as a particularly relevant example of such an environment, which brings a range of challenges for sensor-based activity recognition. Although RIMER+ has been designed as a generic decision model that could be applied in a wide range of situations, this paper discusses how this methodology can be adapted to recognize human activities using binary sensors within smart environments. The evaluation of RIMER+ against other state-of-the-art classifiers in terms of accuracy, efficiency and applicability was found to be significantly relevant, specially in situations of input data incompleteness, and it demonstrates the potential of this methodology and underpins the basis to develop further research on the topic.

  5. Likelihood-Based Inference in Nonlinear Error-Correction Models

    Kristensen, Dennis; Rahbæk, Anders

    We consider a class of vector nonlinear error correction models where the transfer function (or loadings) of the stationary relation- ships is nonlinear. This includes in particular the smooth transition models. A general representation theorem is given which establishes the dynamic properties...... and a linear trend in general. Gaussian likelihood-based estimators are considered for the long- run cointegration parameters, and the short-run parameters. Asymp- totic theory is provided for these and it is discussed to what extend asymptotic normality and mixed normaity can be found. A simulation study...

  6. Effect of age on variability in the production of text-based global inferences.

    Williams, Lynne J; Dunlop, Joseph P; Abdi, Hervé

    2012-01-01

    As we age, our differences in cognitive skills become more visible, an effect especially true for memory and problem solving skills (i.e., fluid intelligence). However, by contrast with fluid intelligence, few studies have examined variability in measures that rely on one's world knowledge (i.e., crystallized intelligence). The current study investigated whether age increased the variability in text based global inference generation--a measure of crystallized intelligence. Global inference generation requires the integration of textual information and world knowledge and can be expressed as a gist or lesson. Variability in generating two global inferences for a single text was examined in young-old (62 to 69 years), middle-old (70 to 76 years) and old-old (77 to 94 years) adults. The older two groups showed greater variability, with the middle elderly group being most variable. These findings suggest that variability may be a characteristic of both fluid and crystallized intelligence in aging.

  7. Effect of age on variability in the production of text-based global inferences.

    Lynne J Williams

    Full Text Available As we age, our differences in cognitive skills become more visible, an effect especially true for memory and problem solving skills (i.e., fluid intelligence. However, by contrast with fluid intelligence, few studies have examined variability in measures that rely on one's world knowledge (i.e., crystallized intelligence. The current study investigated whether age increased the variability in text based global inference generation--a measure of crystallized intelligence. Global inference generation requires the integration of textual information and world knowledge and can be expressed as a gist or lesson. Variability in generating two global inferences for a single text was examined in young-old (62 to 69 years, middle-old (70 to 76 years and old-old (77 to 94 years adults. The older two groups showed greater variability, with the middle elderly group being most variable. These findings suggest that variability may be a characteristic of both fluid and crystallized intelligence in aging.

  8. The Effect of Simulation-Based Learning on Prospective Teachers' Inference Skills in Teaching Probability

    Koparan, Timur; Yilmaz, Gül Kaleli

    2015-01-01

    The effect of simulation-based probability teaching on the prospective teachers' inference skills has been examined with this research. In line with this purpose, it has been aimed to examine the design, implementation and efficiency of a learning environment for experimental probability. Activities were built on modeling, simulation and the…

  9. Poisson-Based Inference for Perturbation Models in Adaptive Spelling Training

    Baschera, Gian-Marco; Gross, Markus

    2010-01-01

    We present an inference algorithm for perturbation models based on Poisson regression. The algorithm is designed to handle unclassified input with multiple errors described by independent mal-rules. This knowledge representation provides an intelligent tutoring system with local and global information about a student, such as error classification…

  10. Decision making based on emotional images

    Kentaro eKatahira

    2011-10-01

    Full Text Available The emotional outcome of a choice affects subsequent decision making. While the relationship between decision making and emotion has attracted attention, studies on emotion and decision making have been independently developed. In this study, we investigated how the emotional valence of pictures, which was stochastically contingent on participants’ choices, influenced subsequent decision making. In contrast to traditional value-based decision-making studies that used money or food as a reward, the reward value of the decision outcome, which guided the update of value for each choice, is unknown beforehand. To estimate the reward value of emotional pictures from participants’ choice data, we used reinforcement learning models that have success- fully been used in previous studies for modeling value-based decision making. Consequently, we found that the estimated reward value was asymmetric between positive and negative pictures. The negative reward value of negative pictures (relative to neutral pictures was larger in magnitude than the positive reward value of positive pictures. This asymmetry was not observed in valence for an individual picture, which was rated by the participants regarding the emotion experienced upon viewing it. These results suggest that there may be a difference between experienced emotion and the effect of the experienced emotion on subsequent behavior. Our experimental and computational paradigm provides a novel way for quantifying how and what aspects of emotional events affect human behavior. The present study is a first step toward relating a large amount of knowledge in emotion science and in taking computational approaches to value-based decision making.

  11. Continuous Record Laplace-based Inference about the Break Date in Structural Change Models

    Casini, Alessandro; Perron, Pierre

    2018-01-01

    Building upon the continuous record asymptotic framework recently introduced by Casini and Perron (2017a) for inference in structural change models, we propose a Laplace-based (Quasi-Bayes) procedure for the construction of the estimate and confidence set for the date of a structural change. The procedure relies on a Laplace-type estimator defined by an integration-based rather than an optimization-based method. A transformation of the leastsquares criterion function is evaluated in order to ...

  12. qPR: An adaptive partial-report procedure based on Bayesian inference.

    Baek, Jongsoo; Lesmes, Luis Andres; Lu, Zhong-Lin

    2016-08-01

    Iconic memory is best assessed with the partial report procedure in which an array of letters appears briefly on the screen and a poststimulus cue directs the observer to report the identity of the cued letter(s). Typically, 6-8 cue delays or 600-800 trials are tested to measure the iconic memory decay function. Here we develop a quick partial report, or qPR, procedure based on a Bayesian adaptive framework to estimate the iconic memory decay function with much reduced testing time. The iconic memory decay function is characterized by an exponential function and a joint probability distribution of its three parameters. Starting with a prior of the parameters, the method selects the stimulus to maximize the expected information gain in the next test trial. It then updates the posterior probability distribution of the parameters based on the observer's response using Bayesian inference. The procedure is reiterated until either the total number of trials or the precision of the parameter estimates reaches a certain criterion. Simulation studies showed that only 100 trials were necessary to reach an average absolute bias of 0.026 and a precision of 0.070 (both in terms of probability correct). A psychophysical validation experiment showed that estimates of the iconic memory decay function obtained with 100 qPR trials exhibited good precision (the half width of the 68.2% credible interval = 0.055) and excellent agreement with those obtained with 1,600 trials of the conventional method of constant stimuli procedure (RMSE = 0.063). Quick partial-report relieves the data collection burden in characterizing iconic memory and makes it possible to assess iconic memory in clinical populations.

  13. Augmentation of Explicit Spatial Configurations by Knowledge-Based Inference on Geometric Fields

    Dan Tappan

    2009-04-01

    Full Text Available A spatial configuration of a rudimentary, static, realworld scene with known objects (animals and properties (positions and orientations contains a wealth of syntactic and semantic spatial information that can contribute to a computational understanding far beyond what its quantitative details alone convey. This work presents an approach that (1 quantitatively represents what a configuration explicitly states, (2 integrates this information with implicit, commonsense background knowledge of its objects and properties, (3 infers additional, contextually appropriate, commonsense spatial information from and about their interrelationships, and (4 augments the original representation with this combined information. A semantic network represents explicit, quantitative information in a configuration. An inheritance-based knowledge base of relevant concepts supplies implicit, qualitative background knowledge to support semantic interpretation. Together, these structures provide a simple, nondeductive, constraint-based, geometric logical formalism to infer substantial implicit knowledge for intrinsic and deictic frames of spatial reference.

  14. Staged decision making based on probabilistic forecasting

    Booister, Nikéh; Verkade, Jan; Werner, Micha; Cranston, Michael; Cumiskey, Lydia; Zevenbergen, Chris

    2016-04-01

    Flood forecasting systems reduce, but cannot eliminate uncertainty about the future. Probabilistic forecasts explicitly show that uncertainty remains. However, as - compared to deterministic forecasts - a dimension is added ('probability' or 'likelihood'), with this added dimension decision making is made slightly more complicated. A technique of decision support is the cost-loss approach, which defines whether or not to issue a warning or implement mitigation measures (risk-based method). With the cost-loss method a warning will be issued when the ratio of the response costs to the damage reduction is less than or equal to the probability of the possible flood event. This cost-loss method is not widely used, because it motivates based on only economic values and is a technique that is relatively static (no reasoning, yes/no decision). Nevertheless it has high potential to improve risk-based decision making based on probabilistic flood forecasting because there are no other methods known that deal with probabilities in decision making. The main aim of this research was to explore the ways of making decision making based on probabilities with the cost-loss method better applicable in practice. The exploration began by identifying other situations in which decisions were taken based on uncertain forecasts or predictions. These cases spanned a range of degrees of uncertainty: from known uncertainty to deep uncertainty. Based on the types of uncertainties, concepts of dealing with situations and responses were analysed and possible applicable concepts where chosen. Out of this analysis the concepts of flexibility and robustness appeared to be fitting to the existing method. Instead of taking big decisions with bigger consequences at once, the idea is that actions and decisions are cut-up into smaller pieces and finally the decision to implement is made based on economic costs of decisions and measures and the reduced effect of flooding. The more lead-time there is in

  15. An empirical Bayesian approach for model-based inference of cellular signaling networks

    Klinke David J

    2009-11-01

    Full Text Available Abstract Background A common challenge in systems biology is to infer mechanistic descriptions of biological process given limited observations of a biological system. Mathematical models are frequently used to represent a belief about the causal relationships among proteins within a signaling network. Bayesian methods provide an attractive framework for inferring the validity of those beliefs in the context of the available data. However, efficient sampling of high-dimensional parameter space and appropriate convergence criteria provide barriers for implementing an empirical Bayesian approach. The objective of this study was to apply an Adaptive Markov chain Monte Carlo technique to a typical study of cellular signaling pathways. Results As an illustrative example, a kinetic model for the early signaling events associated with the epidermal growth factor (EGF signaling network was calibrated against dynamic measurements observed in primary rat hepatocytes. A convergence criterion, based upon the Gelman-Rubin potential scale reduction factor, was applied to the model predictions. The posterior distributions of the parameters exhibited complicated structure, including significant covariance between specific parameters and a broad range of variance among the parameters. The model predictions, in contrast, were narrowly distributed and were used to identify areas of agreement among a collection of experimental studies. Conclusion In summary, an empirical Bayesian approach was developed for inferring the confidence that one can place in a particular model that describes signal transduction mechanisms and for inferring inconsistencies in experimental measurements.

  16. Making judgments about ability: the role of implicit theories of ability in moderating inferences from temporal and social comparison information.

    Butler, R

    2000-05-01

    Two studies examined the novel proposal that implicit theories of intelligence (C. S. Dweck & E. L. Leggett, 1988) moderate both the effects of performance trends on ability inferences and the perceived diagnosticity of temporal versus normative feedback. Results from 613 adolescents and 42 teachers confirmed that entity theorists perceived initial outcome as more diagnostic and inferred higher ability in another (Study 1) and in the self (Study 2) in a declining outcome condition; incremental theorists perceived last outcome as more diagnostic and inferred higher ability in an ascending condition. Experimental induction of beliefs about ability had similar effects. As predicted, self-appraisal was affected more by temporal feedback among incremental theorists and by normative feedback among entity theorists. Results help resolve prior mixed findings regarding order effects and responses to temporal and normative evaluation.

  17. Indexing the Environmental Quality Performance Based on A Fuzzy Inference Approach

    Iswari, Lizda

    2018-03-01

    Environmental performance strongly deals with the quality of human life. In Indonesia, this performance is quantified through Environmental Quality Index (EQI) which consists of three indicators, i.e. river quality index, air quality index, and coverage of land cover. The current of this instrument data processing was done by averaging and weighting each index to represent the EQI at the provincial level. However, we found EQI interpretations that may contain some uncertainties and have a range of circumstances possibly less appropriate if processed under a common statistical approach. In this research, we aim to manage the indicators of EQI with a more intuitive computation technique and make some inferences related to the environmental performance in 33 provinces in Indonesia. Research was conducted in three stages of Mamdani Fuzzy Inference System (MAFIS), i.e. fuzzification, data inference, and defuzzification. Data input consists of 10 environmental parameters and the output is an index of Environmental Quality Performance (EQP). Research was applied to the environmental condition data set in 2015 and quantified the results into the scale of 0 to 100, i.e. 10 provinces at good performance with the EQP above 80 dominated by provinces in eastern part of Indonesia, 22 provinces with the EQP between 80 to 50, and one province in Java Island with the EQP below 20. This research shows that environmental quality performance can be quantified without eliminating the natures of the data set and simultaneously is able to show the environment behavior along with its spatial pattern distribution.

  18. EMR-based medical knowledge representation and inference via Markov random fields and distributed representation learning.

    Zhao, Chao; Jiang, Jingchi; Guan, Yi; Guo, Xitong; He, Bin

    2018-05-01

    Electronic medical records (EMRs) contain medical knowledge that can be used for clinical decision support (CDS). Our objective is to develop a general system that can extract and represent knowledge contained in EMRs to support three CDS tasks-test recommendation, initial diagnosis, and treatment plan recommendation-given the condition of a patient. We extracted four kinds of medical entities from records and constructed an EMR-based medical knowledge network (EMKN), in which nodes are entities and edges reflect their co-occurrence in a record. Three bipartite subgraphs (bigraphs) were extracted from the EMKN, one to support each task. One part of the bigraph was the given condition (e.g., symptoms), and the other was the condition to be inferred (e.g., diseases). Each bigraph was regarded as a Markov random field (MRF) to support the inference. We proposed three graph-based energy functions and three likelihood-based energy functions. Two of these functions are based on knowledge representation learning and can provide distributed representations of medical entities. Two EMR datasets and three metrics were utilized to evaluate the performance. As a whole, the evaluation results indicate that the proposed system outperformed the baseline methods. The distributed representation of medical entities does reflect similarity relationships with respect to knowledge level. Combining EMKN and MRF is an effective approach for general medical knowledge representation and inference. Different tasks, however, require individually designed energy functions. Copyright © 2018 Elsevier B.V. All rights reserved.

  19. Assessment of statistical education in Indonesia: Preliminary results and initiation to simulation-based inference

    Saputra, K. V. I.; Cahyadi, L.; Sembiring, U. A.

    2018-01-01

    Start in this paper, we assess our traditional elementary statistics education and also we introduce elementary statistics with simulation-based inference. To assess our statistical class, we adapt the well-known CAOS (Comprehensive Assessment of Outcomes in Statistics) test that serves as an external measure to assess the student’s basic statistical literacy. This test generally represents as an accepted measure of statistical literacy. We also introduce a new teaching method on elementary statistics class. Different from the traditional elementary statistics course, we will introduce a simulation-based inference method to conduct hypothesis testing. From the literature, it has shown that this new teaching method works very well in increasing student’s understanding of statistics.

  20. Dopamine and Effort-Based Decision Making

    Irma Triasih Kurniawan

    2011-06-01

    Full Text Available Motivational theories of choice focus on the influence of goal values and strength of reinforcement to explain behavior. By contrast relatively little is known concerning how the cost of an action, such as effort expended, contributes to a decision to act. Effort-based decision making addresses how we make an action choice based on an integration of action and goal values. Here we review behavioral and neurobiological data regarding the representation of effort as action cost, and how this impacts on decision making. Although organisms expend effort to obtain a desired reward there is a striking sensitivity to the amount of effort required, such that the net preference for an action decreases as effort cost increases. We discuss the contribution of the neurotransmitter dopamine (DA towards overcoming response costs and in enhancing an animal’s motivation towards effortful actions. We also consider the contribution of brain structures, including the basal ganglia (BG and anterior cingulate cortex (ACC, in the internal generation of action involving a translation of reward expectation into effortful action.

  1. Inferring nonlinear gene regulatory networks from gene expression data based on distance correlation.

    Xiaobo Guo

    Full Text Available Nonlinear dependence is general in regulation mechanism of gene regulatory networks (GRNs. It is vital to properly measure or test nonlinear dependence from real data for reconstructing GRNs and understanding the complex regulatory mechanisms within the cellular system. A recently developed measurement called the distance correlation (DC has been shown powerful and computationally effective in nonlinear dependence for many situations. In this work, we incorporate the DC into inferring GRNs from the gene expression data without any underling distribution assumptions. We propose three DC-based GRNs inference algorithms: CLR-DC, MRNET-DC and REL-DC, and then compare them with the mutual information (MI-based algorithms by analyzing two simulated data: benchmark GRNs from the DREAM challenge and GRNs generated by SynTReN network generator, and an experimentally determined SOS DNA repair network in Escherichia coli. According to both the receiver operator characteristic (ROC curve and the precision-recall (PR curve, our proposed algorithms significantly outperform the MI-based algorithms in GRNs inference.

  2. Human disease MiRNA inference by combining target information based on heterogeneous manifolds.

    Ding, Pingjian; Luo, Jiawei; Liang, Cheng; Xiao, Qiu; Cao, Buwen

    2018-04-01

    The emergence of network medicine has provided great insight into the identification of disease-related molecules, which could help with the development of personalized medicine. However, the state-of-the-art methods could neither simultaneously consider target information and the known miRNA-disease associations nor effectively explore novel gene-disease associations as a by-product during the process of inferring disease-related miRNAs. Computational methods incorporating multiple sources of information offer more opportunities to infer disease-related molecules, including miRNAs and genes in heterogeneous networks at a system level. In this study, we developed a novel algorithm, named inference of Disease-related MiRNAs based on Heterogeneous Manifold (DMHM), to accurately and efficiently identify miRNA-disease associations by integrating multi-omics data. Graph-based regularization was utilized to obtain a smooth function on the data manifold, which constitutes the main principle of DMHM. The novelty of this framework lies in the relatedness between diseases and miRNAs, which are measured via heterogeneous manifolds on heterogeneous networks integrating target information. To demonstrate the effectiveness of DMHM, we conducted comprehensive experiments based on HMDD datasets and compared DMHM with six state-of-the-art methods. Experimental results indicated that DMHM significantly outperformed the other six methods under fivefold cross validation and de novo prediction tests. Case studies have further confirmed the practical usefulness of DMHM. Copyright © 2018 Elsevier Inc. All rights reserved.

  3. Inferring late-Holocene climate in the Ecuadorian Andes using a chironomid-based temperature inference model

    Matthews-Bird, Frazer; Brooks, Stephen J.; Holden, Philip B.; Montoya, Encarni; Gosling, William D.

    2016-06-01

    Presented here is the first chironomid calibration data set for tropical South America. Surface sediments were collected from 59 lakes across Bolivia (15 lakes), Peru (32 lakes), and Ecuador (12 lakes) between 2004 and 2013 over an altitudinal gradient from 150 m above sea level (a.s.l) to 4655 m a.s.l, between 0-17° S and 64-78° W. The study sites cover a mean annual temperature (MAT) gradient of 25 °C. In total, 55 chironomid taxa were identified in the 59 calibration data set lakes. When used as a single explanatory variable, MAT explains 12.9 % of the variance (λ1/λ2 = 1.431). Two inference models were developed using weighted averaging (WA) and Bayesian methods. The best-performing model using conventional statistical methods was a WA (inverse) model (R2jack = 0.890; RMSEPjack = 2.404 °C, RMSEP - root mean squared error of prediction; mean biasjack = -0.017 °C; max biasjack = 4.665 °C). The Bayesian method produced a model with R2jack = 0.909, RMSEPjack = 2.373 °C, mean biasjack = 0.598 °C, and max biasjack = 3.158 °C. Both models were used to infer past temperatures from a ca. 3000-year record from the tropical Andes of Ecuador, Laguna Pindo. Inferred temperatures fluctuated around modern-day conditions but showed significant departures at certain intervals (ca. 1600 cal yr BP; ca. 3000-2500 cal yr BP). Both methods (WA and Bayesian) showed similar patterns of temperature variability; however, the magnitude of fluctuations differed. In general the WA method was more variable and often underestimated Holocene temperatures (by ca. -7 ± 2.5 °C relative to the modern period). The Bayesian method provided temperature anomaly estimates for cool periods that lay within the expected range of the Holocene (ca. -3 ± 3.4 °C). The error associated with both reconstructions is consistent with a constant temperature of 20 °C for the past 3000 years. We would caution, however, against an over-interpretation at this stage. The reconstruction can only

  4. Fuzzy logic inference-based Pavement Friction Management and real-time slippery warning systems: A proof of concept study.

    Najafi, Shahriar; Flintsch, Gerardo W; Khaleghian, Seyedmeysam

    2016-05-01

    Minimizing roadway crashes and fatalities is one of the primary objectives of highway engineers, and can be achieved in part through appropriate maintenance practices. Maintaining an appropriate level of friction is a crucial maintenance practice, due to the effect it has on roadway safety. This paper presents a fuzzy logic inference system that predicts the rate of vehicle crashes based on traffic level, speed limit, and surface friction. Mamdani and Sugeno fuzzy controllers were used to develop the model. The application of the proposed fuzzy control system in a real-time slippery road warning system is demonstrated as a proof of concept. The results of this study provide a decision support model for highway agencies to monitor their network's friction and make appropriate judgments to correct deficiencies based on crash risk. Furthermore, this model can be implemented in the connected vehicle environment to warn drivers of potentially slippery locations. Published by Elsevier Ltd.

  5. Supervisory Adaptive Network-Based Fuzzy Inference System (SANFIS Design for Empirical Test of Mobile Robot

    Yi-Jen Mon

    2012-10-01

    Full Text Available A supervisory Adaptive Network-based Fuzzy Inference System (SANFIS is proposed for the empirical control of a mobile robot. This controller includes an ANFIS controller and a supervisory controller. The ANFIS controller is off-line tuned by an adaptive fuzzy inference system, the supervisory controller is designed to compensate for the approximation error between the ANFIS controller and the ideal controller, and drive the trajectory of the system onto a specified surface (called the sliding surface or switching surface while maintaining the trajectory onto this switching surface continuously to guarantee the system stability. This SANFIS controller can achieve favourable empirical control performance of the mobile robot in the empirical tests of driving the mobile robot with a square path. Practical experimental results demonstrate that the proposed SANFIS can achieve better control performance than that achieved using an ANFIS controller for empirical control of the mobile robot.

  6. In silico model-based inference: a contemporary approach for hypothesis testing in network biology.

    Klinke, David J

    2014-01-01

    Inductive inference plays a central role in the study of biological systems where one aims to increase their understanding of the system by reasoning backwards from uncertain observations to identify causal relationships among components of the system. These causal relationships are postulated from prior knowledge as a hypothesis or simply a model. Experiments are designed to test the model. Inferential statistics are used to establish a level of confidence in how well our postulated model explains the acquired data. This iterative process, commonly referred to as the scientific method, either improves our confidence in a model or suggests that we revisit our prior knowledge to develop a new model. Advances in technology impact how we use prior knowledge and data to formulate models of biological networks and how we observe cellular behavior. However, the approach for model-based inference has remained largely unchanged since Fisher, Neyman and Pearson developed the ideas in the early 1900s that gave rise to what is now known as classical statistical hypothesis (model) testing. Here, I will summarize conventional methods for model-based inference and suggest a contemporary approach to aid in our quest to discover how cells dynamically interpret and transmit information for therapeutic aims that integrates ideas drawn from high performance computing, Bayesian statistics, and chemical kinetics. © 2014 American Institute of Chemical Engineers.

  7. Inferring topologies via driving-based generalized synchronization of two-layer networks

    Wang, Yingfei; Wu, Xiaoqun; Feng, Hui; Lu, Jun-an; Xu, Yuhua

    2016-05-01

    The interaction topology among the constituents of a complex network plays a crucial role in the network’s evolutionary mechanisms and functional behaviors. However, some network topologies are usually unknown or uncertain. Meanwhile, coupling delays are ubiquitous in various man-made and natural networks. Hence, it is necessary to gain knowledge of the whole or partial topology of a complex dynamical network by taking into consideration communication delay. In this paper, topology identification of complex dynamical networks is investigated via generalized synchronization of a two-layer network. Particularly, based on the LaSalle-type invariance principle of stochastic differential delay equations, an adaptive control technique is proposed by constructing an auxiliary layer and designing proper control input and updating laws so that the unknown topology can be recovered upon successful generalized synchronization. Numerical simulations are provided to illustrate the effectiveness of the proposed method. The technique provides a certain theoretical basis for topology inference of complex networks. In particular, when the considered network is composed of systems with high-dimension or complicated dynamics, a simpler response layer can be constructed, which is conducive to circuit design. Moreover, it is practical to take into consideration perturbations caused by control input. Finally, the method is applicable to infer topology of a subnetwork embedded within a complex system and locate hidden sources. We hope the results can provide basic insight into further research endeavors on understanding practical and economical topology inference of networks.

  8. PIA: An Intuitive Protein Inference Engine with a Web-Based User Interface.

    Uszkoreit, Julian; Maerkens, Alexandra; Perez-Riverol, Yasset; Meyer, Helmut E; Marcus, Katrin; Stephan, Christian; Kohlbacher, Oliver; Eisenacher, Martin

    2015-07-02

    Protein inference connects the peptide spectrum matches (PSMs) obtained from database search engines back to proteins, which are typically at the heart of most proteomics studies. Different search engines yield different PSMs and thus different protein lists. Analysis of results from one or multiple search engines is often hampered by different data exchange formats and lack of convenient and intuitive user interfaces. We present PIA, a flexible software suite for combining PSMs from different search engine runs and turning these into consistent results. PIA can be integrated into proteomics data analysis workflows in several ways. A user-friendly graphical user interface can be run either locally or (e.g., for larger core facilities) from a central server. For automated data processing, stand-alone tools are available. PIA implements several established protein inference algorithms and can combine results from different search engines seamlessly. On several benchmark data sets, we show that PIA can identify a larger number of proteins at the same protein FDR when compared to that using inference based on a single search engine. PIA supports the majority of established search engines and data in the mzIdentML standard format. It is implemented in Java and freely available at https://github.com/mpc-bioinformatics/pia.

  9. Everyday conversation requires cognitive inference: neural bases of comprehending implicated meanings in conversations.

    Jang, Gijeong; Yoon, Shin-ae; Lee, Sung-Eun; Park, Haeil; Kim, Joohan; Ko, Jeong Hoon; Park, Hae-Jeong

    2013-11-01

    In ordinary conversations, literal meanings of an utterance are often quite different from implicated meanings and the inference about implicated meanings is essentially required for successful comprehension of the speaker's utterances. Inference of finding implicated meanings is based on the listener's assumption that the conversational partner says only relevant matters according to the maxim of relevance in Grice's theory of conversational implicature. To investigate the neural correlates of comprehending implicated meanings under the maxim of relevance, a total of 23 participants underwent an fMRI task with a series of conversational pairs, each consisting of a question and an answer. The experimental paradigm was composed of three conditions: explicit answers, moderately implicit answers, and highly implicit answers. Participants were asked to decide whether the answer to the Yes/No question meant 'Yes' or 'No'. Longer reaction time was required for the highly implicit answers than for the moderately implicit answers without affecting the accuracy. The fMRI results show that the left anterior temporal lobe, left angular gyrus, and left posterior middle temporal gyrus had stronger activation in both moderately and highly implicit conditions than in the explicit condition. Comprehension of highly implicit answers had increased activations in additional regions including the left inferior frontal gyrus, left medial prefrontal cortex, left posterior cingulate cortex and right anterior temporal lobe. The activation results indicate involvement of these regions in the inference process to build coherence between literally irrelevant but pragmatically associated utterances under the maxim of relevance. Especially, the left anterior temporal lobe showed high sensitivity to the level of implicitness and showed increased activation for highly versus moderately implicit conditions, which imply its central role in inference such as semantic integration. The right

  10. Inference-Based Similarity Search in Randomized Montgomery Domains for Privacy-Preserving Biometric Identification.

    Wang, Yi; Wan, Jianwu; Guo, Jun; Cheung, Yiu-Ming; C Yuen, Pong

    2017-07-14

    Similarity search is essential to many important applications and often involves searching at scale on high-dimensional data based on their similarity to a query. In biometric applications, recent vulnerability studies have shown that adversarial machine learning can compromise biometric recognition systems by exploiting the biometric similarity information. Existing methods for biometric privacy protection are in general based on pairwise matching of secured biometric templates and have inherent limitations in search efficiency and scalability. In this paper, we propose an inference-based framework for privacy-preserving similarity search in Hamming space. Our approach builds on an obfuscated distance measure that can conceal Hamming distance in a dynamic interval. Such a mechanism enables us to systematically design statistically reliable methods for retrieving most likely candidates without knowing the exact distance values. We further propose to apply Montgomery multiplication for generating search indexes that can withstand adversarial similarity analysis, and show that information leakage in randomized Montgomery domains can be made negligibly small. Our experiments on public biometric datasets demonstrate that the inference-based approach can achieve a search accuracy close to the best performance possible with secure computation methods, but the associated cost is reduced by orders of magnitude compared to cryptographic primitives.

  11. A Visual Analysis Approach for Inferring Personal Job and Housing Locations Based on Public Bicycle Data

    Xiaoying Shi

    2017-07-01

    Full Text Available Information concerning the home and workplace of residents is the basis of analyzing the urban job-housing spatial relationship. Traditional methods conduct time-consuming user surveys to obtain personal job and housing location information. Some new methods define rules to detect personal places based on human mobility data. However, because the travel patterns of residents are variable, simple rule-based methods are unable to generalize highly changing and complex travel modes. In this paper, we propose a visual analysis approach to assist the analyzer in inferring personal job and housing locations interactively based on public bicycle data. All users are first clustered to find potential commuting users. Then, several visual views are designed to find the key candidate stations for a specific user, and the visited temporal pattern of stations and the user’s hire behavior are analyzed, which helps with the inference of station semantic meanings. Finally, a number of users’ job and housing locations are detected by the analyzer and visualized. Our approach can manage the complex and diverse cycling habits of users. The effectiveness of the approach is shown through case studies based on a real-world public bicycle dataset.

  12. A bayesian inference-based detection mechanism to defend medical smartphone networks against insider attacks

    Meng, Weizhi; Li, Wenjuan; Xiang, Yang

    2017-01-01

    and experience for both patients and healthcare workers, and the underlying network architecture to support such devices is also referred to as medical smartphone networks (MSNs). MSNs, similar to other networks, are subject to a wide range of attacks (e.g. leakage of sensitive patient information by a malicious...... insider). In this work, we focus on MSNs and present a compact but efficient trust-based approach using Bayesian inference to identify malicious nodes in such an environment. We then demonstrate the effectiveness of our approach in detecting malicious nodes by evaluating the deployment of our proposed...

  13. Optimal inverse magnetorheological damper modeling using shuffled frog-leaping algorithm–based adaptive neuro-fuzzy inference system approach

    Xiufang Lin

    2016-08-01

    Full Text Available Magnetorheological dampers have become prominent semi-active control devices for vibration mitigation of structures which are subjected to severe loads. However, the damping force cannot be controlled directly due to the inherent nonlinear characteristics of the magnetorheological dampers. Therefore, for fully exploiting the capabilities of the magnetorheological dampers, one of the challenging aspects is to develop an accurate inverse model which can appropriately predict the input voltage to control the damping force. In this article, a hybrid modeling strategy combining shuffled frog-leaping algorithm and adaptive-network-based fuzzy inference system is proposed to model the inverse dynamic characteristics of the magnetorheological dampers for improving the modeling accuracy. The shuffled frog-leaping algorithm is employed to optimize the premise parameters of the adaptive-network-based fuzzy inference system while the consequent parameters are tuned by a least square estimation method, here known as shuffled frog-leaping algorithm-based adaptive-network-based fuzzy inference system approach. To evaluate the effectiveness of the proposed approach, the inverse modeling results based on the shuffled frog-leaping algorithm-based adaptive-network-based fuzzy inference system approach are compared with those based on the adaptive-network-based fuzzy inference system and genetic algorithm–based adaptive-network-based fuzzy inference system approaches. Analysis of variance test is carried out to statistically compare the performance of the proposed methods and the results demonstrate that the shuffled frog-leaping algorithm-based adaptive-network-based fuzzy inference system strategy outperforms the other two methods in terms of modeling (training accuracy and checking accuracy.

  14. Enabling Participatory Decision Making Through Web-Based GIS

    Sheng, Grant; Yam, Kevin; Hassas, Aranak

    2001-01-01

    obligations to fellow human beings, animals, and the environment. People perceive that information travels essentially one way in the processes and the voices of the community and its members are not heard. Subsequently, they feel excluded from the actual decision making process and even from being able to participate meaningfully in the process. Recent advances in informatics and geomatics technology, such as the Internet, web-based software and geographic information systems (GIS), have made it possible to address these issues more effectively. We believe that the combined features of two software developed at the York Centre for Applied Sustainability can facilitate access to information, provide a virtual forum for discussion and debate, and it possible for individuals to participate in decision making process, and to infer peoples' values from their choice criteria selection

  15. Bayesian inference of earthquake parameters from buoy data using a polynomial chaos-based surrogate

    Giraldi, Loic

    2017-04-07

    This work addresses the estimation of the parameters of an earthquake model by the consequent tsunami, with an application to the Chile 2010 event. We are particularly interested in the Bayesian inference of the location, the orientation, and the slip of an Okada-based model of the earthquake ocean floor displacement. The tsunami numerical model is based on the GeoClaw software while the observational data is provided by a single DARTⓇ buoy. We propose in this paper a methodology based on polynomial chaos expansion to construct a surrogate model of the wave height at the buoy location. A correlated noise model is first proposed in order to represent the discrepancy between the computational model and the data. This step is necessary, as a classical independent Gaussian noise is shown to be unsuitable for modeling the error, and to prevent convergence of the Markov Chain Monte Carlo sampler. Second, the polynomial chaos model is subsequently improved to handle the variability of the arrival time of the wave, using a preconditioned non-intrusive spectral method. Finally, the construction of a reduced model dedicated to Bayesian inference is proposed. Numerical results are presented and discussed.

  16. Bayesian inference for heterogeneous caprock permeability based on above zone pressure monitoring

    Namhata, Argha; Small, Mitchell J.; Dilmore, Robert M.; Nakles, David V.; King, Seth

    2017-02-01

    The presence of faults/ fractures or highly permeable zones in the primary sealing caprock of a CO2 storage reservoir can result in leakage of CO2. Monitoring of leakage requires the capability to detect and resolve the onset, location, and volume of leakage in a systematic and timely manner. Pressure-based monitoring possesses such capabilities. This study demonstrates a basis for monitoring network design based on the characterization of CO2 leakage scenarios through an assessment of the integrity and permeability of the caprock inferred from above zone pressure measurements. Four representative heterogeneous fractured seal types are characterized to demonstrate seal permeability ranging from highly permeable to impermeable. Based on Bayesian classification theory, the probability of each fractured caprock scenario given above zone pressure measurements with measurement error is inferred. The sensitivity to injection rate and caprock thickness is also evaluated and the probability of proper classification is calculated. The time required to distinguish between above zone pressure outcomes and the associated leakage scenarios is also computed.

  17. Risk Mapping of Cutaneous Leishmaniasis via a Fuzzy C Means-based Neuro-Fuzzy Inference System

    P. Akhavan

    2014-10-01

    Full Text Available Finding pathogenic factors and how they are spread in the environment has become a global demand, recently. Cutaneous Leishmaniasis (CL created by Leishmania is a special parasitic disease which can be passed on to human through phlebotomus of vector-born. Studies show that economic situation, cultural issues, as well as environmental and ecological conditions can affect the prevalence of this disease. In this study, Data Mining is utilized in order to predict CL prevalence rate and obtain a risk map. This case is based on effective environmental parameters on CL and a Neuro-Fuzzy system was also used. Learning capacity of Neuro-Fuzzy systems in neural network on one hand and reasoning power of fuzzy systems on the other, make it very efficient to use. In this research, in order to predict CL prevalence rate, an adaptive Neuro-fuzzy inference system with fuzzy inference structure of fuzzy C Means clustering was applied to determine the initial membership functions. Regarding to high incidence of CL in Ilam province, counties of Ilam, Mehran, and Dehloran have been examined and evaluated. The CL prevalence rate was predicted in 2012 by providing effective environmental map and topography properties including temperature, moisture, annual, rainfall, vegetation and elevation. Results indicate that the model precision with fuzzy C Means clustering structure rises acceptable RMSE values of both training and checking data and support our analyses. Using the proposed data mining technology, the pattern of disease spatial distribution and vulnerable areas become identifiable and the map can be used by experts and decision makers of public health as a useful tool in management and optimal decision-making.

  18. Risk Mapping of Cutaneous Leishmaniasis via a Fuzzy C Means-based Neuro-Fuzzy Inference System

    Akhavan, P.; Karimi, M.; Pahlavani, P.

    2014-10-01

    Finding pathogenic factors and how they are spread in the environment has become a global demand, recently. Cutaneous Leishmaniasis (CL) created by Leishmania is a special parasitic disease which can be passed on to human through phlebotomus of vector-born. Studies show that economic situation, cultural issues, as well as environmental and ecological conditions can affect the prevalence of this disease. In this study, Data Mining is utilized in order to predict CL prevalence rate and obtain a risk map. This case is based on effective environmental parameters on CL and a Neuro-Fuzzy system was also used. Learning capacity of Neuro-Fuzzy systems in neural network on one hand and reasoning power of fuzzy systems on the other, make it very efficient to use. In this research, in order to predict CL prevalence rate, an adaptive Neuro-fuzzy inference system with fuzzy inference structure of fuzzy C Means clustering was applied to determine the initial membership functions. Regarding to high incidence of CL in Ilam province, counties of Ilam, Mehran, and Dehloran have been examined and evaluated. The CL prevalence rate was predicted in 2012 by providing effective environmental map and topography properties including temperature, moisture, annual, rainfall, vegetation and elevation. Results indicate that the model precision with fuzzy C Means clustering structure rises acceptable RMSE values of both training and checking data and support our analyses. Using the proposed data mining technology, the pattern of disease spatial distribution and vulnerable areas become identifiable and the map can be used by experts and decision makers of public health as a useful tool in management and optimal decision-making.

  19. Problem solving and inference mechanisms

    Furukawa, K; Nakajima, R; Yonezawa, A; Goto, S; Aoyama, A

    1982-01-01

    The heart of the fifth generation computer will be powerful mechanisms for problem solving and inference. A deduction-oriented language is to be designed, which will form the core of the whole computing system. The language is based on predicate logic with the extended features of structuring facilities, meta structures and relational data base interfaces. Parallel computation mechanisms and specialized hardware architectures are being investigated to make possible efficient realization of the language features. The project includes research into an intelligent programming system, a knowledge representation language and system, and a meta inference system to be built on the core. 30 references.

  20. Limitations of a metabolic network-based reverse ecology method for inferring host-pathogen interactions.

    Takemoto, Kazuhiro; Aie, Kazuki

    2017-05-25

    Host-pathogen interactions are important in a wide range of research fields. Given the importance of metabolic crosstalk between hosts and pathogens, a metabolic network-based reverse ecology method was proposed to infer these interactions. However, the validity of this method remains unclear because of the various explanations presented and the influence of potentially confounding factors that have thus far been neglected. We re-evaluated the importance of the reverse ecology method for evaluating host-pathogen interactions while statistically controlling for confounding effects using oxygen requirement, genome, metabolic network, and phylogeny data. Our data analyses showed that host-pathogen interactions were more strongly influenced by genome size, primary network parameters (e.g., number of edges), oxygen requirement, and phylogeny than the reserve ecology-based measures. These results indicate the limitations of the reverse ecology method; however, they do not discount the importance of adopting reverse ecology approaches altogether. Rather, we highlight the need for developing more suitable methods for inferring host-pathogen interactions and conducting more careful examinations of the relationships between metabolic networks and host-pathogen interactions.

  1. Value-Based Standards Guide Sexism Inferences for Self and Others.

    Mitamura, Chelsea; Erickson, Lynnsey; Devine, Patricia G

    2017-09-01

    People often disagree about what constitutes sexism, and these disagreements can be both socially and legally consequential. It is unclear, however, why or how people come to different conclusions about whether something or someone is sexist. Previous research on judgments about sexism has focused on the perceiver's gender and attitudes, but neither of these variables identifies comparative standards that people use to determine whether any given behavior (or person) is sexist. Extending Devine and colleagues' values framework (Devine, Monteith, Zuwerink, & Elliot, 1991; Plant & Devine, 1998), we argue that, when evaluating others' behavior, perceivers rely on the morally-prescriptive values that guide their own behavior toward women. In a series of 3 studies we demonstrate that (1) people's personal standards for sexism in their own and others' behavior are each related to their values regarding sexism, (2) these values predict how much behavioral evidence people need to infer sexism, and (3) people with stringent, but not lenient, value-based standards get angry and try to regulate a sexist perpetrator's behavior to reduce sexism. Furthermore, these personal values are related to all outcomes in the present work above and beyond other person characteristics previously used to predict sexism inferences. We discuss the implications of differing value-based standards for explaining and reconciling disputes over what constitutes sexist behavior.

  2. Using a virtual reality in the inference based treatment of compulsive hoarding

    Marie-Eve St-Pierre-Delorme

    2016-07-01

    Full Text Available The present study evaluated the efficacy of adding a virtual reality (VR component to the treatment of compulsive hoarding (CH following inference based therapy. Participants were randomly assigned to either an experimental or a control condition. Seven participants received the experimental and seven received the control condition. Five sessions of one hour were administered weekly. A significant difference indicated that the level of clutter in the bedroom tended to diminish more in the experimental group as compared to the control group F(2,24 = 2.28, p = .10. In addition, the results demonstrated that both groups were immersed and present in the environment. The results on post-treatment measures of CH (Saving Inventory revised, Saving Cognition Inventory and Clutter Image Rating scale demonstrate the efficacy of inference based therapy in terms of symptom reduction. Overall, these results suggest that the creation of a virtual environment may be effective in the treatment of CH by helping the compulsive hoarders take action over they're clutter.

  3. Determination of Indonesian palm-oil-based bioenergy sustainability indicators using fuzzy inference system

    Arkeman, Y.; Rizkyanti, R. A.; Hambali, E.

    2017-05-01

    Development of Indonesian palm-oil-based bioenergy faces an international challenge regarding to sustainability issue, indicated by the establishment of standards on sustainable bioenergy. Currently, Indonesia has sustainability standards limited to palm-oil cultivation, while other standards are lacking appropriateness for Indonesian palm-oil-based bioenergy sustainability regarding to real condition in Indonesia. Thus, Indonesia requires sustainability indicators for Indonesian palm-oil-based bioenergy to gain recognition and easiness in marketing it. Determination of sustainability indicators was accomplished through three stages, which were preliminary analysis, indicator assessment (using fuzzy inference system), and system validation. Global Bioenergy partnership (GBEP) was used as the standard for the assessment because of its general for use, internationally accepted, and it contained balanced proportion between environment, economic, and social aspects. Result showed that the number of sustainability indicators using FIS method are 21 indicators. The system developed has an accuracy of 85%.

  4. Structure-based inference of molecular functions of proteins of unknown function from Berkeley Structural Genomics Center

    Kim, Sung-Hou; Shin, Dong Hae; Hou, Jingtong; Chandonia, John-Marc; Das, Debanu; Choi, In-Geol; Kim, Rosalind; Kim, Sung-Hou

    2007-09-02

    Advances in sequence genomics have resulted in an accumulation of a huge number of protein sequences derived from genome sequences. However, the functions of a large portion of them cannot be inferred based on the current methods of sequence homology detection to proteins of known functions. Three-dimensional structure can have an important impact in providing inference of molecular function (physical and chemical function) of a protein of unknown function. Structural genomics centers worldwide have been determining many 3-D structures of the proteins of unknown functions, and possible molecular functions of them have been inferred based on their structures. Combined with bioinformatics and enzymatic assay tools, the successful acceleration of the process of protein structure determination through high throughput pipelines enables the rapid functional annotation of a large fraction of hypothetical proteins. We present a brief summary of the process we used at the Berkeley Structural Genomics Center to infer molecular functions of proteins of unknown function.

  5. Travel Time Estimation Using Freeway Point Detector Data Based on Evolving Fuzzy Neural Inference System.

    Tang, Jinjun; Zou, Yajie; Ash, John; Zhang, Shen; Liu, Fang; Wang, Yinhai

    2016-01-01

    Travel time is an important measurement used to evaluate the extent of congestion within road networks. This paper presents a new method to estimate the travel time based on an evolving fuzzy neural inference system. The input variables in the system are traffic flow data (volume, occupancy, and speed) collected from loop detectors located at points both upstream and downstream of a given link, and the output variable is the link travel time. A first order Takagi-Sugeno fuzzy rule set is used to complete the inference. For training the evolving fuzzy neural network (EFNN), two learning processes are proposed: (1) a K-means method is employed to partition input samples into different clusters, and a Gaussian fuzzy membership function is designed for each cluster to measure the membership degree of samples to the cluster centers. As the number of input samples increases, the cluster centers are modified and membership functions are also updated; (2) a weighted recursive least squares estimator is used to optimize the parameters of the linear functions in the Takagi-Sugeno type fuzzy rules. Testing datasets consisting of actual and simulated data are used to test the proposed method. Three common criteria including mean absolute error (MAE), root mean square error (RMSE), and mean absolute relative error (MARE) are utilized to evaluate the estimation performance. Estimation results demonstrate the accuracy and effectiveness of the EFNN method through comparison with existing methods including: multiple linear regression (MLR), instantaneous model (IM), linear model (LM), neural network (NN), and cumulative plots (CP).

  6. Statistical inference for the lifetime performance index based on generalised order statistics from exponential distribution

    Vali Ahmadi, Mohammad; Doostparast, Mahdi; Ahmadi, Jafar

    2015-04-01

    In manufacturing industries, the lifetime of an item is usually characterised by a random variable X and considered to be satisfactory if X exceeds a given lower lifetime limit L. The probability of a satisfactory item is then ηL := P(X ≥ L), called conforming rate. In industrial companies, however, the lifetime performance index, proposed by Montgomery and denoted by CL, is widely used as a process capability index instead of the conforming rate. Assuming a parametric model for the random variable X, we show that there is a connection between the conforming rate and the lifetime performance index. Consequently, the statistical inferences about ηL and CL are equivalent. Hence, we restrict ourselves to statistical inference for CL based on generalised order statistics, which contains several ordered data models such as usual order statistics, progressively Type-II censored data and records. Various point and interval estimators for the parameter CL are obtained and optimal critical regions for the hypothesis testing problems concerning CL are proposed. Finally, two real data-sets on the lifetimes of insulating fluid and ball bearings, due to Nelson (1982) and Caroni (2002), respectively, and a simulated sample are analysed.

  7. A statistical method for lung tumor segmentation uncertainty in PET images based on user inference.

    Zheng, Chaojie; Wang, Xiuying; Feng, Dagan

    2015-01-01

    PET has been widely accepted as an effective imaging modality for lung tumor diagnosis and treatment. However, standard criteria for delineating tumor boundary from PET are yet to develop largely due to relatively low quality of PET images, uncertain tumor boundary definition, and variety of tumor characteristics. In this paper, we propose a statistical solution to segmentation uncertainty on the basis of user inference. We firstly define the uncertainty segmentation band on the basis of segmentation probability map constructed from Random Walks (RW) algorithm; and then based on the extracted features of the user inference, we use Principle Component Analysis (PCA) to formulate the statistical model for labeling the uncertainty band. We validated our method on 10 lung PET-CT phantom studies from the public RIDER collections [1] and 16 clinical PET studies where tumors were manually delineated by two experienced radiologists. The methods were validated using Dice similarity coefficient (DSC) to measure the spatial volume overlap. Our method achieved an average DSC of 0.878 ± 0.078 on phantom studies and 0.835 ± 0.039 on clinical studies.

  8. Using adaptive network based fuzzy inference system to forecast regional electricity loads

    Ying, L.-C.; Pan, M.-C.

    2008-01-01

    Since accurate regional load forecasting is very important for improvement of the management performance of the electric industry, various regional load forecasting methods have been developed. The purpose of this study is to apply the adaptive network based fuzzy inference system (ANFIS) model to forecast the regional electricity loads in Taiwan and demonstrate the forecasting performance of this model. Based on the mean absolute percentage errors and statistical results, we can see that the ANFIS model has better forecasting performance than the regression model, artificial neural network (ANN) model, support vector machines with genetic algorithms (SVMG) model, recurrent support vector machines with genetic algorithms (RSVMG) model and hybrid ellipsoidal fuzzy systems for time series forecasting (HEFST) model. Thus, the ANFIS model is a promising alternative for forecasting regional electricity loads

  9. Inference for multivariate regression model based on multiply imputed synthetic data generated via posterior predictive sampling

    Moura, Ricardo; Sinha, Bimal; Coelho, Carlos A.

    2017-06-01

    The recent popularity of the use of synthetic data as a Statistical Disclosure Control technique has enabled the development of several methods of generating and analyzing such data, but almost always relying in asymptotic distributions and in consequence being not adequate for small sample datasets. Thus, a likelihood-based exact inference procedure is derived for the matrix of regression coefficients of the multivariate regression model, for multiply imputed synthetic data generated via Posterior Predictive Sampling. Since it is based in exact distributions this procedure may even be used in small sample datasets. Simulation studies compare the results obtained from the proposed exact inferential procedure with the results obtained from an adaptation of Reiters combination rule to multiply imputed synthetic datasets and an application to the 2000 Current Population Survey is discussed.

  10. Using adaptive network based fuzzy inference system to forecast regional electricity loads

    Ying, Li-Chih [Department of Marketing Management, Central Taiwan University of Science and Technology, 11, Pu-tzu Lane, Peitun, Taichung City 406 (China); Pan, Mei-Chiu [Graduate Institute of Management Sciences, Nanhua University, 32, Chung Keng Li, Dalin, Chiayi 622 (China)

    2008-02-15

    Since accurate regional load forecasting is very important for improvement of the management performance of the electric industry, various regional load forecasting methods have been developed. The purpose of this study is to apply the adaptive network based fuzzy inference system (ANFIS) model to forecast the regional electricity loads in Taiwan and demonstrate the forecasting performance of this model. Based on the mean absolute percentage errors and statistical results, we can see that the ANFIS model has better forecasting performance than the regression model, artificial neural network (ANN) model, support vector machines with genetic algorithms (SVMG) model, recurrent support vector machines with genetic algorithms (RSVMG) model and hybrid ellipsoidal fuzzy systems for time series forecasting (HEFST) model. Thus, the ANFIS model is a promising alternative for forecasting regional electricity loads. (author)

  11. Top-down attention based on object representation and incremental memory for knowledge building and inference.

    Kim, Bumhwi; Ban, Sang-Woo; Lee, Minho

    2013-10-01

    Humans can efficiently perceive arbitrary visual objects based on an incremental learning mechanism with selective attention. This paper proposes a new task specific top-down attention model to locate a target object based on its form and color representation along with a bottom-up saliency based on relativity of primitive visual features and some memory modules. In the proposed model top-down bias signals corresponding to the target form and color features are generated, which draw the preferential attention to the desired object by the proposed selective attention model in concomitance with the bottom-up saliency process. The object form and color representation and memory modules have an incremental learning mechanism together with a proper object feature representation scheme. The proposed model includes a Growing Fuzzy Topology Adaptive Resonance Theory (GFTART) network which plays two important roles in object color and form biased attention; one is to incrementally learn and memorize color and form features of various objects, and the other is to generate a top-down bias signal to localize a target object by focusing on the candidate local areas. Moreover, the GFTART network can be utilized for knowledge inference which enables the perception of new unknown objects on the basis of the object form and color features stored in the memory during training. Experimental results show that the proposed model is successful in focusing on the specified target objects, in addition to the incremental representation and memorization of various objects in natural scenes. In addition, the proposed model properly infers new unknown objects based on the form and color features of previously trained objects. Copyright © 2013 Elsevier Ltd. All rights reserved.

  12. Automatic segmentation of coronary angiograms based on fuzzy inferring and probabilistic tracking

    Shoujun Zhou

    2010-08-01

    Full Text Available Abstract Background Segmentation of the coronary angiogram is important in computer-assisted artery motion analysis or reconstruction of 3D vascular structures from a single-plan or biplane angiographic system. Developing fully automated and accurate vessel segmentation algorithms is highly challenging, especially when extracting vascular structures with large variations in image intensities and noise, as well as with variable cross-sections or vascular lesions. Methods This paper presents a novel tracking method for automatic segmentation of the coronary artery tree in X-ray angiographic images, based on probabilistic vessel tracking and fuzzy structure pattern inferring. The method is composed of two main steps: preprocessing and tracking. In preprocessing, multiscale Gabor filtering and Hessian matrix analysis were used to enhance and extract vessel features from the original angiographic image, leading to a vessel feature map as well as a vessel direction map. In tracking, a seed point was first automatically detected by analyzing the vessel feature map. Subsequently, two operators [e.g., a probabilistic tracking operator (PTO and a vessel structure pattern detector (SPD] worked together based on the detected seed point to extract vessel segments or branches one at a time. The local structure pattern was inferred by a multi-feature based fuzzy inferring function employed in the SPD. The identified structure pattern, such as crossing or bifurcation, was used to control the tracking process, for example, to keep tracking the current segment or start tracking a new one, depending on the detected pattern. Results By appropriate integration of these advanced preprocessing and tracking steps, our tracking algorithm is able to extract both vessel axis lines and edge points, as well as measure the arterial diameters in various complicated cases. For example, it can walk across gaps along the longitudinal vessel direction, manage varying vessel

  13. An adaptive network-based fuzzy inference system for short-term natural gas demand estimation: Uncertain and complex environments

    Azadeh, A.; Asadzadeh, S.M.; Ghanbari, A.

    2010-01-01

    Accurate short-term natural gas (NG) demand estimation and forecasting is vital for policy and decision-making process in energy sector. Moreover, conventional methods may not provide accurate results. This paper presents an adaptive network-based fuzzy inference system (ANFIS) for estimation of NG demand. Standard input variables are used which are day of the week, demand of the same day in previous year, demand of a day before and demand of 2 days before. The proposed ANFIS approach is equipped with pre-processing and post-processing concepts. Moreover, input data are pre-processed (scaled) and finally output data are post-processed (returned to its original scale). The superiority and applicability of the ANFIS approach is shown for Iranian NG consumption from 22/12/2007 to 30/6/2008. Results show that ANFIS provides more accurate results than artificial neural network (ANN) and conventional time series approach. The results of this study provide policy makers with an appropriate tool to make more accurate predictions on future short-term NG demand. This is because the proposed approach is capable of handling non-linearity, complexity as well as uncertainty that may exist in actual data sets due to erratic responses and measurement errors.

  14. Model-based and design-based inference goals frame how to account for neighborhood clustering in studies of health in overlapping context types.

    Lovasi, Gina S; Fink, David S; Mooney, Stephen J; Link, Bruce G

    2017-12-01

    Accounting for non-independence in health research often warrants attention. Particularly, the availability of geographic information systems data has increased the ease with which studies can add measures of the local "neighborhood" even if participant recruitment was through other contexts, such as schools or clinics. We highlight a tension between two perspectives that is often present, but particularly salient when more than one type of potentially health-relevant context is indexed (e.g., both neighborhood and school). On the one hand, a model-based perspective emphasizes the processes producing outcome variation, and observed data are used to make inference about that process. On the other hand, a design-based perspective emphasizes inference to a well-defined finite population, and is commonly invoked by those using complex survey samples or those with responsibility for the health of local residents. These two perspectives have divergent implications when deciding whether clustering must be accounted for analytically and how to select among candidate cluster definitions, though the perspectives are by no means monolithic. There are tensions within each perspective as well as between perspectives. We aim to provide insight into these perspectives and their implications for population health researchers. We focus on the crucial step of deciding which cluster definition or definitions to use at the analysis stage, as this has consequences for all subsequent analytic and interpretational challenges with potentially clustered data.

  15. Model-based and design-based inference goals frame how to account for neighborhood clustering in studies of health in overlapping context types

    Gina S. Lovasi

    2017-12-01

    Full Text Available Accounting for non-independence in health research often warrants attention. Particularly, the availability of geographic information systems data has increased the ease with which studies can add measures of the local “neighborhood” even if participant recruitment was through other contexts, such as schools or clinics. We highlight a tension between two perspectives that is often present, but particularly salient when more than one type of potentially health-relevant context is indexed (e.g., both neighborhood and school. On the one hand, a model-based perspective emphasizes the processes producing outcome variation, and observed data are used to make inference about that process. On the other hand, a design-based perspective emphasizes inference to a well-defined finite population, and is commonly invoked by those using complex survey samples or those with responsibility for the health of local residents. These two perspectives have divergent implications when deciding whether clustering must be accounted for analytically and how to select among candidate cluster definitions, though the perspectives are by no means monolithic. There are tensions within each perspective as well as between perspectives. We aim to provide insight into these perspectives and their implications for population health researchers. We focus on the crucial step of deciding which cluster definition or definitions to use at the analysis stage, as this has consequences for all subsequent analytic and interpretational challenges with potentially clustered data.

  16. Principles of risk-based decision making

    United States. Coast Guard. Risk based decision-making guidelines

    2001-01-01

    ... original content in order to make this product more generically applicable and less Coast Guard specific. h s k assessment and risk management are important topics in industry and government. Because of limited resources and increasing demands for services, most organizations simply cannot continue business as usual. Even if resources are not dec...

  17. Static security-based available transfer capability using adaptive neuro fuzzy inference system

    Venkaiah, C.; Vinod Kumar, D.M.

    2010-07-01

    In a deregulated power system, power transactions between a seller and a buyer can only be scheduled when there is sufficient available transfer capability (ATC). Internet-based, open access same-time information systems (OASIS) provide market participants with ATC information that is continuously updated in real time. Static security-based ATC can be computed for the base case system as well as for the critical line outages of the system. Since critical line outages are based on static security analysis, the computation of static security based ATC using conventional methods is both tedious and time consuming. In this study, static security-based ATC was computed for real-time applications using 3 artificial intelligent methods notably the back propagation algorithm (BPA), the radial basis function (RBF) neural network, and the adaptive neuro fuzzy inference system (ANFIS). An IEEE 24-bus reliability test system (RTS) and 75-bus practical system were used to test these 3 different intelligent methods. The results were compared with the conventional full alternating current (AC) load flow method for different transactions.

  18. Static security-based available transfer capability using adaptive neuro fuzzy inference system

    Venkaiah, C.; Vinod Kumar, D.M.

    2010-01-01

    In a deregulated power system, power transactions between a seller and a buyer can only be scheduled when there is sufficient available transfer capability (ATC). Internet-based, open access same-time information systems (OASIS) provide market participants with ATC information that is continuously updated in real time. Static security-based ATC can be computed for the base case system as well as for the critical line outages of the system. Since critical line outages are based on static security analysis, the computation of static security based ATC using conventional methods is both tedious and time consuming. In this study, static security-based ATC was computed for real-time applications using 3 artificial intelligent methods notably the back propagation algorithm (BPA), the radial basis function (RBF) neural network, and the adaptive neuro fuzzy inference system (ANFIS). An IEEE 24-bus reliability test system (RTS) and 75-bus practical system were used to test these 3 different intelligent methods. The results were compared with the conventional full alternating current (AC) load flow method for different transactions.

  19. Whatever the cost? Information integration in memory-based inferences depends on cognitive effort.

    Hilbig, Benjamin E; Michalkiewicz, Martha; Castela, Marta; Pohl, Rüdiger F; Erdfelder, Edgar

    2015-05-01

    One of the most prominent models of probabilistic inferences from memory is the simple recognition heuristic (RH). The RH theory assumes that judgments are based on recognition in isolation, such that other information is ignored. However, some prior research has shown that available knowledge is not generally ignored. In line with the notion of adaptive strategy selection--and, thus, a trade-off between accuracy and effort--we hypothesized that information integration crucially depends on how easily accessible information beyond recognition is, how much confidence decision makers have in this information, and how (cognitively) costly it is to acquire it. In three experiments, we thus manipulated (a) the availability of information beyond recognition, (b) the subjective usefulness of this information, and (c) the cognitive costs associated with acquiring this information. In line with the predictions, we found that RH use decreased substantially, the more easily and confidently information beyond recognition could be integrated, and increased substantially with increasing cognitive costs.

  20. Design of a biped locomotion controller based on adaptive neuro-fuzzy inference systems

    Shieh, M-Y; Chang, K-H [Department of E. E., Southern Taiwan University, 1 Nantai St., YungKang City, Tainan County 71005, Taiwan (China); Lia, Y-S [Executive Director Office, ITRI, Southern Taiwan Innovation Park, Tainan County, Taiwan (China)], E-mail: myshieh@mail.stut.edu.tw

    2008-02-15

    This paper proposes a method for the design of a biped locomotion controller based on the ANFIS (Adaptive Neuro-Fuzzy Inference System) inverse learning model. In the model developed here, an integrated ANFIS structure is trained to function as the system identifier for the modeling of the inverse dynamics of a biped robot. The parameters resulting from the modeling process are duplicated and integrated as those of the biped locomotion controller to provide favorable control action. As the simulation results show, the proposed controller is able to generate a stable walking cycle for a biped robot. Moreover, the experimental results demonstrate that the performance of the proposed controller is satisfactory under conditions when the robot stands in different postures or moves on a rugged surface.

  1. Design of a biped locomotion controller based on adaptive neuro-fuzzy inference systems

    Shieh, M-Y; Chang, K-H; Lia, Y-S

    2008-01-01

    This paper proposes a method for the design of a biped locomotion controller based on the ANFIS (Adaptive Neuro-Fuzzy Inference System) inverse learning model. In the model developed here, an integrated ANFIS structure is trained to function as the system identifier for the modeling of the inverse dynamics of a biped robot. The parameters resulting from the modeling process are duplicated and integrated as those of the biped locomotion controller to provide favorable control action. As the simulation results show, the proposed controller is able to generate a stable walking cycle for a biped robot. Moreover, the experimental results demonstrate that the performance of the proposed controller is satisfactory under conditions when the robot stands in different postures or moves on a rugged surface

  2. Velocity-based movement modeling for individual and population level inference.

    Ephraim M Hanks

    Full Text Available Understanding animal movement and resource selection provides important information about the ecology of the animal, but an animal's movement and behavior are not typically constant in time. We present a velocity-based approach for modeling animal movement in space and time that allows for temporal heterogeneity in an animal's response to the environment, allows for temporal irregularity in telemetry data, and accounts for the uncertainty in the location information. Population-level inference on movement patterns and resource selection can then be made through cluster analysis of the parameters related to movement and behavior. We illustrate this approach through a study of northern fur seal (Callorhinus ursinus movement in the Bering Sea, Alaska, USA. Results show sex differentiation, with female northern fur seals exhibiting stronger response to environmental variables.

  3. BPhyOG: An interactive server for genome-wide inference of bacterial phylogenies based on overlapping genes

    Lin Kui

    2007-07-01

    Full Text Available Abstract Background Overlapping genes (OGs in bacterial genomes are pairs of adjacent genes of which the coding sequences overlap partly or entirely. With the rapid accumulation of sequence data, many OGs in bacterial genomes have now been identified. Indeed, these might prove a consistent feature across all microbial genomes. Our previous work suggests that OGs can be considered as robust markers at the whole genome level for the construction of phylogenies. An online, interactive web server for inferring phylogenies is needed for biologists to analyze phylogenetic relationships among a set of bacterial genomes of interest. Description BPhyOG is an online interactive server for reconstructing the phylogenies of completely sequenced bacterial genomes on the basis of their shared overlapping genes. It provides two tree-reconstruction methods: Neighbor Joining (NJ and Unweighted Pair-Group Method using Arithmetic averages (UPGMA. Users can apply the desired method to generate phylogenetic trees, which are based on an evolutionary distance matrix for the selected genomes. The distance between two genomes is defined by the normalized number of their shared OG pairs. BPhyOG also allows users to browse the OGs that were used to infer the phylogenetic relationships. It provides detailed annotation for each OG pair and the features of the component genes through hyperlinks. Users can also retrieve each of the homologous OG pairs that have been determined among 177 genomes. It is a useful tool for analyzing the tree of life and overlapping genes from a genomic standpoint. Conclusion BPhyOG is a useful interactive web server for genome-wide inference of any potential evolutionary relationship among the genomes selected by users. It currently includes 177 completely sequenced bacterial genomes containing 79,855 OG pairs, the annotation and homologous OG pairs of which are integrated comprehensively. The reliability of phylogenies complemented by

  4. Travel Time Estimation Using Freeway Point Detector Data Based on Evolving Fuzzy Neural Inference System.

    Jinjun Tang

    Full Text Available Travel time is an important measurement used to evaluate the extent of congestion within road networks. This paper presents a new method to estimate the travel time based on an evolving fuzzy neural inference system. The input variables in the system are traffic flow data (volume, occupancy, and speed collected from loop detectors located at points both upstream and downstream of a given link, and the output variable is the link travel time. A first order Takagi-Sugeno fuzzy rule set is used to complete the inference. For training the evolving fuzzy neural network (EFNN, two learning processes are proposed: (1 a K-means method is employed to partition input samples into different clusters, and a Gaussian fuzzy membership function is designed for each cluster to measure the membership degree of samples to the cluster centers. As the number of input samples increases, the cluster centers are modified and membership functions are also updated; (2 a weighted recursive least squares estimator is used to optimize the parameters of the linear functions in the Takagi-Sugeno type fuzzy rules. Testing datasets consisting of actual and simulated data are used to test the proposed method. Three common criteria including mean absolute error (MAE, root mean square error (RMSE, and mean absolute relative error (MARE are utilized to evaluate the estimation performance. Estimation results demonstrate the accuracy and effectiveness of the EFNN method through comparison with existing methods including: multiple linear regression (MLR, instantaneous model (IM, linear model (LM, neural network (NN, and cumulative plots (CP.

  5. Enabling Participatory Decision Making Through Web-Based GIS

    Sheng, Grant; Yam, Kevin; Hassas, Aranak [York Univ., Toronto, ON (Canada). Faculty of Environmental Studies

    2001-07-01

    , morals, ethical behaviour, our relationship with, and obligations to fellow human beings, animals, and the environment. People perceive that information travels essentially one way in the processes and the voices of the community and its members are not heard. Subsequently, they feel excluded from the actual decision making process and even from being able to participate meaningfully in the process. Recent advances in informatics and geomatics technology, such as the Internet, web-based software and geographic information systems (GIS), have made it possible to address these issues more effectively. We believe that the combined features of two software developed at the York Centre for Applied Sustainability can facilitate access to information, provide a virtual forum for discussion and debate, and it possible for individuals to participate in decision making process, and to infer peoples' values from their choice criteria selection.

  6. Permutation-based inference for the AUC: A unified approach for continuous and discontinuous data.

    Pauly, Markus; Asendorf, Thomas; Konietschke, Frank

    2016-11-01

    We investigate rank-based studentized permutation methods for the nonparametric Behrens-Fisher problem, that is, inference methods for the area under the ROC curve. We hereby prove that the studentized permutation distribution of the Brunner-Munzel rank statistic is asymptotically standard normal, even under the alternative. Thus, incidentally providing the hitherto missing theoretical foundation for the Neubert and Brunner studentized permutation test. In particular, we do not only show its consistency, but also that confidence intervals for the underlying treatment effects can be computed by inverting this permutation test. In addition, we derive permutation-based range-preserving confidence intervals. Extensive simulation studies show that the permutation-based confidence intervals appear to maintain the preassigned coverage probability quite accurately (even for rather small sample sizes). For a convenient application of the proposed methods, a freely available software package for the statistical software R has been developed. A real data example illustrates the application. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  7. Molecular phylogeny of Toxoplasmatinae: comparison between inferences based on mitochondrial and apicoplast genetic sequences

    Michelle Klein Sercundes

    2016-03-01

    Full Text Available Abstract Phylogenies within Toxoplasmatinae have been widely investigated with different molecular markers. Here, we studied molecular phylogenies of the Toxoplasmatinae subfamily based on apicoplast and mitochondrial genes. Partial sequences of apicoplast genes coding for caseinolytic protease (clpC and beta subunit of RNA polymerase (rpoB, and mitochondrial gene coding for cytochrome B (cytB were analyzed. Laboratory-adapted strains of the closely related parasites Sarcocystis falcatula and Sarcocystis neurona were investigated, along with Neospora caninum, Neospora hughesi, Toxoplasma gondii (strains RH, CTG and PTG, Besnoitia akodoni, Hammondia hammondiand two genetically divergent lineages of Hammondia heydorni. The molecular analysis based on organellar genes did not clearly differentiate between N. caninum and N. hughesi, but the two lineages of H. heydorni were confirmed. Slight differences between the strains of S. falcatula and S. neurona were encountered in all markers. In conclusion, congruent phylogenies were inferred from the three different genes and they might be used for screening undescribed sarcocystid parasites in order to ascertain their phylogenetic relationships with organisms of the family Sarcocystidae. The evolutionary studies based on organelar genes confirm that the genusHammondia is paraphyletic. The primers used for amplification of clpC and rpoB were able to amplify genetic sequences of organisms of the genus Sarcocystisand organisms of the subfamily Toxoplasmatinae as well.

  8. Trust-Based Leadership in the Making

    Bentzen, Tina Øllgaard; Jagd, Søren

    I spite of the popularity of trust-based leadership in consultancy and popular management writings empirical research on the transformation from traditional top-down to trust-based leadership is still limited. In this paper we study the implementation of a trust-based leadership reform in the City...... of Copenhagen taking place since 2012. We focus on understanding the trust dynamics in this major transformation. We show that the implementation of trust-based leadership should be seen as an emergent process involving a variety of actors within the organization. The case study reveals that the Trust Reform...... indicates that the implementation of a radical management reform involves a complex interplay of trust relations between actors at multiple levels of the organization....

  9. Likelihood Inference of Nonlinear Models Based on a Class of Flexible Skewed Distributions

    Xuedong Chen

    2014-01-01

    Full Text Available This paper deals with the issue of the likelihood inference for nonlinear models with a flexible skew-t-normal (FSTN distribution, which is proposed within a general framework of flexible skew-symmetric (FSS distributions by combining with skew-t-normal (STN distribution. In comparison with the common skewed distributions such as skew normal (SN, and skew-t (ST as well as scale mixtures of skew normal (SMSN, the FSTN distribution can accommodate more flexibility and robustness in the presence of skewed, heavy-tailed, especially multimodal outcomes. However, for this distribution, a usual approach of maximum likelihood estimates based on EM algorithm becomes unavailable and an alternative way is to return to the original Newton-Raphson type method. In order to improve the estimation as well as the way for confidence estimation and hypothesis test for the parameters of interest, a modified Newton-Raphson iterative algorithm is presented in this paper, based on profile likelihood for nonlinear regression models with FSTN distribution, and, then, the confidence interval and hypothesis test are also developed. Furthermore, a real example and simulation are conducted to demonstrate the usefulness and the superiority of our approach.

  10. Research on NDT Technology in Inference of Steel Member Strength Based on Macro/Micro Model

    Beidou Ding

    2017-01-01

    Full Text Available In consideration of correlations among hardness, chemical composition, grain size, and strength of carbon steel, a new nondestructive testing technology (NDT of inferring the carbon steel strength was explored. First, the hardness test, chemical composition analysis, and metallographic analysis of 162 low-carbon steel samples were conducted. Second, the following works were carried out: (1 quantitative relationship between steel Leeb hardness and carbon steel strength was studied on the basis of regression analysis of experimental data; (2 influences of chemical composition and grain size on tension properties of carbon steel were analyzed on the basis of stepwise regression analysis, and quantitative relationship between conventional compositions and grain size with steel strength was obtained; (3 according to the macro and/or micro factors such as hardness, chemical compositions, and grain size of carbon steel, the fitting formula of steel strength was established based on MLR (multiple linear regressions method. The above relationships and fitting formula based on MLR method could be used to estimate the steel strength with no damage to the structure in engineering practice.

  11. An IPC-based Prolog design pattern for integrating backward chaining inference into applications or embedded systems

    Li Guoqi

    2014-12-01

    Full Text Available Prolog is one of the most important candidates to build expert systems and AI-related programs and has potential applications in embedded systems. However, Prolog is not suitable to develop many kinds of components, such as data acquisition and task scheduling, which are also crucial. To make the best use of the advantages and bypass the disadvantages, it is attractive to integrate Prolog with programs developed by other languages. In this paper, an IPC-based method is used to integrate backward chaining inference implemented by Prolog into applications or embedded systems. A Prolog design pattern is derived from the method for reuse, whose principle and definition are provided in detail. Additionally, the design pattern is applied to a target system, which is free software, to verify its feasibility. The detailed implementation of the application is given to clarify the design pattern. The design pattern can be further applied to wide range applications and embedded systems and the method described in this paper can also be adopted for other logic programming languages.

  12. Making Work-Based Learning Work

    Cahill, Charlotte

    2016-01-01

    Americans seeking employment often face a conundrum: relevant work experience is a prerequisite for many jobs, but it is difficult to gain the required experience without being in the workplace. Work-based learning--activities that occur in workplaces through which youth and adults gain the knowledge, skills, and experience needed for entry or…

  13. Inference and the Introductory Statistics Course

    Pfannkuch, Maxine; Regan, Matt; Wild, Chris; Budgett, Stephanie; Forbes, Sharleen; Harraway, John; Parsonage, Ross

    2011-01-01

    This article sets out some of the rationale and arguments for making major changes to the teaching and learning of statistical inference in introductory courses at our universities by changing from a norm-based, mathematical approach to more conceptually accessible computer-based approaches. The core problem of the inferential argument with its…

  14. Shape Distributions of Nonlinear Dynamical Systems for Video-Based Inference.

    Venkataraman, Vinay; Turaga, Pavan

    2016-12-01

    This paper presents a shape-theoretic framework for dynamical analysis of nonlinear dynamical systems which appear frequently in several video-based inference tasks. Traditional approaches to dynamical modeling have included linear and nonlinear methods with their respective drawbacks. A novel approach we propose is the use of descriptors of the shape of the dynamical attractor as a feature representation of nature of dynamics. The proposed framework has two main advantages over traditional approaches: a) representation of the dynamical system is derived directly from the observational data, without any inherent assumptions, and b) the proposed features show stability under different time-series lengths where traditional dynamical invariants fail. We illustrate our idea using nonlinear dynamical models such as Lorenz and Rossler systems, where our feature representations (shape distribution) support our hypothesis that the local shape of the reconstructed phase space can be used as a discriminative feature. Our experimental analyses on these models also indicate that the proposed framework show stability for different time-series lengths, which is useful when the available number of samples are small/variable. The specific applications of interest in this paper are: 1) activity recognition using motion capture and RGBD sensors, 2) activity quality assessment for applications in stroke rehabilitation, and 3) dynamical scene classification. We provide experimental validation through action and gesture recognition experiments on motion capture and Kinect datasets. In all these scenarios, we show experimental evidence of the favorable properties of the proposed representation.

  15. Characteristics of SiC neutron sensor spectrum unfolding process based on Bayesian inference

    Cetnar, Jerzy; Krolikowski, Igor [Faculty of Energy and Fuels AGH - University of Science and Technology, Al. Mickiewicza 30, 30-059 Krakow (Poland); Ottaviani, L. [IM2NP, UMR CNRS 7334, Aix-Marseille University, Case 231 -13397 Marseille Cedex 20 (France); Lyoussi, A. [CEA, DEN, DER, Instrumentation Sensors and Dosimetry Laboratory, Cadarache, F-13108 St-Paul-Lez-Durance (France)

    2015-07-01

    This paper deals with SiC detector signal interpretation in neutron radiation measurements in mixed neutron gamma radiation fields, which is called the detector inverse problem or the spectrum unfolding, and it aims in finding a representation of the primary radiation, based on the measured detector signals. In our novel methodology we resort to Bayesian inference approach. In the developed procedure the resultant spectra is unfolded form detector channels reading, where the estimated neutron fluence in a group structure is obtained with its statistical characteristic comprising of standard deviation and correlation matrix. In the paper we present results of unfolding process for case of D-T neutron source in neutron moderating environment. Discussions of statistical properties of obtained results are presented as well as of the physical meaning of obtained correlation matrix of estimated group fluence. The presented works has been carried out within the I-SMART project, which is part of the KIC InnoEnergy R and D program. (authors)

  16. Ontology-Based High-Level Context Inference for Human Behavior Identification

    Claudia Villalonga

    2016-09-01

    Full Text Available Recent years have witnessed a huge progress in the automatic identification of individual primitives of human behavior, such as activities or locations. However, the complex nature of human behavior demands more abstract contextual information for its analysis. This work presents an ontology-based method that combines low-level primitives of behavior, namely activity, locations and emotions, unprecedented to date, to intelligently derive more meaningful high-level context information. The paper contributes with a new open ontology describing both low-level and high-level context information, as well as their relationships. Furthermore, a framework building on the developed ontology and reasoning models is presented and evaluated. The proposed method proves to be robust while identifying high-level contexts even in the event of erroneously-detected low-level contexts. Despite reasonable inference times being obtained for a relevant set of users and instances, additional work is required to scale to long-term scenarios with a large number of users.

  17. Ontology-Based High-Level Context Inference for Human Behavior Identification

    Villalonga, Claudia; Razzaq, Muhammad Asif; Khan, Wajahat Ali; Pomares, Hector; Rojas, Ignacio; Lee, Sungyoung; Banos, Oresti

    2016-01-01

    Recent years have witnessed a huge progress in the automatic identification of individual primitives of human behavior, such as activities or locations. However, the complex nature of human behavior demands more abstract contextual information for its analysis. This work presents an ontology-based method that combines low-level primitives of behavior, namely activity, locations and emotions, unprecedented to date, to intelligently derive more meaningful high-level context information. The paper contributes with a new open ontology describing both low-level and high-level context information, as well as their relationships. Furthermore, a framework building on the developed ontology and reasoning models is presented and evaluated. The proposed method proves to be robust while identifying high-level contexts even in the event of erroneously-detected low-level contexts. Despite reasonable inference times being obtained for a relevant set of users and instances, additional work is required to scale to long-term scenarios with a large number of users. PMID:27690050

  18. Population-based statistical inference for temporal sequence of somatic mutations in cancer genomes.

    Rhee, Je-Keun; Kim, Tae-Min

    2018-04-20

    It is well recognized that accumulation of somatic mutations in cancer genomes plays a role in carcinogenesis; however, the temporal sequence and evolutionary relationship of somatic mutations remain largely unknown. In this study, we built a population-based statistical framework to infer the temporal sequence of acquisition of somatic mutations. Using the model, we analyzed the mutation profiles of 1954 tumor specimens across eight tumor types. As a result, we identified tumor type-specific directed networks composed of 2-15 cancer-related genes (nodes) and their mutational orders (edges). The most common ancestors identified in pairwise comparison of somatic mutations were TP53 mutations in breast, head/neck, and lung cancers. The known relationship of KRAS to TP53 mutations in colorectal cancers was identified, as well as potential ancestors of TP53 mutation such as NOTCH1, EGFR, and PTEN mutations in head/neck, lung and endometrial cancers, respectively. We also identified apoptosis-related genes enriched with ancestor mutations in lung cancers and a relationship between APC hotspot mutations and TP53 mutations in colorectal cancers. While evolutionary analysis of cancers has focused on clonal versus subclonal mutations identified in individual genomes, our analysis aims to further discriminate ancestor versus descendant mutations in population-scale mutation profiles that may help select cancer drivers with clinical relevance.

  19. Strategies for memory-based decision making: Modeling behavioral and neural signatures within a cognitive architecture.

    Fechner, Hanna B; Pachur, Thorsten; Schooler, Lael J; Mehlhorn, Katja; Battal, Ceren; Volz, Kirsten G; Borst, Jelmer P

    2016-12-01

    How do people use memories to make inferences about real-world objects? We tested three strategies based on predicted patterns of response times and blood-oxygen-level-dependent (BOLD) responses: one strategy that relies solely on recognition memory, a second that retrieves additional knowledge, and a third, lexicographic (i.e., sequential) strategy, that considers knowledge conditionally on the evidence obtained from recognition memory. We implemented the strategies as computational models within the Adaptive Control of Thought-Rational (ACT-R) cognitive architecture, which allowed us to derive behavioral and neural predictions that we then compared to the results of a functional magnetic resonance imaging (fMRI) study in which participants inferred which of two cities is larger. Overall, versions of the lexicographic strategy, according to which knowledge about many but not all alternatives is searched, provided the best account of the joint patterns of response times and BOLD responses. These results provide insights into the interplay between recognition and additional knowledge in memory, hinting at an adaptive use of these two sources of information in decision making. The results highlight the usefulness of implementing models of decision making within a cognitive architecture to derive predictions on the behavioral and neural level. Copyright © 2016 Elsevier B.V. All rights reserved.

  20. Ancestry inference using principal component analysis and spatial analysis: a distance-based analysis to account for population substructure.

    Byun, Jinyoung; Han, Younghun; Gorlov, Ivan P; Busam, Jonathan A; Seldin, Michael F; Amos, Christopher I

    2017-10-16

    Accurate inference of genetic ancestry is of fundamental interest to many biomedical, forensic, and anthropological research areas. Genetic ancestry memberships may relate to genetic disease risks. In a genome association study, failing to account for differences in genetic ancestry between cases and controls may also lead to false-positive results. Although a number of strategies for inferring and taking into account the confounding effects of genetic ancestry are available, applying them to large studies (tens thousands samples) is challenging. The goal of this study is to develop an approach for inferring genetic ancestry of samples with unknown ancestry among closely related populations and to provide accurate estimates of ancestry for application to large-scale studies. In this study we developed a novel distance-based approach, Ancestry Inference using Principal component analysis and Spatial analysis (AIPS) that incorporates an Inverse Distance Weighted (IDW) interpolation method from spatial analysis to assign individuals to population memberships. We demonstrate the benefits of AIPS in analyzing population substructure, specifically related to the four most commonly used tools EIGENSTRAT, STRUCTURE, fastSTRUCTURE, and ADMIXTURE using genotype data from various intra-European panels and European-Americans. While the aforementioned commonly used tools performed poorly in inferring ancestry from a large number of subpopulations, AIPS accurately distinguished variations between and within subpopulations. Our results show that AIPS can be applied to large-scale data sets to discriminate the modest variability among intra-continental populations as well as for characterizing inter-continental variation. The method we developed will protect against spurious associations when mapping the genetic basis of a disease. Our approach is more accurate and computationally efficient method for inferring genetic ancestry in the large-scale genetic studies.

  1. Gaussian process-based Bayesian nonparametric inference of population size trajectories from gene genealogies.

    Palacios, Julia A; Minin, Vladimir N

    2013-03-01

    Changes in population size influence genetic diversity of the population and, as a result, leave a signature of these changes in individual genomes in the population. We are interested in the inverse problem of reconstructing past population dynamics from genomic data. We start with a standard framework based on the coalescent, a stochastic process that generates genealogies connecting randomly sampled individuals from the population of interest. These genealogies serve as a glue between the population demographic history and genomic sequences. It turns out that only the times of genealogical lineage coalescences contain information about population size dynamics. Viewing these coalescent times as a point process, estimating population size trajectories is equivalent to estimating a conditional intensity of this point process. Therefore, our inverse problem is similar to estimating an inhomogeneous Poisson process intensity function. We demonstrate how recent advances in Gaussian process-based nonparametric inference for Poisson processes can be extended to Bayesian nonparametric estimation of population size dynamics under the coalescent. We compare our Gaussian process (GP) approach to one of the state-of-the-art Gaussian Markov random field (GMRF) methods for estimating population trajectories. Using simulated data, we demonstrate that our method has better accuracy and precision. Next, we analyze two genealogies reconstructed from real sequences of hepatitis C and human Influenza A viruses. In both cases, we recover more believed aspects of the viral demographic histories than the GMRF approach. We also find that our GP method produces more reasonable uncertainty estimates than the GMRF method. Copyright © 2013, The International Biometric Society.

  2. Coalescent-based species tree inference from gene tree topologies under incomplete lineage sorting by maximum likelihood.

    Wu, Yufeng

    2012-03-01

    Incomplete lineage sorting can cause incongruence between the phylogenetic history of genes (the gene tree) and that of the species (the species tree), which can complicate the inference of phylogenies. In this article, I present a new coalescent-based algorithm for species tree inference with maximum likelihood. I first describe an improved method for computing the probability of a gene tree topology given a species tree, which is much faster than an existing algorithm by Degnan and Salter (2005). Based on this method, I develop a practical algorithm that takes a set of gene tree topologies and infers species trees with maximum likelihood. This algorithm searches for the best species tree by starting from initial species trees and performing heuristic search to obtain better trees with higher likelihood. This algorithm, called STELLS (which stands for Species Tree InfErence with Likelihood for Lineage Sorting), has been implemented in a program that is downloadable from the author's web page. The simulation results show that the STELLS algorithm is more accurate than an existing maximum likelihood method for many datasets, especially when there is noise in gene trees. I also show that the STELLS algorithm is efficient and can be applied to real biological datasets. © 2011 The Author. Evolution© 2011 The Society for the Study of Evolution.

  3. Inferring causal molecular networks: empirical assessment through a community-based effort.

    Hill, Steven M; Heiser, Laura M; Cokelaer, Thomas; Unger, Michael; Nesser, Nicole K; Carlin, Daniel E; Zhang, Yang; Sokolov, Artem; Paull, Evan O; Wong, Chris K; Graim, Kiley; Bivol, Adrian; Wang, Haizhou; Zhu, Fan; Afsari, Bahman; Danilova, Ludmila V; Favorov, Alexander V; Lee, Wai Shing; Taylor, Dane; Hu, Chenyue W; Long, Byron L; Noren, David P; Bisberg, Alexander J; Mills, Gordon B; Gray, Joe W; Kellen, Michael; Norman, Thea; Friend, Stephen; Qutub, Amina A; Fertig, Elana J; Guan, Yuanfang; Song, Mingzhou; Stuart, Joshua M; Spellman, Paul T; Koeppl, Heinz; Stolovitzky, Gustavo; Saez-Rodriguez, Julio; Mukherjee, Sach

    2016-04-01

    It remains unclear whether causal, rather than merely correlational, relationships in molecular networks can be inferred in complex biological settings. Here we describe the HPN-DREAM network inference challenge, which focused on learning causal influences in signaling networks. We used phosphoprotein data from cancer cell lines as well as in silico data from a nonlinear dynamical model. Using the phosphoprotein data, we scored more than 2,000 networks submitted by challenge participants. The networks spanned 32 biological contexts and were scored in terms of causal validity with respect to unseen interventional data. A number of approaches were effective, and incorporating known biology was generally advantageous. Additional sub-challenges considered time-course prediction and visualization. Our results suggest that learning causal relationships may be feasible in complex settings such as disease states. Furthermore, our scoring approach provides a practical way to empirically assess inferred molecular networks in a causal sense.

  4. A human error probability estimate methodology based on fuzzy inference and expert judgment on nuclear plants

    Nascimento, C.S. do; Mesquita, R.N. de

    2009-01-01

    Recent studies point human error as an important factor for many industrial and nuclear accidents: Three Mile Island (1979), Bhopal (1984), Chernobyl and Challenger (1986) are classical examples. Human contribution to these accidents may be better understood and analyzed by using Human Reliability Analysis (HRA), which has being taken as an essential part on Probabilistic Safety Analysis (PSA) of nuclear plants. Both HRA and PSA depend on Human Error Probability (HEP) for a quantitative analysis. These probabilities are extremely affected by the Performance Shaping Factors (PSF), which has a direct effect on human behavior and thus shape HEP according with specific environment conditions and personal individual characteristics which are responsible for these actions. This PSF dependence raises a great problem on data availability as turn these scarcely existent database too much generic or too much specific. Besides this, most of nuclear plants do not keep historical records of human error occurrences. Therefore, in order to overcome this occasional data shortage, a methodology based on Fuzzy Inference and expert judgment was employed in this paper in order to determine human error occurrence probabilities and to evaluate PSF's on performed actions by operators in a nuclear power plant (IEA-R1 nuclear reactor). Obtained HEP values were compared with reference tabled data used on current literature in order to show method coherence and valid approach. This comparison leads to a conclusion that this work results are able to be employed both on HRA and PSA enabling efficient prospection of plant safety conditions, operational procedures and local working conditions potential improvements (author)

  5. Inference of Cell Mechanics in Heterogeneous Epithelial Tissue Based on Multivariate Clone Shape Quantification

    Tsuboi, Alice; Umetsu, Daiki; Kuranaga, Erina; Fujimoto, Koichi

    2017-01-01

    Cell populations in multicellular organisms show genetic and non-genetic heterogeneity, even in undifferentiated tissues of multipotent cells during development and tumorigenesis. The heterogeneity causes difference of mechanical properties, such as, cell bond tension or adhesion, at the cell–cell interface, which determine the shape of clonal population boundaries via cell sorting or mixing. The boundary shape could alter the degree of cell–cell contacts and thus influence the physiological consequences of sorting or mixing at the boundary (e.g., tumor suppression or progression), suggesting that the cell mechanics could help clarify the physiology of heterogeneous tissues. While precise inference of mechanical tension loaded at each cell–cell contacts has been extensively developed, there has been little progress on how to distinguish the population-boundary geometry and identify the cause of geometry in heterogeneous tissues. We developed a pipeline by combining multivariate analysis of clone shape with tissue mechanical simulations. We examined clones with four different genotypes within Drosophila wing imaginal discs: wild-type, tartan (trn) overexpression, hibris (hbs) overexpression, and Eph RNAi. Although the clones were previously known to exhibit smoothed or convoluted morphologies, their mechanical properties were unknown. By applying a multivariate analysis to multiple criteria used to quantify the clone shapes based on individual cell shapes, we found the optimal criteria to distinguish not only among the four genotypes, but also non-genetic heterogeneity from genetic one. The efficient segregation of clone shape enabled us to quantitatively compare experimental data with tissue mechanical simulations. As a result, we identified the mechanical basis contributed to clone shape of distinct genotypes. The present pipeline will promote the understanding of the functions of mechanical interactions in heterogeneous tissue in a non-invasive manner. PMID

  6. A review and comparison of Bayesian and likelihood-based inferences in beta regression and zero-or-one-inflated beta regression.

    Liu, Fang; Eugenio, Evercita C

    2018-04-01

    Beta regression is an increasingly popular statistical technique in medical research for modeling of outcomes that assume values in (0, 1), such as proportions and patient reported outcomes. When outcomes take values in the intervals [0,1), (0,1], or [0,1], zero-or-one-inflated beta (zoib) regression can be used. We provide a thorough review on beta regression and zoib regression in the modeling, inferential, and computational aspects via the likelihood-based and Bayesian approaches. We demonstrate the statistical and practical importance of correctly modeling the inflation at zero/one rather than ad hoc replacing them with values close to zero/one via simulation studies; the latter approach can lead to biased estimates and invalid inferences. We show via simulation studies that the likelihood-based approach is computationally faster in general than MCMC algorithms used in the Bayesian inferences, but runs the risk of non-convergence, large biases, and sensitivity to starting values in the optimization algorithm especially with clustered/correlated data, data with sparse inflation at zero and one, and data that warrant regularization of the likelihood. The disadvantages of the regular likelihood-based approach make the Bayesian approach an attractive alternative in these cases. Software packages and tools for fitting beta and zoib regressions in both the likelihood-based and Bayesian frameworks are also reviewed.

  7. A Game-Chain-Based Approach for Decision Making

    An, Tingyu; Watanabe, tsunami

    2009-01-01

    Nowadays, with the rapid development of information society, decision-making problems become more and more complicated especially in large scale systems such as infrastructure, environmental and industrial fields, which are usually accompanied by psychological competition between involved parties in a complicated, uncertain and dynamic situation. From a holistic perspective of system, a specific decision-making method which is described as game-chain-based decision making has been proposed in...

  8. Event Completion: Event Based Inferences Distort Memory in a Matter of Seconds

    Strickland, Brent; Keil, Frank

    2011-01-01

    We present novel evidence that implicit causal inferences distort memory for events only seconds after viewing. Adults watched videos of someone launching (or throwing) an object. However, the videos omitted the moment of contact (or release). Subjects falsely reported seeing the moment of contact when it was implied by subsequent footage but did…

  9. Multimodel inference and adaptive management

    Rehme, S.E.; Powell, L.A.; Allen, Craig R.

    2011-01-01

    Ecology is an inherently complex science coping with correlated variables, nonlinear interactions and multiple scales of pattern and process, making it difficult for experiments to result in clear, strong inference. Natural resource managers, policy makers, and stakeholders rely on science to provide timely and accurate management recommendations. However, the time necessary to untangle the complexities of interactions within ecosystems is often far greater than the time available to make management decisions. One method of coping with this problem is multimodel inference. Multimodel inference assesses uncertainty by calculating likelihoods among multiple competing hypotheses, but multimodel inference results are often equivocal. Despite this, there may be pressure for ecologists to provide management recommendations regardless of the strength of their study’s inference. We reviewed papers in the Journal of Wildlife Management (JWM) and the journal Conservation Biology (CB) to quantify the prevalence of multimodel inference approaches, the resulting inference (weak versus strong), and how authors dealt with the uncertainty. Thirty-eight percent and 14%, respectively, of articles in the JWM and CB used multimodel inference approaches. Strong inference was rarely observed, with only 7% of JWM and 20% of CB articles resulting in strong inference. We found the majority of weak inference papers in both journals (59%) gave specific management recommendations. Model selection uncertainty was ignored in most recommendations for management. We suggest that adaptive management is an ideal method to resolve uncertainty when research results in weak inference.

  10. Response of multiferroic composites inferred from a fast-Fourier-transform-based numerical scheme

    Brenner, Renald; Bravo-Castillero, Julián

    2010-01-01

    The effective response and the local fields within periodic magneto-electric multiferroic composites are investigated by means of a numerical scheme based on fast Fourier transforms. This computational framework relies on the iterative resolution of coupled series expansions for the magnetic, electric and strain fields. By using an augmented Lagrangian formulation, a simple and robust procedure which makes use of the uncoupled Green operators for the elastic, electrostatics and magnetostatics problems is proposed. Its accuracy is assessed in the cases of laminated and fibrous two-phase composites for which analytical solutions exist

  11. Algorithms for MDC-based multi-locus phylogeny inference: beyond rooted binary gene trees on single alleles.

    Yu, Yun; Warnow, Tandy; Nakhleh, Luay

    2011-11-01

    One of the criteria for inferring a species tree from a collection of gene trees, when gene tree incongruence is assumed to be due to incomplete lineage sorting (ILS), is Minimize Deep Coalescence (MDC). Exact algorithms for inferring the species tree from rooted, binary trees under MDC were recently introduced. Nevertheless, in phylogenetic analyses of biological data sets, estimated gene trees may differ from true gene trees, be incompletely resolved, and not necessarily rooted. In this article, we propose new MDC formulations for the cases where the gene trees are unrooted/binary, rooted/non-binary, and unrooted/non-binary. Further, we prove structural theorems that allow us to extend the algorithms for the rooted/binary gene tree case to these cases in a straightforward manner. In addition, we devise MDC-based algorithms for cases when multiple alleles per species may be sampled. We study the performance of these methods in coalescent-based computer simulations.

  12. Inferring motion and location using WLAN RSSI

    Kavitha Muthukrishnan, K.; van der Zwaag, B.J.; Havinga, Paul J.M.; Fuller, R.; Koutsoukos, X.

    2009-01-01

    We present novel algorithms to infer movement by making use of inherent fluctuations in the received signal strengths from existing WLAN infrastructure. We evaluate the performance of the presented algorithms based on classification metrics such as recall and precision using annotated traces

  13. Base oils and methods for making the same

    Ohler, Nicholas; Fisher, Karl; Tirmizi, Shakeel

    2018-01-09

    Provided herein are isoparaffins derived from hydrocarbon terpenes such as myrcene, ocimene and farnesene, and methods for making the same. In certain variations, the isoparaffins have utility as lubricant base stocks.

  14. Protocol-based care: the standardisation of decision-making?

    Rycroft-Malone, Jo; Fontenla, Marina; Seers, Kate; Bick, Debra

    2009-05-01

    To explore how protocol-based care affects clinical decision-making. In the context of evidence-based practice, protocol-based care is a mechanism for facilitating the standardisation of care and streamlining decision-making through rationalising the information with which to make judgements and ultimately decisions. However, whether protocol-based care does, in the reality of practice, standardise decision-making is unknown. This paper reports on a study that explored the impact of protocol-based care on nurses' decision-making. Theoretically informed by realistic evaluation and the promoting action on research implementation in health services framework, a case study design using ethnographic methods was used. Two sites were purposively sampled; a diabetic and endocrine unit and a cardiac medical unit. Within each site, data collection included observation, postobservation semi-structured interviews with staff and patients, field notes, feedback sessions and document review. Data were inductively and thematically analysed. Decisions made by nurses in both sites were varied according to many different and interacting factors. While several standardised care approaches were available for use, in reality, a variety of information sources informed decision-making. The primary approach to knowledge exchange and acquisition was person-to-person; decision-making was a social activity. Rarely were standardised care approaches obviously referred to; nurses described following a mental flowchart, not necessarily linked to a particular guideline or protocol. When standardised care approaches were used, it was reported that they were used flexibly and particularised. While the logic of protocol-based care is algorithmic, in the reality of clinical practice, other sources of information supported nurses' decision-making process. This has significant implications for the political goal of standardisation. The successful implementation and judicious use of tools such as

  15. Pedestrian Detection Based on Adaptive Selection of Visible Light or Far-Infrared Light Camera Image by Fuzzy Inference System and Convolutional Neural Network-Based Verification.

    Kang, Jin Kyu; Hong, Hyung Gil; Park, Kang Ryoung

    2017-07-08

    A number of studies have been conducted to enhance the pedestrian detection accuracy of intelligent surveillance systems. However, detecting pedestrians under outdoor conditions is a challenging problem due to the varying lighting, shadows, and occlusions. In recent times, a growing number of studies have been performed on visible light camera-based pedestrian detection systems using a convolutional neural network (CNN) in order to make the pedestrian detection process more resilient to such conditions. However, visible light cameras still cannot detect pedestrians during nighttime, and are easily affected by shadows and lighting. There are many studies on CNN-based pedestrian detection through the use of far-infrared (FIR) light cameras (i.e., thermal cameras) to address such difficulties. However, when the solar radiation increases and the background temperature reaches the same level as the body temperature, it remains difficult for the FIR light camera to detect pedestrians due to the insignificant difference between the pedestrian and non-pedestrian features within the images. Researchers have been trying to solve this issue by inputting both the visible light and the FIR camera images into the CNN as the input. This, however, takes a longer time to process, and makes the system structure more complex as the CNN needs to process both camera images. This research adaptively selects a more appropriate candidate between two pedestrian images from visible light and FIR cameras based on a fuzzy inference system (FIS), and the selected candidate is verified with a CNN. Three types of databases were tested, taking into account various environmental factors using visible light and FIR cameras. The results showed that the proposed method performs better than the previously reported methods.

  16. Entropic Inference

    Caticha, Ariel

    2011-03-01

    In this tutorial we review the essential arguments behing entropic inference. We focus on the epistemological notion of information and its relation to the Bayesian beliefs of rational agents. The problem of updating from a prior to a posterior probability distribution is tackled through an eliminative induction process that singles out the logarithmic relative entropy as the unique tool for inference. The resulting method of Maximum relative Entropy (ME), includes as special cases both MaxEnt and Bayes' rule, and therefore unifies the two themes of these workshops—the Maximum Entropy and the Bayesian methods—into a single general inference scheme.

  17. Health decision making: lynchpin of evidence-based practice.

    Spring, Bonnie

    2008-01-01

    Health decision making is both the lynchpin and the least developed aspect of evidence-based practice. The evidence-based practice process requires integrating the evidence with consideration of practical resources and patient preferences and doing so via a process that is genuinely collaborative. Yet, the literature is largely silent about how to accomplish integrative, shared decision making. for evidence-based practice are discussed for 2 theories of clinician decision making (expected utility and fuzzy trace) and 2 theories of patient health decision making (transtheoretical model and reasoned action). Three suggestions are offered. First, it would be advantageous to have theory-based algorithms that weight and integrate the 3 data strands (evidence, resources, preferences) in different decisional contexts. Second, patients, not providers, make the decisions of greatest impact on public health, and those decisions are behavioral. Consequently, theory explicating how provider-patient collaboration can influence patient lifestyle decisions made miles from the provider's office is greatly needed. Third, although the preponderance of data on complex decisions supports a computational approach, such an approach to evidence-based practice is too impractical to be widely applied at present. More troublesomely, until patients come to trust decisions made computationally more than they trust their providers' intuitions, patient adherence will remain problematic. A good theory of integrative, collaborative health decision making remains needed.

  18. Role of Temporal Diversity in Inferring Social Ties Based on Spatio-Temporal Data

    Desai, Deshana; Nisar, Harsh; Bhardawaj, Rishab

    2016-01-01

    The last two decades have seen a tremendous surge in research on social networks and their implications. The studies includes inferring social relationships, which in turn have been used for target advertising, recommendations, search customization etc. However, the offline experiences of human, the conversations with people and face-to-face interactions that govern our lives interactions have received lesser attention. We introduce DAIICT Spatio-Temporal Network (DSSN), a spatiotemporal data...

  19. Probabilistic inference with noisy-threshold models based on a CP tensor decomposition

    Vomlel, Jiří; Tichavský, Petr

    2014-01-01

    Roč. 55, č. 4 (2014), s. 1072-1092 ISSN 0888-613X R&D Projects: GA ČR GA13-20012S; GA ČR GA102/09/1278 Institutional support: RVO:67985556 Keywords : Bayesian networks * Probabilistic inference * Candecomp-Parafac tensor decomposition * Symmetric tensor rank Subject RIV: JD - Computer Applications, Robotics Impact factor: 2.451, year: 2014 http://library.utia.cas.cz/separaty/2014/MTR/vomlel-0427059.pdf

  20. Logical inference and evaluation

    Perey, F.G.

    1981-01-01

    Most methodologies of evaluation currently used are based upon the theory of statistical inference. It is generally perceived that this theory is not capable of dealing satisfactorily with what are called systematic errors. Theories of logical inference should be capable of treating all of the information available, including that not involving frequency data. A theory of logical inference is presented as an extension of deductive logic via the concept of plausibility and the application of group theory. Some conclusions, based upon the application of this theory to evaluation of data, are also given

  1. Entropic Inference

    Caticha, Ariel

    2010-01-01

    In this tutorial we review the essential arguments behing entropic inference. We focus on the epistemological notion of information and its relation to the Bayesian beliefs of rational agents. The problem of updating from a prior to a posterior probability distribution is tackled through an eliminative induction process that singles out the logarithmic relative entropy as the unique tool for inference. The resulting method of Maximum relative Entropy (ME), includes as special cases both MaxEn...

  2. A novel mutual information-based Boolean network inference method from time-series gene expression data.

    Shohag Barman

    Full Text Available Inferring a gene regulatory network from time-series gene expression data in systems biology is a challenging problem. Many methods have been suggested, most of which have a scalability limitation due to the combinatorial cost of searching a regulatory set of genes. In addition, they have focused on the accurate inference of a network structure only. Therefore, there is a pressing need to develop a network inference method to search regulatory genes efficiently and to predict the network dynamics accurately.In this study, we employed a Boolean network model with a restricted update rule scheme to capture coarse-grained dynamics, and propose a novel mutual information-based Boolean network inference (MIBNI method. Given time-series gene expression data as an input, the method first identifies a set of initial regulatory genes using mutual information-based feature selection, and then improves the dynamics prediction accuracy by iteratively swapping a pair of genes between sets of the selected regulatory genes and the other genes. Through extensive simulations with artificial datasets, MIBNI showed consistently better performance than six well-known existing methods, REVEAL, Best-Fit, RelNet, CST, CLR, and BIBN in terms of both structural and dynamics prediction accuracy. We further tested the proposed method with two real gene expression datasets for an Escherichia coli gene regulatory network and a fission yeast cell cycle network, and also observed better results using MIBNI compared to the six other methods.Taken together, MIBNI is a promising tool for predicting both the structure and the dynamics of a gene regulatory network.

  3. SU-E-T-144: Bayesian Inference of Local Relapse Data Using a Poisson-Based Tumour Control Probability Model

    La Russa, D [The Ottawa Hospital Cancer Centre, Ottawa, ON (Canada)

    2015-06-15

    Purpose: The purpose of this project is to develop a robust method of parameter estimation for a Poisson-based TCP model using Bayesian inference. Methods: Bayesian inference was performed using the PyMC3 probabilistic programming framework written in Python. A Poisson-based TCP regression model that accounts for clonogen proliferation was fit to observed rates of local relapse as a function of equivalent dose in 2 Gy fractions for a population of 623 stage-I non-small-cell lung cancer patients. The Slice Markov Chain Monte Carlo sampling algorithm was used to sample the posterior distributions, and was initiated using the maximum of the posterior distributions found by optimization. The calculation of TCP with each sample step required integration over the free parameter α, which was performed using an adaptive 24-point Gauss-Legendre quadrature. Convergence was verified via inspection of the trace plot and posterior distribution for each of the fit parameters, as well as with comparisons of the most probable parameter values with their respective maximum likelihood estimates. Results: Posterior distributions for α, the standard deviation of α (σ), the average tumour cell-doubling time (Td), and the repopulation delay time (Tk), were generated assuming α/β = 10 Gy, and a fixed clonogen density of 10{sup 7} cm−{sup 3}. Posterior predictive plots generated from samples from these posterior distributions are in excellent agreement with the observed rates of local relapse used in the Bayesian inference. The most probable values of the model parameters also agree well with maximum likelihood estimates. Conclusion: A robust method of performing Bayesian inference of TCP data using a complex TCP model has been established.

  4. A new Bayesian Inference-based Phase Associator for Earthquake Early Warning

    Meier, Men-Andrin; Heaton, Thomas; Clinton, John; Wiemer, Stefan

    2013-04-01

    State of the art network-based Earthquake Early Warning (EEW) systems can provide warnings for large magnitude 7+ earthquakes. Although regions in the direct vicinity of the epicenter will not receive warnings prior to damaging shaking, real-time event characterization is available before the destructive S-wave arrival across much of the strongly affected region. In contrast, in the case of the more frequent medium size events, such as the devastating 1994 Mw6.7 Northridge, California, earthquake, providing timely warning to the smaller damage zone is more difficult. For such events the "blind zone" of current systems (e.g. the CISN ShakeAlert system in California) is similar in size to the area over which severe damage occurs. We propose a faster and more robust Bayesian inference-based event associator, that in contrast to the current standard associators (e.g. Earthworm Binder), is tailored to EEW and exploits information other than only phase arrival times. In particular, the associator potentially allows for reliable automated event association with as little as two observations, which, compared to the ShakeAlert system, would speed up the real-time characterizations by about ten seconds and thus reduce the blind zone area by up to 80%. We compile an extensive data set of regional and teleseismic earthquake and noise waveforms spanning a wide range of earthquake magnitudes and tectonic regimes. We pass these waveforms through a causal real-time filterbank with passband filters between 0.1 and 50Hz, and, updating every second from the event detection, extract the maximum amplitudes in each frequency band. Using this dataset, we define distributions of amplitude maxima in each passband as a function of epicentral distance and magnitude. For the real-time data, we pass incoming broadband and strong motion waveforms through the same filterbank and extract an evolving set of maximum amplitudes in each passband. We use the maximum amplitude distributions to check

  5. Risk-Based Decision Making for Deterioration Processes Using POMDP

    Nielsen, Jannie Sønderkær; Sørensen, John Dalsgaard

    2015-01-01

    This paper proposes a method for risk-based decision making for maintenance of deteriorating components, based on the partially observable Markov decision process (POMDP). Unlike most methods, the decision polices do not need to be stationary and can vary according to seasons and near the end...

  6. An analysis of line-drawings based upon automatically inferred grammar and its application to chest x-ray images

    Nakayama, Akira; Yoshida, Yuuji; Fukumura, Teruo

    1984-01-01

    There is a technique using inferring grammer as image- structure analyzing technique. This technique involves a few problems if it is applied to naturally obtained images, as the practical grammatical technique for two-dimensional image is not established. The authors developed a technique which solved the above problems for the main purpose of the automated structure analysis of naturally obtained image. The first half of this paper describes on the automatic inference of line drawing generation grammar and the line drawing analysis based on that automatic inference. The second half of the paper reports on the actual analysis. The proposed technique is that to extract object line drawings out of the line drawings containing noise. The technique was evaluated for its effectiveness with an example of extracting rib center lines out of thin line chest X-ray images having practical scale and complexity. In this example, the total number of characteristic points (ends, branch points and intersections) composing line drawings per one image was 377, and the total number of line segments composing line drawings was 566 on average per sheet. The extraction ratio was 86.6 % which seemed to be proper when the complexity of input line drawings was considered. Further, the result was compared with the identified rib center lines with the automatic screening system AISCR-V3 for comparison with the conventional processing technique, and it was satisfactory when the versatility of this method was considered. (Wakatsuki, Y.)

  7. Inference for the Sharpe Ratio Using a Likelihood-Based Approach

    Ying Liu

    2012-01-01

    Full Text Available The Sharpe ratio is the prominent risk-adjusted performance measure used by practitioners. Statistical testing of this ratio using its asymptotic distribution has lagged behind its use. In this paper, highly accurate likelihood analysis is applied for inference on the Sharpe ratio. Both the one- and two-sample problems are considered. The methodology has O(n−3/2 distributional accuracy and can be implemented using any parametric return distribution structure. Simulations are provided to demonstrate the method's superior accuracy over existing methods used for testing in the literature.

  8. MIRA: An R package for DNA methylation-based inference of regulatory activity.

    Lawson, John T; Tomazou, Eleni M; Bock, Christoph; Sheffield, Nathan C

    2018-03-01

    DNA methylation contains information about the regulatory state of the cell. MIRA aggregates genome-scale DNA methylation data into a DNA methylation profile for independent region sets with shared biological annotation. Using this profile, MIRA infers and scores the collective regulatory activity for each region set. MIRA facilitates regulatory analysis in situations where classical regulatory assays would be difficult and allows public sources of open chromatin and protein binding regions to be leveraged for novel insight into the regulatory state of DNA methylation datasets. R package available on Bioconductor: http://bioconductor.org/packages/release/bioc/html/MIRA.html. nsheffield@virginia.edu.

  9. Perceptual inference.

    Aggelopoulos, Nikolaos C

    2015-08-01

    Perceptual inference refers to the ability to infer sensory stimuli from predictions that result from internal neural representations built through prior experience. Methods of Bayesian statistical inference and decision theory model cognition adequately by using error sensing either in guiding action or in "generative" models that predict the sensory information. In this framework, perception can be seen as a process qualitatively distinct from sensation, a process of information evaluation using previously acquired and stored representations (memories) that is guided by sensory feedback. The stored representations can be utilised as internal models of sensory stimuli enabling long term associations, for example in operant conditioning. Evidence for perceptual inference is contributed by such phenomena as the cortical co-localisation of object perception with object memory, the response invariance in the responses of some neurons to variations in the stimulus, as well as from situations in which perception can be dissociated from sensation. In the context of perceptual inference, sensory areas of the cerebral cortex that have been facilitated by a priming signal may be regarded as comparators in a closed feedback loop, similar to the better known motor reflexes in the sensorimotor system. The adult cerebral cortex can be regarded as similar to a servomechanism, in using sensory feedback to correct internal models, producing predictions of the outside world on the basis of past experience. Copyright © 2015 Elsevier Ltd. All rights reserved.

  10. Effort-Based Decision-Making in Schizophrenia.

    Culbreth, Adam J; Moran, Erin K; Barch, Deanna M

    2018-08-01

    Motivational impairment has long been associated with schizophrenia but the underlying mechanisms are not clearly understood. Recently, a small but growing literature has suggested that aberrant effort-based decision-making may be a potential contributory mechanism for motivational impairments in psychosis. Specifically, multiple reports have consistently demonstrated that individuals with schizophrenia are less willing than healthy controls to expend effort to obtain rewards. Further, this effort-based decision-making deficit has been shown to correlate with severity of negative symptoms and level of functioning, in many but not all studies. In the current review, we summarize this literature and discuss several factors that may underlie aberrant effort-based decision-making in schizophrenia.

  11. NIMROD: a program for inference via a normal approximation of the posterior in models with random effects based on ordinary differential equations.

    Prague, Mélanie; Commenges, Daniel; Guedj, Jérémie; Drylewicz, Julia; Thiébaut, Rodolphe

    2013-08-01

    Models based on ordinary differential equations (ODE) are widespread tools for describing dynamical systems. In biomedical sciences, data from each subject can be sparse making difficult to precisely estimate individual parameters by standard non-linear regression but information can often be gained from between-subjects variability. This makes natural the use of mixed-effects models to estimate population parameters. Although the maximum likelihood approach is a valuable option, identifiability issues favour Bayesian approaches which can incorporate prior knowledge in a flexible way. However, the combination of difficulties coming from the ODE system and from the presence of random effects raises a major numerical challenge. Computations can be simplified by making a normal approximation of the posterior to find the maximum of the posterior distribution (MAP). Here we present the NIMROD program (normal approximation inference in models with random effects based on ordinary differential equations) devoted to the MAP estimation in ODE models. We describe the specific implemented features such as convergence criteria and an approximation of the leave-one-out cross-validation to assess the model quality of fit. In pharmacokinetics models, first, we evaluate the properties of this algorithm and compare it with FOCE and MCMC algorithms in simulations. Then, we illustrate NIMROD use on Amprenavir pharmacokinetics data from the PUZZLE clinical trial in HIV infected patients. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.

  12. Promotion of technical harmonisation on risk-based decision making

    Kirchsteiger, Christian; Cojazzi, Giacomo

    2000-01-01

    The EC-JRC International Workshop on Promotion of Technical Harmonisation on Risk-Based Decision Making, held at Stresa and Ispra, Italy, 22-25 May 2000, was an experts meeting to discuss the possible need of developing an internationally accepted generic 'standard' for risk-based decision making.This paper briefly describes the workshop background, its organisation and summarises its main results and conclusions; it reflects the personal opinions of the authors and in no way commits the European Commission. (author)

  13. A Genome-Scale Investigation of How Sequence, Function, and Tree-Based Gene Properties Influence Phylogenetic Inference.

    Shen, Xing-Xing; Salichos, Leonidas; Rokas, Antonis

    2016-09-02

    Molecular phylogenetic inference is inherently dependent on choices in both methodology and data. Many insightful studies have shown how choices in methodology, such as the model of sequence evolution or optimality criterion used, can strongly influence inference. In contrast, much less is known about the impact of choices in the properties of the data, typically genes, on phylogenetic inference. We investigated the relationships between 52 gene properties (24 sequence-based, 19 function-based, and 9 tree-based) with each other and with three measures of phylogenetic signal in two assembled data sets of 2,832 yeast and 2,002 mammalian genes. We found that most gene properties, such as evolutionary rate (measured through the percent average of pairwise identity across taxa) and total tree length, were highly correlated with each other. Similarly, several gene properties, such as gene alignment length, Guanine-Cytosine content, and the proportion of tree distance on internal branches divided by relative composition variability (treeness/RCV), were strongly correlated with phylogenetic signal. Analysis of partial correlations between gene properties and phylogenetic signal in which gene evolutionary rate and alignment length were simultaneously controlled, showed similar patterns of correlations, albeit weaker in strength. Examination of the relative importance of each gene property on phylogenetic signal identified gene alignment length, alongside with number of parsimony-informative sites and variable sites, as the most important predictors. Interestingly, the subsets of gene properties that optimally predicted phylogenetic signal differed considerably across our three phylogenetic measures and two data sets; however, gene alignment length and RCV were consistently included as predictors of all three phylogenetic measures in both yeasts and mammals. These results suggest that a handful of sequence-based gene properties are reliable predictors of phylogenetic signal

  14. CALIPSO-Inferred Aerosol Direct Radiative Effects: Bias Estimates Using Ground-Based Raman Lidars

    Thorsen, Tyler; Fu, Qiang

    2016-01-01

    Observational constraints on the change in the radiative energy budget caused by the presence of aerosols, i.e. the aerosol direct radiative effect (DRE), have recently been made using observations from the Cloud- Aerosol Lidar and Infrared Pathfinder Satellite (CALIPSO). CALIPSO observations have the potential to provide improved global estimates of aerosol DRE compared to passive sensor-derived estimates due to CALIPSO's ability to perform vertically-resolved aerosol retrievals over all surface types and over cloud. In this study we estimate the uncertainties in CALIPSO-inferred aerosol DRE using multiple years of observations from the Atmospheric Radiation Measurement (ARM) program's Raman lidars (RL) at midlatitude and tropical sites. Examined are assumptions about the ratio of extinction-to-backscatter (i.e. the lidar ratio) made by the CALIPSO retrievals, which are needed to retrieve the aerosol extinction profile. The lidar ratio is shown to introduce minimal error in the mean aerosol DRE at the top-of-atmosphere and surface. It is also shown that CALIPSO is unable to detect all radiatively-significant aerosol, resulting in an underestimate in the magnitude of the aerosol DRE by 30-50%. Therefore, global estimates of the aerosol DRE inferred from CALIPSO observations are likely too weak.

  15. Landslide Fissure Inference Assessment by ANFIS and Logistic Regression Using UAS-Based Photogrammetry

    Ozgun Akcay

    2015-10-01

    Full Text Available Unmanned Aerial Systems (UAS are now capable of gathering high-resolution data, therefore, landslides can be explored in detail at larger scales. In this research, 132 aerial photographs were captured, and 85,456 features were detected and matched automatically using UAS photogrammetry. The root mean square (RMS values of the image coordinates of the Ground Control Points (GPCs varied from 0.521 to 2.293 pixels, whereas maximum RMS values of automatically matched features was calculated as 2.921 pixels. Using the 3D point cloud, which was acquired by aerial photogrammetry, the raster datasets of the aspect, slope, and maximally stable extremal regions (MSER detecting visual uniformity, were defined as three variables, in order to reason fissure structures on the landslide surface. In this research, an Adaptive Neuro Fuzzy Inference System (ANFIS and a Logistic Regression (LR were implemented using training datasets to infer fissure data appropriately. The accuracy of the predictive models was evaluated by drawing receiver operating characteristic (ROC curves and by calculating the area under the ROC curve (AUC. The experiments exposed that high-resolution imagery is an indispensable data source to model and validate landslide fissures appropriately.

  16. Episodic memories predict adaptive value-based decision-making

    Murty, Vishnu; FeldmanHall, Oriel; Hunter, Lindsay E.; Phelps, Elizabeth A; Davachi, Lila

    2016-01-01

    Prior research illustrates that memory can guide value-based decision-making. For example, previous work has implicated both working memory and procedural memory (i.e., reinforcement learning) in guiding choice. However, other types of memories, such as episodic memory, may also influence decision-making. Here we test the role for episodic memory—specifically item versus associative memory—in supporting value-based choice. Participants completed a task where they first learned the value associated with trial unique lotteries. After a short delay, they completed a decision-making task where they could choose to re-engage with previously encountered lotteries, or new never before seen lotteries. Finally, participants completed a surprise memory test for the lotteries and their associated values. Results indicate that participants chose to re-engage more often with lotteries that resulted in high versus low rewards. Critically, participants not only formed detailed, associative memories for the reward values coupled with individual lotteries, but also exhibited adaptive decision-making only when they had intact associative memory. We further found that the relationship between adaptive choice and associative memory generalized to more complex, ecologically valid choice behavior, such as social decision-making. However, individuals more strongly encode experiences of social violations—such as being treated unfairly, suggesting a bias for how individuals form associative memories within social contexts. Together, these findings provide an important integration of episodic memory and decision-making literatures to better understand key mechanisms supporting adaptive behavior. PMID:26999046

  17. Inference as Prediction

    Watson, Jane

    2007-01-01

    Inference, or decision making, is seen in curriculum documents as the final step in a statistical investigation. For a formal statistical enquiry this may be associated with sophisticated tests involving probability distributions. For young students without the mathematical background to perform such tests, it is still possible to draw informal…

  18. Machine health prognostics using the Bayesian-inference-based probabilistic indication and high-order particle filtering framework

    Yu, Jianbo

    2015-12-01

    Prognostics is much efficient to achieve zero-downtime performance, maximum productivity and proactive maintenance of machines. Prognostics intends to assess and predict the time evolution of machine health degradation so that machine failures can be predicted and prevented. A novel prognostics system is developed based on the data-model-fusion scheme using the Bayesian inference-based self-organizing map (SOM) and an integration of logistic regression (LR) and high-order particle filtering (HOPF). In this prognostics system, a baseline SOM is constructed to model the data distribution space of healthy machine under an assumption that predictable fault patterns are not available. Bayesian inference-based probability (BIP) derived from the baseline SOM is developed as a quantification indication of machine health degradation. BIP is capable of offering failure probability for the monitored machine, which has intuitionist explanation related to health degradation state. Based on those historic BIPs, the constructed LR and its modeling noise constitute a high-order Markov process (HOMP) to describe machine health propagation. HOPF is used to solve the HOMP estimation to predict the evolution of the machine health in the form of a probability density function (PDF). An on-line model update scheme is developed to adapt the Markov process changes to machine health dynamics quickly. The experimental results on a bearing test-bed illustrate the potential applications of the proposed system as an effective and simple tool for machine health prognostics.

  19. Influence of branding on preference-based decision making.

    Philiastides, Marios G; Ratcliff, Roger

    2013-07-01

    Branding has become one of the most important determinants of consumer choices. Intriguingly, the psychological mechanisms of how branding influences decision making remain elusive. In the research reported here, we used a preference-based decision-making task and computational modeling to identify which internal components of processing are affected by branding. We found that a process of noisy temporal integration of subjective value information can model preference-based choices reliably and that branding biases are explained by changes in the rate of the integration process itself. This result suggests that branding information and subjective preference are integrated into a single source of evidence in the decision-making process, thereby altering choice behavior.

  20. IMAGINE: Interstellar MAGnetic field INference Engine

    Steininger, Theo

    2018-03-01

    IMAGINE (Interstellar MAGnetic field INference Engine) performs inference on generic parametric models of the Galaxy. The modular open source framework uses highly optimized tools and technology such as the MultiNest sampler (ascl:1109.006) and the information field theory framework NIFTy (ascl:1302.013) to create an instance of the Milky Way based on a set of parameters for physical observables, using Bayesian statistics to judge the mismatch between measured data and model prediction. The flexibility of the IMAGINE framework allows for simple refitting for newly available data sets and makes state-of-the-art Bayesian methods easily accessible particularly for random components of the Galactic magnetic field.

  1. A dual justification for science-based policy-making

    Pedersen, David Budtz

    2014-01-01

    Science-based policy-making has grown ever more important in recent years, in parallel with the dramatic increase in the complexity and uncertainty of the ways in which science and technology interact with society and economy at the national, regional and global level. Installing a proper framewo...

  2. Data-Based Decision Making in Teams: Enablers and Barriers

    Bolhuis, Erik; Schildkamp, Kim; Voogt, Joke

    2016-01-01

    Data use is becoming more important in higher education. In this case study, a team of teachers from a teacher education college was supported in data-based decision making by means of the data team procedure. This data team studied the reasons why students drop out. A team's success depends in part on whether the team is able to develop and apply…

  3. Multicriteria decision-making method based on a cosine similarity ...

    the cosine similarity measure is often used in information retrieval, citation analysis, and automatic classification. However, it scarcely deals with trapezoidal fuzzy information and multicriteria decision-making problems. For this purpose, a cosine similarity measure between trapezoidal fuzzy numbers is proposed based on ...

  4. Applications of decision theory to test-based decision making

    van der Linden, Willem J.

    1987-01-01

    The use of Bayesian decision theory to solve problems in test-based decision making is discussed. Four basic decision problems are distinguished: (1) selection; (2) mastery; (3) placement; and (4) classification, the situation where each treatment has its own criterion. Each type of decision can be

  5. Making Instructional Decisions Based on Data: What, How, and Why

    Mokhtari, Kouider; Rosemary, Catherine A.; Edwards, Patricia A.

    2007-01-01

    A carefully coordinated literacy assessment and instruction framework implemented school-wide can support school teams in making sense of various types of data for instructional planning. Instruction that is data based and goal driven sets the stage for continuous reading and writing improvement. (Contains 2 figures.)

  6. School-Based Decision-Making: The Canadian Perspective.

    Peters, Frank

    1997-01-01

    In Canada, school-based decision making is a political expedient to co-opt public support for public education at the same time as financial resources to schools are being curtailed. School councils are advisory in nature and have no statutory position in either school or school-system decisions. (17 references) (MLF)

  7. ERC analysis: web-based inference of gene function via evolutionary rate covariation.

    Wolfe, Nicholas W; Clark, Nathan L

    2015-12-01

    The recent explosion of comparative genomics data presents an unprecedented opportunity to construct gene networks via the evolutionary rate covariation (ERC) signature. ERC is used to identify genes that experienced similar evolutionary histories, and thereby draws functional associations between them. The ERC Analysis website allows researchers to exploit genome-wide datasets to infer novel genes in any biological function and to explore deep evolutionary connections between distinct pathways and complexes. The website provides five analytical methods, graphical output, statistical support and access to an increasing number of taxonomic groups. Analyses and data at http://csb.pitt.edu/erc_analysis/ nclark@pitt.edu. © The Author 2015. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  8. Automatic fuzzy inference system development for marker-based watershed segmentation

    Gonzalez, M A; Meschino, G J; Ballarin, V L

    2007-01-01

    Texture image segmentation is a constant challenge in digital image processing. The partition of an image into regions that allow the experienced observer to obtain the necessary information can be done using a Mathematical Morphology tool called the Watershed Transform. This transform is able to distinguish extremely complex objects and is easily adaptable to various kinds of images. The success of the Watershed Transform depends essentially on the existence of unequivocal markers for each of the objects of interest. The standard methods for marker detection are highly specific and complex when objects presenting great variability of shape, size and texture are processed. This paper proposes the automatic generation of a fuzzy inference system for marker detection using object selection done by the expert. This method allows applying the Watershed Transform to biomedical images with diferent kinds of texture. The results allow concluding that the method proposed is an effective tool for the application of the Watershed Transform

  9. Using ontological inference and hierarchical matchmaking to overcome semantic heterogeneity in remote sensing-based biodiversity monitoring

    Nieland, Simon; Kleinschmit, Birgit; Förster, Michael

    2015-05-01

    Ontology-based applications hold promise in improving spatial data interoperability. In this work we use remote sensing-based biodiversity information and apply semantic formalisation and ontological inference to show improvements in data interoperability/comparability. The proposed methodology includes an observation-based, "bottom-up" engineering approach for remote sensing applications and gives a practical example of semantic mediation of geospatial products. We apply the methodology to three different nomenclatures used for remote sensing-based classification of two heathland nature conservation areas in Belgium and Germany. We analysed sensor nomenclatures with respect to their semantic formalisation and their bio-geographical differences. The results indicate that a hierarchical and transparent nomenclature is far more important for transferability than the sensor or study area. The inclusion of additional information, not necessarily belonging to a vegetation class description, is a key factor for the future success of using semantics for interoperability in remote sensing.

  10. Forecasting building energy consumption with hybrid genetic algorithm-hierarchical adaptive network-based fuzzy inference system

    Li, Kangji [Institute of Cyber-Systems and Control, Zhejiang University, Hangzhou 310027 (China); School of Electricity Information Engineering, Jiangsu University, Zhenjiang 212013 (China); Su, Hongye [Institute of Cyber-Systems and Control, Zhejiang University, Hangzhou 310027 (China)

    2010-11-15

    There are several ways to forecast building energy consumption, varying from simple regression to models based on physical principles. In this paper, a new method, namely, the hybrid genetic algorithm-hierarchical adaptive network-based fuzzy inference system (GA-HANFIS) model is developed. In this model, hierarchical structure decreases the rule base dimension. Both clustering and rule base parameters are optimized by GAs and neural networks (NNs). The model is applied to predict a hotel's daily air conditioning consumption for a period over 3 months. The results obtained by the proposed model are presented and compared with regular method of NNs, which indicates that GA-HANFIS model possesses better performance than NNs in terms of their forecasting accuracy. (author)

  11. Make

    Frauenfelder, Mark

    2012-01-01

    The first magazine devoted entirely to do-it-yourself technology projects presents its 29th quarterly edition for people who like to tweak, disassemble, recreate, and invent cool new uses for technology. MAKE Volume 29 takes bio-hacking to a new level. Get introduced to DIY tracking devices before they hit the consumer electronics marketplace. Learn how to build an EKG machine to study your heartbeat, and put together a DIY bio lab to study athletic motion using consumer grade hardware.

  12. Interpatch foraging in honeybees-rational decision making at secondary hubs based upon time and motivation.

    Najera, Daniel A; McCullough, Erin L; Jander, Rudolf

    2012-11-01

    For honeybees, Apis mellifera, the hive has been well known to function as a primary decision-making hub, a place from which foragers decide among various directions, distances, and times of day to forage efficiently. Whether foraging honeybees can make similarly complex navigational decisions from locations away from the hive is unknown. To examine whether or not such secondary decision-making hubs exist, we trained bees to forage at four different locations. Specifically, we trained honeybees to first forage to a distal site "CT" 100 m away from the hive; if food was present, they fed and then chose to go home. If food was not present, the honeybees were trained to forage to three auxiliary sites, each at a different time of the day: A in the morning, B at noon, and C in the afternoon. The foragers learned to check site CT for food first and then efficiently depart to the correct location based upon the time of day if there was no food at site CT. Thus, the honeybees were able to cognitively map motivation, time, and five different locations (Hive, CT, A, B, and C) in two spatial dimensions; these are the contents of the cognitive map used by the honeybees here. While at site CT, we verified that the honeybees could choose between 4 different directions (to A, B, C, and the Hive) and thus label it as a secondary decision-making hub. The observed decision making uncovered here is inferred to constitute genuine logical operations, involving a branched structure, based upon the premises of motivational state, and spatiotemporal knowledge.

  13. Estimating uncertainty of inference for validation

    Booker, Jane M [Los Alamos National Laboratory; Langenbrunner, James R [Los Alamos National Laboratory; Hemez, Francois M [Los Alamos National Laboratory; Ross, Timothy J [UNM

    2010-09-30

    We present a validation process based upon the concept that validation is an inference-making activity. This has always been true, but the association has not been as important before as it is now. Previously, theory had been confirmed by more data, and predictions were possible based on data. The process today is to infer from theory to code and from code to prediction, making the role of prediction somewhat automatic, and a machine function. Validation is defined as determining the degree to which a model and code is an accurate representation of experimental test data. Imbedded in validation is the intention to use the computer code to predict. To predict is to accept the conclusion that an observable final state will manifest; therefore, prediction is an inference whose goodness relies on the validity of the code. Quantifying the uncertainty of a prediction amounts to quantifying the uncertainty of validation, and this involves the characterization of uncertainties inherent in theory/models/codes and the corresponding data. An introduction to inference making and its associated uncertainty is provided as a foundation for the validation problem. A mathematical construction for estimating the uncertainty in the validation inference is then presented, including a possibility distribution constructed to represent the inference uncertainty for validation under uncertainty. The estimation of inference uncertainty for validation is illustrated using data and calculations from Inertial Confinement Fusion (ICF). The ICF measurements of neutron yield and ion temperature were obtained for direct-drive inertial fusion capsules at the Omega laser facility. The glass capsules, containing the fusion gas, were systematically selected with the intent of establishing a reproducible baseline of high-yield 10{sup 13}-10{sup 14} neutron output. The deuterium-tritium ratio in these experiments was varied to study its influence upon yield. This paper on validation inference is the

  14. Polynomial Chaos–Based Bayesian Inference of K-Profile Parameterization in a General Circulation Model of the Tropical Pacific

    Sraj, Ihab

    2016-08-26

    The authors present a polynomial chaos (PC)-based Bayesian inference method for quantifying the uncertainties of the K-profile parameterization (KPP) within the MIT general circulation model (MITgcm) of the tropical Pacific. The inference of the uncertain parameters is based on a Markov chain Monte Carlo (MCMC) scheme that utilizes a newly formulated test statistic taking into account the different components representing the structures of turbulent mixing on both daily and seasonal time scales in addition to the data quality, and filters for the effects of parameter perturbations over those as a result of changes in the wind. To avoid the prohibitive computational cost of integrating the MITgcm model at each MCMC iteration, a surrogate model for the test statistic using the PC method is built. Because of the noise in the model predictions, a basis-pursuit-denoising (BPDN) compressed sensing approach is employed to determine the PC coefficients of a representative surrogate model. The PC surrogate is then used to evaluate the test statistic in the MCMC step for sampling the posterior of the uncertain parameters. Results of the posteriors indicate good agreement with the default values for two parameters of the KPP model, namely the critical bulk and gradient Richardson numbers; while the posteriors of the remaining parameters were barely informative. © 2016 American Meteorological Society.

  15. Intelligent PID controller based on ant system algorithm and fuzzy inference and its application to bionic artificial leg

    谭冠政; 曾庆冬; 李文斌

    2004-01-01

    A designing method of intelligent proportional-integral-derivative(PID) controllers was proposed based on the ant system algorithm and fuzzy inference. This kind of controller is called Fuzzy-ant system PID controller. It consists of an off-line part and an on-line part. In the off-line part, for a given control system with a PID controller,by taking the overshoot, setting time and steady-state error of the system unit step response as the performance indexes and by using the ant system algorithm, a group of optimal PID parameters K*p , Ti* and T*d can be obtained, which are used as the initial values for the on-line tuning of PID parameters. In the on-line part, based on Kp* , Ti*and Td* and according to the current system error e and its time derivative, a specific program is written, which is used to optimize and adjust the PID parameters on-line through a fuzzy inference mechanism to ensure that the system response has optimal transient and steady-state performance. This kind of intelligent PID controller can be used to control the motor of the intelligent bionic artificial leg designed by the authors. The result of computer simulation experiment shows that the controller has less overshoot and shorter setting time.

  16. Modeling of a 5-cell direct methanol fuel cell using adaptive-network-based fuzzy inference systems

    Wang, Rongrong; Qi, Liang; Xie, Xiaofeng; Ding, Qingqing; Li, Chunwen; Ma, ChenChi M.

    The methanol concentrations, temperature and current were considered as inputs, the cell voltage was taken as output, and the performance of a direct methanol fuel cell (DMFC) was modeled by adaptive-network-based fuzzy inference systems (ANFIS). The artificial neural network (ANN) and polynomial-based models were selected to be compared with the ANFIS in respect of quality and accuracy. Based on the ANFIS model obtained, the characteristics of the DMFC were studied. The results show that temperature and methanol concentration greatly affect the performance of the DMFC. Within a restricted current range, the methanol concentration does not greatly affect the stack voltage. In order to obtain higher fuel utilization efficiency, the methanol concentrations and temperatures should be adjusted according to the load on the system.

  17. Modeling of a 5-cell direct methanol fuel cell using adaptive-network-based fuzzy inference systems

    Wang, Rongrong; Li, Chunwen [Department of Automation, Tsinghua University, Beijing 100084 (China); Qi, Liang; Xie, Xiaofeng [Institute of Nuclear and New Energy Technology, Tsinghua University, Beijing 100084 (China); Ding, Qingqing [Department of Electrical Engineering, Tsinghua University, Beijing 100084 (China); Ma, ChenChi M. [National Tsing Hua University, Hsinchu 300 (China)

    2008-12-01

    The methanol concentrations, temperature and current were considered as inputs, the cell voltage was taken as output, and the performance of a direct methanol fuel cell (DMFC) was modeled by adaptive-network-based fuzzy inference systems (ANFIS). The artificial neural network (ANN) and polynomial-based models were selected to be compared with the ANFIS in respect of quality and accuracy. Based on the ANFIS model obtained, the characteristics of the DMFC were studied. The results show that temperature and methanol concentration greatly affect the performance of the DMFC. Within a restricted current range, the methanol concentration does not greatly affect the stack voltage. In order to obtain higher fuel utilization efficiency, the methanol concentrations and temperatures should be adjusted according to the load on the system. (author)

  18. SWPhylo - A Novel Tool for Phylogenomic Inferences by Comparison of Oligonucleotide Patterns and Integration of Genome-Based and Gene-Based Phylogenetic Trees.

    Yu, Xiaoyu; Reva, Oleg N

    2018-01-01

    Modern phylogenetic studies may benefit from the analysis of complete genome sequences of various microorganisms. Evolutionary inferences based on genome-scale analysis are believed to be more accurate than the gene-based alternative. However, the computational complexity of current phylogenomic procedures, inappropriateness of standard phylogenetic tools to process genome-wide data, and lack of reliable substitution models which correlates with alignment-free phylogenomic approaches deter microbiologists from using these opportunities. For example, the super-matrix and super-tree approaches of phylogenomics use multiple integrated genomic loci or individual gene-based trees to infer an overall consensus tree. However, these approaches potentially multiply errors of gene annotation and sequence alignment not mentioning the computational complexity and laboriousness of the methods. In this article, we demonstrate that the annotation- and alignment-free comparison of genome-wide tetranucleotide frequencies, termed oligonucleotide usage patterns (OUPs), allowed a fast and reliable inference of phylogenetic trees. These were congruent to the corresponding whole genome super-matrix trees in terms of tree topology when compared with other known approaches including 16S ribosomal RNA and GyrA protein sequence comparison, complete genome-based MAUVE, and CVTree methods. A Web-based program to perform the alignment-free OUP-based phylogenomic inferences was implemented at http://swphylo.bi.up.ac.za/. Applicability of the tool was tested on different taxa from subspecies to intergeneric levels. Distinguishing between closely related taxonomic units may be enforced by providing the program with alignments of marker protein sequences, eg, GyrA.

  19. SWPhylo – A Novel Tool for Phylogenomic Inferences by Comparison of Oligonucleotide Patterns and Integration of Genome-Based and Gene-Based Phylogenetic Trees

    Yu, Xiaoyu; Reva, Oleg N

    2018-01-01

    Modern phylogenetic studies may benefit from the analysis of complete genome sequences of various microorganisms. Evolutionary inferences based on genome-scale analysis are believed to be more accurate than the gene-based alternative. However, the computational complexity of current phylogenomic procedures, inappropriateness of standard phylogenetic tools to process genome-wide data, and lack of reliable substitution models which correlates with alignment-free phylogenomic approaches deter microbiologists from using these opportunities. For example, the super-matrix and super-tree approaches of phylogenomics use multiple integrated genomic loci or individual gene-based trees to infer an overall consensus tree. However, these approaches potentially multiply errors of gene annotation and sequence alignment not mentioning the computational complexity and laboriousness of the methods. In this article, we demonstrate that the annotation- and alignment-free comparison of genome-wide tetranucleotide frequencies, termed oligonucleotide usage patterns (OUPs), allowed a fast and reliable inference of phylogenetic trees. These were congruent to the corresponding whole genome super-matrix trees in terms of tree topology when compared with other known approaches including 16S ribosomal RNA and GyrA protein sequence comparison, complete genome-based MAUVE, and CVTree methods. A Web-based program to perform the alignment-free OUP-based phylogenomic inferences was implemented at http://swphylo.bi.up.ac.za/. Applicability of the tool was tested on different taxa from subspecies to intergeneric levels. Distinguishing between closely related taxonomic units may be enforced by providing the program with alignments of marker protein sequences, eg, GyrA. PMID:29511354

  20. Decision Making under Uncertainty: A Neural Model based on Partially Observable Markov Decision Processes

    Rajesh P N Rao

    2010-11-01

    Full Text Available A fundamental problem faced by animals is learning to select actions based on noisy sensory information and incomplete knowledge of the world. It has been suggested that the brain engages in Bayesian inference during perception but how such probabilistic representations are used to select actions has remained unclear. Here we propose a neural model of action selection and decision making based on the theory of partially observable Markov decision processes (POMDPs. Actions are selected based not on a single optimal estimate of state but on the posterior distribution over states (the belief state. We show how such a model provides a unified framework for explaining experimental results in decision making that involve both information gathering and overt actions. The model utilizes temporal difference (TD learning for maximizing expected reward. The resulting neural architecture posits an active role for the neocortex in belief computation while ascribing a role to the basal ganglia in belief representation, value computation, and action selection. When applied to the random dots motion discrimination task, model neurons representing belief exhibit responses similar to those of LIP neurons in primate neocortex. The appropriate threshold for switching from information gathering to overt actions emerges naturally during reward maximization. Additionally, the time course of reward prediction error in the model shares similarities with dopaminergic responses in the basal ganglia during the random dots task. For tasks with a deadline, the model learns a decision making strategy that changes with elapsed time, predicting a collapsing decision threshold consistent with some experimental studies. The model provides a new framework for understanding neural decision making and suggests an important role for interactions between the neocortex and the basal ganglia in learning the mapping between probabilistic sensory representations and actions that maximize

  1. Sustainable development based energy policy making frameworks, a critical review

    Meyar-Naimi, H.; Vaez-Zadeh, S.

    2012-01-01

    This paper, in the first step, presents an overview of the origination and formulation of sustainable development (SD) concept and the related policy making frameworks. The frameworks include Pressure–State–Response (PSR), Driving Force–State–Response (DSR), Driving Force–Pressure–State–Impact–Response (DPSIR), Driving Force–Pressure–State–Effect–Action (DPSEA) and Driving Force-Pressure-State-Exposure-Effect-Action (DPSEEA). In this regard, 40 case studies using the reviewed frameworks reported during 1994–2011 are surveyed. Then, their application area and application intensity are investigated. It is concluded that PSR, and DPSEA and DPSEEA have the higher and lower application intensities, respectively. Moreover, using Analytical Hierarchy Process (AHP) with a set of criteria, it is shown that PSR and DPSIR have the highest and lowest priorities. Finally, the shortcomings of frameworks applications are discussed. The paper is helpful in selecting appropriate policy making frameworks and presents some hints for future research in the area for developing more comprehensive models especially for sustainable electric energy policy making. - Highlights: ► The origination and formulation of sustainable development (SD) concept is reviewed. ► SD based frameworks (PSR, DSR, DPSIR, DPSEA and DPSEEA) are also reviewed. ► Then, the frameworks application area and intensity in recent years are investigated. ► Finally, the SD concept and the SD based frameworks are criticized. ► It will be helpful for developing more comprehensive energy policy making models.

  2. An efficient Bayesian inference approach to inverse problems based on an adaptive sparse grid collocation method

    Ma Xiang; Zabaras, Nicholas

    2009-01-01

    A new approach to modeling inverse problems using a Bayesian inference method is introduced. The Bayesian approach considers the unknown parameters as random variables and seeks the probabilistic distribution of the unknowns. By introducing the concept of the stochastic prior state space to the Bayesian formulation, we reformulate the deterministic forward problem as a stochastic one. The adaptive hierarchical sparse grid collocation (ASGC) method is used for constructing an interpolant to the solution of the forward model in this prior space which is large enough to capture all the variability/uncertainty in the posterior distribution of the unknown parameters. This solution can be considered as a function of the random unknowns and serves as a stochastic surrogate model for the likelihood calculation. Hierarchical Bayesian formulation is used to derive the posterior probability density function (PPDF). The spatial model is represented as a convolution of a smooth kernel and a Markov random field. The state space of the PPDF is explored using Markov chain Monte Carlo algorithms to obtain statistics of the unknowns. The likelihood calculation is performed by directly sampling the approximate stochastic solution obtained through the ASGC method. The technique is assessed on two nonlinear inverse problems: source inversion and permeability estimation in flow through porous media

  3. Inference of Functionally-Relevant N-acetyltransferase Residues Based on Statistical Correlations.

    Neuwald, Andrew F; Altschul, Stephen F

    2016-12-01

    Over evolutionary time, members of a superfamily of homologous proteins sharing a common structural core diverge into subgroups filling various functional niches. At the sequence level, such divergence appears as correlations that arise from residue patterns distinct to each subgroup. Such a superfamily may be viewed as a population of sequences corresponding to a complex, high-dimensional probability distribution. Here we model this distribution as hierarchical interrelated hidden Markov models (hiHMMs), which describe these sequence correlations implicitly. By characterizing such correlations one may hope to obtain information regarding functionally-relevant properties that have thus far evaded detection. To do so, we infer a hiHMM distribution from sequence data using Bayes' theorem and Markov chain Monte Carlo (MCMC) sampling, which is widely recognized as the most effective approach for characterizing a complex, high dimensional distribution. Other routines then map correlated residue patterns to available structures with a view to hypothesis generation. When applied to N-acetyltransferases, this reveals sequence and structural features indicative of functionally important, yet generally unknown biochemical properties. Even for sets of proteins for which nothing is known beyond unannotated sequences and structures, this can lead to helpful insights. We describe, for example, a putative coenzyme-A-induced-fit substrate binding mechanism mediated by arginine residue switching between salt bridge and π-π stacking interactions. A suite of programs implementing this approach is available (psed.igs.umaryland.edu).

  4. Inference of Functionally-Relevant N-acetyltransferase Residues Based on Statistical Correlations.

    Andrew F Neuwald

    2016-12-01

    Full Text Available Over evolutionary time, members of a superfamily of homologous proteins sharing a common structural core diverge into subgroups filling various functional niches. At the sequence level, such divergence appears as correlations that arise from residue patterns distinct to each subgroup. Such a superfamily may be viewed as a population of sequences corresponding to a complex, high-dimensional probability distribution. Here we model this distribution as hierarchical interrelated hidden Markov models (hiHMMs, which describe these sequence correlations implicitly. By characterizing such correlations one may hope to obtain information regarding functionally-relevant properties that have thus far evaded detection. To do so, we infer a hiHMM distribution from sequence data using Bayes' theorem and Markov chain Monte Carlo (MCMC sampling, which is widely recognized as the most effective approach for characterizing a complex, high dimensional distribution. Other routines then map correlated residue patterns to available structures with a view to hypothesis generation. When applied to N-acetyltransferases, this reveals sequence and structural features indicative of functionally important, yet generally unknown biochemical properties. Even for sets of proteins for which nothing is known beyond unannotated sequences and structures, this can lead to helpful insights. We describe, for example, a putative coenzyme-A-induced-fit substrate binding mechanism mediated by arginine residue switching between salt bridge and π-π stacking interactions. A suite of programs implementing this approach is available (psed.igs.umaryland.edu.

  5. Inference of median difference based on the Box-Cox model in randomized clinical trials.

    Maruo, K; Isogawa, N; Gosho, M

    2015-05-10

    In randomized clinical trials, many medical and biological measurements are not normally distributed and are often skewed. The Box-Cox transformation is a powerful procedure for comparing two treatment groups for skewed continuous variables in terms of a statistical test. However, it is difficult to directly estimate and interpret the location difference between the two groups on the original scale of the measurement. We propose a helpful method that infers the difference of the treatment effect on the original scale in a more easily interpretable form. We also provide statistical analysis packages that consistently include an estimate of the treatment effect, covariance adjustments, standard errors, and statistical hypothesis tests. The simulation study that focuses on randomized parallel group clinical trials with two treatment groups indicates that the performance of the proposed method is equivalent to or better than that of the existing non-parametric approaches in terms of the type-I error rate and power. We illustrate our method with cluster of differentiation 4 data in an acquired immune deficiency syndrome clinical trial. Copyright © 2015 John Wiley & Sons, Ltd.

  6. UAV Controller Based on Adaptive Neuro-Fuzzy Inference System and PID

    Ali Moltajaei Farid

    2013-01-01

    Full Text Available ANFIS is combining a neural network with a fuzzy system results in a hybrid neuro-fuzzy system, capable of reasoning and learning in an uncertain and imprecise environment. In this paper, an adaptive neuro-fuzzy inference system (ANFIS is employed to control an unmanned aircraft vehicle (UAV.  First, autopilots structure is defined, and then ANFIS controller is applied, to control UAVs lateral position. The results of ANFIS and PID lateral controllers are compared, where it shows the two controllers have similar results. ANFIS controller is capable to adaptation in nonlinear conditions, while PID has to be tuned to preserves proper control in some conditions. The simulation results generated by Matlab using Aerosim Aeronautical Simulation Block Set, which provides a complete set of tools for development of six degree-of-freedom. Nonlinear Aerosonde unmanned aerial vehicle model with ANFIS controller is simulated to verify the capability of the system. Moreover, the results are validated by FlightGear flight simulator.

  7. Stereotyping to infer group membership creates plausible deniability for prejudice-based aggression.

    Cox, William T L; Devine, Patricia G

    2014-02-01

    In the present study, participants administered painful electric shocks to an unseen male opponent who was either explicitly labeled as gay or stereotypically implied to be gay. Identifying the opponent with a gay-stereotypic attribute produced a situation in which the target's group status was privately inferred but plausibly deniable to others. To test the plausible deniability hypothesis, we examined aggression levels as a function of internal (personal) and external (social) motivation to respond without prejudice. Whether plausible deniability was present or absent, participants high in internal motivation aggressed at low levels, and participants low in both internal and external motivation aggressed at high levels. The behavior of participants low in internal and high in external motivation, however, depended on experimental condition. They aggressed at low levels when observers could plausibly attribute their behavior to prejudice and aggressed at high levels when the situation granted plausible deniability. This work has implications for both obstacles to and potential avenues for prejudice-reduction efforts.

  8. RCK: accurate and efficient inference of sequence- and structure-based protein-RNA binding models from RNAcompete data.

    Orenstein, Yaron; Wang, Yuhao; Berger, Bonnie

    2016-06-15

    Protein-RNA interactions, which play vital roles in many processes, are mediated through both RNA sequence and structure. CLIP-based methods, which measure protein-RNA binding in vivo, suffer from experimental noise and systematic biases, whereas in vitro experiments capture a clearer signal of protein RNA-binding. Among them, RNAcompete provides binding affinities of a specific protein to more than 240 000 unstructured RNA probes in one experiment. The computational challenge is to infer RNA structure- and sequence-based binding models from these data. The state-of-the-art in sequence models, Deepbind, does not model structural preferences. RNAcontext models both sequence and structure preferences, but is outperformed by GraphProt. Unfortunately, GraphProt cannot detect structural preferences from RNAcompete data due to the unstructured nature of the data, as noted by its developers, nor can it be tractably run on the full RNACompete dataset. We develop RCK, an efficient, scalable algorithm that infers both sequence and structure preferences based on a new k-mer based model. Remarkably, even though RNAcompete data is designed to be unstructured, RCK can still learn structural preferences from it. RCK significantly outperforms both RNAcontext and Deepbind in in vitro binding prediction for 244 RNAcompete experiments. Moreover, RCK is also faster and uses less memory, which enables scalability. While currently on par with existing methods in in vivo binding prediction on a small scale test, we demonstrate that RCK will increasingly benefit from experimentally measured RNA structure profiles as compared to computationally predicted ones. By running RCK on the entire RNAcompete dataset, we generate and provide as a resource a set of protein-RNA structure-based models on an unprecedented scale. Software and models are freely available at http://rck.csail.mit.edu/ bab@mit.edu Supplementary data are available at Bioinformatics online. © The Author 2016. Published by

  9. Clinical data warehousing for evidence based decision making.

    Narra, Lekha; Sahama, Tony; Stapleton, Peta

    2015-01-01

    Large volumes of heterogeneous health data silos pose a big challenge when exploring for information to allow for evidence based decision making and ensuring quality outcomes. In this paper, we present a proof of concept for adopting data warehousing technology to aggregate and analyse disparate health data in order to understand the impact various lifestyle factors on obesity. We present a practical model for data warehousing with detailed explanation which can be adopted similarly for studying various other health issues.

  10. Knowledge based decision making: perspective on natural gas production

    Ydstie, B. Erik; Stuland, Kjetil M.

    2009-07-01

    Conclusions (drawn by the author): Decarbonization of energy sources - From coal to renewable. Natural Gas Abundantly available - Norway is no. 3 exporter. Natural gas important as - Hydrogen source for chemicals; - Electricity; - End consumer usage (heating etc). Large potential for application of model based decision making; - Where and when to install platforms and drill wells - How to operate platforms and pipeline systems; - How to operate and optimize chemical production; - Optimization of electricity generation systems. (author)

  11. A Model of Decision-Making Based on Critical Thinking

    Uluçınar, Ufuk; Aypay, Ahmet

    2016-01-01

    The aim of this study is to examine the causal relationships between high school students' inquisitiveness, open-mindedness, causal thinking, and rational and intuitive decision-making dispositions through an assumed model based on research data. This study was designed in correlational model. Confirmatory factor analysis and path analysis, which are structural equation modelling applications, were used to explain these relationships. The participants were 404 students studying in five high s...

  12. A new and accurate fault location algorithm for combined transmission lines using Adaptive Network-Based Fuzzy Inference System

    Sadeh, Javad; Afradi, Hamid [Electrical Engineering Department, Faculty of Engineering, Ferdowsi University of Mashhad, P.O. Box: 91775-1111, Mashhad (Iran)

    2009-11-15

    This paper presents a new and accurate algorithm for locating faults in a combined overhead transmission line with underground power cable using Adaptive Network-Based Fuzzy Inference System (ANFIS). The proposed method uses 10 ANFIS networks and consists of 3 stages, including fault type classification, faulty section detection and exact fault location. In the first part, an ANFIS is used to determine the fault type, applying four inputs, i.e., fundamental component of three phase currents and zero sequence current. Another ANFIS network is used to detect the faulty section, whether the fault is on the overhead line or on the underground cable. Other eight ANFIS networks are utilized to pinpoint the faults (two for each fault type). Four inputs, i.e., the dc component of the current, fundamental frequency of the voltage and current and the angle between them, are used to train the neuro-fuzzy inference systems in order to accurately locate the faults on each part of the combined line. The proposed method is evaluated under different fault conditions such as different fault locations, different fault inception angles and different fault resistances. Simulation results confirm that the proposed method can be used as an efficient means for accurate fault location on the combined transmission lines. (author)

  13. α-Decomposition for estimating parameters in common cause failure modeling based on causal inference

    Zheng, Xiaoyu; Yamaguchi, Akira; Takata, Takashi

    2013-01-01

    The traditional α-factor model has focused on the occurrence frequencies of common cause failure (CCF) events. Global α-factors in the α-factor model are defined as fractions of failure probability for particular groups of components. However, there are unknown uncertainties in the CCF parameters estimation for the scarcity of available failure data. Joint distributions of CCF parameters are actually determined by a set of possible causes, which are characterized by CCF-triggering abilities and occurrence frequencies. In the present paper, the process of α-decomposition (Kelly-CCF method) is developed to learn about sources of uncertainty in CCF parameter estimation. Moreover, it aims to evaluate CCF risk significances of different causes, which are named as decomposed α-factors. Firstly, a Hybrid Bayesian Network is adopted to reveal the relationship between potential causes and failures. Secondly, because all potential causes have different occurrence frequencies and abilities to trigger dependent failures or independent failures, a regression model is provided and proved by conditional probability. Global α-factors are expressed by explanatory variables (causes’ occurrence frequencies) and parameters (decomposed α-factors). At last, an example is provided to illustrate the process of hierarchical Bayesian inference for the α-decomposition process. This study shows that the α-decomposition method can integrate failure information from cause, component and system level. It can parameterize the CCF risk significance of possible causes and can update probability distributions of global α-factors. Besides, it can provide a reliable way to evaluate uncertainty sources and reduce the uncertainty in probabilistic risk assessment. It is recommended to build databases including CCF parameters and corresponding causes’ occurrence frequency of each targeted system

  14. Error threshold inference from Global Precipitation Measurement (GPM) satellite rainfall data and interpolated ground-based rainfall measurements in Metro Manila

    Ampil, L. J. Y.; Yao, J. G.; Lagrosas, N.; Lorenzo, G. R. H.; Simpas, J.

    2017-12-01

    The Global Precipitation Measurement (GPM) mission is a group of satellites that provides global observations of precipitation. Satellite-based observations act as an alternative if ground-based measurements are inadequate or unavailable. Data provided by satellites however must be validated for this data to be reliable and used effectively. In this study, the Integrated Multisatellite Retrievals for GPM (IMERG) Final Run v3 half-hourly product is validated by comparing against interpolated ground measurements derived from sixteen ground stations in Metro Manila. The area considered in this study is the region 14.4° - 14.8° latitude and 120.9° - 121.2° longitude, subdivided into twelve 0.1° x 0.1° grid squares. Satellite data from June 1 - August 31, 2014 with the data aggregated to 1-day temporal resolution are used in this study. The satellite data is directly compared to measurements from individual ground stations to determine the effect of the interpolation by contrast against the comparison of satellite data and interpolated measurements. The comparisons are calculated by taking a fractional root-mean-square error (F-RMSE) between two datasets. The results show that interpolation improves errors compared to using raw station data except during days with very small amounts of rainfall. F-RMSE reaches extreme values of up to 654 without a rainfall threshold. A rainfall threshold is inferred to remove extreme error values and make the distribution of F-RMSE more consistent. Results show that the rainfall threshold varies slightly per month. The threshold for June is inferred to be 0.5 mm, reducing the maximum F-RMSE to 9.78, while the threshold for July and August is inferred to be 0.1 mm, reducing the maximum F-RMSE to 4.8 and 10.7, respectively. The maximum F-RMSE is reduced further as the threshold is increased. Maximum F-RMSE is reduced to 3.06 when a rainfall threshold of 10 mm is applied over the entire duration of JJA. These results indicate that

  15. Inference Instruction to Support Reading Comprehension for Elementary Students with Learning Disabilities

    Hall, Colby; Barnes, Marcia A.

    2017-01-01

    Making inferences during reading is a critical standards-based skill and is important for reading comprehension. This article supports the improvement of reading comprehension for students with learning disabilities (LD) in upper elementary grades by reviewing what is currently known about inference instruction for students with LD and providing…

  16. Towards a better understanding of the legibility bias in performance assessments: the case of gender-based inferences.

    Greifeneder, Rainer; Zelt, Sarah; Seele, Tim; Bottenberg, Konstantin; Alt, Alexander

    2012-09-01

    Handwriting legibility systematically biases evaluations in that highly legible handwriting results in more positive evaluations than less legible handwriting. Because performance assessments in educational contexts are not only based on computerized or multiple choice tests but often include the evaluation of handwritten work samples, understanding the causes of this bias is critical. This research was designed to replicate and extend the legibility bias in two tightly controlled experiments and to explore whether gender-based inferences contribute to its occurrence. A total of 132 students from a German university participated in one pre-test and two independent experiments. Participants were asked to read and evaluate several handwritten essays varying in content quality. Each essay was presented to some participants in highly legible handwriting and to other participants in less legible handwriting. In addition, the assignment of legibility to participant group was reversed from essay to essay, resulting in a mixed-factor design. The legibility bias was replicated in both experiments. Results suggest that gender-based inferences do not account for its occurrence. Rather it appears that fluency from legibility exerts a biasing impact on evaluations of content and author abilities. The legibility bias was shown to be genuine and strong. By refuting a series of alternative explanations, this research contributes to a better understanding of what underlies the legibility bias. The present research may inform those who grade on what to focus and thus help to better allocate cognitive resources when trying to reduce this important source of error. ©2011 The British Psychological Society.

  17. Permutation based decision making under fuzzy environment using Tabu search

    Mahdi Bashiri

    2012-04-01

    Full Text Available One of the techniques, which are used for Multiple Criteria Decision Making (MCDM is the permutation. In the classical form of permutation, it is assumed that weights and decision matrix components are crisp. However, when group decision making is under consideration and decision makers could not agree on a crisp value for weights and decision matrix components, fuzzy numbers should be used. In this article, the fuzzy permutation technique for MCDM problems has been explained. The main deficiency of permutation is its big computational time, so a Tabu Search (TS based algorithm has been proposed to reduce the computational time. A numerical example has illustrated the proposed approach clearly. Then, some benchmark instances extracted from literature are solved by proposed TS. The analyses of the results show the proper performance of the proposed method.

  18. Statistical inference

    Rohatgi, Vijay K

    2003-01-01

    Unified treatment of probability and statistics examines and analyzes the relationship between the two fields, exploring inferential issues. Numerous problems, examples, and diagrams--some with solutions--plus clear-cut, highlighted summaries of results. Advanced undergraduate to graduate level. Contents: 1. Introduction. 2. Probability Model. 3. Probability Distributions. 4. Introduction to Statistical Inference. 5. More on Mathematical Expectation. 6. Some Discrete Models. 7. Some Continuous Models. 8. Functions of Random Variables and Random Vectors. 9. Large-Sample Theory. 10. General Meth

  19. The implementation of two stages clustering (k-means clustering and adaptive neuro fuzzy inference system) for prediction of medicine need based on medical data

    Husein, A. M.; Harahap, M.; Aisyah, S.; Purba, W.; Muhazir, A.

    2018-03-01

    Medication planning aim to get types, amount of medicine according to needs, and avoid the emptiness medicine based on patterns of disease. In making the medicine planning is still rely on ability and leadership experience, this is due to take a long time, skill, difficult to obtain a definite disease data, need a good record keeping and reporting, and the dependence of the budget resulted in planning is not going well, and lead to frequent lack and excess of medicines. In this research, we propose Adaptive Neuro Fuzzy Inference System (ANFIS) method to predict medication needs in 2016 and 2017 based on medical data in 2015 and 2016 from two source of hospital. The framework of analysis using two approaches. The first phase is implementing ANFIS to a data source, while the second approach we keep using ANFIS, but after the process of clustering from K-Means algorithm, both approaches are calculated values of Root Mean Square Error (RMSE) for training and testing. From the testing result, the proposed method with better prediction rates based on the evaluation analysis of quantitative and qualitative compared with existing systems, however the implementation of K-Means Algorithm against ANFIS have an effect on the timing of the training process and provide a classification accuracy significantly better without clustering.

  20. Evaluation of a new neutron energy spectrum unfolding code based on an Adaptive Neuro-Fuzzy Inference System (ANFIS).

    Hosseini, Seyed Abolfazl; Esmaili Paeen Afrakoti, Iman

    2018-01-17

    The purpose of the present study was to reconstruct the energy spectrum of a poly-energetic neutron source using an algorithm developed based on an Adaptive Neuro-Fuzzy Inference System (ANFIS). ANFIS is a kind of artificial neural network based on the Takagi-Sugeno fuzzy inference system. The ANFIS algorithm uses the advantages of both fuzzy inference systems and artificial neural networks to improve the effectiveness of algorithms in various applications such as modeling, control and classification. The neutron pulse height distributions used as input data in the training procedure for the ANFIS algorithm were obtained from the simulations performed by MCNPX-ESUT computational code (MCNPX-Energy engineering of Sharif University of Technology). Taking into account the normalization condition of each energy spectrum, 4300 neutron energy spectra were generated randomly. (The value in each bin was generated randomly, and finally a normalization of each generated energy spectrum was performed). The randomly generated neutron energy spectra were considered as output data of the developed ANFIS computational code in the training step. To calculate the neutron energy spectrum using conventional methods, an inverse problem with an approximately singular response matrix (with the determinant of the matrix close to zero) should be solved. The solution of the inverse problem using the conventional methods unfold neutron energy spectrum with low accuracy. Application of the iterative algorithms in the solution of such a problem, or utilizing the intelligent algorithms (in which there is no need to solve the problem), is usually preferred for unfolding of the energy spectrum. Therefore, the main reason for development of intelligent algorithms like ANFIS for unfolding of neutron energy spectra is to avoid solving the inverse problem. In the present study, the unfolded neutron energy spectra of 252Cf and 241Am-9Be neutron sources using the developed computational code were

  1. Deep Learning for Population Genetic Inference.

    Sheehan, Sara; Song, Yun S

    2016-03-01

    Given genomic variation data from multiple individuals, computing the likelihood of complex population genetic models is often infeasible. To circumvent this problem, we introduce a novel likelihood-free inference framework by applying deep learning, a powerful modern technique in machine learning. Deep learning makes use of multilayer neural networks to learn a feature-based function from the input (e.g., hundreds of correlated summary statistics of data) to the output (e.g., population genetic parameters of interest). We demonstrate that deep learning can be effectively employed for population genetic inference and learning informative features of data. As a concrete application, we focus on the challenging problem of jointly inferring natural selection and demography (in the form of a population size change history). Our method is able to separate the global nature of demography from the local nature of selection, without sequential steps for these two factors. Studying demography and selection jointly is motivated by Drosophila, where pervasive selection confounds demographic analysis. We apply our method to 197 African Drosophila melanogaster genomes from Zambia to infer both their overall demography, and regions of their genome under selection. We find many regions of the genome that have experienced hard sweeps, and fewer under selection on standing variation (soft sweep) or balancing selection. Interestingly, we find that soft sweeps and balancing selection occur more frequently closer to the centromere of each chromosome. In addition, our demographic inference suggests that previously estimated bottlenecks for African Drosophila melanogaster are too extreme.

  2. Deep Learning for Population Genetic Inference.

    Sara Sheehan

    2016-03-01

    Full Text Available Given genomic variation data from multiple individuals, computing the likelihood of complex population genetic models is often infeasible. To circumvent this problem, we introduce a novel likelihood-free inference framework by applying deep learning, a powerful modern technique in machine learning. Deep learning makes use of multilayer neural networks to learn a feature-based function from the input (e.g., hundreds of correlated summary statistics of data to the output (e.g., population genetic parameters of interest. We demonstrate that deep learning can be effectively employed for population genetic inference and learning informative features of data. As a concrete application, we focus on the challenging problem of jointly inferring natural selection and demography (in the form of a population size change history. Our method is able to separate the global nature of demography from the local nature of selection, without sequential steps for these two factors. Studying demography and selection jointly is motivated by Drosophila, where pervasive selection confounds demographic analysis. We apply our method to 197 African Drosophila melanogaster genomes from Zambia to infer both their overall demography, and regions of their genome under selection. We find many regions of the genome that have experienced hard sweeps, and fewer under selection on standing variation (soft sweep or balancing selection. Interestingly, we find that soft sweeps and balancing selection occur more frequently closer to the centromere of each chromosome. In addition, our demographic inference suggests that previously estimated bottlenecks for African Drosophila melanogaster are too extreme.

  3. Deep Learning for Population Genetic Inference

    Sheehan, Sara; Song, Yun S.

    2016-01-01

    Given genomic variation data from multiple individuals, computing the likelihood of complex population genetic models is often infeasible. To circumvent this problem, we introduce a novel likelihood-free inference framework by applying deep learning, a powerful modern technique in machine learning. Deep learning makes use of multilayer neural networks to learn a feature-based function from the input (e.g., hundreds of correlated summary statistics of data) to the output (e.g., population genetic parameters of interest). We demonstrate that deep learning can be effectively employed for population genetic inference and learning informative features of data. As a concrete application, we focus on the challenging problem of jointly inferring natural selection and demography (in the form of a population size change history). Our method is able to separate the global nature of demography from the local nature of selection, without sequential steps for these two factors. Studying demography and selection jointly is motivated by Drosophila, where pervasive selection confounds demographic analysis. We apply our method to 197 African Drosophila melanogaster genomes from Zambia to infer both their overall demography, and regions of their genome under selection. We find many regions of the genome that have experienced hard sweeps, and fewer under selection on standing variation (soft sweep) or balancing selection. Interestingly, we find that soft sweeps and balancing selection occur more frequently closer to the centromere of each chromosome. In addition, our demographic inference suggests that previously estimated bottlenecks for African Drosophila melanogaster are too extreme. PMID:27018908

  4. Managing Operational Risk Related to Microfinance Lending Process using Fuzzy Inference System based on the FMEA Method: Moroccan Case Study

    Alaoui Youssef Lamrani

    2017-12-01

    Full Text Available Managing operational risk efficiently is a critical factor of microfinance institutions (MFIs to get a financial and social return. The purpose of this paper is to identify, assess and prioritize the root causes of failure within the microfinance lending process (MLP especially in Moroccan microfinance institutions. Considering the limitation of traditional failure mode and effect analysis (FMEA method in assessing and classifying risks, the methodology adopted in this study focuses on developing a fuzzy logic inference system (FLIS based on (FMEA. This approach can take into account the subjectivity of risk indicators and the insufficiency of statistical data. The results show that the Moroccan MFIs need to focus more on customer relationship management and give more importance to their staff training, to clients screening as well as to their business analysis.

  5. Forecasting short-term power prices in the Ontario Electricity Market (OEM) with a fuzzy logic based inference system

    Arciniegas, Alvaro I.; Arciniegas Rueda, Ismael E.

    2008-01-01

    The Ontario Electricity Market (OEM), which opened in May 2002, is relatively new and is still under change. In addition, the bidding strategies of the participants are such that the relationships between price and fundamentals are non-linear and dynamic. The lack of market maturity and high complexity hinders the use of traditional statistical methodologies (e.g., regression analysis) for price forecasting. Therefore, a flexible model is needed to achieve good forecasting in OEM. This paper uses a Takagi-Sugeno-Kang (TSK) fuzzy inference system in forecasting the one-day-ahead real-time peak price of the OEM. The forecasting results of TSK are compared with those obtained by traditional statistical and neural network based forecasting. The comparison suggests that TSK has considerable value in forecasting one-day-ahead peak price in OEM. (author)

  6. Staff background paper on performance-based rate making

    Fraser, J.; Brownell, B.

    1998-10-01

    An alternative to the traditional cost of service (COS) regulation for electric utilities in British Columbia has been proposed. The alternative to pure COS regulation is performance-based rate making (PBR). PBR partially decouples a utility's rates from its costs and ties utility profits to performance relative to specific benchmarks. The motivation underlying PBR is that ideally, it provides incentives for utilities to cost-effectively achieve pre-defined goals. This report describes the design of PBR mechanisms, base rate PBR formulas, base rate PBR in other jurisdictions including New York, California, Maine and New Jersey. The report also describes gas procurement PBR in other jurisdictions, as well as British Columbia Utilities' Commission's own experience with PBR. In general, PBR has the potential to provide resource efficiency, allocative efficiency, support for introduction of new services, and reduced regulatory administrative costs. 15 refs., 4 tabs

  7. A formal model of interpersonal inference

    Michael eMoutoussis

    2014-03-01

    Full Text Available Introduction: We propose that active Bayesian inference – a general framework for decision-making – can equally be applied to interpersonal exchanges. Social cognition, however, entails special challenges. We address these challenges through a novel formulation of a formal model and demonstrate its psychological significance. Method: We review relevant literature, especially with regards to interpersonal representations, formulate a mathematical model and present a simulation study. The model accommodates normative models from utility theory and places them within the broader setting of Bayesian inference. Crucially, we endow people's prior beliefs, into which utilities are absorbed, with preferences of self and others. The simulation illustrates the model's dynamics and furnishes elementary predictions of the theory. Results: 1. Because beliefs about self and others inform both the desirability and plausibility of outcomes, in this framework interpersonal representations become beliefs that have to be actively inferred. This inference, akin to 'mentalising' in the psychological literature, is based upon the outcomes of interpersonal exchanges. 2. We show how some well-known social-psychological phenomena (e.g. self-serving biases can be explained in terms of active interpersonal inference. 3. Mentalising naturally entails Bayesian updating of how people value social outcomes. Crucially this includes inference about one’s own qualities and preferences. Conclusion: We inaugurate a Bayes optimal framework for modelling intersubject variability in mentalising during interpersonal exchanges. Here, interpersonal representations are endowed with explicit functional and affective properties. We suggest the active inference framework lends itself to the study of psychiatric conditions where mentalising is distorted.

  8. Agent-Based Modeling of Consumer Decision making Process Based on Power Distance and Personality

    Roozmand, O.; Ghasem-Aghaee, N.; Hofstede, G.J.; Nematbakhsh, M.A.; Baraani, A.; Verwaart, T.

    2011-01-01

    Simulating consumer decision making processes involves different disciplines such as: sociology, social psychology, marketing, and computer science. In this paper, we propose an agent-based conceptual and computational model of consumer decision-making based on culture, personality and human needs.

  9. The relationships within the Chaitophorinae and Drepanosiphinae (Hemiptera, Aphididae) inferred from molecular-based phylogeny and comprehensive morphological data

    Wieczorek, Karina; Lachowska-Cierlik, Dorota; Kajtoch, Łukasz; Kanturski, Mariusz

    2017-01-01

    The Chaitophorinae is a bionomically diverse Holarctic subfamily of Aphididae. The current classification includes two tribes: the Chaitophorini associated with deciduous trees and shrubs, and Siphini that feed on monocotyledonous plants. We present the first phylogenetic hypothesis for the subfamily, based on molecular and morphological datasets. Molecular analyses were based on the mitochondrial gene cytochrome oxidase subunit I (COI) and the nuclear gene elongation factor-1α (EF-1α). Phylogenetic inferences were obtained individually on each of genes and joined alignments using Bayesian inference (BI) and Maximum likelihood (ML). In phylogenetic trees reconstructed on the basis of nuclear and mitochondrial genes as well as a morphological dataset, the monophyly of Siphini and the genus Chaitophorus was supported. Periphyllus forms independent lineages from Chaitophorus and Siphini. Within this genus two clades comprising European and Asiatic species, respectively, were indicated. Concerning relationships within the subfamily, EF-1α and joined COI and EF-1α genes analysis strongly supports the hypothesis that Chaitophorini do not form a monophyletic clade. Periphyllus is a sister group to a clade containing Chaitophorus and Siphini. The Asiatic unit of Periphyllus also includes Trichaitophorus koyaensis. The analysis of morphological dataset under equally weighted parsimony also supports the view that Chaitophorini is an artificial taxon, as Lambersaphis pruinosae and Pseudopterocomma hughi, both traditionally included in the Chaitophorini, formed independent lineages. COI analyses support consistent groups within the subfamily, but relationships between groups are poorly resolved. These analyses were extended to include the species of closely related and phylogenetically unstudied subfamily Drepanosiphinae, which produced congruent results. Genera Drepanosiphum and Depanaphis are monophyletic and sister. The position of Yamatocallis tokyoensis differs in the

  10. Data mining in forecasting PVT correlations of crude oil systems based on Type1 fuzzy logic inference systems

    El-Sebakhy, Emad A.

    2009-09-01

    Pressure-volume-temperature properties are very important in the reservoir engineering computations. There are many empirical approaches for predicting various PVT properties based on empirical correlations and statistical regression models. Last decade, researchers utilized neural networks to develop more accurate PVT correlations. These achievements of neural networks open the door to data mining techniques to play a major role in oil and gas industry. Unfortunately, the developed neural networks correlations are often limited, and global correlations are usually less accurate compared to local correlations. Recently, adaptive neuro-fuzzy inference systems have been proposed as a new intelligence framework for both prediction and classification based on fuzzy clustering optimization criterion and ranking. This paper proposes neuro-fuzzy inference systems for estimating PVT properties of crude oil systems. This new framework is an efficient hybrid intelligence machine learning scheme for modeling the kind of uncertainty associated with vagueness and imprecision. We briefly describe the learning steps and the use of the Takagi Sugeno and Kang model and Gustafson-Kessel clustering algorithm with K-detected clusters from the given database. It has featured in a wide range of medical, power control system, and business journals, often with promising results. A comparative study will be carried out to compare their performance of this new framework with the most popular modeling techniques, such as neural networks, nonlinear regression, and the empirical correlations algorithms. The results show that the performance of neuro-fuzzy systems is accurate, reliable, and outperform most of the existing forecasting techniques. Future work can be achieved by using neuro-fuzzy systems for clustering the 3D seismic data, identification of lithofacies types, and other reservoir characterization.

  11. An Intelligent Fleet Condition-Based Maintenance Decision Making Method Based on Multi-Agent

    Bo Sun; Qiang Feng; Songjie Li

    2012-01-01

    According to the demand for condition-based maintenance online decision making among a mission oriented fleet, an intelligent maintenance decision making method based on Multi-agent and heuristic rules is proposed. The process of condition-based maintenance within an aircraft fleet (each containing one or more Line Replaceable Modules) based on multiple maintenance thresholds is analyzed. Then the process is abstracted into a Multi-Agent Model, a 2-layer model structure containing host negoti...

  12. A canonical correlation analysis-based dynamic bayesian network prior to infer gene regulatory networks from multiple types of biological data.

    Baur, Brittany; Bozdag, Serdar

    2015-04-01

    One of the challenging and important computational problems in systems biology is to infer gene regulatory networks (GRNs) of biological systems. Several methods that exploit gene expression data have been developed to tackle this problem. In this study, we propose the use of copy number and DNA methylation data to infer GRNs. We developed an algorithm that scores regulatory interactions between genes based on canonical correlation analysis. In this algorithm, copy number or DNA methylation variables are treated as potential regulator variables, and expression variables are treated as potential target variables. We first validated that the canonical correlation analysis method is able to infer true interactions in high accuracy. We showed that the use of DNA methylation or copy number datasets leads to improved inference over steady-state expression. Our results also showed that epigenetic and structural information could be used to infer directionality of regulatory interactions. Additional improvements in GRN inference can be gleaned from incorporating the result in an informative prior in a dynamic Bayesian algorithm. This is the first study that incorporates copy number and DNA methylation into an informative prior in dynamic Bayesian framework. By closely examining top-scoring interactions with different sources of epigenetic or structural information, we also identified potential novel regulatory interactions.

  13. The Relative Success of Recognition-Based Inference in Multichoice Decisions

    McCloy, Rachel; Beaman, C. Philip; Smith, Philip T.

    2008-01-01

    The utility of an "ecologically rational" recognition-based decision rule in multichoice decision problems is analyzed, varying the type of judgment required (greater or lesser). The maximum size and range of a counterintuitive advantage associated with recognition-based judgment (the "less-is-more effect") is identified for a range of cue…

  14. Development of methods for inferring cloud thickness and cloud-base height from satellite radiance data

    Smith, William L., Jr.; Minnis, Patrick; Alvarez, Joseph M.; Uttal, Taneil; Intrieri, Janet M.; Ackerman, Thomas P.; Clothiaux, Eugene

    1993-01-01

    Cloud-top height is a major factor determining the outgoing longwave flux at the top of the atmosphere. The downwelling radiation from the cloud strongly affects the cooling rate within the atmosphere and the longwave radiation incident at the surface. Thus, determination of cloud-base temperature is important for proper calculation of fluxes below the cloud. Cloud-base altitude is also an important factor in aircraft operations. Cloud-top height or temperature can be derived in a straightforward manner using satellite-based infrared data. Cloud-base temperature, however, is not observable from the satellite, but is related to the height, phase, and optical depth of the cloud in addition to other variables. This study uses surface and satellite data taken during the First ISCCP Regional Experiment (FIRE) Phase-2 Intensive Field Observation (IFO) period (13 Nov. - 7 Dec. 1991, to improve techniques for deriving cloud-base height from conventional satellite data.

  15. DEVELOPING INCIDENT DETECTION ALGORITHM BASED ON THE MAMDANI FUZZY INFERENCE ALGORITHM

    Andrey Borisovich Nikolaev

    2017-09-01

    Full Text Available Application of fuzzy logic in the incident detection system allows making a decision under uncertainty. The phase of incident detection is a process of finding difficulties in traffic. The difficulty in traffic is the main sign that there was a road accident and requires a reaction for its elimination. This leads to the use of input data that must be relevant to the vehicles and the road. These data must be considered together, and should be compared with the corresponding values for further analysis. The main parameters of the traffic flow, which can characterize its current state, are a flow rate, a volume flow. Necessary to analyze the input data received from the sensors. After processing the input data, using the previously entered fuzzy rules, will be taken action that will improve the situation in traffic or at least not allow it worse.

  16. Inference for exponentiated general class of distributions based on record values

    Samah N. Sindi

    2017-09-01

    Full Text Available The main objective of this paper is to suggest and study a new exponentiated general class (EGC of distributions. Maximum likelihood, Bayesian and empirical Bayesian estimators of the parameter of the EGC of distributions based on lower record values are obtained. Furthermore, Bayesian prediction of future records is considered. Based on lower record values, the exponentiated Weibull distribution, its special cases of distributions and exponentiated Gompertz distribution are applied to the EGC of distributions.  

  17. Stochastic Watershed Models for Risk Based Decision Making

    Vogel, R. M.

    2017-12-01

    Over half a century ago, the Harvard Water Program introduced the field of operational or synthetic hydrology providing stochastic streamflow models (SSMs), which could generate ensembles of synthetic streamflow traces useful for hydrologic risk management. The application of SSMs, based on streamflow observations alone, revolutionized water resources planning activities, yet has fallen out of favor due, in part, to their inability to account for the now nearly ubiquitous anthropogenic influences on streamflow. This commentary advances the modern equivalent of SSMs, termed `stochastic watershed models' (SWMs) useful as input to nearly all modern risk based water resource decision making approaches. SWMs are deterministic watershed models implemented using stochastic meteorological series, model parameters and model errors, to generate ensembles of streamflow traces that represent the variability in possible future streamflows. SWMs combine deterministic watershed models, which are ideally suited to accounting for anthropogenic influences, with recent developments in uncertainty analysis and principles of stochastic simulation

  18. Risk-based decision making for terrorism applications.

    Dillon, Robin L; Liebe, Robert M; Bestafka, Thomas

    2009-03-01

    This article describes the anti-terrorism risk-based decision aid (ARDA), a risk-based decision-making approach for prioritizing anti-terrorism measures. The ARDA model was developed as part of a larger effort to assess investments for protecting U.S. Navy assets at risk and determine whether the most effective anti-terrorism alternatives are being used to reduce the risk to the facilities and war-fighting assets. With ARDA and some support from subject matter experts, we examine thousands of scenarios composed of 15 attack modes against 160 facility types on two installations and hundreds of portfolios of 22 mitigation alternatives. ARDA uses multiattribute utility theory to solve some of the commonly identified challenges in security risk analysis. This article describes the process and documents lessons learned from applying the ARDA model for this application.

  19. Studies in using a universal exchange and inference language for evidence based medicine. Semi-automated learning and reasoning for PICO methodology, systematic review, and environmental epidemiology.

    Robson, Barry

    2016-12-01

    The Q-UEL language of XML-like tags and the associated software applications are providing a valuable toolkit for Evidence Based Medicine (EBM). In this paper the already existing applications, data bases, and tags are brought together with new ones. The particular Q-UEL embodiment used here is the BioIngine. The main challenge is one of bringing together the methods of symbolic reasoning and calculative probabilistic inference that underlie EBM and medical decision making. Some space is taken to review this background. The unification is greatly facilitated by Q-UEL's roots in the notation and algebra of Dirac, and by extending Q-UEL into the Wolfram programming environment. Further, the overall problem of integration is also a relatively simple one because of the nature of Q-UEL as a language for interoperability in healthcare and biomedicine, while the notion of workflow is facilitated because of the EBM best practice known as PICO. What remains difficult is achieving a high degree of overall automation because of a well-known difficulty in capturing human expertise in computers: the Feigenbaum bottleneck. Copyright © 2016 Elsevier Ltd. All rights reserved.

  20. Hybrid Optical Inference Machines

    1991-09-27

    with labels. Now, events. a set of facts cal be generated in the dyadic form "u, R 1,2" Eichmann and Caulfield (19] consider the same type of and can...these enceding-schemes. These architectures are-based pri- 19. G. Eichmann and H. J. Caulfield, "Optical Learning (Inference)marily on optical inner

  1. Flare parameters inferred from a 3D loop model data base

    Cuambe, Valente A.; Costa, J. E. R.; Simões, P. J. A.

    2018-06-01

    We developed a data base of pre-calculated flare images and spectra exploring a set of parameters which describe the physical characteristics of coronal loops and accelerated electron distribution. Due to the large number of parameters involved in describing the geometry and the flaring atmosphere in the model used, we built a large data base of models (˜250 000) to facilitate the flare analysis. The geometry and characteristics of non-thermal electrons are defined on a discrete grid with spatial resolution greater than 4 arcsec. The data base was constructed based on general properties of known solar flares and convolved with instrumental resolution to replicate the observations from the Nobeyama radio polarimeter spectra and Nobeyama radioheliograph (NoRH) brightness maps. Observed spectra and brightness distribution maps are easily compared with the modelled spectra and images in the data base, indicating a possible range of solutions. The parameter search efficiency in this finite data base is discussed. 8 out of 10 parameters analysed for 1000 simulated flare searches were recovered with a relative error of less than 20 per cent on average. In addition, from the analysis of the observed correlation between NoRH flare sizes and intensities at 17 GHz, some statistical properties were derived. From these statistics, the energy spectral index was found to be δ ˜ 3, with non-thermal electron densities showing a peak distribution ⪅107 cm-3, and Bphotosphere ⪆ 2000 G. Some bias for larger loops with heights as great as ˜2.6 × 109 cm, and looptop events were noted. An excellent match of the spectrum and the brightness distribution at 17 and 34 GHz of the 2002 May 31 flare is presented as well.

  2. Knowledge Representation and Inference for Analysis and Design of Database and Tabular Rule-Based Systems

    Antoni Ligeza

    2001-01-01

    Full Text Available Rulebased systems constitute a powerful tool for specification of knowledge in design and implementation of knowledge based systems. They provide also a universal programming paradigm for domains such as intelligent control, decision support, situation classification and operational knowledge encoding. In order to assure safe and reliable performance, such system should satisfy certain formal requirements, including completeness and consistency. This paper addresses the issue of analysis and verification of selected properties of a class of such system in a systematic way. A uniform, tabular scheme of single-level rule-based systems is considered. Such systems can be applied as a generalized form of databases for specification of data pattern (unconditional knowledge, or can be used for defining attributive decision tables (conditional knowledge in form of rules. They can also serve as lower-level components of a hierarchical multi-level control and decision support knowledge-based systems. An algebraic knowledge representation paradigm using extended tabular representation, similar to relational database tables is presented and algebraic bases for system analysis, verification and design support are outlined.

  3. Space-Time Joint Interference Cancellation Using Fuzzy-Inference-Based Adaptive Filtering Techniques in Frequency-Selective Multipath Channels

    Hu, Chia-Chang; Lin, Hsuan-Yu; Chen, Yu-Fan; Wen, Jyh-Horng

    2006-12-01

    An adaptive minimum mean-square error (MMSE) array receiver based on the fuzzy-logic recursive least-squares (RLS) algorithm is developed for asynchronous DS-CDMA interference suppression in the presence of frequency-selective multipath fading. This receiver employs a fuzzy-logic control mechanism to perform the nonlinear mapping of the squared error and squared error variation, denoted by ([InlineEquation not available: see fulltext.],[InlineEquation not available: see fulltext.]), into a forgetting factor[InlineEquation not available: see fulltext.]. For the real-time applicability, a computationally efficient version of the proposed receiver is derived based on the least-mean-square (LMS) algorithm using the fuzzy-inference-controlled step-size[InlineEquation not available: see fulltext.]. This receiver is capable of providing both fast convergence/tracking capability as well as small steady-state misadjustment as compared with conventional LMS- and RLS-based MMSE DS-CDMA receivers. Simulations show that the fuzzy-logic LMS and RLS algorithms outperform, respectively, other variable step-size LMS (VSS-LMS) and variable forgetting factor RLS (VFF-RLS) algorithms at least 3 dB and 1.5 dB in bit-error-rate (BER) for multipath fading channels.

  4. Space-Time Joint Interference Cancellation Using Fuzzy-Inference-Based Adaptive Filtering Techniques in Frequency-Selective Multipath Channels

    Chen Yu-Fan

    2006-01-01

    Full Text Available An adaptive minimum mean-square error (MMSE array receiver based on the fuzzy-logic recursive least-squares (RLS algorithm is developed for asynchronous DS-CDMA interference suppression in the presence of frequency-selective multipath fading. This receiver employs a fuzzy-logic control mechanism to perform the nonlinear mapping of the squared error and squared error variation, denoted by ( , , into a forgetting factor . For the real-time applicability, a computationally efficient version of the proposed receiver is derived based on the least-mean-square (LMS algorithm using the fuzzy-inference-controlled step-size . This receiver is capable of providing both fast convergence/tracking capability as well as small steady-state misadjustment as compared with conventional LMS- and RLS-based MMSE DS-CDMA receivers. Simulations show that the fuzzy-logic LMS and RLS algorithms outperform, respectively, other variable step-size LMS (VSS-LMS and variable forgetting factor RLS (VFF-RLS algorithms at least 3 dB and 1.5 dB in bit-error-rate (BER for multipath fading channels.

  5. Towards a framework for threaded inference in rule-based systems

    Luis Casillas Santillan

    2013-11-01

    Full Text Available nformation and communication technologies have shown a significant advance and fast pace in their performance and pervasiveness. Knowledge has become a significant asset for organizations, which need to deal with large amounts of data and information to produce valuable knowledge. Dealing with knowledge is turning the axis for organizations in the new economy. One of the choices to gather the goal of knowledge managing is the use of rule-based systems. This kind of approach is the new chance for expert-systems’ technology. Modern languages and cheap computing allow the implementation of concurrent systems for dealing huge volumes of information in organizations. The present work is aimed at proposing the use of contemporary programming elements, as easy to exploit threading, when implementing rule-based treatment over huge data volumes.

  6. Bootstrap-based procedures for inference in nonparametric receiver-operating characteristic curve regression analysis.

    Rodríguez-Álvarez, María Xosé; Roca-Pardiñas, Javier; Cadarso-Suárez, Carmen; Tahoces, Pablo G

    2018-03-01

    Prior to using a diagnostic test in a routine clinical setting, the rigorous evaluation of its diagnostic accuracy is essential. The receiver-operating characteristic curve is the measure of accuracy most widely used for continuous diagnostic tests. However, the possible impact of extra information about the patient (or even the environment) on diagnostic accuracy also needs to be assessed. In this paper, we focus on an estimator for the covariate-specific receiver-operating characteristic curve based on direct regression modelling and nonparametric smoothing techniques. This approach defines the class of generalised additive models for the receiver-operating characteristic curve. The main aim of the paper is to offer new inferential procedures for testing the effect of covariates on the conditional receiver-operating characteristic curve within the above-mentioned class. Specifically, two different bootstrap-based tests are suggested to check (a) the possible effect of continuous covariates on the receiver-operating characteristic curve and (b) the presence of factor-by-curve interaction terms. The validity of the proposed bootstrap-based procedures is supported by simulations. To facilitate the application of these new procedures in practice, an R-package, known as npROCRegression, is provided and briefly described. Finally, data derived from a computer-aided diagnostic system for the automatic detection of tumour masses in breast cancer is analysed.

  7. Health technology assessment, value-based decision making, and innovation.

    Henshall, Chris; Schuller, Tara

    2013-10-01

    Identifying treatments that offer value and value for money is becoming increasingly important, with interest in how health technology assessment (HTA) and decision makers can take appropriate account of what is of value to patients and to society, and in the relationship between innovation and assessments of value. This study summarizes points from an Health Technology Assessment International (HTAi) Policy Forum discussion, drawing on presentations, discussions among attendees, and background papers. Various perspectives on value were considered; most place patient health at the core of value. Wider elements of value comprise other benefits for: patients; caregivers; the health and social care systems; and society. Most decision-making systems seek to take account of similar elements of value, although they are assessed and combined in different ways. Judgment in decisions remains important and cannot be replaced by mathematical approaches. There was discussion of the value of innovation and of the effects of value assessments on innovation. Discussion also included moving toward "progressive health system decision making," an ongoing process whereby evidence-based decisions on use would be made at various stages in the technology lifecycle. Five actions are identified: (i) development of a general framework for the definition and assessment of value; development by HTA/coverage bodies and regulators of (ii) disease-specific guidance and (iii) further joint scientific advice for industry on demonstrating value; (iv) development of a framework for progressive licensing, usage, and reimbursement; and (v) promoting work to better adapt HTA, coverage, and procurement approaches to medical devices.

  8. Likelihood-based inference for cointegration with nonlinear error-correction

    Kristensen, Dennis; Rahbek, Anders Christian

    2010-01-01

    We consider a class of nonlinear vector error correction models where the transfer function (or loadings) of the stationary relationships is nonlinear. This includes in particular the smooth transition models. A general representation theorem is given which establishes the dynamic properties...... and a linear trend in general. Gaussian likelihood-based estimators are considered for the long-run cointegration parameters, and the short-run parameters. Asymptotic theory is provided for these and it is discussed to what extend asymptotic normality and mixed normality can be found. A simulation study...

  9. Variable selection for confounder control, flexible modeling and Collaborative Targeted Minimum Loss-based Estimation in causal inference

    Schnitzer, Mireille E.; Lok, Judith J.; Gruber, Susan

    2015-01-01

    This paper investigates the appropriateness of the integration of flexible propensity score modeling (nonparametric or machine learning approaches) in semiparametric models for the estimation of a causal quantity, such as the mean outcome under treatment. We begin with an overview of some of the issues involved in knowledge-based and statistical variable selection in causal inference and the potential pitfalls of automated selection based on the fit of the propensity score. Using a simple example, we directly show the consequences of adjusting for pure causes of the exposure when using inverse probability of treatment weighting (IPTW). Such variables are likely to be selected when using a naive approach to model selection for the propensity score. We describe how the method of Collaborative Targeted minimum loss-based estimation (C-TMLE; van der Laan and Gruber, 2010) capitalizes on the collaborative double robustness property of semiparametric efficient estimators to select covariates for the propensity score based on the error in the conditional outcome model. Finally, we compare several approaches to automated variable selection in low-and high-dimensional settings through a simulation study. From this simulation study, we conclude that using IPTW with flexible prediction for the propensity score can result in inferior estimation, while Targeted minimum loss-based estimation and C-TMLE may benefit from flexible prediction and remain robust to the presence of variables that are highly correlated with treatment. However, in our study, standard influence function-based methods for the variance underestimated the standard errors, resulting in poor coverage under certain data-generating scenarios. PMID:26226129

  10. Model-free prediction and regression a transformation-based approach to inference

    Politis, Dimitris N

    2015-01-01

    The Model-Free Prediction Principle expounded upon in this monograph is based on the simple notion of transforming a complex dataset to one that is easier to work with, e.g., i.i.d. or Gaussian. As such, it restores the emphasis on observable quantities, i.e., current and future data, as opposed to unobservable model parameters and estimates thereof, and yields optimal predictors in diverse settings such as regression and time series. Furthermore, the Model-Free Bootstrap takes us beyond point prediction in order to construct frequentist prediction intervals without resort to unrealistic assumptions such as normality. Prediction has been traditionally approached via a model-based paradigm, i.e., (a) fit a model to the data at hand, and (b) use the fitted model to extrapolate/predict future data. Due to both mathematical and computational constraints, 20th century statistical practice focused mostly on parametric models. Fortunately, with the advent of widely accessible powerful computing in the late 1970s, co...

  11. Collinearity analysis of Brassica A and C genomes based on an updated inferred unigene order

    Ian Bancroft

    2015-06-01

    Full Text Available This data article includes SNP scoring across lines of the Brassica napus TNDH population based on Illumina sequencing of mRNA, expanded to 75 lines. The 21, 323 mapped markers defined 887 recombination bins, representing an updated genetic linkage map for the species. Based on this new map, 5 genome sequence scaffolds were split and the order and orientation of scaffolds updated to establish a new pseudomolecule specification. The order of unigenes and SNP array probes within these pseudomolecules was determined. Unigenes were assessed for sequence similarity to the A and C genomes. The 57, 246 that mapped to both enabled the collinearity of the A and C genomes to be illustrated graphically. Although the great majority was in collinear positions, some were not. Analyses of 60 such instances are presented, suggesting that the breakdown in collinearity was largely due to either the absence of the homoeologue on one genome (resulting in sequence match to a paralogue or multiple similar sequences being present. The mRNAseq datasets for the TNDH lines are available from the SRA repository (ERA283648; the remaining datasets are supplied with this article.

  12. Bayesian inference based modelling for gene transcriptional dynamics by integrating multiple source of knowledge

    Wang Shu-Qiang

    2012-07-01

    Full Text Available Abstract Background A key challenge in the post genome era is to identify genome-wide transcriptional regulatory networks, which specify the interactions between transcription factors and their target genes. Numerous methods have been developed for reconstructing gene regulatory networks from expression data. However, most of them are based on coarse grained qualitative models, and cannot provide a quantitative view of regulatory systems. Results A binding affinity based regulatory model is proposed to quantify the transcriptional regulatory network. Multiple quantities, including binding affinity and the activity level of transcription factor (TF are incorporated into a general learning model. The sequence features of the promoter and the possible occupancy of nucleosomes are exploited to estimate the binding probability of regulators. Comparing with the previous models that only employ microarray data, the proposed model can bridge the gap between the relative background frequency of the observed nucleotide and the gene's transcription rate. Conclusions We testify the proposed approach on two real-world microarray datasets. Experimental results show that the proposed model can effectively identify the parameters and the activity level of TF. Moreover, the kinetic parameters introduced in the proposed model can reveal more biological sense than previous models can do.

  13. PHYLOViZ: phylogenetic inference and data visualization for sequence based typing methods

    Francisco Alexandre P

    2012-05-01

    Full Text Available Abstract Background With the decrease of DNA sequencing costs, sequence-based typing methods are rapidly becoming the gold standard for epidemiological surveillance. These methods provide reproducible and comparable results needed for a global scale bacterial population analysis, while retaining their usefulness for local epidemiological surveys. Online databases that collect the generated allelic profiles and associated epidemiological data are available but this wealth of data remains underused and are frequently poorly annotated since no user-friendly tool exists to analyze and explore it. Results PHYLOViZ is platform independent Java software that allows the integrated analysis of sequence-based typing methods, including SNP data generated from whole genome sequence approaches, and associated epidemiological data. goeBURST and its Minimum Spanning Tree expansion are used for visualizing the possible evolutionary relationships between isolates. The results can be displayed as an annotated graph overlaying the query results of any other epidemiological data available. Conclusions PHYLOViZ is a user-friendly software that allows the combined analysis of multiple data sources for microbial epidemiological and population studies. It is freely available at http://www.phyloviz.net.

  14. Protein-protein interaction inference based on semantic similarity of Gene Ontology terms.

    Zhang, Shu-Bo; Tang, Qiang-Rong

    2016-07-21

    Identifying protein-protein interactions is important in molecular biology. Experimental methods to this issue have their limitations, and computational approaches have attracted more and more attentions from the biological community. The semantic similarity derived from the Gene Ontology (GO) annotation has been regarded as one of the most powerful indicators for protein interaction. However, conventional methods based on GO similarity fail to take advantage of the specificity of GO terms in the ontology graph. We proposed a GO-based method to predict protein-protein interaction by integrating different kinds of similarity measures derived from the intrinsic structure of GO graph. We extended five existing methods to derive the semantic similarity measures from the descending part of two GO terms in the GO graph, then adopted a feature integration strategy to combines both the ascending and the descending similarity scores derived from the three sub-ontologies to construct various kinds of features to characterize each protein pair. Support vector machines (SVM) were employed as discriminate classifiers, and five-fold cross validation experiments were conducted on both human and yeast protein-protein interaction datasets to evaluate the performance of different kinds of integrated features, the experimental results suggest the best performance of the feature that combines information from both the ascending and the descending parts of the three ontologies. Our method is appealing for effective prediction of protein-protein interaction. Copyright © 2016 Elsevier Ltd. All rights reserved.

  15. Neuro-fuzzy controller of low head hydropower plants using adaptive-network based fuzzy inference system

    Djukanovic, M.B. [Inst. Nikola Tesla, Belgrade (Yugoslavia). Dept. of Power Systems; Calovic, M.S. [Univ. of Belgrade (Yugoslavia). Dept. of Electrical Engineering; Vesovic, B.V. [Inst. Mihajlo Pupin, Belgrade (Yugoslavia). Dept. of Automatic Control; Sobajic, D.J. [Electric Power Research Inst., Palo Alto, CA (United States)

    1997-12-01

    This paper presents an attempt of nonlinear, multivariable control of low-head hydropower plants, by using adaptive-network based fuzzy inference system (ANFIS). The new design technique enhances fuzzy controllers with self-learning capability for achieving prescribed control objectives in a near optimal manner. The controller has flexibility for accepting more sensory information, with the main goal to improve the generator unit transients, by adjusting the exciter input, the wicket gate and runner blade positions. The developed ANFIS controller whose control signals are adjusted by using incomplete on-line measurements, can offer better damping effects to generator oscillations over a wide range of operating conditions, than conventional controllers. Digital simulations of hydropower plant equipped with low-head Kaplan turbine are performed and the comparisons of conventional excitation-governor control, state-feedback optimal control and ANFIS based output feedback control are presented. To demonstrate the effectiveness of the proposed control scheme and the robustness of the acquired neuro-fuzzy controller, the controller has been implemented on a complex high-order non-linear hydrogenerator model.

  16. Bridge Performance Assessment Based on an Adaptive Neuro-Fuzzy Inference System with Wavelet Filter for the GPS Measurements

    Mosbeh R. Kaloop

    2015-10-01

    Full Text Available This study describes the performance assessment of the Huangpu Bridge in Guangzhou, China based on long-term monitoring in real-time by the kinematic global positioning system (RTK-GPS technique. Wavelet transformde-noising is applied to filter the GPS measurements, while the adaptive neuro-fuzzy inference system (ANFIS time series output-only model is used to predict the deformations of GPS-bridge monitoring points. In addition, GPS and accelerometer monitoring systems are used to evaluate the bridge oscillation performance. The conclusions drawn from investigating the numerical results show that: (1the wavelet de-noising of the GPS measurements of the different recording points on the bridge is a suitable tool to efficiently eliminate the signal noise and extract the different deformation components such as: semi-static and dynamic displacements; (2 the ANFIS method with two multi-input single output model is revealed to powerfully predict GPS movement measurements and assess the bridge deformations; and (3 The installed structural health monitoring system and the applied ANFIS movement prediction performance model are solely sufficient to assure bridge safety based on the analyses of the different filtered movement components.

  17. Designing Dietary Recommendations Using System Level Interactomics Analysis and Network-Based Inference

    Tingting Zheng

    2017-09-01

    Full Text Available Background: A range of computational methods that rely on the analysis of genome-wide expression datasets have been developed and successfully used for drug repositioning. The success of these methods is based on the hypothesis that introducing a factor (in this case, a drug molecule that could reverse the disease gene expression signature will lead to a therapeutic effect. However, it has also been shown that globally reversing the disease expression signature is not a prerequisite for drug activity. On the other hand, the basic idea of significant anti-correlation in expression profiles could have great value for establishing diet-disease associations and could provide new insights into the role of dietary interventions in disease.Methods: We performed an integrated analysis of publicly available gene expression profiles for foods, diseases and drugs, by calculating pairwise similarity scores for diet and disease gene expression signatures and characterizing their topological features in protein-protein interaction networks.Results: We identified 485 diet-disease pairs where diet could positively influence disease development and 472 pairs where specific diets should be avoided in a disease state. Multiple evidence suggests that orange, whey and coconut fat could be beneficial for psoriasis, lung adenocarcinoma and macular degeneration, respectively. On the other hand, fructose-rich diet should be restricted in patients with chronic intermittent hypoxia and ovarian cancer. Since humans normally do not consume foods in isolation, we also applied different algorithms to predict synergism; as a result, 58 food pairs were predicted. Interestingly, the diets identified as anti-correlated with diseases showed a topological proximity to the disease proteins similar to that of the corresponding drugs.Conclusions: In conclusion, we provide a computational framework for establishing diet-disease associations and additional information on the role of

  18. Shrinkage Estimators for Robust and Efficient Inference in Haplotype-Based Case-Control Studies

    Chen, Yi-Hau

    2009-03-01

    Case-control association studies often aim to investigate the role of genes and gene-environment interactions in terms of the underlying haplotypes (i.e., the combinations of alleles at multiple genetic loci along chromosomal regions). The goal of this article is to develop robust but efficient approaches to the estimation of disease odds-ratio parameters associated with haplotypes and haplotype-environment interactions. We consider "shrinkage" estimation techniques that can adaptively relax the model assumptions of Hardy-Weinberg-Equilibrium and gene-environment independence required by recently proposed efficient "retrospective" methods. Our proposal involves first development of a novel retrospective approach to the analysis of case-control data, one that is robust to the nature of the gene-environment distribution in the underlying population. Next, it involves shrinkage of the robust retrospective estimator toward a more precise, but model-dependent, retrospective estimator using novel empirical Bayes and penalized regression techniques. Methods for variance estimation are proposed based on asymptotic theories. Simulations and two data examples illustrate both the robustness and efficiency of the proposed methods.

  19. Shrinkage Estimators for Robust and Efficient Inference in Haplotype-Based Case-Control Studies

    Chen, Yi-Hau; Chatterjee, Nilanjan; Carroll, Raymond J.

    2009-01-01

    Case-control association studies often aim to investigate the role of genes and gene-environment interactions in terms of the underlying haplotypes (i.e., the combinations of alleles at multiple genetic loci along chromosomal regions). The goal of this article is to develop robust but efficient approaches to the estimation of disease odds-ratio parameters associated with haplotypes and haplotype-environment interactions. We consider "shrinkage" estimation techniques that can adaptively relax the model assumptions of Hardy-Weinberg-Equilibrium and gene-environment independence required by recently proposed efficient "retrospective" methods. Our proposal involves first development of a novel retrospective approach to the analysis of case-control data, one that is robust to the nature of the gene-environment distribution in the underlying population. Next, it involves shrinkage of the robust retrospective estimator toward a more precise, but model-dependent, retrospective estimator using novel empirical Bayes and penalized regression techniques. Methods for variance estimation are proposed based on asymptotic theories. Simulations and two data examples illustrate both the robustness and efficiency of the proposed methods.

  20. HLA genes in Madeira Island (Portugal) inferred from sequence-based typing: footprints from different origins.

    Spínola, Hélder; Bruges-Armas, Jácome; Mora, Marian Gantes; Middleton, Derek; Brehm, António

    2006-04-01

    Human leukocyte antigen (HLA)-A, HLA-B, and HLA-DRB1 polymorphisms were examined in Madeira Island populations. The data was obtained at high-resolution level, using sequence-based typing (SBT). The most frequent alleles at each loci were: A*020101 (24.6%), B*5101 (9.7%), B*440201 (9.2%), and DRB1*070101 (15.7%). The predominant three-loci haplotypes in Madeira were A*020101-B*510101-DRB1*130101 (2.7%) and A*010101-B*0801-DRB1*030101 (2.4%), previously found in north and central Portugal. The present study corroborates historical sources and other genetic studies that say Madeira were populated not only by Europeans, mostly Portuguese, but also sub-Saharan Africans due to slave trade. Comparison with other populations shows that Madeira experienced a stronger African influence due to slave trade than Portugal mainland and even the Azores archipelago. Despite this African genetic input, haplotype and allele frequencies were predominantly from European origin, mostly common to mainland Portugal.

  1. An operant-based detection method for inferring tinnitus in mice.

    Zuo, Hongyan; Lei, Debin; Sivaramakrishnan, Shobhana; Howie, Benjamin; Mulvany, Jessica; Bao, Jianxin

    2017-11-01

    Subjective tinnitus is a hearing disorder in which a person perceives sound when no external sound is present. It can be acute or chronic. Because our current understanding of its pathology is incomplete, no effective cures have yet been established. Mouse models are useful for studying the pathophysiology of tinnitus as well as for developing therapeutic treatments. We have developed a new method for determining acute and chronic tinnitus in mice, called sound-based avoidance detection (SBAD). The SBAD method utilizes one paradigm to detect tinnitus and another paradigm to monitor possible confounding factors, such as motor impairment, loss of motivation, and deficits in learning and memory. The SBAD method has succeeded in monitoring both acute and chronic tinnitus in mice. Its detection ability is further validated by functional studies demonstrating an abnormal increase in neuronal activity in the inferior colliculus of mice that had previously been identified as having tinnitus by the SBAD method. The SBAD method provides a new means by which investigators can detect tinnitus in a single mouse accurately and with more control over potential confounding factors than existing methods. This work establishes a new behavioral method for detecting tinnitus in mice. The detection outcome is consistent with functional validation. One key advantage of mouse models is they provide researchers the opportunity to utilize an extensive array of genetic tools. This new method could lead to a deeper understanding of the molecular pathways underlying tinnitus pathology. Copyright © 2017 Elsevier B.V. All rights reserved.

  2. Genetic affinities of north and northeastern populations of India: inference from HLA-based study.

    Agrawal, S; Srivastava, S K; Borkar, M; Chaudhuri, T K

    2008-08-01

    India is like a microcosm of the world in terms of its diversity; religion, climate and ethnicity which leads to genetic variations in the populations. As a highly polymorphic marker, the human leukocyte antigen (HLA) system plays an important role in the genetic differentiation studies. To assess the genetic diversity of HLA class II loci, we studied a total of 1336 individuals from north India using DNA-based techniques. The study included four endogamous castes (Kayastha, Mathurs, Rastogies and Vaishyas), two inbreeding Muslim populations (Shias and Sunnis) from north India and three northeast Indian populations (Lachung, Mech and Rajbanshi). A total of 36 alleles were observed at DRB1 locus in both Hindu castes and Muslims from north, while 21 alleles were seen in northeast Indians. At the DQA1 locus, the number of alleles ranged from 11 to 17 in the studied populations. The total number of alleles at DQB1 was 19, 12 and 20 in the studied castes, Muslims and northeastern populations, respectively. The most frequent haplotypes observed in all the studied populations were DRB1*0701-DQA1*0201-DQB1*0201 and DRB1*1501-DQA1*0103-DQB1*0601. Upon comparing our results with other world populations, we observed the presence of Caucasoid element in north Indian population. However, differential admixturing among Sunnis and Shias with the other north Indians was evident. Northeastern populations showed genetic affinity with Mongoloids from southeast Asia. When genetic distances were calculated, we found the north Indians and northeastern populations to be markedly unrelated.

  3. HLA polymorphisms in Cabo Verde and Guiné-Bissau inferred from sequence-based typing.

    Spínola, Hélder; Bruges-Armas, Jácome; Middleton, Derek; Brehm, António

    2005-10-01

    Human leukocyte antigen (HLA)-A, -B, and -DRB1 polymorphisms were examined in the Cabo Verde and Guiné-Bissau populations. The data were obtained at high-resolution level, using sequence-based typing. The most frequent alleles in each locus was: A*020101 (16.7% in Guiné-Bissau and 13.5% in Cabo Verde), B*350101 (14.4% in Guiné-Bissau and 13.2% in Cabo Verde), DRB1*1304 (19.6% in Guiné-Bissau), and DRB1*1101 (10.1% in Cabo Verde). The predominant three loci haplotype in Guiné-Bissau was A*2301-B*1503-DRB1*1101 (4.6%) and in Cabo Verde was A*3002-B*350101-DRB1*1001 (2.8%), exclusive to northwestern islands (5.6%) and absent in Guiné-Bissau. The present study corroborates historic sources and other genetic studies that say Cabo Verde were populated not only by Africans but also by Europeans. Haplotypes and dendrogram analysis shows a Caucasian genetic influence in today's gene pool of Cabo Verdeans. Haplotypes and allele frequencies present a differential distribution between southeastern and northwestern Cabo Verde islands, which could be the result of different genetic influences, founder effect, or bottlenecks. Dendrograms and principal coordinates analysis show that Guineans are more similar to North Africans than other HLA-studied sub-Saharans, probably from ancient and recent genetic contacts with other peoples, namely East Africans.

  4. Pragmatic Inferences in Context: Learning to Interpret Contrastive Prosody

    Kurumada, Chigusa; Clark, Eve V.

    2017-01-01

    Can preschoolers make pragmatic inferences based on the intonation of an utterance? Previous work has found that young children appear to ignore intonational meanings and come to understand contrastive intonation contours only after age six. We show that four-year-olds succeed in interpreting an English utterance, such as "It LOOKS like a…

  5. Bayesian inference model for fatigue life of laminated composites

    Dimitrov, Nikolay Krasimirov; Kiureghian, Armen Der; Berggreen, Christian

    2016-01-01

    A probabilistic model for estimating the fatigue life of laminated composite plates is developed. The model is based on lamina-level input data, making it possible to predict fatigue properties for a wide range of laminate configurations. Model parameters are estimated by Bayesian inference. The ...

  6. Speaker Reliability Guides Children's Inductive Inferences about Novel Properties

    Kim, Sunae; Kalish, Charles W.; Harris, Paul L.

    2012-01-01

    Prior work shows that children can make inductive inferences about objects based on their labels rather than their appearance (Gelman, 2003). A separate line of research shows that children's trust in a speaker's label is selective. Children accept labels from a reliable speaker over an unreliable speaker (e.g., Koenig & Harris, 2005). In the…

  7. Evaluating the impact of implementation factors on family-based prevention programming: methods for strengthening causal inference.

    Crowley, D Max; Coffman, Donna L; Feinberg, Mark E; Greenberg, Mark T; Spoth, Richard L

    2014-04-01

    Despite growing recognition of the important role implementation plays in successful prevention efforts, relatively little work has sought to demonstrate a causal relationship between implementation factors and participant outcomes. In turn, failure to explore the implementation-to-outcome link limits our understanding of the mechanisms essential to successful programming. This gap is partially due to the inability of current methodological procedures within prevention science to account for the multitude of confounders responsible for variation in implementation factors (i.e., selection bias). The current paper illustrates how propensity and marginal structural models can be used to improve causal inferences involving implementation factors not easily randomized (e.g., participant attendance). We first present analytic steps for simultaneously evaluating the impact of multiple implementation factors on prevention program outcome. Then, we demonstrate this approach for evaluating the impact of enrollment and attendance in a family program, over and above the impact of a school-based program, within PROSPER, a large-scale real-world prevention trial. Findings illustrate the capacity of this approach to successfully account for confounders that influence enrollment and attendance, thereby more accurately representing true causal relations. For instance, after accounting for selection bias, we observed a 5% reduction in the prevalence of 11th grade underage drinking for those who chose to receive a family program and school program compared to those who received only the school program. Further, we detected a 7% reduction in underage drinking for those with high attendance in the family program.

  8. New PDE-based methods for image enhancement using SOM and Bayesian inference in various discretization schemes

    Karras, D A; Mertzios, G B

    2009-01-01

    A novel approach is presented in this paper for improving anisotropic diffusion PDE models, based on the Perona–Malik equation. A solution is proposed from an engineering perspective to adaptively estimate the parameters of the regularizing function in this equation. The goal of such a new adaptive diffusion scheme is to better preserve edges when the anisotropic diffusion PDE models are applied to image enhancement tasks. The proposed adaptive parameter estimation in the anisotropic diffusion PDE model involves self-organizing maps and Bayesian inference to define edge probabilities accurately. The proposed modifications attempt to capture not only simple edges but also difficult textural edges and incorporate their probability in the anisotropic diffusion model. In the context of the application of PDE models to image processing such adaptive schemes are closely related to the discrete image representation problem and the investigation of more suitable discretization algorithms using constraints derived from image processing theory. The proposed adaptive anisotropic diffusion model illustrates these concepts when it is numerically approximated by various discretization schemes in a database of magnetic resonance images (MRI), where it is shown to be efficient in image filtering and restoration applications

  9. Optimization of Indoor Thermal Comfort Parameters with the Adaptive Network-Based Fuzzy Inference System and Particle Swarm Optimization Algorithm

    Jing Li

    2017-01-01

    Full Text Available The goal of this study is to improve thermal comfort and indoor air quality with the adaptive network-based fuzzy inference system (ANFIS model and improved particle swarm optimization (PSO algorithm. A method to optimize air conditioning parameters and installation distance is proposed. The methodology is demonstrated through a prototype case, which corresponds to a typical laboratory in colleges and universities. A laboratory model is established, and simulated flow field information is obtained with the CFD software. Subsequently, the ANFIS model is employed instead of the CFD model to predict indoor flow parameters, and the CFD database is utilized to train ANN input-output “metamodels” for the subsequent optimization. With the improved PSO algorithm and the stratified sequence method, the objective functions are optimized. The functions comprise PMV, PPD, and mean age of air. The optimal installation distance is determined with the hemisphere model. Results show that most of the staff obtain a satisfactory degree of thermal comfort and that the proposed method can significantly reduce the cost of building an experimental device. The proposed methodology can be used to determine appropriate air supply parameters and air conditioner installation position for a pleasant and healthy indoor environment.

  10. Relative Wave Energy based Adaptive Neuro-Fuzzy Inference System model for the Estimation of Depth of Anaesthesia.

    Benzy, V K; Jasmin, E A; Koshy, Rachel Cherian; Amal, Frank; Indiradevi, K P

    2018-01-01

    The advancement in medical research and intelligent modeling techniques has lead to the developments in anaesthesia management. The present study is targeted to estimate the depth of anaesthesia using cognitive signal processing and intelligent modeling techniques. The neurophysiological signal that reflects cognitive state of anaesthetic drugs is the electroencephalogram signal. The information available on electroencephalogram signals during anaesthesia are drawn by extracting relative wave energy features from the anaesthetic electroencephalogram signals. Discrete wavelet transform is used to decomposes the electroencephalogram signals into four levels and then relative wave energy is computed from approximate and detail coefficients of sub-band signals. Relative wave energy is extracted to find out the degree of importance of different electroencephalogram frequency bands associated with different anaesthetic phases awake, induction, maintenance and recovery. The Kruskal-Wallis statistical test is applied on the relative wave energy features to check the discriminating capability of relative wave energy features as awake, light anaesthesia, moderate anaesthesia and deep anaesthesia. A novel depth of anaesthesia index is generated by implementing a Adaptive neuro-fuzzy inference system based fuzzy c-means clustering algorithm which uses relative wave energy features as inputs. Finally, the generated depth of anaesthesia index is compared with a commercially available depth of anaesthesia monitor Bispectral index.

  11. Enhancing Transparency and Control When Drawing Data-Driven Inferences About Individuals.

    Chen, Daizhuo; Fraiberger, Samuel P; Moakler, Robert; Provost, Foster

    2017-09-01

    Recent studies show the remarkable power of fine-grained information disclosed by users on social network sites to infer users' personal characteristics via predictive modeling. Similar fine-grained data are being used successfully in other commercial applications. In response, attention is turning increasingly to the transparency that organizations provide to users as to what inferences are drawn and why, as well as to what sort of control users can be given over inferences that are drawn about them. In this article, we focus on inferences about personal characteristics based on information disclosed by users' online actions. As a use case, we explore personal inferences that are made possible from "Likes" on Facebook. We first present a means for providing transparency into the information responsible for inferences drawn by data-driven models. We then introduce the "cloaking device"-a mechanism for users to inhibit the use of particular pieces of information in inference. Using these analytical tools we ask two main questions: (1) How much information must users cloak to significantly affect inferences about their personal traits? We find that usually users must cloak only a small portion of their actions to inhibit inference. We also find that, encouragingly, false-positive inferences are significantly easier to cloak than true-positive inferences. (2) Can firms change their modeling behavior to make cloaking more difficult? The answer is a definitive yes. We demonstrate a simple modeling change that requires users to cloak substantially more information to affect the inferences drawn. The upshot is that organizations can provide transparency and control even into complicated, predictive model-driven inferences, but they also can make control easier or harder for their users.

  12. Elastic Properties of Novel Co- and CoNi-Based Superalloys Determined through Bayesian Inference and Resonant Ultrasound Spectroscopy

    Goodlet, Brent R.; Mills, Leah; Bales, Ben; Charpagne, Marie-Agathe; Murray, Sean P.; Lenthe, William C.; Petzold, Linda; Pollock, Tresa M.

    2018-06-01

    Bayesian inference is employed to precisely evaluate single crystal elastic properties of novel γ -γ ' Co- and CoNi-based superalloys from simple and non-destructive resonant ultrasound spectroscopy (RUS) measurements. Nine alloys from three Co-, CoNi-, and Ni-based alloy classes were evaluated in the fully aged condition, with one alloy per class also evaluated in the solution heat-treated condition. Comparisons are made between the elastic properties of the three alloy classes and among the alloys of a single class, with the following trends observed. A monotonic rise in the c_{44} (shear) elastic constant by a total of 12 pct is observed between the three alloy classes as Co is substituted for Ni. Elastic anisotropy ( A) is also increased, with a large majority of the nearly 13 pct increase occurring after Co becomes the dominant constituent. Together the five CoNi alloys, with Co:Ni ratios from 1:1 to 1.5:1, exhibited remarkably similar properties with an average A 1.8 pct greater than the Ni-based alloy CMSX-4. Custom code demonstrating a substantial advance over previously reported methods for RUS inversion is also reported here for the first time. CmdStan-RUS is built upon the open-source probabilistic programing language of Stan and formulates the inverse problem using Bayesian methods. Bayesian posterior distributions are efficiently computed with Hamiltonian Monte Carlo (HMC), while initial parameterization is randomly generated from weakly informative prior distributions. Remarkably robust convergence behavior is demonstrated across multiple independent HMC chains in spite of initial parameterization often very far from actual parameter values. Experimental procedures are substantially simplified by allowing any arbitrary misorientation between the specimen and crystal axes, as elastic properties and misorientation are estimated simultaneously.

  13. Decision Making Analysis: Critical Factors-Based Methodology

    2010-04-01

    the pitfalls associated with current wargaming methods such as assuming a western view of rational values in decision - making regardless of the cultures...Utilization theory slightly expands the rational decision making model as it states that “actors try to maximize their expected utility by weighing the...items to categorize the decision - making behavior of political leaders which tend to demonstrate either a rational or cognitive leaning. Leaders

  14. Clay based superior aggregate for making light construction

    Sumarnadi, Eko Tri; Sudaryanto; Zulkarnain, Iskandar

    2002-01-01

    Aggregate is defined as material consisting of solid minerals with grain size ranging from sand to gravel. This material is usually used for filling components of concrete. Clay based aggregate has special properties that can be useful for making light construction in wet environment. Concrete build using the aggregate is not impermeable, but it can adsorb water in significant amount, so that is very useful to be used as paving block for walker zone (trotoir) and zone surrounding swimming pool. Laboratory test results show that the aggregate has grain size in zone 1 type with specific gravity between 1360 - 1840 kg/cm 3. Its abrasive factor about 35.32 % in average and it can be used as building material for light construction. Concrete with mixing ratio cement/aggregate of 1 to 6 with slump condition 3 to 6 cm and curing time of 28 days indicates compressive strength of 10 N/cm 3. Qualitative cost analysis reflected that the paving block production cost is relatively low and it will be more profitable if they are product from failed pentile having fail between 10 to 30 %

  15. A Bayesian Network Schema for Lessening Database Inference

    Chang, LiWu; Moskowitz, Ira S

    2001-01-01

    .... The authors introduce a formal schema for database inference analysis, based upon a Bayesian network structure, which identifies critical parameters involved in the inference problem and represents...

  16. A new intuitionistic fuzzy rule-based decision-making system for an operating system process scheduler.

    Butt, Muhammad Arif; Akram, Muhammad

    2016-01-01

    We present a new intuitionistic fuzzy rule-based decision-making system based on intuitionistic fuzzy sets for a process scheduler of a batch operating system. Our proposed intuitionistic fuzzy scheduling algorithm, inputs the nice value and burst time of all available processes in the ready queue, intuitionistically fuzzify the input values, triggers appropriate rules of our intuitionistic fuzzy inference engine and finally calculates the dynamic priority (dp) of all the processes in the ready queue. Once the dp of every process is calculated the ready queue is sorted in decreasing order of dp of every process. The process with maximum dp value is sent to the central processing unit for execution. Finally, we show complete working of our algorithm on two different data sets and give comparisons with some standard non-preemptive process schedulers.

  17. Leuconostoc mesenteroides growth in food products: prediction and sensitivity analysis by adaptive-network-based fuzzy inference systems.

    Hue-Yu Wang

    Full Text Available BACKGROUND: An adaptive-network-based fuzzy inference system (ANFIS was compared with an artificial neural network (ANN in terms of accuracy in predicting the combined effects of temperature (10.5 to 24.5°C, pH level (5.5 to 7.5, sodium chloride level (0.25% to 6.25% and sodium nitrite level (0 to 200 ppm on the growth rate of Leuconostoc mesenteroides under aerobic and anaerobic conditions. METHODS: THE ANFIS AND ANN MODELS WERE COMPARED IN TERMS OF SIX STATISTICAL INDICES CALCULATED BY COMPARING THEIR PREDICTION RESULTS WITH ACTUAL DATA: mean absolute percentage error (MAPE, root mean square error (RMSE, standard error of prediction percentage (SEP, bias factor (Bf, accuracy factor (Af, and absolute fraction of variance (R (2. Graphical plots were also used for model comparison. CONCLUSIONS: The learning-based systems obtained encouraging prediction results. Sensitivity analyses of the four environmental factors showed that temperature and, to a lesser extent, NaCl had the most influence on accuracy in predicting the growth rate of Leuconostoc mesenteroides under aerobic and anaerobic conditions. The observed effectiveness of ANFIS for modeling microbial kinetic parameters confirms its potential use as a supplemental tool in predictive mycology. Comparisons between growth rates predicted by ANFIS and actual experimental data also confirmed the high accuracy of the Gaussian membership function in ANFIS. Comparisons of the six statistical indices under both aerobic and anaerobic conditions also showed that the ANFIS model was better than all ANN models in predicting the four kinetic parameters. Therefore, the ANFIS model is a valuable tool for quickly predicting the growth rate of Leuconostoc mesenteroides under aerobic and anaerobic conditions.

  18. Making Evidence Based Changes on the Labor Ward of Muhima ...

    All these things can lead to increased maternal and neonatal ... information from the internet and make a power point presentation. In addition global trainings on helping ... Staff embraced the change even though initially resistant, learned how to find information on the internet and found that making power points was fun, ...

  19. Multi-criteria weighted order based maintenance decision making

    Dhanisetty, V.S.V.; Verhagen, W.J.C.; Curran, R.

    2017-01-01

    Decision making in daily maintenance requires consideration of multiple factors. The importance of each of the factors fluctuates depending on the repair scenario and the needs of the maintainer. In order to include the prioritisation of multiple criteria, a weighted decision making model is

  20. Inference of expanded Lrp-like feast/famine transcription factor targets in a non-model organism using protein structure-based prediction.

    Ashworth, Justin; Plaisier, Christopher L; Lo, Fang Yin; Reiss, David J; Baliga, Nitin S

    2014-01-01

    Widespread microbial genome sequencing presents an opportunity to understand the gene regulatory networks of non-model organisms. This requires knowledge of the binding sites for transcription factors whose DNA-binding properties are unknown or difficult to infer. We adapted a protein structure-based method to predict the specificities and putative regulons of homologous transcription factors across diverse species. As a proof-of-concept we predicted the specificities and transcriptional target genes of divergent archaeal feast/famine regulatory proteins, several of which are encoded in the genome of Halobacterium salinarum. This was validated by comparison to experimentally determined specificities for transcription factors in distantly related extremophiles, chromatin immunoprecipitation experiments, and cis-regulatory sequence conservation across eighteen related species of halobacteria. Through this analysis we were able to infer that Halobacterium salinarum employs a divergent local trans-regulatory strategy to regulate genes (carA and carB) involved in arginine and pyrimidine metabolism, whereas Escherichia coli employs an operon. The prediction of gene regulatory binding sites using structure-based methods is useful for the inference of gene regulatory relationships in new species that are otherwise difficult to infer.

  1. An Intelligent Fleet Condition-Based Maintenance Decision Making Method Based on Multi-Agent

    Bo Sun

    2012-01-01

    Full Text Available According to the demand for condition-based maintenance online decision making among a mission oriented fleet, an intelligent maintenance decision making method based on Multi-agent and heuristic rules is proposed. The process of condition-based maintenance within an aircraft fleet (each containing one or more Line Replaceable Modules based on multiple maintenance thresholds is analyzed. Then the process is abstracted into a Multi-Agent Model, a 2-layer model structure containing host negotiation and independent negotiation is established, and the heuristic rules applied to global and local maintenance decision making is proposed. Based on Contract Net Protocol and the heuristic rules, the maintenance decision making algorithm is put forward. Finally, a fleet consisting of 10 aircrafts on a 3-wave continuous mission is illustrated to verify this method. Simulation results indicate that this method can improve the availability of the fleet, meet mission demands, rationalize the utilization of support resources and provide support for online maintenance decision making among a mission oriented fleet.

  2. On principles of inductive inference

    Kostecki, Ryszard Paweł

    2011-01-01

    We propose an intersubjective epistemic approach to foundations of probability theory and statistical inference, based on relative entropy and category theory, and aimed to bypass the mathematical and conceptual problems of existing foundational approaches.

  3. sick: The Spectroscopic Inference Crank

    Casey, Andrew R.

    2016-03-01

    There exists an inordinate amount of spectral data in both public and private astronomical archives that remain severely under-utilized. The lack of reliable open-source tools for analyzing large volumes of spectra contributes to this situation, which is poised to worsen as large surveys successively release orders of magnitude more spectra. In this article I introduce sick, the spectroscopic inference crank, a flexible and fast Bayesian tool for inferring astrophysical parameters from spectra. sick is agnostic to the wavelength coverage, resolving power, or general data format, allowing any user to easily construct a generative model for their data, regardless of its source. sick can be used to provide a nearest-neighbor estimate of model parameters, a numerically optimized point estimate, or full Markov Chain Monte Carlo sampling of the posterior probability distributions. This generality empowers any astronomer to capitalize on the plethora of published synthetic and observed spectra, and make precise inferences for a host of astrophysical (and nuisance) quantities. Model intensities can be reliably approximated from existing grids of synthetic or observed spectra using linear multi-dimensional interpolation, or a Cannon-based model. Additional phenomena that transform the data (e.g., redshift, rotational broadening, continuum, spectral resolution) are incorporated as free parameters and can be marginalized away. Outlier pixels (e.g., cosmic rays or poorly modeled regimes) can be treated with a Gaussian mixture model, and a noise model is included to account for systematically underestimated variance. Combining these phenomena into a scalar-justified, quantitative model permits precise inferences with credible uncertainties on noisy data. I describe the common model features, the implementation details, and the default behavior, which is balanced to be suitable for most astronomical applications. Using a forward model on low-resolution, high signal

  4. SICK: THE SPECTROSCOPIC INFERENCE CRANK

    Casey, Andrew R., E-mail: arc@ast.cam.ac.uk [Institute of Astronomy, University of Cambridge, Madingley Road, Cambdridge, CB3 0HA (United Kingdom)

    2016-03-15

    There exists an inordinate amount of spectral data in both public and private astronomical archives that remain severely under-utilized. The lack of reliable open-source tools for analyzing large volumes of spectra contributes to this situation, which is poised to worsen as large surveys successively release orders of magnitude more spectra. In this article I introduce sick, the spectroscopic inference crank, a flexible and fast Bayesian tool for inferring astrophysical parameters from spectra. sick is agnostic to the wavelength coverage, resolving power, or general data format, allowing any user to easily construct a generative model for their data, regardless of its source. sick can be used to provide a nearest-neighbor estimate of model parameters, a numerically optimized point estimate, or full Markov Chain Monte Carlo sampling of the posterior probability distributions. This generality empowers any astronomer to capitalize on the plethora of published synthetic and observed spectra, and make precise inferences for a host of astrophysical (and nuisance) quantities. Model intensities can be reliably approximated from existing grids of synthetic or observed spectra using linear multi-dimensional interpolation, or a Cannon-based model. Additional phenomena that transform the data (e.g., redshift, rotational broadening, continuum, spectral resolution) are incorporated as free parameters and can be marginalized away. Outlier pixels (e.g., cosmic rays or poorly modeled regimes) can be treated with a Gaussian mixture model, and a noise model is included to account for systematically underestimated variance. Combining these phenomena into a scalar-justified, quantitative model permits precise inferences with credible uncertainties on noisy data. I describe the common model features, the implementation details, and the default behavior, which is balanced to be suitable for most astronomical applications. Using a forward model on low-resolution, high signal

  5. SICK: THE SPECTROSCOPIC INFERENCE CRANK

    Casey, Andrew R.

    2016-01-01

    There exists an inordinate amount of spectral data in both public and private astronomical archives that remain severely under-utilized. The lack of reliable open-source tools for analyzing large volumes of spectra contributes to this situation, which is poised to worsen as large surveys successively release orders of magnitude more spectra. In this article I introduce sick, the spectroscopic inference crank, a flexible and fast Bayesian tool for inferring astrophysical parameters from spectra. sick is agnostic to the wavelength coverage, resolving power, or general data format, allowing any user to easily construct a generative model for their data, regardless of its source. sick can be used to provide a nearest-neighbor estimate of model parameters, a numerically optimized point estimate, or full Markov Chain Monte Carlo sampling of the posterior probability distributions. This generality empowers any astronomer to capitalize on the plethora of published synthetic and observed spectra, and make precise inferences for a host of astrophysical (and nuisance) quantities. Model intensities can be reliably approximated from existing grids of synthetic or observed spectra using linear multi-dimensional interpolation, or a Cannon-based model. Additional phenomena that transform the data (e.g., redshift, rotational broadening, continuum, spectral resolution) are incorporated as free parameters and can be marginalized away. Outlier pixels (e.g., cosmic rays or poorly modeled regimes) can be treated with a Gaussian mixture model, and a noise model is included to account for systematically underestimated variance. Combining these phenomena into a scalar-justified, quantitative model permits precise inferences with credible uncertainties on noisy data. I describe the common model features, the implementation details, and the default behavior, which is balanced to be suitable for most astronomical applications. Using a forward model on low-resolution, high signal

  6. A Hybrid Fuzzy Inference System Based on Dispersion Model for Quantitative Environmental Health Impact Assessment of Urban Transportation Planning

    Behnam Tashayo

    2017-01-01

    Full Text Available Characterizing the spatial variation of traffic-related air pollution has been and is a long-standing challenge in quantitative environmental health impact assessment of urban transportation planning. Advanced approaches are required for modeling complex relationships among traffic, air pollution, and adverse health outcomes by considering uncertainties in the available data. A new hybrid fuzzy model is developed and implemented through hierarchical fuzzy inference system (HFIS. This model is integrated with a dispersion model in order to model the effect of transportation system on the PM2.5 concentration. An improved health metric is developed as well based on a HFIS to model the impact of traffic-related PM2.5 on health. Two solutions are applied to improve the performance of both the models: the topologies of HFISs are selected according to the problem and used variables, membership functions, and rule set are determined through learning in a simultaneous manner. The capabilities of this proposed approach is examined by assessing the impacts of three traffic scenarios involved in air pollution in the city of Isfahan, Iran, and the model accuracy compared to the results of available models from literature. The advantages here are modeling the spatial variation of PM2.5 with high resolution, appropriate processing requirements, and considering the interaction between emissions and meteorological processes. These models are capable of using the available qualitative and uncertain data. These models are of appropriate accuracy, and can provide better understanding of the phenomena in addition to assess the impact of each parameter for the planners.

  7. Hi-Jack: a novel computational framework for pathway-based inference of host–pathogen interactions

    Kleftogiannis, Dimitrios A.

    2015-03-09

    Motivation: Pathogens infect their host and hijack the host machinery to produce more progeny pathogens. Obligate intracellular pathogens, in particular, require resources of the host to replicate. Therefore, infections by these pathogens lead to alterations in the metabolism of the host, shifting in favor of pathogen protein production. Some computational identification of mechanisms of host-pathogen interactions have been proposed, but it seems the problem has yet to be approached from the metabolite-hijacking angle. Results: We propose a novel computational framework, Hi-Jack, for inferring pathway-based interactions between a host and a pathogen that relies on the idea of metabolite hijacking. Hi-Jack searches metabolic network data from hosts and pathogens, and identifies candidate reactions where hijacking occurs. A novel scoring function ranks candidate hijacked reactions and identifies pathways in the host that interact with pathways in the pathogen, as well as the associated frequent hijacked metabolites. We also describe host-pathogen interaction principles that can be used in the future for subsequent studies. Our case study on Mycobacterium tuberculosis (Mtb) revealed pathways in human-e.g. carbohydrate metabolism, lipids metabolism and pathways related to amino acids metabolism-that are likely to be hijacked by the pathogen. In addition, we report interesting potential pathway interconnections between human and Mtb such as linkage of human fatty acid biosynthesis with Mtb biosynthesis of unsaturated fatty acids, or linkage of human pentose phosphate pathway with lipopolysaccharide biosynthesis in Mtb. © The Author 2015. Published by Oxford University Press. All rights reserved.

  8. An Intuitive Dashboard for Bayesian Network Inference

    Reddy, Vikas; Farr, Anna Charisse; Wu, Paul; Mengersen, Kerrie; Yarlagadda, Prasad K D V

    2014-01-01

    Current Bayesian network software packages provide good graphical interface for users who design and develop Bayesian networks for various applications. However, the intended end-users of these networks may not necessarily find such an interface appealing and at times it could be overwhelming, particularly when the number of nodes in the network is large. To circumvent this problem, this paper presents an intuitive dashboard, which provides an additional layer of abstraction, enabling the end-users to easily perform inferences over the Bayesian networks. Unlike most software packages, which display the nodes and arcs of the network, the developed tool organises the nodes based on the cause-and-effect relationship, making the user-interaction more intuitive and friendly. In addition to performing various types of inferences, the users can conveniently use the tool to verify the behaviour of the developed Bayesian network. The tool has been developed using QT and SMILE libraries in C++

  9. An Intuitive Dashboard for Bayesian Network Inference

    Reddy, Vikas; Charisse Farr, Anna; Wu, Paul; Mengersen, Kerrie; Yarlagadda, Prasad K. D. V.

    2014-03-01

    Current Bayesian network software packages provide good graphical interface for users who design and develop Bayesian networks for various applications. However, the intended end-users of these networks may not necessarily find such an interface appealing and at times it could be overwhelming, particularly when the number of nodes in the network is large. To circumvent this problem, this paper presents an intuitive dashboard, which provides an additional layer of abstraction, enabling the end-users to easily perform inferences over the Bayesian networks. Unlike most software packages, which display the nodes and arcs of the network, the developed tool organises the nodes based on the cause-and-effect relationship, making the user-interaction more intuitive and friendly. In addition to performing various types of inferences, the users can conveniently use the tool to verify the behaviour of the developed Bayesian network. The tool has been developed using QT and SMILE libraries in C++.

  10. Self-disclosure decision making based on intimacy and privacy

    Such, Jose M.; Espinosa, Agustin; Garcia-Fornes, Ana; Sierra, Caries

    2012-01-01

    Autonomous agents may encapsulate their principals¿ personal data attributes. These attributes may be disclosed to other agents during agent interactions, producing a loss of privacy. Thus, agents need self-disclosure decision-making mechanisms to autonomously decide whether disclosing personal data attributes to other agents is acceptable or not. Current self-disclosure decision-making mechanisms consider the direct benefit and the privacy loss of disclosing an attribute. Howe...

  11. Active inference and learning.

    Friston, Karl; FitzGerald, Thomas; Rigoli, Francesco; Schwartenbeck, Philipp; O Doherty, John; Pezzulo, Giovanni

    2016-09-01

    This paper offers an active inference account of choice behaviour and learning. It focuses on the distinction between goal-directed and habitual behaviour and how they contextualise each other. We show that habits emerge naturally (and autodidactically) from sequential policy optimisation when agents are equipped with state-action policies. In active inference, behaviour has explorative (epistemic) and exploitative (pragmatic) aspects that are sensitive to ambiguity and risk respectively, where epistemic (ambiguity-resolving) behaviour enables pragmatic (reward-seeking) behaviour and the subsequent emergence of habits. Although goal-directed and habitual policies are usually associated with model-based and model-free schemes, we find the more important distinction is between belief-free and belief-based schemes. The underlying (variational) belief updating provides a comprehensive (if metaphorical) process theory for several phenomena, including the transfer of dopamine responses, reversal learning, habit formation and devaluation. Finally, we show that active inference reduces to a classical (Bellman) scheme, in the absence of ambiguity. Copyright © 2016 The Authors. Published by Elsevier Ltd.. All rights reserved.

  12. Impaired Flexible Reward-Based Decision-Making in Binge Eating Disorder: Evidence from Computational Modeling and Functional Neuroimaging.

    Reiter, Andrea M F; Heinze, Hans-Jochen; Schlagenhauf, Florian; Deserno, Lorenz

    2017-02-01

    Despite its clinical relevance and the recent recognition as a diagnostic category in the DSM-5, binge eating disorder (BED) has rarely been investigated from a cognitive neuroscientific perspective targeting a more precise neurocognitive profiling of the disorder. BED patients suffer from a lack of behavioral control during recurrent binge eating episodes and thus fail to adapt their behavior in the face of negative consequences, eg, high risk for obesity. To examine impairments in flexible reward-based decision-making, we exposed BED patients (n=22) and matched healthy individuals (n=22) to a reward-guided decision-making task during functional resonance imaging (fMRI). Performing fMRI analysis informed via computational modeling of choice behavior, we were able to identify specific signatures of altered decision-making in BED. On the behavioral level, we observed impaired behavioral adaptation in BED, which was due to enhanced switching behavior, a putative deficit in striking a balance between exploration and exploitation appropriately. This was accompanied by diminished activation related to exploratory decisions in the anterior insula/ventro-lateral prefrontal cortex. Moreover, although so-called model-free reward prediction errors remained intact, representation of ventro-medial prefrontal learning signatures, incorporating inference on unchosen options, was reduced in BED, which was associated with successful decision-making in the task. On the basis of a computational psychiatry account, the presented findings contribute to defining a neurocognitive phenotype of BED.

  13. Inferring introduction history and spread of Falcaria vulgaris Bernh. (Apiaceae) in the United States based on herbarium records

    Sarbottam Piya; Madhav P. Nepal; Achal Neupane; Gary E. Larson; Jack L. Butler

    2012-01-01

    Herbarium records were studied to infer the introduction history and spread of the exotic Eurasian sickleweed (Falcaria vulgaris Bernh.) in the United States. The spread of the plant was reconstructed using the location of early collections as the possible sites of primary introduction, and the location of subsequent collections as potential pathways along which this...

  14. Chironomid-based water depth reconstructions: an independent evaluation of site-specific and local inference models

    Engels, S.; Cwynar, L.C.; Rees, A.B.H.; Shuman, B.N.

    2012-01-01

    Water depth is an important environmental variable that explains a significant portion of the variation in the chironomid fauna of shallow lakes. We developed site-specific and local chironomid water-depth inference models using 26 and 104 surface-sediment samples, respectively, from seven

  15. Inferring facts from fiction: reading correct and incorrect information affects memory for related information.

    Butler, Andrew C; Dennis, Nancy A; Marsh, Elizabeth J

    2012-07-01

    People can acquire both true and false knowledge about the world from fictional stories. The present study explored whether the benefits and costs of learning about the world from fictional stories extend beyond memory for directly stated pieces of information. Of interest was whether readers would use correct and incorrect story references to make deductive inferences about related information in the story, and then integrate those inferences into their knowledge bases. Participants read stories containing correct, neutral, and misleading references to facts about the world; each reference could be combined with another reference that occurred in a later sentence to make a deductive inference. Later they answered general knowledge questions that tested for these deductive inferences. The results showed that participants generated and retained the deductive inferences regardless of whether the inferences were consistent or inconsistent with world knowledge, and irrespective of whether the references were placed consecutively in the text or separated by many sentences. Readers learn more than what is directly stated in stories; they use references to the real world to make both correct and incorrect inferences that are integrated into their knowledge bases.

  16. State-Based Curriculum-Making, Part 2, the Tool-Kit for the State's Curriculum-Making

    Westbury, Ian; Sivesind, Kirsten

    2016-01-01

    The paper identifies three tools that support the administrative instrument of a state-based curriculum commission: compartmentalization, licensing, and segmentation. These tools channel the state's curriculum-making towards forms of symbolic rather than regulatory action. The state curriculum becomes a framework for the ideological governance of…

  17. Making the case for evidence-based design in healthcare: a descriptive case study of organizational decision making.

    Shoemaker, Lorie K; Kazley, Abby Swanson; White, Andrea

    2010-01-01

    The aim of this study was to describe the organizational decision-making process used in the selection of evidence-based design (EBD) concepts, the criteria used to make these decisions, and the extent to which leadership style may have influenced the decision-making process. Five research questions were formulated to frame the direction of this study, including: (1) How did healthcare leaders learn of innovations in design? (2) How did healthcare leaders make decisions in the selection of healthcare design concepts? (3) What criteria did healthcare leaders use in the decision-making process? (4) How did healthcare leaders consider input from the staff in design decisions? and (5) To what extent did the leadership style of administrators affect the outcomes of the decision-making process? Current issues affecting healthcare in the community led the principal investigator's organization to undertake an ambitious facilities expansion project. As part of its planning process, the organization learned of EBD principles that seemingly had a positive impact on patient care and safety and staff working conditions. Although promising, a paucity of empirical research addressed the cost/benefit of incorporating many EBD concepts into one hospital setting, and there was no research that articulated the organizational decision-making process used by healthcare administrators when considering the use of EBD in expansion projects. A mixed-method, descriptive, qualitative, single-case study and quantitative design were used to address the five research questions. The Systems Research Organizing Model provided the theoretical framework. A variety of data collection methods was used, including interviews of key respondents, the review of documentary evidence, and the Multifactor Leadership Questionnaire. A participatory process was used throughout the design decision phases, involving staff at all levels of the organization. The Internet and architects facilitated learning about

  18. Making Cloud-based Systems Elasticity Testing Reproducible

    Albonico , Michel; Mottu , Jean-Marie; Sunyé , Gerson; Alvares , Frederico

    2017-01-01

    International audience; Elastic cloud infrastructures vary computational resources at runtime, i. e., elasticity, which is error-prone. That makes testing throughout elasticity crucial for those systems. Those errors are detected thanks to tests that should run deterministically many times all along the development. However, elasticity testing reproduction requires several features not supported natively by the main cloud providers, such as Amazon EC2. We identify three requirements that we c...

  19. Making sense of shared sense-making in an inquiry-based science classroom: Toward a sociocultural theory of mind

    Ladewski, Barbara G.

    Despite considerable exploration of inquiry and reflection in the literatures of science education and teacher education/teacher professional development over the past century, few theoretical or analytical tools exist to characterize these processes within a naturalistic classroom context. In addition, little is known regarding possible developmental trajectories for inquiry or reflection---for teachers or students---as these processes develop within a classroom context over time. In the dissertation, I use a sociocultural lens to explore these issues with an eye to the ways in which teachers and students develop shared sense-making, rather than from the more traditional perspective of individual teacher activity or student learning. The study includes both theoretical and empirical components. Theoretically, I explore the elaborations of sociocultural theory needed to characterize teacher-student shared sense-making as it develops within a classroom context, and, in particular, the role of inquiry and reflection in that sense-making. I develop a sociocultural model of shared sense-making that attempts to represent the dialectic between the individual and the social, through an elaboration of existing sociocultural and psychological constructs, including Vygotsky's zone of proximal development and theory of mind. Using this model as an interpretive framework, I develop a case study that explores teacher-student shared sense-making within a middle-school science classroom across a year of scaffolded introduction to inquiry-based science instruction. The empirical study serves not only as a test case for the theoretical model, but also informs our understanding regarding possible developmental trajectories and important mechanisms supporting and constraining shared sense-making within inquiry-based science classrooms. Theoretical and empirical findings provide support for the idea that perspectival shifts---that is, shifts of point-of-view that alter relationships

  20. Inference in models with adaptive learning

    Chevillon, G.; Massmann, M.; Mavroeidis, S.

    2010-01-01

    Identification of structural parameters in models with adaptive learning can be weak, causing standard inference procedures to become unreliable. Learning also induces persistent dynamics, and this makes the distribution of estimators and test statistics non-standard. Valid inference can be

  1. Data-Based Decision-Making: Developing a Method for Capturing Teachers' Understanding of CBM Graphs

    Espin, Christine A.; Wayman, Miya Miura; Deno, Stanley L.; McMaster, Kristen L.; de Rooij, Mark

    2017-01-01

    In this special issue, we explore the decision-making aspect of "data-based decision-making". The articles in the issue address a wide range of research questions, designs, methods, and analyses, but all focus on data-based decision-making for students with learning difficulties. In this first article, we introduce the topic of…

  2. The influence of design characteristics on statistical inference in nonlinear estimation: A simulation study based on survival data and hazard modeling

    Andersen, J.S.; Bedaux, J.J.M.; Kooijman, S.A.L.M.

    2000-01-01

    This paper describes the influence of design characteristics on the statistical inference for an ecotoxicological hazard-based model using simulated survival data. The design characteristics of interest are the number and spacing of observations (counts) in time, the number and spacing of exposure...... concentrations (within c(min) and c(max)), and the initial number of individuals at time 0 in each concentration. A comparison of the coverage probabilities for confidence limits arising from the profile-likelihood approach and the Wald-based approach is carried out. The Wald-based approach is very sensitive...

  3. Eight challenges in phylodynamic inference

    Simon D.W. Frost

    2015-03-01

    Full Text Available The field of phylodynamics, which attempts to enhance our understanding of infectious disease dynamics using pathogen phylogenies, has made great strides in the past decade. Basic epidemiological and evolutionary models are now well characterized with inferential frameworks in place. However, significant challenges remain in extending phylodynamic inference to more complex systems. These challenges include accounting for evolutionary complexities such as changing mutation rates, selection, reassortment, and recombination, as well as epidemiological complexities such as stochastic population dynamics, host population structure, and different patterns at the within-host and between-host scales. An additional challenge exists in making efficient inferences from an ever increasing corpus of sequence data.

  4. Bootstrapping phylogenies inferred from rearrangement data

    Lin Yu

    2012-08-01

    support values follow a similar scale and its receiver-operating characteristics are nearly identical, indicating that it provides similar levels of sensitivity and specificity. Thus our assessment method makes it possible to conduct phylogenetic analyses on whole genomes with the same degree of confidence as for analyses on aligned sequences. Extensions to search-based inference methods such as maximum parsimony and maximum likelihood are possible, but remain to be thoroughly tested.

  5. Bootstrapping phylogenies inferred from rearrangement data.

    Lin, Yu; Rajan, Vaibhav; Moret, Bernard Me

    2012-08-29

    -operating characteristics are nearly identical, indicating that it provides similar levels of sensitivity and specificity. Thus our assessment method makes it possible to conduct phylogenetic analyses on whole genomes with the same degree of confidence as for analyses on aligned sequences. Extensions to search-based inference methods such as maximum parsimony and maximum likelihood are possible, but remain to be thoroughly tested.

  6. Triptycene-based ladder monomers and polymers, methods of making each, and methods of use

    Pinnau, Ingo; Ghanem, Bader; Swaidan, Raja

    2015-01-01

    Embodiments of the present disclosure provide for a triptycene-based A-B monomer, a method of making a triptycene-based A-B monomer, a triptycene-based ladder polymer, a method of making a triptycene-based ladder polymers, a method of using

  7. Making public health nutrition relevant to evidence-based action.

    Brunner, E.; Rayner, M.; Thorogood, M.; Margetts, B.; Hooper, L.; Summerbell, C.D.; Dowler, E.; Hewitt, G.; Robertson, A.; Wiseman, M.

    2001-01-01

    Public health nutrition enjoyed many breakthroughs in the\\ud 20th century – from the discovery of vitamins and the\\ud metabolic roles of some 60 macro- and micronutrients, to\\ud the effects of maternal and childhood diet on health over\\ud the life course. Moreover, the food shortages in the UK that\\ud were experienced during World War II gave the first\\ud opportunity to show that nutritional science could make a\\ud valuable contribution to public policy. However, public\\ud health nutrition is...

  8. Green Plate-making Technology Based on Nanomaterials

    SONG Yanlin

    2011-01-01

    @@ Movable type printing technology is one of the four great inventions of ancient China which played a significant role in passing down the history and civilization of human society.Today, with economic development and people's diverse needs for presswork, printing has penetrated into all spheres of life, and the printing industry has become one of the industries that have important impacts on China's national economy.Since China is now the third largest printing market in the world, the rapid development of this huge market calls for digital plate-making technologies for the sake of environmental protection.

  9. Maintenance Decision Making based on different types of data fusion

    Galar, D.; Gustafson, A.; Tormos Martínez, Bernardo Vicente; Berges, Luis

    2012-01-01

    [EN] Over the last decade, system integration is applied more as it allows organizations to streamline business processes. A recent development in the asset engineering management is to leverage the investment already made in process control systems. This allows the operations, maintenance, and process control teams to monitor and determine new alarm level based on the physical condition data of the critical machines. Condition-based maintenance (CBM) is a maintenance philosophy base...

  10. Reflective Pedagogy: Making Meaning in Experiential Based Online Courses

    Kathy L. Guthrie

    2010-07-01

    Full Text Available The use of reflective pedagogies has long been considered critical to facilitating meaningful learning through experientially based curricula; however, the use of such methods has not been extensively explored as implemented in virtual environments. The study reviewed utilizes a combination of survey research and individual interviews to examine student perceptions of the meaningful learning which occurred as a result of their participation in two Web-based courses that utilized reflective pedagogies. One course focuses on topics related to service-learning and the second on placement-based internships. Both were instructed using online coursework based in reflective pedagogies to compliment on-site placements within local communities.

  11. Utility systems operation: Optimisation-based decision making

    Velasco-Garcia, Patricia; Varbanov, Petar Sabev; Arellano-Garcia, Harvey; Wozny, Guenter

    2011-01-01

    Utility systems provide heat and power to industrial sites. The importance of operating these systems in an optimal way has increased significantly due to the unstable and in the long term rising prices of fossil fuels as well as the need for reducing the greenhouse gas emissions. This paper presents an analysis of the problem for supporting operator decision making under conditions of variable steam demands from the production processes on an industrial site. An optimisation model has been developed, where besides for running the utility system, also the costs associated with starting-up the operating units have been modelled. The illustrative case study shows that accounting for the shut-downs and start-ups of utility operating units can bring significant cost savings. - Highlights: → Optimisation methodology for decision making on running utility systems. → Accounting for varying steam demands. → Optimal operating specifications when a demand change occurs. → Operating costs include start-up costs of boilers and other units. → Validated on a real-life case study. Up to 20% cost savings are possible.

  12. Evidence-Based Practice: A Framework for Making Effective Decisions

    Spencer, Trina D.; Detrich, Ronnie; Slocum, Timothy A.

    2012-01-01

    The research to practice gap in education has been a long-standing concern. The enactment of No Child Left Behind brought increased emphasis on the value of using scientifically based instructional practices to improve educational outcomes. It also brought education into the broader evidence-based practice movement that started in medicine and has…

  13. Site-Based Management: Avoiding Disaster While Sharing Decision Making.

    Sorenson, Larry Dean

    This paper argues that many site-based management practices do not represent true empowerment and are not founded on a consensual framework of values, goals, and priorities developed by educational stakeholders. In addition, they often lack clearly stated operating principles. The paper distinguishes between site-based management (SBM) and…

  14. Daily Average Wind Power Interval Forecasts Based on an Optimal Adaptive-Network-Based Fuzzy Inference System and Singular Spectrum Analysis

    Zhongrong Zhang

    2016-01-01

    Full Text Available Wind energy has increasingly played a vital role in mitigating conventional resource shortages. Nevertheless, the stochastic nature of wind poses a great challenge when attempting to find an accurate forecasting model for wind power. Therefore, precise wind power forecasts are of primary importance to solve operational, planning and economic problems in the growing wind power scenario. Previous research has focused efforts on the deterministic forecast of wind power values, but less attention has been paid to providing information about wind energy. Based on an optimal Adaptive-Network-Based Fuzzy Inference System (ANFIS and Singular Spectrum Analysis (SSA, this paper develops a hybrid uncertainty forecasting model, IFASF (Interval Forecast-ANFIS-SSA-Firefly Alogorithm, to obtain the upper and lower bounds of daily average wind power, which is beneficial for the practical operation of both the grid company and independent power producers. To strengthen the practical ability of this developed model, this paper presents a comparison between IFASF and other benchmarks, which provides a general reference for this aspect for statistical or artificially intelligent interval forecast methods. The comparison results show that the developed model outperforms eight benchmarks and has a satisfactory forecasting effectiveness in three different wind farms with two time horizons.

  15. Model distinguishability and inference robustness in mechanisms of cholera transmission and loss of immunity

    Lee, Elizabeth C.; Kelly, Michael R.; Ochocki, Brad M.; Akinwumi, Segun M.; Hamre, Karen E. S.; Tien, Joseph H.; Eisenberg, Marisa C.

    2016-01-01

    Mathematical models of cholera and waterborne disease vary widely in their structures, in terms of transmission pathways, loss of immunity, and other features. These differences may yield different predictions and parameter estimates from the same data. Given the increasing use of models to inform public health decision-making, it is important to assess distinguishability (whether models can be distinguished based on fit to data) and inference robustness (whether model inferences are robust t...

  16. Making Graphical Inferences: A Hierarchical Framework

    2004-08-01

    from graphs is considered one of the more complex skills graph readers should possess. According to the National Council of Teachers of Mathematics ...understanding graphical perception. Human Computer Interaction, 8, 353-388. NCTM : Standards for Mathematics . (2003, 2003). Pinker, S. (1990). A theory... NCTM ) the simplest type of question involves the extraction or comparison of a few explicitly represented data points (read-offs) ( NCTM : Standards

  17. The Inferences We Make: Children and Literature.

    Petrosky, Anthony R.

    1980-01-01

    Discusses classroom literary practices related to teacher questioning, retelling, literalism, and figurative language for children in the concrete operational stage; concludes that recent research on response to literature may say as much about what children are taught to do as what they do developmentally. Offers suggestions about teaching…

  18. Competency-Based Training and Simulation: Making a "Valid" Argument.

    Noureldin, Yasser A; Lee, Jason Y; McDougall, Elspeth M; Sweet, Robert M

    2018-02-01

    The use of simulation as an assessment tool is much more controversial than is its utility as an educational tool. However, without valid simulation-based assessment tools, the ability to objectively assess technical skill competencies in a competency-based medical education framework will remain challenging. The current literature in urologic simulation-based training and assessment uses a definition and framework of validity that is now outdated. This is probably due to the absence of awareness rather than an absence of comprehension. The following review article provides the urologic community an updated taxonomy on validity theory as it relates to simulation-based training and assessments and translates our simulation literature to date into this framework. While the old taxonomy considered validity as distinct subcategories and focused on the simulator itself, the modern taxonomy, for which we translate the literature evidence, considers validity as a unitary construct with a focus on interpretation of simulator data/scores.

  19. Making Theory Come Alive through Practice-based Design Research

    Markussen, Thomas; Knutz, Eva; Rind Christensen, Poul

    The aim of this paper is to demonstrate how practice-based design research is able not only to challenge, but also to push toward further development of some of the basic assumpstions in emotion theories as used within design research. In so doing, we wish to increase knolwedge on a central...... epistemological question for design research, namely how practice-based design research can be a vehicle for the construction of new theory for design research....

  20. Ultrasonographic diagnosis of biliary atresia based on a decision-making tree model

    Lee, So Mi; Cheon, Jung Eun; Choi, Young Hun; Kim, Woo Sun; Cho, Hyun Hye; Kim, In One; You, Sun Kyoung

    2015-01-01

    To assess the diagnostic value of various ultrasound (US) findings and to make a decision-tree model for US diagnosis of biliary atresia (BA). From March 2008 to January 2014, the following US findings were retrospectively evaluated in 100 infants with cholestatic jaundice (BA, n = 46; non-BA, n = 54): length and morphology of the gallbladder, triangular cord thickness, hepatic artery and portal vein diameters, and visualization of the common bile duct. Logistic regression analyses were performed to determine the features that would be useful in predicting BA. Conditional inference tree analysis was used to generate a decision-making tree for classifying patients into the BA or non-BA groups. Multivariate logistic regression analysis showed that abnormal gallbladder morphology and greater triangular cord thickness were significant predictors of BA (p = 0.003 and 0.001; adjusted odds ratio: 345.6 and 65.6, respectively). In the decision-making tree using conditional inference tree analysis, gallbladder morphology and triangular cord thickness (optimal cutoff value of triangular cord thickness, 3.4 mm) were also selected as significant discriminators for differential diagnosis of BA, and gallbladder morphology was the first discriminator. The diagnostic performance of the decision-making tree was excellent, with sensitivity of 100% (46/46), specificity of 94.4% (51/54), and overall accuracy of 97% (97/100). Abnormal gallbladder morphology and greater triangular cord thickness (> 3.4 mm) were the most useful predictors of BA on US. We suggest that the gallbladder morphology should be evaluated first and that triangular cord thickness should be evaluated subsequently in cases with normal gallbladder morphology

  1. Ultrasonographic Diagnosis of Biliary Atresia Based on a Decision-Making Tree Model.

    Lee, So Mi; Cheon, Jung-Eun; Choi, Young Hun; Kim, Woo Sun; Cho, Hyun-Hae; Cho, Hyun-Hye; Kim, In-One; You, Sun Kyoung

    2015-01-01

    To assess the diagnostic value of various ultrasound (US) findings and to make a decision-tree model for US diagnosis of biliary atresia (BA). From March 2008 to January 2014, the following US findings were retrospectively evaluated in 100 infants with cholestatic jaundice (BA, n = 46; non-BA, n = 54): length and morphology of the gallbladder, triangular cord thickness, hepatic artery and portal vein diameters, and visualization of the common bile duct. Logistic regression analyses were performed to determine the features that would be useful in predicting BA. Conditional inference tree analysis was used to generate a decision-making tree for classifying patients into the BA or non-BA groups. Multivariate logistic regression analysis showed that abnormal gallbladder morphology and greater triangular cord thickness were significant predictors of BA (p = 0.003 and 0.001; adjusted odds ratio: 345.6 and 65.6, respectively). In the decision-making tree using conditional inference tree analysis, gallbladder morphology and triangular cord thickness (optimal cutoff value of triangular cord thickness, 3.4 mm) were also selected as significant discriminators for differential diagnosis of BA, and gallbladder morphology was the first discriminator. The diagnostic performance of the decision-making tree was excellent, with sensitivity of 100% (46/46), specificity of 94.4% (51/54), and overall accuracy of 97% (97/100). Abnormal gallbladder morphology and greater triangular cord thickness (> 3.4 mm) were the most useful predictors of BA on US. We suggest that the gallbladder morphology should be evaluated first and that triangular cord thickness should be evaluated subsequently in cases with normal gallbladder morphology.

  2. Ultrasonographic diagnosis of biliary atresia based on a decision-making tree model

    Lee, So Mi; Cheon, Jung Eun; Choi, Young Hun; Kim, Woo Sun; Cho, Hyun Hye; Kim, In One; You, Sun Kyoung [Dept. of Radiology, Seoul National University College of Medicine, Seoul (Korea, Republic of)

    2015-12-15

    To assess the diagnostic value of various ultrasound (US) findings and to make a decision-tree model for US diagnosis of biliary atresia (BA). From March 2008 to January 2014, the following US findings were retrospectively evaluated in 100 infants with cholestatic jaundice (BA, n = 46; non-BA, n = 54): length and morphology of the gallbladder, triangular cord thickness, hepatic artery and portal vein diameters, and visualization of the common bile duct. Logistic regression analyses were performed to determine the features that would be useful in predicting BA. Conditional inference tree analysis was used to generate a decision-making tree for classifying patients into the BA or non-BA groups. Multivariate logistic regression analysis showed that abnormal gallbladder morphology and greater triangular cord thickness were significant predictors of BA (p = 0.003 and 0.001; adjusted odds ratio: 345.6 and 65.6, respectively). In the decision-making tree using conditional inference tree analysis, gallbladder morphology and triangular cord thickness (optimal cutoff value of triangular cord thickness, 3.4 mm) were also selected as significant discriminators for differential diagnosis of BA, and gallbladder morphology was the first discriminator. The diagnostic performance of the decision-making tree was excellent, with sensitivity of 100% (46/46), specificity of 94.4% (51/54), and overall accuracy of 97% (97/100). Abnormal gallbladder morphology and greater triangular cord thickness (> 3.4 mm) were the most useful predictors of BA on US. We suggest that the gallbladder morphology should be evaluated first and that triangular cord thickness should be evaluated subsequently in cases with normal gallbladder morphology.

  3. Multimodality Inferring of Human Cognitive States Based on Integration of Neuro-Fuzzy Network and Information Fusion Techniques

    P. Bhattacharya

    2007-11-01

    Full Text Available To achieve an effective and safe operation on the machine system where the human interacts with the machine mutually, there is a need for the machine to understand the human state, especially cognitive state, when the human's operation task demands an intensive cognitive activity. Due to a well-known fact with the human being, a highly uncertain cognitive state and behavior as well as expressions or cues, the recent trend to infer the human state is to consider multimodality features of the human operator. In this paper, we present a method for multimodality inferring of human cognitive states by integrating neuro-fuzzy network and information fusion techniques. To demonstrate the effectiveness of this method, we take the driver fatigue detection as an example. The proposed method has, in particular, the following new features. First, human expressions are classified into four categories: (i casual or contextual feature, (ii contact feature, (iii contactless feature, and (iv performance feature. Second, the fuzzy neural network technique, in particular Takagi-Sugeno-Kang (TSK model, is employed to cope with uncertain behaviors. Third, the sensor fusion technique, in particular ordered weighted aggregation (OWA, is integrated with the TSK model in such a way that cues are taken as inputs to the TSK model, and then the outputs of the TSK are fused by the OWA which gives outputs corresponding to particular cognitive states under interest (e.g., fatigue. We call this method TSK-OWA. Validation of the TSK-OWA, performed in the Northeastern University vehicle drive simulator, has shown that the proposed method is promising to be a general tool for human cognitive state inferring and a special tool for the driver fatigue detection.

  4. Causal inference of asynchronous audiovisual speech

    John F Magnotti

    2013-11-01

    Full Text Available During speech perception, humans integrate auditory information from the voice with visual information from the face. This multisensory integration increases perceptual precision, but only if the two cues come from the same talker; this requirement has been largely ignored by current models of speech perception. We describe a generative model of multisensory speech perception that includes this critical step of determining the likelihood that the voice and face information have a common cause. A key feature of the model is that it is based on a principled analysis of how an observer should solve this causal inference problem using the asynchrony between two cues and the reliability of the cues. This allows the model to make predictions abut the behavior of subjects performing a synchrony judgment task, predictive power that does not exist in other approaches, such as post hoc fitting of Gaussian curves to behavioral data. We tested the model predictions against the performance of 37 subjects performing a synchrony judgment task viewing audiovisual speech under a variety of manipulations, including varying asynchronies, intelligibility, and visual cue reliability. The causal inference model outperformed the Gaussian model across two experiments, providing a better fit to the behavioral data with fewer parameters. Because the causal inference model is derived from a principled understanding of the task, model parameters are directly interpretable in terms of stimulus and subject properties.

  5. Active inference, communication and hermeneutics.

    Friston, Karl J; Frith, Christopher D

    2015-07-01

    Hermeneutics refers to interpretation and translation of text (typically ancient scriptures) but also applies to verbal and non-verbal communication. In a psychological setting it nicely frames the problem of inferring the intended content of a communication. In this paper, we offer a solution to the problem of neural hermeneutics based upon active inference. In active inference, action fulfils predictions about how we will behave (e.g., predicting we will speak). Crucially, these predictions can be used to predict both self and others--during speaking and listening respectively. Active inference mandates the suppression of prediction errors by updating an internal model that generates predictions--both at fast timescales (through perceptual inference) and slower timescales (through perceptual learning). If two agents adopt the same model, then--in principle--they can predict each other and minimise their mutual prediction errors. Heuristically, this ensures they are singing from the same hymn sheet. This paper builds upon recent work on active inference and communication to illustrate perceptual learning using simulated birdsongs. Our focus here is the neural hermeneutics implicit in learning, where communication facilitates long-term changes in generative models that are trying to predict each other. In other words, communication induces perceptual learning and enables others to (literally) change our minds and vice versa. Copyright © 2015 The Authors. Published by Elsevier Ltd.. All rights reserved.

  6. Triptycene-based ladder monomers and polymers, methods of making each, and methods of use

    Pinnau, Ingo

    2015-02-05

    Embodiments of the present disclosure provide for a triptycene-based A-B monomer, a method of making a triptycene-based A-B monomer, a triptycene-based ladder polymer, a method of making a triptycene-based ladder polymers, a method of using triptycene-based ladder polymers, a structure incorporating triptycene-based ladder polymers, a method of gas separation, and the like.

  7. Validating evidence based decision making in health care

    Nüssler, Emil Karl; Eskildsen, Jacob Kjær; Håkonsson, Dorthe Døjbak

    Surgeons who perform prolapse surgeries face the dilemma of choosing to use mesh, with its assumed benefits, and the risks associated with mesh. In this paper, we examine whether decisions to use mesh is evidence based. Based on data of 30,398 patients from the Swedish National Quality Register o...... are highly influenced by the geographical placement of surgeons. Therfore, decisions to use mesh are boundedly rationality, rather than rational.......Surgeons who perform prolapse surgeries face the dilemma of choosing to use mesh, with its assumed benefits, and the risks associated with mesh. In this paper, we examine whether decisions to use mesh is evidence based. Based on data of 30,398 patients from the Swedish National Quality Register...... of Gynecological Surgery we examine factors related to decisions to use mesh. Our results indicate that decisions to use mesh are not evidence based, and cannot be explained neither by FDA safety communications, nor by medical conditions usually assumed to predict its usage. Instead, decisions to use mesh...

  8. Optimal policy for value-based decision-making.

    Tajima, Satohiro; Drugowitsch, Jan; Pouget, Alexandre

    2016-08-18

    For decades now, normative theories of perceptual decisions, and their implementation as drift diffusion models, have driven and significantly improved our understanding of human and animal behaviour and the underlying neural processes. While similar processes seem to govern value-based decisions, we still lack the theoretical understanding of why this ought to be the case. Here, we show that, similar to perceptual decisions, drift diffusion models implement the optimal strategy for value-based decisions. Such optimal decisions require the models' decision boundaries to collapse over time, and to depend on the a priori knowledge about reward contingencies. Diffusion models only implement the optimal strategy under specific task assumptions, and cease to be optimal once we start relaxing these assumptions, by, for example, using non-linear utility functions. Our findings thus provide the much-needed theory for value-based decisions, explain the apparent similarity to perceptual decisions, and predict conditions under which this similarity should break down.

  9. A large-scale RF-based Indoor Localization System Using Low-complexity Gaussian filter and improved Bayesian inference

    L. Xiao

    2013-04-01

    Full Text Available The growing convergence among mobile computing device and smart sensors boosts the development of ubiquitous computing and smart spaces, where localization is an essential part to realize the big vision. The general localization methods based on GPS and cellular techniques are not suitable for tracking numerous small size and limited power objects in the indoor case. In this paper, we propose and demonstrate a new localization method, this method is an easy-setup and cost-effective indoor localization system based on off-the-shelf active RFID technology. Our system is not only compatible with the future smart spaces and ubiquitous computing systems, but also suitable for large-scale indoor localization. The use of low-complexity Gaussian Filter (GF, Wheel Graph Model (WGM and Probabilistic Localization Algorithm (PLA make the proposed algorithm robust and suitable for large-scale indoor positioning from uncertainty, self-adjective to varying indoor environment. Using MATLAB simulation, we study the system performances, especially the dependence on a number of system and environment parameters, and their statistical properties. The simulation results prove that our proposed system is an accurate and cost-effective candidate for indoor localization.

  10. Practical risk-based decision making: Good decisions made efficiently

    Haire, M.J.; Guthrie, V.; Walker, D.; Singer, R.

    1995-01-01

    The Robotics and Process Systems Division of the Oak Ridge National Laboratory and the Westinghouse Savannah River Company have teamed with JBF Associates, Inc. to address risk-based robotic planning. The objective of the project is to provide systematic, risk-based relative comparisons of competing alternatives for solving clean-up problems at DOE facilities. This paper presents the methodology developed, describes the software developed to efficiently apply the methodology, and discusses the results of initial applications for DOE. The paper also addresses current work in applying the approach to problems in other industries (including an example from the hydrocarbon processing industry)

  11. Strategy selection in cue-based decision making.

    Bryant, David J

    2014-06-01

    People can make use of a range of heuristic and rational, compensatory strategies to perform a multiple-cue judgment task. It has been proposed that people are sensitive to the amount of cognitive effort required to employ decision strategies. Experiment 1 employed a dual-task methodology to investigate whether participants' preference for heuristic versus compensatory decision strategies can be altered by increasing the cognitive demands of the task. As indicated by participants' decision times, a secondary task interfered more with the performance of a heuristic than compensatory decision strategy but did not affect the proportions of participants using either type of strategy. A stimulus set effect suggested that the conjunction of cue salience and cue validity might play a determining role in strategy selection. The results of Experiment 2 indicated that when a perceptually salient cue was also the most valid, the majority of participants preferred a single-cue heuristic strategy. Overall, the results contradict the view that heuristics are more likely to be adopted when a task is made more cognitively demanding. It is argued that people employ 2 learning processes during training, one an associative learning process in which cue-outcome associations are developed by sampling multiple cues, and another that involves the sequential examination of single cues to serve as a basis for a single-cue heuristic.

  12. Decision making in advanced otosclerosis: an evidence-based strategy

    Merkus, P.; van Loon, M.C.; Smit, C.F.G.M.; Smits, J.C.M.; de Cock, A.F.C.; Hensen, E.F.

    2011-01-01

    Objectives/Hypothesis: To propose an evidence-based strategy for the management of patients with advanced otosclerosis accompanied by severe to profound hearing loss. Study Design: Systematic review of the literature and development of treatment guidelines. Methods: A systematic review was conducted

  13. Understanding Elementary Astronomy by Making Drawing-Based Models

    van Joolingen, Wouter; Aukes, A.V.A.; Gijlers, Aaltje H.; Bollen, Lars

    2015-01-01

    Modeling is an important approach in the teaching and learning of science. In this study, we attempt to bring modeling within the reach of young children by creating the SimSketch modeling system, which is based on freehand drawings that can be turned into simulations. This system was used by 247

  14. Understanding Elementary Astronomy by Making Drawing-Based Models

    van Joolingen, W. R.; Aukes, Annika V.; Gijlers, H.; Bollen, L.

    2015-01-01

    Modeling is an important approach in the teaching and learning of science. In this study, we attempt to bring modeling within the reach of young children by creating the SimSketch modeling system, which is based on freehand drawings that can be turned into simulations. This system was used by 247 children (ages ranging from 7 to 15) to create a…

  15. Likelihood-based inference for discretely observed birth-death-shift processes, with applications to evolution of mobile genetic elements.

    Xu, Jason; Guttorp, Peter; Kato-Maeda, Midori; Minin, Vladimir N

    2015-12-01

    Continuous-time birth-death-shift (BDS) processes are frequently used in stochastic modeling, with many applications in ecology and epidemiology. In particular, such processes can model evolutionary dynamics of transposable elements-important genetic markers in molecular epidemiology. Estimation of the effects of individual covariates on the birth, death, and shift rates of the process can be accomplished by analyzing patient data, but inferring these rates in a discretely and unevenly observed setting presents computational challenges. We propose a multi-type branching process approximation to BDS processes and develop a corresponding expectation maximization algorithm, where we use spectral techniques to reduce calculation of expected sufficient statistics to low-dimensional integration. These techniques yield an efficient and robust optimization routine for inferring the rates of the BDS process, and apply broadly to multi-type branching processes whose rates can depend on many covariates. After rigorously testing our methodology in simulation studies, we apply our method to study intrapatient time evolution of IS6110 transposable element, a genetic marker frequently used during estimation of epidemiological clusters of Mycobacterium tuberculosis infections. © 2015, The International Biometric Society.

  16. Could one make a diamond-based quantum computer?

    Stoneham, A Marshall; Harker, A H; Morley, Gavin W

    2009-01-01

    We assess routes to a diamond-based quantum computer, where we specifically look towards scalable devices, with at least 10 linked quantum gates. Such a computer should satisfy the deVincenzo rules and might be used at convenient temperatures. The specific examples that we examine are based on the optical control of electron spins. For some such devices, nuclear spins give additional advantages. Since there have already been demonstrations of basic initialization and readout, our emphasis is on routes to two-qubit quantum gate operations and the linking of perhaps 10-20 such gates. We analyse the dopant properties necessary, especially centres containing N and P, and give results using simple scoping calculations for the key interactions determining gate performance. Our conclusions are cautiously optimistic: it may be possible to develop a useful quantum information processor that works above cryogenic temperatures.

  17. Optimal inference with suboptimal models: Addiction and active Bayesian inference

    Schwartenbeck, Philipp; FitzGerald, Thomas H.B.; Mathys, Christoph; Dolan, Ray; Wurst, Friedrich; Kronbichler, Martin; Friston, Karl

    2015-01-01

    When casting behaviour as active (Bayesian) inference, optimal inference is defined with respect to an agent’s beliefs – based on its generative model of the world. This contrasts with normative accounts of choice behaviour, in which optimal actions are considered in relation to the true structure of the environment – as opposed to the agent’s beliefs about worldly states (or the task). This distinction shifts an understanding of suboptimal or pathological behaviour away from aberrant inference as such, to understanding the prior beliefs of a subject that cause them to behave less ‘optimally’ than our prior beliefs suggest they should behave. Put simply, suboptimal or pathological behaviour does not speak against understanding behaviour in terms of (Bayes optimal) inference, but rather calls for a more refined understanding of the subject’s generative model upon which their (optimal) Bayesian inference is based. Here, we discuss this fundamental distinction and its implications for understanding optimality, bounded rationality and pathological (choice) behaviour. We illustrate our argument using addictive choice behaviour in a recently described ‘limited offer’ task. Our simulations of pathological choices and addictive behaviour also generate some clear hypotheses, which we hope to pursue in ongoing empirical work. PMID:25561321

  18. Daytime, low latitude, vertical ExB drift velocities, inferred from ground-based magnetometer observations in the Peruvian, Philippine and Indian longitude sectors under quiet and disturbed conditions

    Anderson, D; Chau, J; Yumoto, K; Bhattacharya, A; Alex, S

    2006-01-01

    Daytime, low latitude, vertical ExB drift velocities, inferred from ground-based magnetometer observations in the Peruvian, Philippine and Indian longitude sectors under quiet and disturbed conditions

  19. Making Value-Based Payment Work for Academic Health Centers.

    Miller, Harold D

    2015-10-01

    Under fee-for-service payment systems, physicians and hospitals can be financially harmed by delivering higher-quality, more efficient care. The author describes how current "value-based purchasing" initiatives fail to address the underlying problems in fee-for-service payment and can be particularly problematic for academic health centers (AHCs). Bundled payments, warranties, and condition-based payments can correct the problems with fee-for-service payments and enable physicians and hospitals to redesign care delivery without causing financial problems for themselves. However, the author explains several specific actions that are needed to ensure that payment reforms can be a "win-win-win" for patients, purchasers, and AHCs: (1) disconnecting funding for teaching and research from payment for service delivery, (2) providing predictable payment for essential hospital services, (3) improving the quality and efficiency of care at AHCs, and (4) supporting collaborative relationships between AHCs and community providers by allowing each to focus on their unique strengths and by paying AHC specialists to assist community providers in diagnosis and treatment. With appropriate payment reforms and a commitment by AHCs to redesign care delivery, medical education, and research, AHCs could provide the leadership needed to improve care for patients, lower costs for health care purchasers, and maintain the financial viability of both AHCs and community providers.

  20. Environmental performance of straw-based pulp making: A life cycle perspective.

    Sun, Mingxing; Wang, Yutao; Shi, Lei

    2018-03-01

    Agricultural straw-based pulp making plays a vital role in pulp and paper industry, especially in forest deficient countries such as China. However, the environmental performance of straw-based pulp has scarcely been studied. A life cycle assessment on wheat straw-based pulp making in China was conducted to fill of the gaps in comprehensive environmental assessments of agricultural straw-based pulp making. On average, the global warming potential (GWP), GWP excluding biogenic carbon, acidification potential and eutrophication potential of wheat straw based pulp making are 2299kg CO 2 -eq, 4550kg CO 2 -eq, 16.43kg SO 2 -eq and 2.56kg Phosphate-eq respectively. The dominant factors contributing to environmental impacts are coal consumption, electricity consumption, and chemical (NaOH, ClO 2 ) input. Chemical input decrease and energy recovery increase reduce the total environmental impacts dramatically. Compared with wood-based and recycled pulp making, wheat straw-based pulp making has higher environmental impacts, which are mainly due to higher energy and chemical requirements. However, the environmental impacts of wheat straw-based pulp making are lower than hemp and flax based pulp making from previous studies. It is also noteworthy that biogenic carbon emission is significant in bio industries. If carbon sequestration is taken into account in pulp making industry, wheat straw-based pulp making is a net emitter rather than a net absorber of carbon dioxide. Since wheat straw-based pulp making provides an alternative for agricultural residue management, its evaluation framework should be expanded to further reveal its environmental benefits. Copyright © 2017 Elsevier B.V. All rights reserved.

  1. Clustering economies based on multiple criteria decision making techniques

    Mansour Momeni

    2011-10-01

    Full Text Available One of the primary concerns on many countries is to determine different important factors affecting economic growth. In this paper, we study some factors such as unemployment rate, inflation ratio, population growth, average annual income, etc to cluster different countries. The proposed model of this paper uses analytical hierarchy process (AHP to prioritize the criteria and then uses a K-mean technique to cluster 59 countries based on the ranked criteria into four groups. The first group includes countries with high standards such as Germany and Japan. In the second cluster, there are some developing countries with relatively good economic growth such as Saudi Arabia and Iran. The third cluster belongs to countries with faster rates of growth compared with the countries located in the second group such as China, India and Mexico. Finally, the fourth cluster includes countries with relatively very low rates of growth such as Jordan, Mali, Niger, etc.

  2. Anytime decision making based on unconstrained influence diagrams

    Luque, Manuel; Nielsen, Thomas Dyhre; Jensen, Finn Verner

    2016-01-01

    . This paper addresses this problem by proposing an anytime algorithm that at any time provides a qualified recommendation for the first decisions of the problem. The algorithm performs a heuristic-based search in a decision tree representation of the problem. We provide a framework for analyzing......Unconstrained influence diagrams extend the language of influence diagrams to cope with decision problems in which the order of the decisions is unspecified. Thus, when solving an unconstrained influence diagram we not only look for an optimal policy for each decision, but also for a so-called step......-policy specifying the next decision given the observations made so far. However, due to the complexity of the problem, temporal constraints can force the decision maker to act before the solution algorithm has finished, and, in particular, before an optimal policy for the first decision has been computed...

  3. A human genome-wide library of local phylogeny predictions for whole-genome inference problems

    Schwartz Russell

    2008-08-01

    Full Text Available Abstract Background Many common inference problems in computational genetics depend on inferring aspects of the evolutionary history of a data set given a set of observed modern sequences. Detailed predictions of the full phylogenies are therefore of value in improving our ability to make further inferences about population history and sources of genetic variation. Making phylogenetic predictions on the scale needed for whole-genome analysis is, however, extremely computationally demanding. Results In order to facilitate phylogeny-based predictions on a genomic scale, we develop a library of maximum parsimony phylogenies within local regions spanning all autosomal human chromosomes based on Haplotype Map variation data. We demonstrate the utility of this library for population genetic inferences by examining a tree statistic we call 'imperfection,' which measures the reuse of variant sites within a phylogeny. This statistic is significantly predictive of recombination rate, shows additional regional and population-specific conservation, and allows us to identify outlier genes likely to have experienced unusual amounts of variation in recent human history. Conclusion Recent theoretical advances in algorithms for phylogenetic tree reconstruction have made it possible to perform large-scale inferences of local maximum parsimony phylogenies from single nucleotide polymorphism (SNP data. As results from the imperfection statistic demonstrate, phylogeny predictions encode substantial information useful for detecting genomic features and population history. This data set should serve as a platform for many kinds of inferences one may wish to make about human population history and genetic variation.

  4. A subjective framework for seat comfort based on a heuristic multi criteria decision making technique and anthropometry.

    Fazlollahtabar, Hamed

    2010-12-01

    Consumer expectations for automobile seat comfort continue to rise. With this said, it is evident that the current automobile seat comfort development process, which is only sporadically successful, needs to change. In this context, there has been growing recognition of the need for establishing theoretical and methodological automobile seat comfort. On the other hand, seat producer need to know the costumer's required comfort to produce based on their interests. The current research methodologies apply qualitative approaches due to anthropometric specifications. The most significant weakness of these approaches is the inexact extracted inferences. Despite the qualitative nature of the consumer's preferences there are some methods to transform the qualitative parameters into numerical value which could help seat producer to improve or enhance their products. Nonetheless this approach would help the automobile manufacturer to provide their seats from the best producer regarding to the consumers idea. In this paper, a heuristic multi criteria decision making technique is applied to make consumers preferences in the numeric value. This Technique is combination of Analytical Hierarchy Procedure (AHP), Entropy method, and Technique for Order Preference by Similarity to an Ideal Solution (TOPSIS). A case study is conducted to illustrate the applicability and the effectiveness of the proposed heuristic approach. Copyright © 2010 Elsevier Ltd. All rights reserved.

  5. Probability biases as Bayesian inference

    Andre; C. R. Martins

    2006-11-01

    Full Text Available In this article, I will show how several observed biases in human probabilistic reasoning can be partially explained as good heuristics for making inferences in an environment where probabilities have uncertainties associated to them. Previous results show that the weight functions and the observed violations of coalescing and stochastic dominance can be understood from a Bayesian point of view. We will review those results and see that Bayesian methods should also be used as part of the explanation behind other known biases. That means that, although the observed errors are still errors under the be understood as adaptations to the solution of real life problems. Heuristics that allow fast evaluations and mimic a Bayesian inference would be an evolutionary advantage, since they would give us an efficient way of making decisions. %XX In that sense, it should be no surprise that humans reason with % probability as it has been observed.

  6. Evidence-based decision making in health care settings: from theory to practice.

    Kohn, Melanie Kazman; Berta, Whitney; Langley, Ann; Davis, David

    2011-01-01

    The relatively recent attention that evidence-based decision making has received in health care management has been at least in part due to the profound influence of evidence-based medicine. The result has been several comparisons in the literature between the use of evidence in health care management decisions and the use of evidence in medical decision making. Direct comparison, however, may be problematic, given the differences between medicine and management as they relate to (1) the nature of evidence that is brought to bear on decision making; (2) the maturity of empirical research in each field (in particular, studies that have substantiated whether or not and how evidence-based decision making is enacted); and (3) the context within which evidence-based decisions are made. By simultaneously reviewing evidence-based medicine and management, this chapter aims to inform future theorizing and empirical research on evidence-based decision making in health care settings.

  7. Exercise Sensing and Pose Recovery Inference Tool (ESPRIT) - A Compact Stereo-based Motion Capture Solution For Exercise Monitoring

    Lee, Mun Wai

    2015-01-01

    Crew exercise is important during long-duration space flight not only for maintaining health and fitness but also for preventing adverse health problems, such as losses in muscle strength and bone density. Monitoring crew exercise via motion capture and kinematic analysis aids understanding of the effects of microgravity on exercise and helps ensure that exercise prescriptions are effective. Intelligent Automation, Inc., has developed ESPRIT to monitor exercise activities, detect body markers, extract image features, and recover three-dimensional (3D) kinematic body poses. The system relies on prior knowledge and modeling of the human body and on advanced statistical inference techniques to achieve robust and accurate motion capture. In Phase I, the company demonstrated motion capture of several exercises, including walking, curling, and dead lifting. Phase II efforts focused on enhancing algorithms and delivering an ESPRIT prototype for testing and demonstration.

  8. The Impact of Contextual Clue Selection on Inference

    Leila Barati

    2010-05-01

    Full Text Available Linguistic information can be conveyed in the form of speech and written text, but it is the content of the message that is ultimately essential for higher-level processes in language comprehension, such as making inferences and associations between text information and knowledge about the world. Linguistically, inference is the shovel that allows receivers to dig meaning out from the text with selecting different embedded contextual clues. Naturally, people with different world experiences infer similar contextual situations differently. Lack of contextual knowledge of the target language can present an obstacle to comprehension (Anderson & Lynch, 2003. This paper tries to investigate how true contextual clue selection from the text can influence listener’s inference. In the present study 60 male and female teenagers (13-19 and 60 male and female young adults (20-26 were selected randomly based on Oxford Placement Test (OPT. During the study two fiction and two non-fiction passages were read to the participants in the experimental and control groups respectively and they were given scores according to Lexile’s Score (LS[1] based on their correct inference and logical thinking ability. In general the results show that participants’ clue selection based on their personal schematic references and background knowledge differ between teenagers and young adults and influence inference and listening comprehension. [1]- This is a framework for reading and listening which matches the appropriate score to each text based on degree of difficulty of text and each text was given a Lexile score from zero to four.

  9. Lower complexity bounds for lifted inference

    Jaeger, Manfred

    2015-01-01

    instances of the model. Numerous approaches for such “lifted inference” techniques have been proposed. While it has been demonstrated that these techniques will lead to significantly more efficient inference on some specific models, there are only very recent and still quite restricted results that show...... the feasibility of lifted inference on certain syntactically defined classes of models. Lower complexity bounds that imply some limitations for the feasibility of lifted inference on more expressive model classes were established earlier in Jaeger (2000; Jaeger, M. 2000. On the complexity of inference about...... that under the assumption that NETIME≠ETIME, there is no polynomial lifted inference algorithm for knowledge bases of weighted, quantifier-, and function-free formulas. Further strengthening earlier results, this is also shown to hold for approximate inference and for knowledge bases not containing...

  10. Data-driven inference for the spatial scan statistic

    Duczmal Luiz H

    2011-08-01

    Full Text Available Abstract Background Kulldorff's spatial scan statistic for aggregated area maps searches for clusters of cases without specifying their size (number of areas or geographic location in advance. Their statistical significance is tested while adjusting for the multiple testing inherent in such a procedure. However, as is shown in this work, this adjustment is not done in an even manner for all possible cluster sizes. Results A modification is proposed to the usual inference test of the spatial scan statistic, incorporating additional information about the size of the most likely cluster found. A new interpretation of the results of the spatial scan statistic is done, posing a modified inference question: what is the probability that the null hypothesis is rejected for the original observed cases map with a most likely cluster of size k, taking into account only those most likely clusters of size k found under null hypothesis for comparison? This question is especially important when the p-value computed by the usual inference process is near the alpha significance level, regarding the correctness of the decision based in this inference. Conclusions A practical procedure is provided to make more accurate inferences about the most likely cluster found by the spatial scan statistic.

  11. Data-driven inference for the spatial scan statistic.

    Almeida, Alexandre C L; Duarte, Anderson R; Duczmal, Luiz H; Oliveira, Fernando L P; Takahashi, Ricardo H C

    2011-08-02

    Kulldorff's spatial scan statistic for aggregated area maps searches for clusters of cases without specifying their size (number of areas) or geographic location in advance. Their statistical significance is tested while adjusting for the multiple testing inherent in such a procedure. However, as is shown in this work, this adjustment is not done in an even manner for all possible cluster sizes. A modification is proposed to the usual inference test of the spatial scan statistic, incorporating additional information about the size of the most likely cluster found. A new interpretation of the results of the spatial scan statistic is done, posing a modified inference question: what is the probability that the null hypothesis is rejected for the original observed cases map with a most likely cluster of size k, taking into account only those most likely clusters of size k found under null hypothesis for comparison? This question is especially important when the p-value computed by the usual inference process is near the alpha significance level, regarding the correctness of the decision based in this inference. A practical procedure is provided to make more accurate inferences about the most likely cluster found by the spatial scan statistic.

  12. Ground Field-Based Hyperspectral Imaging: A Preliminary Study to Assess the Potential of Established Vegetation Indices to Infer Variation in Water-Use Efficiency.

    Pelech, E. A.; McGrath, J.; Pederson, T.; Bernacchi, C.

    2017-12-01

    Increases in the global average temperature will consequently induce a higher occurrence of severe environmental conditions such as drought on arable land. To mitigate these threats, crops for fuel and food must be bred for higher water-use efficiencies (WUE). Defining genomic variation through high-throughput phenotypic analysis in field conditions has the potential to relieve the major bottleneck in linking desirable genetic traits to the associated phenotypic response. This can subsequently enable breeders to create new agricultural germplasm that supports the need for higher water-use efficient crops. From satellites to field-based aerial and ground sensors, the reflectance properties of vegetation measured by hyperspectral imaging is becoming a rapid high-throughput phenotyping technique. A variety of physiological traits can be inferred by regression analysis with leaf reflectance which is controlled by the properties and abundance of water, carbon, nitrogen and pigments. Although, given that the current established vegetation indices are designed to accentuate these properties from spectral reflectance, it becomes a challenge to infer relative measurements of WUE at a crop canopy scale without ground-truth data collection. This study aims to correlate established biomass and canopy-water-content indices with ground-truth data. Five bioenergy sorghum genotypes (Sorghum bicolor L. Moench) that have differences in WUE and wild-type Tobacco (Nicotiana tabacum var. Samsun) under irrigated and rainfed field conditions were examined. A linear regression analysis was conducted to determine if variation in canopy water content and biomass, driven by natural genotypic and artificial treatment influences, can be inferred using established vegetation indices. The results from this study will elucidate the ability of ground field-based hyperspectral imaging to assess variation in water content, biomass and water-use efficiency. This can lead to improved opportunities to

  13. NATO Guide for Judgement-Based Operational Analysis in Defence Decision Making : Client-Oriented Volume

    Wijnmalen, D.J.D.; et al

    2012-01-01

    Judgment plays an important role in all Operational Analysis (OA). NATO practitioners have determined that approaches in OA that are based on human judgement are increasingly critical to defence decision making. The purpose of the NATO Guide for Judgement-Based OA in Defence Decision Making is to

  14. NATO Guide for Judgement-Based Operational Analysis in Defence Decision Making : Executive Leaflet

    Wijnmalen, D.J.D.; et al

    2012-01-01

    Judgment plays an important role in all Operational Analysis (OA). NATO practitioners have determined that approaches in OA that are based on human judgement are increasingly critical to defence decision making. The purpose of the NATO Guide for Judgement-Based OA in Defence Decision Making is to

  15. Prerequisites for data-based decision making in the classroom: Research evidence and practical illustrations

    Hoogland, Inge; Schildkamp, Kim; van der Kleij, Fabienne; Heitink, Maaike Christine; Kippers, Wilma Berdien; Veldkamp, Bernard P.; Dijkstra, Anne M.

    2016-01-01

    Data-based decision making can lead to increased student learning. The desired effects of increased student learning can only be realized if data-based decision making is implemented successfully. Therefore, a systematic literature review was conducted to identify prerequisites of such successful

  16. [Value-based cancer care. From traditional evidence-based decision making to balanced decision making within frameworks of shared values].

    Palazzo, Salvatore; Filice, Aldo; Mastroianni, Candida; Biamonte, Rosalbino; Conforti, Serafino; Liguori, Virginia; Turano, Salvatore; De Simone, Rosanna; Rovito, Antonio; Manfredi, Caterina; Minardi, Stefano; Vilardo, Emmanuelle; Loizzo, Monica; Oriolo, Carmela

    2016-04-01

    Clinical decision making in oncology is based so far on the evidence of efficacy from high-quality clinical research. Data collection and analysis from experimental studies provide valuable insight into response rates and progression-free or overall survival. Data processing generates valuable information for medical professionals involved in cancer patient care, enabling them to make objective and unbiased choices. The increased attention of many scientific associations toward a more rational resource consumption in clinical decision making is mirrored in the Choosing Wisely campaign against the overuse or misuse of exams and procedures of little or no benefit for the patient. This cultural movement has been actively promoting care solutions based on the concept of "value". As a result, the value-based decision-making process for cancer care should not be dissociated from economic sustainability and from ethics of the affordability, also given the growing average cost of the most recent cancer drugs. In support of this orientation, the National Comprehensive Cancer Network (NCCN) has developed innovative and "complex" guidelines based on values, defined as "evidence blocks", with the aim of assisting the medical community in making overall sustainable choices.

  17. A decision‐making framework for flood risk management based on a Bayesian Influence Diagram

    Åstrøm, Helena Lisa Alexandra; Madsen, Henrik; Friis-Hansen, Peter

    2014-01-01

    We develop a Bayesian Influence Diagram (ID) approach for risk‐based decision‐ making in flood management. We show that it is a flexible decision‐making tool to assess flood risk in a non‐stationary environment and with an ability to test different adaptation measures in order to agree on the best...... means to describe uncertainty in the system. Hence, an ID contributes with several advantages in risk assessment and decision‐making. We present an ID approach for risk‐ based decision‐making in which we improve conventional flood risk assessments by including several types of hazards...... measures and combinations of these. Adaptation options can be tested at different points in time (in different time slices) which allows for finding the optimal time to invest. The usefulness of our decision‐making framework was exemplified through case studies in Aarhus and Copenhagen. Risk‐based decision‐making...

  18. Selecting a risk-based tool to aid in decision making

    Bendure, A.O.

    1995-03-01

    Selecting a risk-based tool to aid in decision making is as much of a challenge as properly using the tool once it has been selected. Failure to consider customer and stakeholder requirements and the technical bases and differences in risk-based decision making tools will produce confounding and/or politically unacceptable results when the tool is used. Selecting a risk-based decisionmaking tool must therefore be undertaken with the same, if not greater, rigor than the use of the tool once it is selected. This paper presents a process for selecting a risk-based tool appropriate to a set of prioritization or resource allocation tasks, discusses the results of applying the process to four risk-based decision-making tools, and identifies the ``musts`` for successful selection and implementation of a risk-based tool to aid in decision making.

  19. Is there a hierarchy of social inferences? The likelihood and speed of inferring intentionality, mind, and personality.

    Malle, Bertram F; Holbrook, Jess

    2012-04-01

    People interpret behavior by making inferences about agents' intentionality, mind, and personality. Past research studied such inferences 1 at a time; in real life, people make these inferences simultaneously. The present studies therefore examined whether 4 major inferences (intentionality, desire, belief, and personality), elicited simultaneously in response to an observed behavior, might be ordered in a hierarchy of likelihood and speed. To achieve generalizability, the studies included a wide range of stimulus behaviors, presented them verbally and as dynamic videos, and assessed inferences both in a retrieval paradigm (measuring the likelihood and speed of accessing inferences immediately after they were made) and in an online processing paradigm (measuring the speed of forming inferences during behavior observation). Five studies provide evidence for a hierarchy of social inferences-from intentionality and desire to belief to personality-that is stable across verbal and visual presentations and that parallels the order found in developmental and primate research. (c) 2012 APA, all rights reserved.

  20. Inferring the distribution and demography of an invasive species from sighting data: the red fox incursion into Tasmania.

    Peter Caley

    Full Text Available A recent study has inferred that the red fox (Vulpes vulpes is now widespread in Tasmania as of 2010, based on the extraction of fox DNA from predator scats. Heuristically, this inference appears at first glance to be at odds with the lack of recent confirmed discoveries of either road-killed foxes--the last of which occurred in 2006, or hunter killed foxes--the most recent in 2001. This paper demonstrates a method to codify this heuristic analysis and produce inferences consistent with assumptions and data. It does this by formalising the analysis in a transparent and repeatable manner to make inference on the past, present and future distribution of an invasive species. It utilizes Approximate Bayesian Computation to make inferences. Importantly, the method is able to inform management of invasive species within realistic time frames, and can be applied widely. We illustrate the technique using the Tasmanian fox data. Based on the pattern of carcass discoveries of foxes in Tasmania, we infer that the population of foxes in Tasmania is most likely extinct, or restricted in distribution and demographically weak as of 2013. It is possible, though unlikely, that that population is widespread and/or demographically robust. This inference is largely at odds with the inference from the predator scat survey data. Our results suggest the chances of successfully eradicating the introduced red fox population in Tasmania may be significantly higher than previously thought.

  1. Inferring the distribution and demography of an invasive species from sighting data: the red fox incursion into Tasmania.

    Caley, Peter; Ramsey, David S L; Barry, Simon C

    2015-01-01

    A recent study has inferred that the red fox (Vulpes vulpes) is now widespread in Tasmania as of 2010, based on the extraction of fox DNA from predator scats. Heuristically, this inference appears at first glance to be at odds with the lack of recent confirmed discoveries of either road-killed foxes--the last of which occurred in 2006, or hunter killed foxes--the most recent in 2001. This paper demonstrates a method to codify this heuristic analysis and produce inferences consistent with assumptions and data. It does this by formalising the analysis in a transparent and repeatable manner to make inference on the past, present and future distribution of an invasive species. It utilizes Approximate Bayesian Computation to make inferences. Importantly, the method is able to inform management of invasive species within realistic time frames, and can be applied widely. We illustrate the technique using the Tasmanian fox data. Based on the pattern of carcass discoveries of foxes in Tasmania, we infer that the population of foxes in Tasmania is most likely extinct, or restricted in distribution and demographically weak as of 2013. It is possible, though unlikely, that that population is widespread and/or demographically robust. This inference is largely at odds with the inference from the predator scat survey data. Our results suggest the chances of successfully eradicating the introduced red fox population in Tasmania may be significantly higher than previously thought.

  2. An Intuitionistic Fuzzy Stochastic Decision-Making Method Based on Case-Based Reasoning and Prospect Theory

    Peng Li

    2017-01-01

    Full Text Available According to the case-based reasoning method and prospect theory, this paper mainly focuses on finding a way to obtain decision-makers’ preferences and the criterion weights for stochastic multicriteria decision-making problems and classify alternatives. Firstly, we construct a new score function for an intuitionistic fuzzy number (IFN considering the decision-making environment. Then, we aggregate the decision-making information in different natural states according to the prospect theory and test decision-making matrices. A mathematical programming model based on a case-based reasoning method is presented to obtain the criterion weights. Moreover, in the original decision-making problem, we integrate all the intuitionistic fuzzy decision-making matrices into an expectation matrix using the expected utility theory and classify or rank the alternatives by the case-based reasoning method. Finally, two illustrative examples are provided to illustrate the implementation process and applicability of the developed method.

  3. Hybrid artificial intelligence approach based on neural fuzzy inference model and metaheuristic optimization for flood susceptibilitgy modeling in a high-frequency tropical cyclone area using GIS

    Tien Bui, Dieu; Pradhan, Biswajeet; Nampak, Haleh; Bui, Quang-Thanh; Tran, Quynh-An; Nguyen, Quoc-Phi

    2016-09-01

    This paper proposes a new artificial intelligence approach based on neural fuzzy inference system and metaheuristic optimization for flood susceptibility modeling, namely MONF. In the new approach, the neural fuzzy inference system was used to create an initial flood susceptibility model and then the model was optimized using two metaheuristic algorithms, Evolutionary Genetic and Particle Swarm Optimization. A high-frequency tropical cyclone area of the Tuong Duong district in Central Vietnam was used as a case study. First, a GIS database for the study area was constructed. The database that includes 76 historical flood inundated areas and ten flood influencing factors was used to develop and validate the proposed model. Root Mean Square Error (RMSE), Mean Absolute Error (MAE), Receiver Operating Characteristic (ROC) curve, and area under the ROC curve (AUC) were used to assess the model performance and its prediction capability. Experimental results showed that the proposed model has high performance on both the training (RMSE = 0.306, MAE = 0.094, AUC = 0.962) and validation dataset (RMSE = 0.362, MAE = 0.130, AUC = 0.911). The usability of the proposed model was evaluated by comparing with those obtained from state-of-the art benchmark soft computing techniques such as J48 Decision Tree, Random Forest, Multi-layer Perceptron Neural Network, Support Vector Machine, and Adaptive Neuro Fuzzy Inference System. The results show that the proposed MONF model outperforms the above benchmark models; we conclude that the MONF model is a new alternative tool that should be used in flood susceptibility mapping. The result in this study is useful for planners and decision makers for sustainable management of flood-prone areas.

  4. SEMANTIC PATCH INFERENCE

    Andersen, Jesper

    2009-01-01

    Collateral evolution the problem of updating several library-using programs in response to API changes in the used library. In this dissertation we address the issue of understanding collateral evolutions by automatically inferring a high-level specification of the changes evident in a given set ...... specifications inferred by spdiff in Linux are shown. We find that the inferred specifications concisely capture the actual collateral evolution performed in the examples....

  5. Inference for binomial probability based on dependent Bernoulli random variables with applications to meta‐analysis and group level studies

    Bakbergenuly, Ilyas; Morgenthaler, Stephan

    2016-01-01

    We study bias arising as a result of nonlinear transformations of random variables in random or mixed effects models and its effect on inference in group‐level studies or in meta‐analysis. The findings are illustrated on the example of overdispersed binomial distributions, where we demonstrate considerable biases arising from standard log‐odds and arcsine transformations of the estimated probability p^, both for single‐group studies and in combining results from several groups or studies in meta‐analysis. Our simulations confirm that these biases are linear in ρ, for small values of ρ, the intracluster correlation coefficient. These biases do not depend on the sample sizes or the number of studies K in a meta‐analysis and result in abysmal coverage of the combined effect for large K. We also propose bias‐correction for the arcsine transformation. Our simulations demonstrate that this bias‐correction works well for small values of the intraclass correlation. The methods are applied to two examples of meta‐analyses of prevalence. PMID:27192062

  6. Electromyography (EMG) signal recognition using combined discrete wavelet transform based adaptive neuro-fuzzy inference systems (ANFIS)

    Arozi, Moh; Putri, Farika T.; Ariyanto, Mochammad; Khusnul Ari, M.; Munadi, Setiawan, Joga D.

    2017-01-01

    People with disabilities are increasing from year to year either due to congenital factors, sickness, accident factors and war. One form of disability is the case of interruptions of hand function. The condition requires and encourages the search for solutions in the form of creating an artificial hand with the ability as a human hand. The development of science in the field of neuroscience currently allows the use of electromyography (EMG) to control the motion of artificial prosthetic hand into the necessary use of EMG as an input signal to control artificial prosthetic hand. This study is the beginning of a significant research planned in the development of artificial prosthetic hand with EMG signal input. This initial research focused on the study of EMG signal recognition. Preliminary results show that the EMG signal recognition using combined discrete wavelet transform and Adaptive Neuro-Fuzzy Inference System (ANFIS) produces accuracy 98.3 % for training and 98.51% for testing. Thus the results can be used as an input signal for Simulink block diagram of a prosthetic hand that will be developed on next study. The research will proceed with the construction of artificial prosthetic hand along with Simulink program controlling and integrating everything into one system.

  7. Seasonal Habitat Patterns of Japanese Common Squid (Todarodes Pacificus Inferred from Satellite-Based Species Distribution Models

    Irene D. Alabia

    2016-11-01

    Full Text Available The understanding of the spatio-temporal distributions of the species habitat in the marine environment is central to effectual resource management and conservation. Here, we examined the potential habitat distributions of Japanese common squid (Todarodes pacificus in the Sea of Japan during a four-year period. The seasonal patterns of preferential habitat were inferred from species distribution models, built using squid occurrences detected from night-time visible images and remotely-sensed environmental factors. The predicted squid habitat (i.e., areas with high habitat suitability revealed strong seasonal variability, characterized by a reduction of potential habitat, confined off of the southern part of the basin during the winter–spring period (December–May. Apparent expansion of preferential habitat occurred during summer–autumn months (June–November, concurrent with the formation of highly suitable habitat patches in certain regions of the Sea of Japan. These habitat distribution patterns were in response to changes in oceanographic conditions and synchronous with seasonal migration of squid. Moreover, the most important variables regulating the spatio-temporal patterns of suitable habitat were sea surface temperature, depth, sea surface height anomaly, and eddy kinetic energy. These variables could affect the habitat distributions through their impacts on growth and survival of squid, local nutrient transport, and the availability of favorable spawning and feeding grounds.

  8. Inference for binomial probability based on dependent Bernoulli random variables with applications to meta-analysis and group level studies.

    Bakbergenuly, Ilyas; Kulinskaya, Elena; Morgenthaler, Stephan

    2016-07-01

    We study bias arising as a result of nonlinear transformations of random variables in random or mixed effects models and its effect on inference in group-level studies or in meta-analysis. The findings are illustrated on the example of overdispersed binomial distributions, where we demonstrate considerable biases arising from standard log-odds and arcsine transformations of the estimated probability p̂, both for single-group studies and in combining results from several groups or studies in meta-analysis. Our simulations confirm that these biases are linear in ρ, for small values of ρ, the intracluster correlation coefficient. These biases do not depend on the sample sizes or the number of studies K in a meta-analysis and result in abysmal coverage of the combined effect for large K. We also propose bias-correction for the arcsine transformation. Our simulations demonstrate that this bias-correction works well for small values of the intraclass correlation. The methods are applied to two examples of meta-analyses of prevalence. © 2016 The Authors. Biometrical Journal Published by Wiley-VCH Verlag GmbH & Co. KGaA.

  9. Maintenance planning support method for nuclear power plants based on collective decision making

    Shimizu, Shunichi; Sakurai, Shoji; Takaoka, Kazushi; Kanemoto, Shigeru; Fukutomi, Shigeki

    1992-01-01

    Inspection and maintenance planning in nuclear power plants is conducted by decision making based on experts' collective consensus. However, since a great deal of time and effort is required to reach a consensus among expert judgments, the establishment of effective decision making methods is necessary. Therefore, the authors developed a method for supporting collective decision making, based on a combination of three types of decision making methods; the Characteristic Diagram method, Interpretative Structural Modeling method, and the Analytic Hierarchy Process method. The proposed method enables us to determine the evaluation criteria systematically for collective decision making, and also allows extracting collective decisions using simplified questionnaires. The proposed method can support reaching a consensus of groups effectively through the evaluation of collective decision structural models and their characteristics. In this paper, the effectiveness of the proposed method was demonstrated through its application to the decision making problem concerning whether or not the improved ultrasonic testing equipment should be adopted at nuclear power plants. (author)

  10. Forward and backward inference in spatial cognition.

    Will D Penny

    Full Text Available This paper shows that the various computations underlying spatial cognition can be implemented using statistical inference in a single probabilistic model. Inference is implemented using a common set of 'lower-level' computations involving forward and backward inference over time. For example, to estimate where you are in a known environment, forward inference is used to optimally combine location estimates from path integration with those from sensory input. To decide which way to turn to reach a goal, forward inference is used to compute the likelihood of reaching that goal under each option. To work out which environment you are in, forward inference is used to compute the likelihood of sensory observations under the different hypotheses. For reaching sensory goals that require a chaining together of decisions, forward inference can be used to compute a state trajectory that will lead to that goal, and backward inference to refine the route and estimate control signals that produce the required trajectory. We propose that these computations are reflected in recent findings of pattern replay in the mammalian brain. Specifically, that theta sequences reflect decision making, theta flickering reflects model selection, and remote replay reflects route and motor planning. We also propose a mapping of the above computational processes onto lateral and medial entorhinal cortex and hippocampus.

  11. Phylogeny of the gymnosperm genus Cycas L. (Cycadaceae) as inferred from plastid and nuclear loci based on a large-scale sampling: Evolutionary relationships and taxonomical implications.

    Liu, Jian; Zhang, Shouzhou; Nagalingum, Nathalie S; Chiang, Yu-Chung; Lindstrom, Anders J; Gong, Xun

    2018-05-18

    The gymnosperm genus Cycas is the sole member of Cycadaceae, and is the largest genus of extant cycads. There are about 115 accepted Cycas species mainly distributed in the paleotropics. Based on morphology, the genus has been divided into six sections and eight subsections, but this taxonomy has not yet been tested in a molecular phylogenetic framework. Although the monophyly of Cycas is broadly accepted, the intrageneric relationships inferred from previous molecular phylogenetic analyses are unclear due to insufficient sampling or uninformative DNA sequence data. In this study, we reconstructed a phylogeny of Cycas using four chloroplast intergenic spacers and seven low-copy nuclear genes and sampling 90% of extant Cycas species. The maximum likelihood and Bayesian inference phylogenies suggest: (1) matrices of either concatenated cpDNA markers or of concatenated nDNA lack sufficient informative sites to resolve the phylogeny alone, however, the phylogeny from the combined cpDNA-nDNA dataset suggests the genus can be roughly divided into 13 clades and six sections that are in agreement with the current classification of the genus; (2) although with partial support, a clade combining sections Panzhihuaenses + Asiorientales is resolved as the earliest diverging branch; (3) section Stangerioides is not monophyletic because the species resolve as a grade; (4) section Indosinenses is not monophyletic as it includes Cycas macrocarpa and C. pranburiensis from section Cycas; (5) section Cycas is the most derived group and its subgroups correspond with geography. Copyright © 2018 Elsevier Inc. All rights reserved.

  12. Genetic structure and inferences on potential source areas for Bactrocera dorsalis (Hendel based on mitochondrial and microsatellite markers.

    Wei Shi

    Full Text Available Bactrocera dorsalis (Diptera: Tephritidae is mainly distributed in tropical and subtropical Asia and in the Pacific region. Despite its economic importance, very few studies have addressed the question of the wide genetic structure and potential source area of this species. This pilot study attempts to infer the native region of this pest and its colonization pathways in Asia. Combining mitochondrial and microsatellite markers, we evaluated the level of genetic diversity, genetic structure, and the gene flow among fly populations collected across Southeast Asia and China. A complex and significant genetic structure corresponding to the geographic pattern was found with both types of molecular markers. However, the genetic structure found was rather weak in both cases, and no pattern of isolation by distance was identified. Multiple long-distance dispersal events and miscellaneous host selection by this species may explain the results. These complex patterns may have been influenced by human-mediated transportation of the pest from one area to another and the complex topography of the study region. For both mitochondrial and microsatellite data, no signs of bottleneck or founder events could be identified. Nonetheless, maximal genetic diversity was observed in Myanmar, Vietnam and Guangdong (China and asymmetric migration patterns were found. These results provide indirect evidence that the tropical regions of Southeast Asia and southern coast of China may be considered as the native range of the species and the population expansion is northward. Yunnan (China is a contact zone that has been colonized from different sources. Regions along the southern coast of Vietnam and China probably served to colonize mainly the southern region of China. Southern coastal regions of China may also have colonized central parts of China and of central Yunnan.

  13. Fiducial inference - A Neyman-Pearson interpretation

    Salome, D; VonderLinden, W; Dose,; Fischer, R; Preuss, R

    1999-01-01

    Fisher's fiducial argument is a tool for deriving inferences in the form of a probability distribution on the parameter space, not based on Bayes's Theorem. Lindley established that in exceptional situations fiducial inferences coincide with posterior distributions; in the other situations fiducial

  14. Compiling Relational Bayesian Networks for Exact Inference

    Jaeger, Manfred; Darwiche, Adnan; Chavira, Mark

    2006-01-01

    We describe in this paper a system for exact inference with relational Bayesian networks as defined in the publicly available PRIMULA tool. The system is based on compiling propositional instances of relational Bayesian networks into arithmetic circuits and then performing online inference...

  15. Compiling Relational Bayesian Networks for Exact Inference

    Jaeger, Manfred; Chavira, Mark; Darwiche, Adnan

    2004-01-01

    We describe a system for exact inference with relational Bayesian networks as defined in the publicly available \\primula\\ tool. The system is based on compiling propositional instances of relational Bayesian networks into arithmetic circuits and then performing online inference by evaluating...

  16. Assessing an ensemble Kalman filter inference of Manning’s n coefficient of an idealized tidal inlet against a polynomial chaos-based MCMC

    Siripatana, Adil

    2017-06-08

    Bayesian estimation/inversion is commonly used to quantify and reduce modeling uncertainties in coastal ocean model, especially in the framework of parameter estimation. Based on Bayes rule, the posterior probability distribution function (pdf) of the estimated quantities is obtained conditioned on available data. It can be computed either directly, using a Markov chain Monte Carlo (MCMC) approach, or by sequentially processing the data following a data assimilation approach, which is heavily exploited in large dimensional state estimation problems. The advantage of data assimilation schemes over MCMC-type methods arises from the ability to algorithmically accommodate a large number of uncertain quantities without significant increase in the computational requirements. However, only approximate estimates are generally obtained by this approach due to the restricted Gaussian prior and noise assumptions that are generally imposed in these methods. This contribution aims at evaluating the effectiveness of utilizing an ensemble Kalman-based data assimilation method for parameter estimation of a coastal ocean model against an MCMC polynomial chaos (PC)-based scheme. We focus on quantifying the uncertainties of a coastal ocean ADvanced CIRCulation (ADCIRC) model with respect to the Manning’s n coefficients. Based on a realistic framework of observation system simulation experiments (OSSEs), we apply an ensemble Kalman filter and the MCMC method employing a surrogate of ADCIRC constructed by a non-intrusive PC expansion for evaluating the likelihood, and test both approaches under identical scenarios. We study the sensitivity of the estimated posteriors with respect to the parameters of the inference methods, including ensemble size, inflation factor, and PC order. A full analysis of both methods, in the context of coastal ocean model, suggests that an ensemble Kalman filter with appropriate ensemble size and well-tuned inflation provides reliable mean estimates and

  17. Assessing an ensemble Kalman filter inference of Manning's n coefficient of an idealized tidal inlet against a polynomial chaos-based MCMC

    Siripatana, Adil; Mayo, Talea; Sraj, Ihab; Knio, Omar; Dawson, Clint; Le Maitre, Olivier; Hoteit, Ibrahim

    2017-08-01

    Bayesian estimation/inversion is commonly used to quantify and reduce modeling uncertainties in coastal ocean model, especially in the framework of parameter estimation. Based on Bayes rule, the posterior probability distribution function (pdf) of the estimated quantities is obtained conditioned on available data. It can be computed either directly, using a Markov chain Monte Carlo (MCMC) approach, or by sequentially processing the data following a data assimilation approach, which is heavily exploited in large dimensional state estimation problems. The advantage of data assimilation schemes over MCMC-type methods arises from the ability to algorithmically accommodate a large number of uncertain quantities without significant increase in the computational requirements. However, only approximate estimates are generally obtained by this approach due to the restricted Gaussian prior and noise assumptions that are generally imposed in these methods. This contribution aims at evaluating the effectiveness of utilizing an ensemble Kalman-based data assimilation method for parameter estimation of a coastal ocean model against an MCMC polynomial chaos (PC)-based scheme. We focus on quantifying the uncertainties of a coastal ocean ADvanced CIRCulation (ADCIRC) model with respect to the Manning's n coefficients. Based on a realistic framework of observation system simulation experiments (OSSEs), we apply an ensemble Kalman filter and the MCMC method employing a surrogate of ADCIRC constructed by a non-intrusive PC expansion for evaluating the likelihood, and test both approaches under identical scenarios. We study the sensitivity of the estimated posteriors with respect to the parameters of the inference methods, including ensemble size, inflation factor, and PC order. A full analysis of both methods, in the context of coastal ocean model, suggests that an ensemble Kalman filter with appropriate ensemble size and well-tuned inflation provides reliable mean estimates and

  18. Mild cognitive impairment is associated with poorer decision-making in community-based older persons.

    Han, S Duke; Boyle, Patricia A; James, Bryan D; Yu, Lei; Bennett, David A

    2015-04-01

    To test the hypothesis that mild cognitive impairment (MCI) is associated with poorer financial and healthcare decision-making. Community-based epidemiological cohort study. Communities throughout northeastern Illinois. Older persons without dementia from the Rush Memory and Aging Project (N = 730). All participants underwent a detailed clinical evaluation and decision-making assessment using a measure that closely approximates materials used in real-world financial and healthcare settings. This allowed for measurement of total decision-making and financial and healthcare decision-making. Regression models were used to examine whether MCI was associated with a lower level of decision-making. In subsequent analyses, the relationship between specific cognitive systems (episodic memory, semantic memory, working memory, perceptual speed, visuospatial ability) and decision-making was explored in participants with MCI. MCI was associated with lower total, financial, and healthcare decision-making scores after accounting for the effects of age, education, and sex. The effect of MCI on total decision-making was equivalent to the effect of more than 10 additional years of age. Additional models showed that, when considering multiple cognitive systems, perceptual speed accounted for the most variance in decision-making in participants with MCI. Persons with MCI may have poorer financial and healthcare decision-making in real-world situations, and perceptual speed may be an important contributor to poorer decision-making in persons with MCI. © 2015, Copyright the Authors Journal compilation © 2015, The American Geriatrics Society.

  19. School characteristics influencing the implementation of a data-based decision making intervention

    van Geel, Marieke Johanna Maria; Visscher, Arend J.; Teunis, B.

    2017-01-01

    There is an increasing global emphasis on using data for decision making, with a growing body of research on interventions aimed at implementing and sustaining data-based decision making (DBDM) in schools. Yet, little is known about the school features that facilitate or hinder the implementation of

  20. An agent-based model for integrated emotion regulation and contagion in socially affected decision making

    Manzoor, A.; Treur, J.

    2015-01-01

    This paper addresses an agent-based computational social agent model for the integration of emotion regulation, emotion contagion and decision making in a social context. The model integrates emotion-related valuing, in order to analyse the role of emotions in socially affected decision making. The

  1. Models for inference in dynamic metacommunity systems

    Dorazio, Robert M.; Kery, Marc; Royle, J. Andrew; Plattner, Matthias

    2010-01-01

    A variety of processes are thought to be involved in the formation and dynamics of species assemblages. For example, various metacommunity theories are based on differences in the relative contributions of dispersal of species among local communities and interactions of species within local communities. Interestingly, metacommunity theories continue to be advanced without much empirical validation. Part of the problem is that statistical models used to analyze typical survey data either fail to specify ecological processes with sufficient complexity or they fail to account for errors in detection of species during sampling. In this paper, we describe a statistical modeling framework for the analysis of metacommunity dynamics that is based on the idea of adopting a unified approach, multispecies occupancy modeling, for computing inferences about individual species, local communities of species, or the entire metacommunity of species. This approach accounts for errors in detection of species during sampling and also allows different metacommunity paradigms to be specified in terms of species- and location-specific probabilities of occurrence, extinction, and colonization: all of which are estimable. In addition, this approach can be used to address inference problems that arise in conservation ecology, such as predicting temporal and spatial changes in biodiversity for use in making conservation decisions. To illustrate, we estimate changes in species composition associated with the species-specific phenologies of flight patterns of butterflies in Switzerland for the purpose of estimating regional differences in biodiversity.

  2. Inferring relevance in a changing world

    Robert C Wilson

    2012-01-01

    Full Text Available Reinforcement learning models of human and animal learning usually concentrate on how we learn the relationship between different stimuli or actions and rewards. However, in real world situations stimuli are ill-defined. On the one hand, our immediate environment is extremely multi-dimensional. On the other hand, in every decision-making scenario only a few aspects of the environment are relevant for obtaining reward, while most are irrelevant. Thus a key question is how do we learn these relevant dimensions, that is, how do we learn what to learn about? We investigated this process of representation learning experimentally, using a task in which one stimulus dimension was relevant for determining reward at each point in time. As in real life situations, in our task the relevant dimension can change without warning, adding ever-present uncertainty engendered by a constantly changing environment. We show that human performance on this task is better described by a suboptimal strategy based on selective attention and serial hypothesis testing rather than a normative strategy based on probabilistic inference. From this, we conjecture that the problem of inferring relevance in general scenarios is too computationally demanding for the brain to solve optimally. As a result the brain utilizes approximations, employing these even in simplified scenarios in which optimal representation learning is tractable, such as the one in our experiment.

  3. Triptycene-based dianhydrides, polyimides, methods of making each, and methods of use

    Ghanem, Bader; Pinnau, Ingo; Swaidan, Raja

    2015-01-01

    A triptycene-based monomer, a method of making a triptycene-based monomer, a triptycene-based aromatic polyimide, a method of making a triptycene- based aromatic polyimide, methods of using triptycene-based aromatic polyimides, structures incorporating triptycene-based aromatic polyimides, and methods of gas separation are provided. Embodiments of the triptycene-based monomers and triptycene-based aromatic polyimides have high permeabilities and excellent selectivities. Embodiments of the triptycene-based aromatic polyimides have one or more of the following characteristics: intrinsic microporosity, good thermal stability, and enhanced solubility. In an exemplary embodiment, the triptycene-based aromatic polyimides are microporous and have a high BET surface area. In an exemplary embodiment, the triptycene-based aromatic polyimides can be used to form a gas separation membrane.

  4. Triptycene-based dianhydrides, polyimides, methods of making each, and methods of use

    Ghanem, Bader

    2015-12-30

    A triptycene-based monomer, a method of making a triptycene-based monomer, a triptycene-based aromatic polyimide, a method of making a triptycene- based aromatic polyimide, methods of using triptycene-based aromatic polyimides, structures incorporating triptycene-based aromatic polyimides, and methods of gas separation are provided. Embodiments of the triptycene-based monomers and triptycene-based aromatic polyimides have high permeabilities and excellent selectivities. Embodiments of the triptycene-based aromatic polyimides have one or more of the following characteristics: intrinsic microporosity, good thermal stability, and enhanced solubility. In an exemplary embodiment, the triptycene-based aromatic polyimides are microporous and have a high BET surface area. In an exemplary embodiment, the triptycene-based aromatic polyimides can be used to form a gas separation membrane.

  5. Inference in `poor` languages

    Petrov, S.

    1996-10-01

    Languages with a solvable implication problem but without complete and consistent systems of inference rules (`poor` languages) are considered. The problem of existence of finite complete and consistent inference rule system for a ``poor`` language is stated independently of the language or rules syntax. Several properties of the problem arc proved. An application of results to the language of join dependencies is given.

  6. Bayesian statistical inference

    Bruno De Finetti

    2017-04-01

    Full Text Available This work was translated into English and published in the volume: Bruno De Finetti, Induction and Probability, Biblioteca di Statistica, eds. P. Monari, D. Cocchi, Clueb, Bologna, 1993.Bayesian statistical Inference is one of the last fundamental philosophical papers in which we can find the essential De Finetti's approach to the statistical inference.

  7. Geometric statistical inference

    Periwal, Vipul

    1999-01-01

    A reparametrization-covariant formulation of the inverse problem of probability is explicitly solved for finite sample sizes. The inferred distribution is explicitly continuous for finite sample size. A geometric solution of the statistical inference problem in higher dimensions is outlined

  8. Practical Bayesian Inference

    Bailer-Jones, Coryn A. L.

    2017-04-01

    Preface; 1. Probability basics; 2. Estimation and uncertainty; 3. Statistical models and inference; 4. Linear models, least squares, and maximum likelihood; 5. Parameter estimation: single parameter; 6. Parameter estimation: multiple parameters; 7. Approximating distributions; 8. Monte Carlo methods for inference; 9. Parameter estimation: Markov chain Monte Carlo; 10. Frequentist hypothesis testing; 11. Model comparison; 12. Dealing with more complicated problems; References; Index.

  9. Neural signatures of experience-based improvements in deterministic decision-making.

    Tremel, Joshua J; Laurent, Patryk A; Wolk, David A; Wheeler, Mark E; Fiez, Julie A

    2016-12-15

    Feedback about our choices is a crucial part of how we gather information and learn from our environment. It provides key information about decision experiences that can be used to optimize future choices. However, our understanding of the processes through which feedback translates into improved decision-making is lacking. Using neuroimaging (fMRI) and cognitive models of decision-making and learning, we examined the influence of feedback on multiple aspects of decision processes across learning. Subjects learned correct choices to a set of 50 word pairs across eight repetitions of a concurrent discrimination task. Behavioral measures were then analyzed with both a drift-diffusion model and a reinforcement learning model. Parameter values from each were then used as fMRI regressors to identify regions whose activity fluctuates with specific cognitive processes described by the models. The patterns of intersecting neural effects across models support two main inferences about the influence of feedback on decision-making. First, frontal, anterior insular, fusiform, and caudate nucleus regions behave like performance monitors, reflecting errors in performance predictions that signal the need for changes in control over decision-making. Second, temporoparietal, supplementary motor, and putamen regions behave like mnemonic storage sites, reflecting differences in learned item values that inform optimal decision choices. As information about optimal choices is accrued, these neural systems dynamically adjust, likely shifting the burden of decision processing from controlled performance monitoring to bottom-up, stimulus-driven choice selection. Collectively, the results provide a detailed perspective on the fundamental ability to use past experiences to improve future decisions. Copyright © 2016 Elsevier B.V. All rights reserved.

  10. System Support for Forensic Inference

    Gehani, Ashish; Kirchner, Florent; Shankar, Natarajan

    Digital evidence is playing an increasingly important role in prosecuting crimes. The reasons are manifold: financially lucrative targets are now connected online, systems are so complex that vulnerabilities abound and strong digital identities are being adopted, making audit trails more useful. If the discoveries of forensic analysts are to hold up to scrutiny in court, they must meet the standard for scientific evidence. Software systems are currently developed without consideration of this fact. This paper argues for the development of a formal framework for constructing “digital artifacts” that can serve as proxies for physical evidence; a system so imbued would facilitate sound digital forensic inference. A case study involving a filesystem augmentation that provides transparent support for forensic inference is described.

  11. Cannabinoids and value-based decision making: Implications for neurodegenerative disorders

    Lee, AM; Oleson, E.B.; Diergaarde, L.; Cheer, J.F.; Pattij, T.

    2012-01-01

    In recent years, disturbances in cognitive function have been increasingly recognized as important symptomatic phenomena in neurodegenerative diseases, including Parkinson's disease (PD). Value-based decision making in particular is an important executive cognitive function that is not only impaired

  12. Ortho-substituted triptycene-based diamines, monomers, and polymers, methods of making and uses thereof

    Ghanem, Bader Saleh

    2017-04-13

    Described herein are ortho-dimethyl-substituted and tetramethyi-substituted triptycene-containing diamine monomers and microporous triptycene-based poiyimides and poiyamides, and methods of making the monomers and polymers.

  13. Ortho-substituted triptycene-based diamines, monomers, and polymers, methods of making and uses thereof

    Ghanem, Bader Saleh; Pinnau, Ingo

    2017-01-01

    Described herein are ortho-dimethyl-substituted and tetramethyi-substituted triptycene-containing diamine monomers and microporous triptycene-based poiyimides and poiyamides, and methods of making the monomers and polymers.

  14. An agent-based model for integrated emotion regulation and contagion in socially affected decision making

    Manzoor, A.; Treur, J.

    2015-01-01

    This paper addresses an agent-based computational social agent model for the integration of emotion regulation, emotion contagion and decision making in a social context. The model integrates emotion-related valuing, in order to analyse the role of emotions in socially affected decision making. The agent-based model is illustrated for the interaction between two persons. Simulation experiments for different kinds of scenarios help to understand how decisions can be affected by regulating the ...

  15. Make or buy decision considering uncertainty based on fuzzy logic using simulation and multiple criteria decision making

    Ali Mohtashami

    2013-01-01

    Full Text Available Decision making on making/buying problem has always been a challenge to decision makers. In this paper a methodology has been proposed to resolve this challenge. This methodology is capable of evaluating making/buying decision making under uncertainty. For uncertainty, the fuzzy logic and simulation approaches have been used. The proposed methodology can be applied to parts with multi stage manufacturing processes and different suppliers. Therefore this methodology provides a scale for decision making from full outsourcing to full manufacturing and with selecting appropriate supplier.

  16. Grey Language Hesitant Fuzzy Group Decision Making Method Based on Kernel and Grey Scale.

    Li, Qingsheng; Diao, Yuzhu; Gong, Zaiwu; Hu, Aqin

    2018-03-02

    Based on grey language multi-attribute group decision making, a kernel and grey scale scoring function is put forward according to the definition of grey language and the meaning of the kernel and grey scale. The function introduces grey scale into the decision-making method to avoid information distortion. This method is applied to the grey language hesitant fuzzy group decision making, and the grey correlation degree is used to sort the schemes. The effectiveness and practicability of the decision-making method are further verified by the industry chain sustainable development ability evaluation example of a circular economy. Moreover, its simplicity and feasibility are verified by comparing it with the traditional grey language decision-making method and the grey language hesitant fuzzy weighted arithmetic averaging (GLHWAA) operator integration method after determining the index weight based on the grey correlation.

  17. Design of nuclear emergency decision-making support system based on the results of radiation monitoring

    Zheng Qiyan; Zhang Lijun; Huang Weiqi; Chen Lin

    2010-01-01

    For nuclear emergency decision-making support system based on the results of radiation monitoring, its main assignment is receiving radiation monitoring data and analyzing them, to accomplish some works such as environment influence evaluation, dose assessment for emergency responder, decision-making analyzing and effectiveness evaluation for emergency actions, etc.. This system is made up of server, communication terminal, data-analyzing terminal, GPRS modules, printer, and so on. The whole system make of a LAN. The system's software is made up of six subsystems: data-analyzing subsystem, reporting subsystem, GIS subsystem, communication subsystem, user-managing subsystem and data-base. (authors)

  18. Apathy and Emotion-Based Decision-Making in Amnesic Mild Cognitive Impairment and Alzheimer's Disease

    Bayard, Sophie; Jacus, Jean-Pierre; Raffard, Stéphane; Gely-Nargeot, Marie-Christine

    2014-01-01

    Background. Apathy and reduced emotion-based decision-making are two behavioral modifications independently described in Alzheimer’s disease (AD) and amnestic mild cognitive impairment (aMCI). Objectives. The aims of this study were to investigate decision-making based on emotional feedback processing in AD and aMCI and to study the impact of reduced decision-making performances on apathy. Methods. We recruited 20 patients with AD, 20 participants with aMCI, and 20 healthy controls. All parti...

  19. Research-Based Knowledge: Researchers' Contribution to Evidence-Based Practice and Policy Making in Career Guidance

    Haug, Erik Hagaseth; Plant, Peter

    2016-01-01

    To present evidence for the outcomes of career guidance is increasingly seen as pivotal for a further professionalization of policy making and service provision. This paper puts an emphasis on researchers' contribution to evidence-based practice and policy making in career guidance. We argue for a broader and more pluralistic research strategy to…

  20. A Novel Technique for Maximum Power Point Tracking of a Photovoltaic Based on Sensing of Array Current Using Adaptive Neuro-Fuzzy Inference System (ANFIS)

    El-Zoghby, Helmy M.; Bendary, Ahmed F.

    2016-10-01

    Maximum Power Point Tracking (MPPT) is now widely used method in increasing the photovoltaic (PV) efficiency. The conventional MPPT methods have many problems concerning the accuracy, flexibility and efficiency. The MPP depends on the PV temperature and solar irradiation that randomly varied. In this paper an artificial intelligence based controller is presented through implementing of an Adaptive Neuro-Fuzzy Inference System (ANFIS) to obtain maximum power from PV. The ANFIS inputs are the temperature and cell current, and the output is optimal voltage at maximum power. During operation the trained ANFIS senses the PV current using suitable sensor and also senses the temperature to determine the optimal operating voltage that corresponds to the current at MPP. This voltage is used to control the boost converter duty cycle. The MATLAB simulation results shows the effectiveness of the ANFIS with sensing the PV current in obtaining the MPPT from the PV.

  1. Children's Pragmatic Inferences as a Route for Learning about the World

    Horowitz, Alexandra C.; Frank, Michael C.

    2016-01-01

    This study investigated whether children can infer category properties based on how a speaker describes an individual (e.g., saying something is a "small zib" implies that zibs are generally bigger than this one). Three- to 5-year-olds (N = 264) from a university preschool and a children's museum were tested on their ability to make this…

  2. Students' Informal Inference about the Binomial Distribution of "Bunny Hops": A Dialogic Perspective

    Kazak, Sibel; Fujita, Taro; Wegerif, Rupert

    2016-01-01

    The study explores the development of 11-year-old students' informal inference about random bunny hops through student talk and use of computer simulation tools. Our aim in this paper is to draw on dialogic theory to explain how students make shifts in perspective, from intuition-based reasoning to more powerful, formal ways of using probabilistic…

  3. Statistical inference and Aristotle's Rhetoric.

    Macdonald, Ranald R

    2004-11-01

    Formal logic operates in a closed system where all the information relevant to any conclusion is present, whereas this is not the case when one reasons about events and states of the world. Pollard and Richardson drew attention to the fact that the reasoning behind statistical tests does not lead to logically justifiable conclusions. In this paper statistical inferences are defended not by logic but by the standards of everyday reasoning. Aristotle invented formal logic, but argued that people mostly get at the truth with the aid of enthymemes--incomplete syllogisms which include arguing from examples, analogies and signs. It is proposed that statistical tests work in the same way--in that they are based on examples, invoke the analogy of a model and use the size of the effect under test as a sign that the chance hypothesis is unlikely. Of existing theories of statistical inference only a weak version of Fisher's takes this into account. Aristotle anticipated Fisher by producing an argument of the form that there were too many cases in which an outcome went in a particular direction for that direction to be plausibly attributed to chance. We can therefore conclude that Aristotle would have approved of statistical inference and there is a good reason for calling this form of statistical inference classical.

  4. The Interplay of Hippocampus and Ventromedial Prefrontal Cortex in Memory-Based Decision Making

    Regina A. Weilbächer

    2016-12-01

    Full Text Available Episodic memory and value-based decision making are two central and intensively studied research domains in cognitive neuroscience, but we are just beginning to understand how they interact to enable memory-based decisions. The two brain regions that have been associated with episodic memory and value-based decision making are the hippocampus and the ventromedial prefrontal cortex, respectively. In this review article, we first give an overview of these brain–behavior associations and then focus on the mechanisms of potential interactions between the hippocampus and ventromedial prefrontal cortex that have been proposed and tested in recent neuroimaging studies. Based on those possible interactions, we discuss several directions for future research on the neural and cognitive foundations of memory-based decision making.

  5. Knowledge and inference

    Nagao, Makoto

    1990-01-01

    Knowledge and Inference discusses an important problem for software systems: How do we treat knowledge and ideas on a computer and how do we use inference to solve problems on a computer? The book talks about the problems of knowledge and inference for the purpose of merging artificial intelligence and library science. The book begins by clarifying the concept of """"knowledge"""" from many points of view, followed by a chapter on the current state of library science and the place of artificial intelligence in library science. Subsequent chapters cover central topics in the artificial intellig

  6. Mild Cognitive Impairment is Associated with PoorerDecision Making in Community-Based Older Persons

    Duke Han, S.; Boyle, Patricia A.; James, Bryan D.; Yu, Lei; Bennett, David A.

    2015-01-01

    Background/Objectives Financial and healthcare decision making are important for maintaining wellbeing and independence in old age. We tested the hypothesis that Mild Cognitive Impairment (MCI) is associated with poorer decision making in financial and healthcare matters. Design Community-based epidemiologic cohort study. Setting Communities throughout Northeastern Illinois. Participants Participants were 730 older nondemented persons from the Rush Memory and Aging Project. Measurements All participants underwent a detailed clinical evaluation and decision making assessment using a measure that closely approximates materials utilized in real world financial and healthcare settings. This allowed for measurement of total decision making, as well as financial and healthcare decision making. Regression models were used to examine whether the presence of MCI was associated with a lower level of decision making. In subsequent analyses, we explored the relation of specific cognitive systems (i.e., episodic memory, semantic memory, working memory, perceptual speed, and visuospatial ability) with decision making in those with MCI. Results Results showed that MCI was associated with lower decision making total scores as well as financial and healthcare scores, respectively, after accounting for the effects of age, education, and sex. The effect of MCI on total decision making was equivalent to the effect of more than 10 additional years of age. Additional models showed that when considering multiple cognitive systems, perceptual speed accounted for the most variance in decision making among participants with MCI. Conclusion Results suggest that persons with MCI may exhibit poorer financial and healthcare decision making in real world situations, and that perceptual speed may be an important contributor to poorer decision making among persons with MCI. PMID:25850350

  7. Behavior Intention Derivation of Android Malware Using Ontology Inference

    Jian Jiao

    2018-01-01

    Full Text Available Previous researches on Android malware mainly focus on malware detection, and malware’s evolution makes the process face certain hysteresis. The information presented by these detected results (malice judgment, family classification, and behavior characterization is limited for analysts. Therefore, a method is needed to restore the intention of malware, which reflects the relation between multiple behaviors of complex malware and its ultimate purpose. This paper proposes a novel description and derivation model of Android malware intention based on the theory of intention and malware reverse engineering. This approach creates ontology for malware intention to model the semantic relation between behaviors and its objects and automates the process of intention derivation by using SWRL rules transformed from intention model and Jess inference engine. Experiments on 75 typical samples show that the inference system can perform derivation of malware intention effectively, and 89.3% of the inference results are consistent with artificial analysis, which proves the feasibility and effectiveness of our theory and inference system.

  8. Hydrological inferences through morphometric analysis of lower Kosi river basin of India for water resource management based on remote sensing data

    Rai, Praveen Kumar; Chandel, Rajeev Singh; Mishra, Varun Narayan; Singh, Prafull

    2018-03-01

    Satellite based remote sensing technology has proven to be an effectual tool in analysis of drainage networks, study of surface morphological features and their correlation with groundwater management prospect at basin level. The present study highlights the effectiveness and advantage of remote sensing and GIS-based analysis for quantitative and qualitative assessment of flood plain region of lower Kosi river basin based on morphometric analysis. In this study, ASTER DEM is used to extract the vital hydrological parameters of lower Kosi river basin in ARC GIS software. Morphometric parameters, e.g., stream order, stream length, bifurcation ratio, drainage density, drainage frequency, drainage texture, form factor, circularity ratio, elongation ratio, etc., have been calculated for the Kosi basin and their hydrological inferences were discussed. Most of the morphometric parameters such as bifurcation ratio, drainage density, drainage frequency, drainage texture concluded that basin has good prospect for water management program for various purposes and also generated data base that can provide scientific information for site selection of water-harvesting structures and flood management activities in the basin. Land use land cover (LULC) of the basin were also prepared from Landsat data of 2005, 2010 and 2015 to assess the change in dynamic of the basin and these layers are very noteworthy for further watershed prioritization.

  9. Probability and Statistical Inference

    Prosper, Harrison B.

    2006-01-01

    These lectures introduce key concepts in probability and statistical inference at a level suitable for graduate students in particle physics. Our goal is to paint as vivid a picture as possible of the concepts covered.

  10. On quantum statistical inference

    Barndorff-Nielsen, O.E.; Gill, R.D.; Jupp, P.E.

    2003-01-01

    Interest in problems of statistical inference connected to measurements of quantum systems has recently increased substantially, in step with dramatic new developments in experimental techniques for studying small quantum systems. Furthermore, developments in the theory of quantum measurements have

  11. How to Make Reminiscence Movies: A Project-Based Gerontology Course

    Yancura, Loriena A.

    2013-01-01

    One key to successful gerontological education lies in teaching students to integrate information from diverse academic disciplines into practical contexts. This article describes a project-based course within which students learn to integrate theories by working with older adult partners to make reminiscence movies based on an important event or…

  12. Advances in the application of decision theory to test-based decision making

    van der Linden, Willem J.

    This paper reviews recent research in the Netherlands on the application of decision theory to test-based decision making about personnel selection and student placement. The review is based on an earlier model proposed for the classification of decision problems, and emphasizes an empirical

  13. Interval-Valued Intuitionistic Fuzzy Multicriteria Group Decision Making Based on VIKOR and Choquet Integral

    Chunqiao Tan

    2013-01-01

    Full Text Available An effective decision making approach based on VIKOR and Choquet integral is developed to solve multicriteria group decision making problem with conflicting criteria and interdependent subjective preference of decision makers in a fuzzy environment where preferences of decision makers with respect to criteria are represented by interval-valued intuitionistic fuzzy sets. First, an interval-valued intuitionistic fuzzy Choquet integral operator is given. Some of its properties are investigated in detail. The extended VIKOR decision procedure based on the proposed operator is developed for solving the multicriteria group decision making problem where the interactive criteria weight is measured by Shapley value. An illustrative example is given for demonstrating the applicability of the proposed decision procedure for solving the multi-criteria group decision making problem in interval-valued intuitionistic fuzzy environment.

  14. INFERENCE BUILDING BLOCKS

    2018-02-15

    expressed a variety of inference techniques on discrete and continuous distributions: exact inference, importance sampling, Metropolis-Hastings (MH...without redoing any math or rewriting any code. And although our main goal is composable reuse, our performance is also good because we can use...control paths. • The Hakaru language can express mixtures of discrete and continuous distributions, but the current disintegration transformation

  15. Introductory statistical inference

    Mukhopadhyay, Nitis

    2014-01-01

    This gracefully organized text reveals the rigorous theory of probability and statistical inference in the style of a tutorial, using worked examples, exercises, figures, tables, and computer simulations to develop and illustrate concepts. Drills and boxed summaries emphasize and reinforce important ideas and special techniques.Beginning with a review of the basic concepts and methods in probability theory, moments, and moment generating functions, the author moves to more intricate topics. Introductory Statistical Inference studies multivariate random variables, exponential families of dist

  16. Rationality versus reality: the challenges of evidence-based decision making for health policy makers.

    McCaughey, Deirdre; Bruning, Nealia S

    2010-05-26

    Current healthcare systems have extended the evidence-based medicine (EBM) approach to health policy and delivery decisions, such as access-to-care, healthcare funding and health program continuance, through attempts to integrate valid and reliable evidence into the decision making process. These policy decisions have major impacts on society and have high personal and financial costs associated with those decisions. Decision models such as these function under a shared assumption of rational choice and utility maximization in the decision-making process. We contend that health policy decision makers are generally unable to attain the basic goals of evidence-based decision making (EBDM) and evidence-based policy making (EBPM) because humans make decisions with their naturally limited, faulty, and biased decision-making processes. A cognitive information processing framework is presented to support this argument, and subtle cognitive processing mechanisms are introduced to support the focal thesis: health policy makers' decisions are influenced by the subjective manner in which they individually process decision-relevant information rather than on the objective merits of the evidence alone. As such, subsequent health policy decisions do not necessarily achieve the goals of evidence-based policy making, such as maximizing health outcomes for society based on valid and reliable research evidence. In this era of increasing adoption of evidence-based healthcare models, the rational choice, utility maximizing assumptions in EBDM and EBPM, must be critically evaluated to ensure effective and high-quality health policy decisions. The cognitive information processing framework presented here will aid health policy decision makers by identifying how their decisions might be subtly influenced by non-rational factors. In this paper, we identify some of the biases and potential intervention points and provide some initial suggestions about how the EBDM/EBPM process can be

  17. Rationality versus reality: the challenges of evidence-based decision making for health policy makers

    Bruning Nealia S

    2010-05-01

    Full Text Available Abstract Background Current healthcare systems have extended the evidence-based medicine (EBM approach to health policy and delivery decisions, such as access-to-care, healthcare funding and health program continuance, through attempts to integrate valid and reliable evidence into the decision making process. These policy decisions have major impacts on society and have high personal and financial costs associated with those decisions. Decision models such as these function under a shared assumption of rational choice and utility maximization in the decision-making process. Discussion We contend that health policy decision makers are generally unable to attain the basic goals of evidence-based decision making (EBDM and evidence-based policy making (EBPM because humans make decisions with their naturally limited, faulty, and biased decision-making processes. A cognitive information processing framework is presented to support this argument, and subtle cognitive processing mechanisms are introduced to support the focal thesis: health policy makers' decisions are influenced by the subjective manner in which they individually process decision-relevant information rather than on the objective merits of the evidence alone. As such, subsequent health policy decisions do not necessarily achieve the goals of evidence-based policy making, such as maximizing health outcomes for society based on valid and reliable research evidence. Summary In this era of increasing adoption of evidence-based healthcare models, the rational choice, utility maximizing assumptions in EBDM and EBPM, must be critically evaluated to ensure effective and high-quality health policy decisions. The cognitive information processing framework presented here will aid health policy decision makers by identifying how their decisions might be subtly influenced by non-rational factors. In this paper, we identify some of the biases and potential intervention points and provide some initial

  18. Rationality versus reality: the challenges of evidence-based decision making for health policy makers

    2010-01-01

    Background Current healthcare systems have extended the evidence-based medicine (EBM) approach to health policy and delivery decisions, such as access-to-care, healthcare funding and health program continuance, through attempts to integrate valid and reliable evidence into the decision making process. These policy decisions have major impacts on society and have high personal and financial costs associated with those decisions. Decision models such as these function under a shared assumption of rational choice and utility maximization in the decision-making process. Discussion We contend that health policy decision makers are generally unable to attain the basic goals of evidence-based decision making (EBDM) and evidence-based policy making (EBPM) because humans make decisions with their naturally limited, faulty, and biased decision-making processes. A cognitive information processing framework is presented to support this argument, and subtle cognitive processing mechanisms are introduced to support the focal thesis: health policy makers' decisions are influenced by the subjective manner in which they individually process decision-relevant information rather than on the objective merits of the evidence alone. As such, subsequent health policy decisions do not necessarily achieve the goals of evidence-based policy making, such as maximizing health outcomes for society based on valid and reliable research evidence. Summary In this era of increasing adoption of evidence-based healthcare models, the rational choice, utility maximizing assumptions in EBDM and EBPM, must be critically evaluated to ensure effective and high-quality health policy decisions. The cognitive information processing framework presented here will aid health policy decision makers by identifying how their decisions might be subtly influenced by non-rational factors. In this paper, we identify some of the biases and potential intervention points and provide some initial suggestions about how the

  19. Robust Inventory System Optimization Based on Simulation and Multiple Criteria Decision Making

    Ahmad Mortazavi

    2014-01-01

    Full Text Available Inventory management in retailers is difficult and complex decision making process which is related to the conflict criteria, also existence of cyclic changes and trend in demand is inevitable in many industries. In this paper, simulation modeling is considered as efficient tool for modeling of retailer multiproduct inventory system. For simulation model optimization, a novel multicriteria and robust surrogate model is designed based on multiple attribute decision making (MADM method, design of experiments (DOE, and principal component analysis (PCA. This approach as a main contribution of this paper, provides a framework for robust multiple criteria decision making under uncertainty.

  20. Troger's base-based monomers, and polymers, methods of making and uses thereof

    Ma, Xiaohua; Pinnau, Ingo

    2017-01-01

    Embodiments of the present disclosure provide compounds derived by Troger's amine as shown below, microporous structures, membranes, methods of making said compounds, structures, and membranes, methods of use for gas separation, and the like (Formula A1).

  1. Troger's base-based monomers, and polymers, methods of making and uses thereof

    Ma, Xiaohua

    2017-12-28

    Embodiments of the present disclosure provide compounds derived by Troger\\'s amine as shown below, microporous structures, membranes, methods of making said compounds, structures, and membranes, methods of use for gas separation, and the like (Formula A1).

  2. New Multi-Criteria Group Decision-Making Method Based on Vague Set Theory

    Kuo-Sui Lin

    2016-01-01

    In light of the deficiencies and limitations for existing score functions, Lin has proposed a more effective and reasonable new score function for measuring vague values. By using Lin’s score function and a new weighted aggregation score function, an algorithm for multi-criteria group decision-making method was proposed to solve vague set based group decision-making problems under vague environments. Finally, a numerical example was illustrated to show the effectiveness of the proposed multi-...

  3. A Costing Analysis for Decision Making Grid Model in Failure-Based Maintenance

    Burhanuddin M. A.

    2011-01-01

    Full Text Available Background. In current economic downturn, industries have to set good control on production cost, to maintain their profit margin. Maintenance department as an imperative unit in industries should attain all maintenance data, process information instantaneously, and subsequently transform it into a useful decision. Then act on the alternative to reduce production cost. Decision Making Grid model is used to identify strategies for maintenance decision. However, the model has limitation as it consider two factors only, that is, downtime and frequency of failures. We consider third factor, cost, in this study for failure-based maintenance. The objective of this paper is to introduce the formulae to estimate maintenance cost. Methods. Fish bone analysis conducted with Ishikawa model and Decision Making Grid methods are used in this study to reveal some underlying risk factors that delay failure-based maintenance. The goal of the study is to estimate the risk factor that is, repair cost to fit in the Decision Making Grid model. Decision Making grid model consider two variables, frequency of failure and downtime in the analysis. This paper introduces third variable, repair cost for Decision Making Grid model. This approaches give better result to categorize the machines, reduce cost, and boost the earning for the manufacturing plant. Results. We collected data from one of the food processing factories in Malaysia. From our empirical result, Machine C, Machine D, Machine F, and Machine I must be in the Decision Making Grid model even though their frequency of failures and downtime are less than Machine B and Machine N, based on the costing analysis. The case study and experimental results show that the cost analysis in Decision Making Grid model gives more promising strategies in failure-based maintenance. Conclusions. The improvement of Decision Making Grid model for decision analysis with costing analysis is our contribution in this paper for

  4. Probability- and model-based approaches to inference for proportion forest using satellite imagery as ancillary data

    Ronald E. McRoberts

    2010-01-01

    Estimates of forest area are among the most common and useful information provided by national forest inventories. The estimates are used for local and national purposes and for reporting to international agreements such as the Montréal Process, the Ministerial Conference on the Protection of Forests in Europe, and the Kyoto Protocol. The estimates are usually based on...

  5. Randomization-Based Inference about Latent Variables from Complex Samples: The Case of Two-Stage Sampling

    Li, Tiandong

    2012-01-01

    In large-scale assessments, such as the National Assessment of Educational Progress (NAEP), plausible values based on Multiple Imputations (MI) have been used to estimate population characteristics for latent constructs under complex sample designs. Mislevy (1991) derived a closed-form analytic solution for a fixed-effect model in creating…

  6. Social Inference Through Technology

    Oulasvirta, Antti

    Awareness cues are computer-mediated, real-time indicators of people’s undertakings, whereabouts, and intentions. Already in the mid-1970 s, UNIX users could use commands such as “finger” and “talk” to find out who was online and to chat. The small icons in instant messaging (IM) applications that indicate coconversants’ presence in the discussion space are the successors of “finger” output. Similar indicators can be found in online communities, media-sharing services, Internet relay chat (IRC), and location-based messaging applications. But presence and availability indicators are only the tip of the iceberg. Technological progress has enabled richer, more accurate, and more intimate indicators. For example, there are mobile services that allow friends to query and follow each other’s locations. Remote monitoring systems developed for health care allow relatives and doctors to assess the wellbeing of homebound patients (see, e.g., Tang and Venables 2000). But users also utilize cues that have not been deliberately designed for this purpose. For example, online gamers pay attention to other characters’ behavior to infer what the other players are like “in real life.” There is a common denominator underlying these examples: shared activities rely on the technology’s representation of the remote person. The other human being is not physically present but present only through a narrow technological channel.

  7. Development and Testing of a Decision Making Based Method to Adjust Automatically the Harrowing Intensity

    Rueda-Ayala, Victor; Weis, Martin; Keller, Martina; Andújar, Dionisio; Gerhards, Roland

    2013-01-01

    Harrowing is often used to reduce weed competition, generally using a constant intensity across a whole field. The efficacy of weed harrowing in wheat and barley can be optimized, if site-specific conditions of soil, weed infestation and crop growth stage are taken into account. This study aimed to develop and test an algorithm to automatically adjust the harrowing intensity by varying the tine angle and number of passes. The field variability of crop leaf cover, weed density and soil density was acquired with geo-referenced sensors to investigate the harrowing selectivity and crop recovery. Crop leaf cover and weed density were assessed using bispectral cameras through differential images analysis. The draught force of the soil opposite to the direction of travel was measured with electronic load cell sensor connected to a rigid tine mounted in front of the harrow. Optimal harrowing intensity levels were derived in previously implemented experiments, based on the weed control efficacy and yield gain. The assessments of crop leaf cover, weed density and soil density were combined via rules with the aforementioned optimal intensities, in a linguistic fuzzy inference system (LFIS). The system was evaluated in two field experiments that compared constant intensities with variable intensities inferred by the system. A higher weed density reduction could be achieved when the harrowing intensity was not kept constant along the cultivated plot. Varying the intensity tended to reduce the crop leaf cover, though slightly improving crop yield. A real-time intensity adjustment with this system is achievable, if the cameras are attached in the front and at the rear or sides of the harrow. PMID:23669712

  8. Development and Testing of a Decision Making Based Method to Adjust Automatically the Harrowing Intensity

    Roland Gerhards

    2013-05-01

    Full Text Available Harrowing is often used to reduce weed competition, generally using a constant intensity across a whole field. The efficacy of weed harrowing in wheat and barley can be optimized, if site-specific conditions of soil, weed infestation and crop growth stage are taken into account. This study aimed to develop and test an algorithm to automatically adjust the harrowing intensity by varying the tine angle and number of passes. The field variability of crop leaf cover, weed density and soil density was acquired with geo-referenced sensors to investigate the harrowing selectivity and crop recovery. Crop leaf cover and weed density were assessed using bispectral cameras through differential images analysis. The draught force of the soil opposite to the direction of travel was measured with electronic load cell sensor connected to a rigid tine mounted in front of the harrow. Optimal harrowing intensity levels were derived in previously implemented experiments, based on the weed control efficacy and yield gain. The assessments of crop leaf cover, weed density and soil density were combined via rules with the aforementioned optimal intensities, in a linguistic fuzzy inference system (LFIS. The system was evaluated in two field experiments that compared constant intensities with variable intensities inferred by the system. A higher weed density reduction could be achieved when the harrowing intensity was not kept constant along the cultivated plot. Varying the intensity tended to reduce the crop leaf cover, though slightly improving crop yield. A real-time intensity adjustment with this system is achievable, if the cameras are attached in the front and at the rear or sides of the harrow.

  9. NetCooperate: a network-based tool for inferring host-microbe and microbe-microbe cooperation

    Levy, Roie; Carr, Rogan; Kreimer, Anat; Freilich, Shiri; Borenstein, Elhanan

    2015-01-01

    Background Host-microbe and microbe-microbe interactions are often governed by the complex exchange of metabolites. Such interactions play a key role in determining the way pathogenic and commensal species impact their host and in the assembly of complex microbial communities. Recently, several studies have demonstrated how such interactions are reflected in the organization of the metabolic networks of the interacting species, and introduced various graph theory-based methods to predict host...

  10. Where and why hyporheic exchange is important: Inferences from a parsimonious, physically-based river network model

    Gomez-Velez, J. D.; Harvey, J. W.

    2014-12-01

    Hyporheic exchange has been hypothesized to have basin-scale consequences; however, predictions throughout river networks are limited by available geomorphic and hydrogeologic data as well as models that can analyze and aggregate hyporheic exchange flows across large spatial scales. We developed a parsimonious but physically-based model of hyporheic flow for application in large river basins: Networks with EXchange and Subsurface Storage (NEXSS). At the core of NEXSS is a characterization of the channel geometry, geomorphic features, and related hydraulic drivers based on scaling equations from the literature and readily accessible information such as river discharge, bankfull width, median grain size, sinuosity, channel slope, and regional groundwater gradients. Multi-scale hyporheic flow is computed based on combining simple but powerful analytical and numerical expressions that have been previously published. We applied NEXSS across a broad range of geomorphic diversity in river reaches and synthetic river networks. NEXSS demonstrates that vertical exchange beneath submerged bedforms dominates hyporheic fluxes and turnover rates along the river corridor. Moreover, the hyporheic zone's potential for biogeochemical transformations is comparable across stream orders, but the abundance of lower-order channels results in a considerably higher cumulative effect for low-order streams. Thus, vertical exchange beneath submerged bedforms has more potential for biogeochemical transformations than lateral exchange beneath banks, although lateral exchange through meanders may be important in large rivers. These results have implications for predicting outcomes of river and basin management practices.

  11. Integration Strategy Is a Key Step in Network-Based Analysis and Dramatically Affects Network Topological Properties and Inferring Outcomes

    Jin, Nana; Wu, Deng; Gong, Yonghui; Bi, Xiaoman; Jiang, Hong; Li, Kongning; Wang, Qianghu

    2014-01-01

    An increasing number of experiments have been designed to detect intracellular and intercellular molecular interactions. Based on these molecular interactions (especially protein interactions), molecular networks have been built for using in several typical applications, such as the discovery of new disease genes and the identification of drug targets and molecular complexes. Because the data are incomplete and a considerable number of false-positive interactions exist, protein interactions from different sources are commonly integrated in network analyses to build a stable molecular network. Although various types of integration strategies are being applied in current studies, the topological properties of the networks from these different integration strategies, especially typical applications based on these network integration strategies, have not been rigorously evaluated. In this paper, systematic analyses were performed to evaluate 11 frequently used methods using two types of integration strategies: empirical and machine learning methods. The topological properties of the networks of these different integration strategies were found to significantly differ. Moreover, these networks were found to dramatically affect the outcomes of typical applications, such as disease gene predictions, drug target detections, and molecular complex identifications. The analysis presented in this paper could provide an important basis for future network-based biological researches. PMID:25243127

  12. Efficient fuzzy Bayesian inference algorithms for incorporating expert knowledge in parameter estimation

    Rajabi, Mohammad Mahdi; Ataie-Ashtiani, Behzad

    2016-05-01

    Bayesian inference has traditionally been conceived as the proper framework for the formal incorporation of expert knowledge in parameter estimation of groundwater models. However, conventional Bayesian inference is incapable of taking into account the imprecision essentially embedded in expert provided information. In order to solve this problem, a number of extensions to conventional Bayesian inference have been introduced in recent years. One of these extensions is 'fuzzy Bayesian inference' which is the result of integrating fuzzy techniques into Bayesian statistics. Fuzzy Bayesian inference has a number of desirable features which makes it an attractive approach for incorporating expert knowledge in the parameter estimation process of groundwater models: (1) it is well adapted to the nature of expert provided information, (2) it allows to distinguishably model both uncertainty and imprecision, and (3) it presents a framework for fusing expert provided information regarding the various inputs of the Bayesian inference algorithm. However an important obstacle in employing fuzzy Bayesian inference in groundwater numerical modeling applications is the computational burden, as the required number of numerical model simulations often becomes extremely exhaustive and often computationally infeasible. In this paper, a novel approach of accelerating the fuzzy Bayesian inference algorithm is proposed which is based on using approximate posterior distributions derived from surrogate modeling, as a screening tool in the computations. The proposed approach is first applied to a synthetic test case of seawater intrusion (SWI) in a coastal aquifer. It is shown that for this synthetic test case, the proposed approach decreases the number of required numerical simulations by an order of magnitude. Then the proposed approach is applied to a real-world test case involving three-dimensional numerical modeling of SWI in Kish Island, located in the Persian Gulf. An expert

  13. The Effect of Conflict Theory Based Decision-Making Skill Training Psycho-Educational Group Experience on Decision Making Styles of Adolescents

    Colakkadioglu, Oguzhan; Gucray, S. Sonay

    2012-01-01

    In this study, the effect of conflict theory based decision making skill training group applications on decision making styles of adolescents was investigated. A total of 36 students, including 18 students in experimental group and 18 students in control group, participated in the research. When assigning students to experimental group or control…

  14. Vicarious Effort-Based Decision-Making in Autism Spectrum Disorders.

    Mosner, Maya G; Kinard, Jessica L; McWeeny, Sean; Shah, Jasmine S; Markiewitz, Nathan D; Damiano-Goodwin, Cara R; Burchinal, Margaret R; Rutherford, Helena J V; Greene, Rachel K; Treadway, Michael T; Dichter, Gabriel S

    2017-10-01

    This study investigated vicarious effort-based decision-making in 50 adolescents with autism spectrum disorders (ASD) compared to 32 controls using the Effort Expenditure for Rewards Task. Participants made choices to win money for themselves or for another person. When choosing for themselves, the ASD group exhibited relatively similar patterns of effort-based decision-making across reward parameters. However, when choosing for another person, the ASD group demonstrated relatively decreased sensitivity to reward magnitude, particularly in the high magnitude condition. Finally, patterns of responding in the ASD group were related to individual differences in consummatory pleasure capacity. These findings indicate atypical vicarious effort-based decision-making in ASD and more broadly add to the growing body of literature addressing social reward processing deficits in ASD.

  15. NetCooperate: a network-based tool for inferring host-microbe and microbe-microbe cooperation.

    Levy, Roie; Carr, Rogan; Kreimer, Anat; Freilich, Shiri; Borenstein, Elhanan

    2015-05-17

    Host-microbe and microbe-microbe interactions are often governed by the complex exchange of metabolites. Such interactions play a key role in determining the way pathogenic and commensal species impact their host and in the assembly of complex microbial communities. Recently, several studies have demonstrated how such interactions are reflected in the organization of the metabolic networks of the interacting species, and introduced various graph theory-based methods to predict host-microbe and microbe-microbe interactions directly from network topology. Using these methods, such studies have revealed evolutionary and ecological processes that shape species interactions and community assembly, highlighting the potential of this reverse-ecology research paradigm. NetCooperate is a web-based tool and a software package for determining host-microbe and microbe-microbe cooperative potential. It specifically calculates two previously developed and validated metrics for species interaction: the Biosynthetic Support Score which quantifies the ability of a host species to supply the nutritional requirements of a parasitic or a commensal species, and the Metabolic Complementarity Index which quantifies the complementarity of a pair of microbial organisms' niches. NetCooperate takes as input a pair of metabolic networks, and returns the pairwise metrics as well as a list of potential syntrophic metabolic compounds. The Biosynthetic Support Score and Metabolic Complementarity Index provide insight into host-microbe and microbe-microbe metabolic interactions. NetCooperate determines these interaction indices from metabolic network topology, and can be used for small- or large-scale analyses. NetCooperate is provided as both a web-based tool and an open-source Python module; both are freely available online at http://elbo.gs.washington.edu/software_netcooperate.html.

  16. Satellite-based evidence of wavelength-dependent aerosol absorption in biomass burning smoke inferred from Ozone Monitoring Instrument

    H. Jethva

    2011-10-01

    Full Text Available We provide satellite-based evidence of the spectral dependence of absorption in biomass burning aerosols over South America using near-UV measurements made by the Ozone Monitoring Instrument (OMI during 2005–2007. In the current near-UV OMI aerosol algorithm (OMAERUV, it is implicitly assumed that the only absorbing component in carbonaceous aerosols is black carbon whose imaginary component of the refractive index is wavelength independent. With this assumption, OMI-derived aerosol optical depth (AOD is found to be significantly over-estimated compared to that of AERONET at several sites during intense biomass burning events (August-September. Other well-known sources of error affecting the near-UV method of aerosol retrieval do not explain the large observed AOD discrepancies between the satellite and the ground-based observations. A number of studies have revealed strong spectral dependence in carbonaceous aerosol absorption in the near-UV region suggesting the presence of organic carbon in biomass burning generated aerosols. A sensitivity analysis examining the importance of accounting for the presence of wavelength-dependent aerosol absorption in carbonaceous particles in satellite-based remote sensing was carried out in this work. The results convincingly show that the inclusion of spectrally-dependent aerosol absorption in the radiative transfer calculations leads to a more accurate characterization of the atmospheric load of carbonaceous aerosols. The use of a new set of aerosol models assuming wavelength-dependent aerosol absorption in the near-UV region (Absorption Angstrom Exponent λ−2.5 to −3.0 improved the OMAERUV retrieval results by significantly reducing the AOD bias observed when gray aerosols were assumed. In addition, the new retrieval of single-scattering albedo is in better agreement with those of AERONET within the uncertainties (ΔSSA = ±0.03. The new colored carbonaceous aerosol model was also found to

  17. Inference of hierarchical regulatory network of estrogen-dependent breast cancer through ChIP-based data

    Parvin Jeffrey

    2010-12-01

    Full Text Available Abstract Background Global profiling of in vivo protein-DNA interactions using ChIP-based technologies has evolved rapidly in recent years. Although many genome-wide studies have identified thousands of ERα binding sites and have revealed the associated transcription factor (TF partners, such as AP1, FOXA1 and CEBP, little is known about ERα associated hierarchical transcriptional regulatory networks. Results In this study, we applied computational approaches to analyze three public available ChIP-based datasets: ChIP-seq, ChIP-PET and ChIP-chip, and to investigate the hierarchical regulatory network for ERα and ERα partner TFs regulation in estrogen-dependent breast cancer MCF7 cells. 16 common TFs and two common new TF partners (RORA and PITX2 were found among ChIP-seq, ChIP-chip and ChIP-PET datasets. The regulatory networks were constructed by scanning the ChIP-peak region with TF specific position weight matrix (PWM. A permutation test was performed to test the reliability of each connection of the network. We then used DREM software to perform gene ontology function analysis on the common genes. We found that FOS, PITX2, RORA and FOXA1 were involved in the up-regulated genes. We also conducted the ERα and Pol-II ChIP-seq experiments in tamoxifen resistance MCF7 cells (denoted as MCF7-T in this study and compared the difference between MCF7 and MCF7-T cells. The result showed very little overlap between these two cells in terms of targeted genes (21.2% of common genes and targeted TFs (25% of common TFs. The significant dissimilarity may indicate totally different transcriptional regulatory mechanisms between these two cancer cells. Conclusions Our study uncovers new estrogen-mediated regulatory networks by mining three ChIP-based data in MCF7 cells and ChIP-seq data in MCF7-T cells. We compared the different ChIP-based technologies as well as different breast cancer cells. Our computational analytical approach may guide biologists to

  18. Temporal Variability of Total Ozone in the Asian Region Inferred from Ground-Based and Satellite Measurement Data

    Visheratin, K. N.; Nerushev, A. F.; Orozaliev, M. D.; Zheng, Xiangdong; Sun, Shumen; Liu, Li

    2017-12-01

    This paper reports investigation data on the temporal variability of total ozone content (TOC) in the Central Asian and Tibet Plateau mountain regions obtained by conventional methods, as well as by spectral, cross-wavelet, and composite analyses. The data of ground-based observation stations located at Huang He, Kunming, and Lake Issyk-Kul, along with the satellite data obtained at SBUV/SBUV2 (SBUV merged total and profile ozone data, Version 8.6) for 1980-2013 and OMI (Ozone Monitoring Instrument) and TOU (Total Ozone Unit) for 2009-2013 have been used. The average relative deviation from the SBUV/SBUV2 data is less than 1% in Kunming and Issyk-Kul for the period of 1980-2013, while the Huang He Station is characterized by an excess of the satellite data over the ground-based information at an average deviation of 2%. According to the Fourier analysis results, the distribution of amplitudes and the periods of TOC oscillations within a range of over 14 months is similar for all series analyzed. Meanwhile, according to the cross-wavelet and composite analyses results, the phase relationships between the series may considerably differ, especially in the periods of 5-7 years. The phase of quasi-decennial oscillations in the Kunming Station is close to the 11-year oscillations of the solar cycle, while in the Huang He and Issyk-Kul stations the TOC variations go ahead of the solar cycle.

  19. Genealogy-based methods for inference of historical recombination and gene flow and their application in Saccharomyces cerevisiae.

    Jenkins, Paul A; Song, Yun S; Brem, Rachel B

    2012-01-01

    Genetic exchange between isolated populations, or introgression between species, serves as a key source of novel genetic material on which natural selection can act. While detecting historical gene flow from DNA sequence data is of much interest, many existing methods can be limited by requirements for deep population genomic sampling. In this paper, we develop a scalable genealogy-based method to detect candidate signatures of gene flow into a given population when the source of the alleles is unknown. Our method does not require sequenced samples from the source population, provided that the alleles have not reached fixation in the sampled recipient population. The method utilizes recent advances in algorithms for the efficient reconstruction of ancestral recombination graphs, which encode genealogical histories of DNA sequence data at each site, and is capable of detecting the signatures of gene flow whose footprints are of length up to single genes. Further, we employ a theoretical framework based on coalescent theory to test for statistical significance of certain recombination patterns consistent with gene flow from divergent sources. Implementing these methods for application to whole-genome sequences of environmental yeast isolates, we illustrate the power of our approach to highlight loci with unusual recombination histories. By developing innovative theory and methods to analyze signatures of gene flow from population sequence data, our work establishes a foundation for the continued study of introgression and its evolutionary relevance.

  20. Inferring the demographic history from DNA sequences: An importance sampling approach based on non-homogeneous processes.

    Ait Kaci Azzou, S; Larribe, F; Froda, S

    2016-10-01

    In Ait Kaci Azzou et al. (2015) we introduced an Importance Sampling (IS) approach for estimating the demographic history of a sample of DNA sequences, the skywis plot. More precisely, we proposed a new nonparametric estimate of a population size that changes over time. We showed on simulated data that the skywis plot can work well in typical situations where the effective population size does not undergo very steep changes. In this paper, we introduce an iterative procedure which extends the previous method and gives good estimates under such rapid variations. In the iterative calibrated skywis plot we approximate the effective population size by a piecewise constant function, whose values are re-estimated at each step. These piecewise constant functions are used to generate the waiting times of non homogeneous Poisson processes related to a coalescent process with mutation under a variable population size model. Moreover, the present IS procedure is based on a modified version of the Stephens and Donnelly (2000) proposal distribution. Finally, we apply the iterative calibrated skywis plot method to a simulated data set from a rapidly expanding exponential model, and we show that the method based on this new IS strategy correctly reconstructs the demographic history. Copyright © 2016. Published by Elsevier Inc.

  1. Bayesian inferences of generation and growth of corrosion defects on energy pipelines based on imperfect inspection data

    Qin, H.; Zhou, W.; Zhang, S.

    2015-01-01

    Stochastic process-based models are developed to characterize the generation and growth of metal-loss corrosion defects on oil and gas steel pipelines. The generation of corrosion defects over time is characterized by the non-homogenous Poisson process, and the growth of depths of individual defects is modeled by the non-homogenous gamma process (NHGP). The defect generation and growth models are formulated in a hierarchical Bayesian framework, whereby the parameters of the models are evaluated from the in-line inspection (ILI) data through the Bayesian updating by accounting for the probability of detection (POD) and measurement errors associated with the ILI data. The Markov Chain Monte Carlo (MCMC) simulation in conjunction with the data augmentation (DA) technique is employed to carry out the Bayesian updating. Numerical examples that involve simulated ILI data are used to illustrate and validate the proposed methodology. - Highlights: • Bayesian updating of growth and generation models of defects on energy pipelines. • Non-homogeneous Poisson process for defect generation. • Non-homogeneous gamma process for defect growth. • Updating based on inspection data with detecting and sizing uncertainties. • MCMC in conjunction with data augmentation technique employed for the updating.

  2. Beyond Bioethics: A Child Rights-Based Approach to Complex Medical Decision-Making.

    Wade, Katherine; Melamed, Irene; Goldhagen, Jeffrey

    2016-01-01

    This analysis adopts a child rights approach-based on the principles, standards, and norms of child rights and the U.N. Convention on the Rights of the Child (CRC)-to explore how decisions could be made with regard to treatment of a severely impaired infant (Baby G). While a child rights approach does not provide neat answers to ethically complex issues, it does provide a framework for decision-making in which the infant is viewed as an independent rights-holder. The state has obligations to develop the capacity of those who make decisions for infants in such situations to meet their obligations to respect, protect, and fulfill their rights as delineated in the CRC. Furthermore, a child rights approach requires procedural clarity and transparency in decision-making processes. As all rights in the CRC are interdependent and indivisible, all must be considered in the process of ethical decision-making, and the reasons for decisions must be delineated by reference to how these rights were considered. It is also important that decisions that are made in this context be monitored and reviewed to ensure consistency. A rights-based framework ensures decision-making is child-centered and that there are transparent criteria and legitimate procedures for making decisions regarding the child's most basic human right: the right to life, survival, and development.

  3. Apathy and emotion-based decision-making in amnesic mild cognitive impairment and Alzheimer's disease.

    Bayard, Sophie; Jacus, Jean-Pierre; Raffard, Stéphane; Gely-Nargeot, Marie-Christine

    2014-01-01

    Apathy and reduced emotion-based decision-making are two behavioral modifications independently described in Alzheimer's disease (AD) and amnestic mild cognitive impairment (aMCI). The aims of this study were to investigate decision-making based on emotional feedback processing in AD and aMCI and to study the impact of reduced decision-making performances on apathy. We recruited 20 patients with AD, 20 participants with aMCI, and 20 healthy controls. All participants completed the Lille apathy rating scale (LARS) and the Iowa gambling task (IGT). Both aMCI and AD participants had reduced performances on the IGT and were more apathetic compared to controls without any difference between aMCI and AD groups. For the entire sample, LARS initiation dimension was related to IGT disadvantageous decision-making profile. We provide the first study showing that both aMCI and AD individuals make less profitable decisions than controls, whereas aMCI and AD did not differ. Disadvantageous decision-making profile on the IGT was associated with higher level of apathy on the action initiation dimension. The role of an abnormal IGT performance as a risk factor for the development of apathy needs to be investigated in other clinical populations and in normal aging.

  4. Hesitant Fuzzy Thermodynamic Method for Emergency Decision Making Based on Prospect Theory.

    Ren, Peijia; Xu, Zeshui; Hao, Zhinan

    2017-09-01

    Due to the timeliness of emergency response and much unknown information in emergency situations, this paper proposes a method to deal with the emergency decision making, which can comprehensively reflect the emergency decision making process. By utilizing the hesitant fuzzy elements to represent the fuzziness of the objects and the hesitant thought of the experts, this paper introduces the negative exponential function into the prospect theory so as to portray the psychological behaviors of the experts, which transforms the hesitant fuzzy decision matrix into the hesitant fuzzy prospect decision matrix (HFPDM) according to the expectation-levels. Then, this paper applies the energy and the entropy in thermodynamics to take the quantity and the quality of the decision values into account, and defines the thermodynamic decision making parameters based on the HFPDM. Accordingly, a whole procedure for emergency decision making is conducted. What is more, some experiments are designed to demonstrate and improve the validation of the emergency decision making procedure. Last but not the least, this paper makes a case study about the emergency decision making in the firing and exploding at Port Group in Tianjin Binhai New Area, which manifests the effectiveness and practicability of the proposed method.

  5. Evaluating and improving count-based population inference: A case study from 31 years of monitoring Sandhill Cranes

    Gerber, Brian D.; Kendall, William L.

    2017-01-01

    Monitoring animal populations can be difficult. Limited resources often force monitoring programs to rely on unadjusted or smoothed counts as an index of abundance. Smoothing counts is commonly done using a moving-average estimator to dampen sampling variation. These indices are commonly used to inform management decisions, although their reliability is often unknown. We outline a process to evaluate the biological plausibility of annual changes in population counts and indices from a typical monitoring scenario and compare results with a hierarchical Bayesian time series (HBTS) model. We evaluated spring and fall counts, fall indices, and model-based predictions for the Rocky Mountain population (RMP) of Sandhill Cranes (Antigone canadensis) by integrating juvenile recruitment, harvest, and survival into a stochastic stage-based population model. We used simulation to evaluate population indices from the HBTS model and the commonly used 3-yr moving average estimator. We found counts of the RMP to exhibit biologically unrealistic annual change, while the fall population index was largely biologically realistic. HBTS model predictions suggested that the RMP changed little over 31 yr of monitoring, but the pattern depended on assumptions about the observational process. The HBTS model fall population predictions were biologically plausible if observed crane harvest mortality was compensatory up to natural mortality, as empirical evidence suggests. Simulations indicated that the predicted mean of the HBTS model was generally a more reliable estimate of the true population than population indices derived using a moving 3-yr average estimator. Practitioners could gain considerable advantages from modeling population counts using a hierarchical Bayesian autoregressive approach. Advantages would include: (1) obtaining measures of uncertainty; (2) incorporating direct knowledge of the observational and population processes; (3) accommodating missing years of data; and (4

  6. Inferring gene dependency network specific to phenotypic alteration based on gene expression data and clinical information of breast cancer.

    Zhou, Xionghui; Liu, Juan

    2014-01-01

    Although many methods have been proposed to reconstruct gene regulatory network, most of them, when applied in the sample-based data, can not reveal the gene regulatory relations underlying the phenotypic change (e.g. normal versus cancer). In this paper, we adopt phenotype as a variable when constructing the gene regulatory network, while former researches either neglected it or only used it to select the differentially expressed genes as the inputs to construct the gene regulatory network. To be specific, we integrate phenotype information with gene expression data to identify the gene dependency pairs by using the method of conditional mutual information. A gene dependency pair (A,B) means that the influence of gene A on the phenotype depends on gene B. All identified gene dependency pairs constitute a directed network underlying the phenotype, namely gene dependency network. By this way, we have constructed gene dependency network of breast cancer from gene expression data along with two different phenotype states (metastasis and non-metastasis). Moreover, we have found the network scale free, indicating that its hub genes with high out-degrees may play critical roles in the network. After functional investigation, these hub genes are found to be biologically significant and specially related to breast cancer, which suggests that our gene dependency network is meaningful. The validity has also been justified by literature investigation. From the network, we have selected 43 discriminative hubs as signature to build the classification model for distinguishing the distant metastasis risks of breast cancer patients, and the result outperforms those classification models with published signatures. In conclusion, we have proposed a promising way to construct the gene regulatory network by using sample-based data, which has been shown to be effective and accurate in uncovering the hidden mechanism of the biological process and identifying the gene signature for

  7. Goal inferences about robot behavior : goal inferences and human response behaviors

    Broers, H.A.T.; Ham, J.R.C.; Broeders, R.; De Silva, P.; Okada, M.

    2014-01-01

    This explorative research focused on the goal inferences human observers draw based on a robot's behavior, and the extent to which those inferences predict people's behavior in response to that robot. Results show that different robot behaviors cause different response behavior from people.

  8. Design of an expert system based on neuro-fuzzy inference analyzer for on-line microstructural characterization using magnetic NDT method

    Ghanei, S.; Vafaeenezhad, H.; Kashefi, M.; Eivani, A.R.; Mazinani, M.

    2015-01-01

    Tracing microstructural evolution has a significant importance and priority in manufacturing lines of dual-phase steels. In this paper, an artificial intelligence method is presented for on-line microstructural characterization of dual-phase steels. A new method for microstructure characterization based on the theory of magnetic Barkhausen noise nondestructive testing method is introduced using adaptive neuro-fuzzy inference system (ANFIS). In order to predict the accurate martensite volume fraction of dual-phase steels while eliminating the effect and interference of frequency on the magnetic Barkhausen noise outputs, the magnetic responses were fed into the ANFIS structure in terms of position, height and width of the Barkhausen profiles. The results showed that ANFIS approach has the potential to detect and characterize microstructural evolution while the considerable effect of the frequency on magnetic outputs is overlooked. In fact implementing multiple outputs simultaneously enables ANFIS to approach to the accurate results using only height, position and width of the magnetic Barkhausen noise peaks without knowing the value of the used frequency. - Highlights: • New NDT system for microstructural evaluation based on MBN using ANFIS modeling. • Sensitivity of magnetic Barkhausen noise to microstructure changes of the DP steels. • Accurate prediction of martensite by feeding multiple MBN outputs simultaneously. • Obtaining the modeled output without knowing the amount of the used frequency

  9. Design of an expert system based on neuro-fuzzy inference analyzer for on-line microstructural characterization using magnetic NDT method

    Ghanei, S., E-mail: Sadegh.Ghanei@yahoo.com [Department of Materials Engineering, Faculty of Engineering, Ferdowsi University of Mashhad, Azadi Square, Mashhad (Iran, Islamic Republic of); Vafaeenezhad, H. [Centre of Excellence for High Strength Alloys Technology (CEHSAT), School of Metallurgical and Materials Engineering, Iran University of Science and Technology (IUST), Narmak, Tehran (Iran, Islamic Republic of); Kashefi, M. [Department of Materials Engineering, Faculty of Engineering, Ferdowsi University of Mashhad, Azadi Square, Mashhad (Iran, Islamic Republic of); Eivani, A.R. [Centre of Excellence for High Strength Alloys Technology (CEHSAT), School of Metallurgical and Materials Engineering, Iran University of Science and Technology (IUST), Narmak, Tehran (Iran, Islamic Republic of); Mazinani, M. [Department of Materials Engineering, Faculty of Engineering, Ferdowsi University of Mashhad, Azadi Square, Mashhad (Iran, Islamic Republic of)

    2015-04-01

    Tracing microstructural evolution has a significant importance and priority in manufacturing lines of dual-phase steels. In this paper, an artificial intelligence method is presented for on-line microstructural characterization of dual-phase steels. A new method for microstructure characterization based on the theory of magnetic Barkhausen noise nondestructive testing method is introduced using adaptive neuro-fuzzy inference system (ANFIS). In order to predict the accurate martensite volume fraction of dual-phase steels while eliminating the effect and interference of frequency on the magnetic Barkhausen noise outputs, the magnetic responses were fed into the ANFIS structure in terms of position, height and width of the Barkhausen profiles. The results showed that ANFIS approach has the potential to detect and characterize microstructural evolution while the considerable effect of the frequency on magnetic outputs is overlooked. In fact implementing multiple outputs simultaneously enables ANFIS to approach to the accurate results using only height, position and width of the magnetic Barkhausen noise peaks without knowing the value of the used frequency. - Highlights: • New NDT system for microstructural evaluation based on MBN using ANFIS modeling. • Sensitivity of magnetic Barkhausen noise to microstructure changes of the DP steels. • Accurate prediction of martensite by feeding multiple MBN outputs simultaneously. • Obtaining the modeled output without knowing the amount of the used frequency.

  10. Using Artificial Intelligence to Retrieve the Optimal Parameters and Structures of Adaptive Network-Based Fuzzy Inference System for Typhoon Precipitation Forecast Modeling

    Chien-Lin Huang

    2015-01-01

    Full Text Available This study aims to construct a typhoon precipitation forecast model providing forecasts one to six hours in advance using optimal model parameters and structures retrieved from a combination of the adaptive network-based fuzzy inference system (ANFIS and artificial intelligence. To enhance the accuracy of the precipitation forecast, two structures were then used to establish the precipitation forecast model for a specific lead-time: a single-model structure and a dual-model hybrid structure where the forecast models of higher and lower precipitation were integrated. In order to rapidly, automatically, and accurately retrieve the optimal parameters and structures of the ANFIS-based precipitation forecast model, a tabu search was applied to identify the adjacent radius in subtractive clustering when constructing the ANFIS structure. The coupled structure was also employed to establish a precipitation forecast model across short and long lead-times in order to improve the accuracy of long-term precipitation forecasts. The study area is the Shimen Reservoir, and the analyzed period is from 2001 to 2009. Results showed that the optimal initial ANFIS parameters selected by the tabu search, combined with the dual-model hybrid method and the coupled structure, provided the favors in computation efficiency and high-reliability predictions in typhoon precipitation forecasts regarding short to long lead-time forecasting horizons.

  11. An adaptive neuro fuzzy inference system controlled space cector pulse width modulation based HVDC light transmission system under AC fault conditions

    Ajay Kumar, M.; Srikanth, N. V.

    2014-03-01

    In HVDC Light transmission systems, converter control is one of the major fields of present day research works. In this paper, fuzzy logic controller is utilized for controlling both the converters of the space vector pulse width modulation (SVPWM) based HVDC Light transmission systems. Due to its complexity in the rule base formation, an intelligent controller known as adaptive neuro fuzzy inference system (ANFIS) controller is also introduced in this paper. The proposed ANFIS controller changes the PI gains automatically for different operating conditions. A hybrid learning method which combines and exploits the best features of both the back propagation algorithm and least square estimation method is used to train the 5-layer ANFIS controller. The performance of the proposed ANFIS controller is compared and validated with the fuzzy logic controller and also with the fixed gain conventional PI controller. The simulations are carried out in the MATLAB/SIMULINK environment. The results reveal that the proposed ANFIS controller is reducing power fluctuations at both the converters. It also improves the dynamic performance of the test power system effectively when tested for various ac fault conditions.

  12. The meaning-making of science teachers participating in a school-based PD project

    Nielsen, Birgitte Lund

    The meaning-making of four science teachers involved in collaboratively analyzing video and other artifacts from practice in local science classrooms in a school-based professional development project is examined through repeated interviews and represented as meaning-making maps. The research aim...... is to examine how these collaborative inquiries make sense to the teachers: what they identify as outcomes, how they make use of inputs and support in their classrooms and in collegial interactions and how their ideas about teaching and learning of science might play a role. An adapted version...... learning of science in concrete situations. They refer to outcomes from sharing experiments with new tools and materials and refer to being encouraged to continue collaboration around science at the school. Beside this the teachers emphasize various outcomes apparently for each of them in areas where...

  13. The meaning-making of science teachers participating in as school based PD project

    Nielsen, Birgitte Lund

    The meaning-making of four science teachers involved in collaboratively analyzing video and other artifacts from practice in local science classrooms in a school-based professional development project is examined through repeated interviews and represented as meaning-making maps. The research aim...... is to examine how these collaborative inquiries make sense to the teachers: what they identify as outcomes, how they make use of inputs and support in their classrooms and in collegial interactions and how their ideas about teaching and learning of science might play a role. An adapted version...... learning of science in concrete situations. They refer to outcomes from sharing experiments with new tools and materials and refer to being encouraged to continue collaboration around science at the school. Beside this the teachers emphasize various outcomes apparently for each of them in areas where...

  14. Grey situation group decision-making method based on prospect theory.

    Zhang, Na; Fang, Zhigeng; Liu, Xiaqing

    2014-01-01

    This paper puts forward a grey situation group decision-making method on the basis of prospect theory, in view of the grey situation group decision-making problems that decisions are often made by multiple decision experts and those experts have risk preferences. The method takes the positive and negative ideal situation distance as reference points, defines positive and negative prospect value function, and introduces decision experts' risk preference into grey situation decision-making to make the final decision be more in line with decision experts' psychological behavior. Based on TOPSIS method, this paper determines the weight of each decision expert, sets up comprehensive prospect value matrix for decision experts' evaluation, and finally determines the optimal situation. At last, this paper verifies the effectiveness and feasibility of the method by means of a specific example.

  15. Statistical inference an integrated approach

    Migon, Helio S; Louzada, Francisco

    2014-01-01

    Introduction Information The concept of probability Assessing subjective probabilities An example Linear algebra and probability Notation Outline of the bookElements of Inference Common statistical modelsLikelihood-based functions Bayes theorem Exchangeability Sufficiency and exponential family Parameter elimination Prior Distribution Entirely subjective specification Specification through functional forms Conjugacy with the exponential family Non-informative priors Hierarchical priors Estimation Introduction to decision theoryBayesian point estimation Classical point estimation Empirical Bayes estimation Comparison of estimators Interval estimation Estimation in the Normal model Approximating Methods The general problem of inference Optimization techniquesAsymptotic theory Other analytical approximations Numerical integration methods Simulation methods Hypothesis Testing Introduction Classical hypothesis testingBayesian hypothesis testing Hypothesis testing and confidence intervalsAsymptotic tests Prediction...

  16. Activation of the cannabinoid system in the nucleus accumbens affects effort-based decision making.

    Fatahi, Zahra; Haghparast, Abbas

    2018-02-01

    Effort-based decision making addresses how we make an action choice based on an integration of action and goal values. The nucleus accumbens (NAc) is implicated in allowing an animal to overcome effort constraints to obtain greater benefits, and it has been previously shown that cannabis derivatives may affect such processes. Therefore, in this study, we intend to evaluate the involvement of the cannabinoid system in the entire NAc on effort-based decision making. Rats were trained in a T-maze cost-benefit decision making the task in which they could choose either to climb a barrier to obtain a large reward in one arm or run into the other arm without a barrier to obtaining a small reward. Following training, the animals were bilaterally implanted with guide cannulae in the NAc. On test day, rats received cannabinoid agonist (Win 55,212-2; 2, 10 and 50μM) and/or antagonist (AM251; 45μM), afterward percentage of large reward choice and latency of reward attainment were investigated. Results revealed that the administration of cannabinoid agonist led to decrease of large reward choice percentage such that the animals preferred to receive a small reward with low effort instead of receiving a large reward with high effort. The administration of antagonist solely did not affect effort-based decision making, but did attenuate the Win 55,212-2-induced impairments in effort allocation. In agonist-treated animals, the latency of reward collection increased. Moreover, when the effort was equated on both arms, the animals returned to choosing large reward showing that obtained results were not caused by spatial memory impairment. Our finding suggested that activation of the cannabinoid system in the NAc impaired effort-based decision making and led to rats were less willing to invest the physical effort to gain large reward. Copyright © 2017 Elsevier Inc. All rights reserved.

  17. Biogeography of the Pistia clade (Araceae): based on chloroplast and mitochondrial DNA sequences and Bayesian divergence time inference.

    Renner, Susanne S; Zhang, Li-Bing

    2004-06-01

    Pistia stratiotes (water lettuce) and Lemna (duckweeds) are the only free-floating aquatic Araceae. The geographic origin and phylogenetic placement of these unrelated aroids present long-standing problems because of their highly modified reproductive structures and wide geographical distributions. We sampled chloroplast (trnL-trnF and rpl20-rps12 spacers, trnL intron) and mitochondrial sequences (nad1 b/c intron) for all genera implicated as close relatives of Pistia by morphological, restriction site, and sequencing data, and present a hypothesis about its geographic origin based on the consensus of trees obtained from the combined data, using Bayesian, maximum likelihood, parsimony, and distance analyses. Of the 14 genera closest to Pistia, only Alocasia, Arisaema, and Typhonium are species-rich, and the latter two were studied previously, facilitating the choice of representatives that span the roots of these genera. Results indicate that Pistia and the Seychelles endemic Protarum sechellarum are the basalmost branches in a grade comprising the tribes Colocasieae (Ariopsis, Steudnera, Remusatia, Alocasia, Colocasia), Arisaemateae (Arisaema, Pinellia), and Areae (Arum, Biarum, Dracunculus, Eminium, Helicodiceros, Theriophonum, Typhonium). Unexpectedly, all Areae genera are embedded in Typhonium, which throws new light on the geographic history of Areae. A Bayesian analysis of divergence times that explores the effects of multiple fossil and geological calibration points indicates that the Pistia lineage is 90 to 76 million years (my) old. The oldest fossils of the Pistia clade, though not Pistia itself, are 45-my-old leaves from Germany; the closest outgroup, Peltandreae (comprising a few species in Florida, the Mediterranean, and Madagascar), is known from 60-my-old leaves from Europe, Kazakhstan, North Dakota, and Tennessee. Based on the geographic ranges of close relatives, Pistia likely originated in the Tethys region, with Protarum then surviving on the

  18. Persistent misunderstandings about evidence-based (sorry: informed!) policy-making.

    Bédard, Pierre-Olivier; Ouimet, Mathieu

    2016-01-01

    The field of research on knowledge mobilization and evidence-informed policy-making has seen enduring debates related to various fundamental assumptions such as the definition of 'evidence', the relative validity of various research methods, the actual role of evidence to inform policy-making, etc. In many cases, these discussions serve a useful purpose, but they also stem from serious disagreement on methodological and epistemological issues. This essay reviews the rationale for evidence-informed policy-making by examining some of the common claims made about the aims and practices of this perspective on public policy. Supplementing the existing justifications for evidence-based policy making, we argue in favor of a greater inclusion of research evidence in the policy process but in a structured fashion, based on methodological considerations. In this respect, we present an overview of the intricate relation between policy questions and appropriate research designs. By closely examining the relation between research questions and research designs, we claim that the usual points of disagreement are mitigated. For instance, when focusing on the variety of research designs that can answer a range of policy questions, the common critical claim about 'RCT-based policy-making' seems to lose some, if not all of its grip.

  19. Genomic organization, annotation, and ligand-receptor inferences of chicken chemokines and chemokine receptor genes based on comparative genomics

    Sze Sing-Hoi

    2005-03-01

    Full Text Available Abstract Background Chemokines and their receptors play important roles in host defense, organogenesis, hematopoiesis, and neuronal communication. Forty-two chemokines and 19 cognate receptors have been found in the human genome. Prior to this report, only 11 chicken chemokines and 7 receptors had been reported. The objectives of this study were to systematically identify chicken chemokines and their cognate receptor genes in the chicken genome and to annotate these genes and ligand-receptor binding by a comparative genomics approach. Results Twenty-three chemokine and 14 chemokine receptor genes were identified in the chicken genome. All of the chicken chemokines contained a conserved CC, CXC, CX3C, or XC motif, whereas all the chemokine receptors had seven conserved transmembrane helices, four extracellular domains with a conserved cysteine, and a conserved DRYLAIV sequence in the second intracellular domain. The number of coding exons in these genes and the syntenies are highly conserved between human, mouse, and chicken although the amino acid sequence homologies are generally low between mammalian and chicken chemokines. Chicken genes were named with the systematic nomenclature used in humans and mice based on phylogeny, synteny, and sequence homology. Conclusion The independent nomenclature of chicken chemokines and chemokine receptors suggests that the chicken may have ligand-receptor pairings similar to mammals. All identified chicken chemokines and their cognate receptors were identified in the chicken genome except CCR9, whose ligand was not identified in this study. The organization of these genes suggests that there were a substantial number of these genes present before divergence between aves and mammals and more gene duplications of CC, CXC, CCR, and CXCR subfamilies in mammals than in aves after the divergence.

  20. Supporting decision-making processes for evidence-based mental health promotion.

    Jané-Llopis, Eva; Katschnig, Heinz; McDaid, David; Wahlbeck, Kristian

    2011-12-01

    The use of evidence is critical in guiding decision-making, but evidence from effect studies will be only one of a number of factors that will need to be taken into account in the decision-making processes. Equally important for policymakers will be the use of different types of evidence including implementation essentials and other decision-making principles such as social justice, political, ethical, equity issues, reflecting public attitudes and the level of resources available, rather than be based on health outcomes alone. This paper, aimed to support decision-makers, highlights the importance of commissioning high-quality evaluations, the key aspects to assess levels of evidence, the importance of supporting evidence-based implementation and what to look out for before, during and after implementation of mental health promotion and mental disorder prevention programmes.