WorldWideScience

Sample records for model rule-performance testing

  1. Testing the performance of technical trading rules in the Chinese markets based on superior predictive test

    Science.gov (United States)

    Wang, Shan; Jiang, Zhi-Qiang; Li, Sai-Ping; Zhou, Wei-Xing

    2015-12-01

    Technical trading rules have a long history of being used by practitioners in financial markets. The profitable ability and efficiency of technical trading rules are yet controversial. In this paper, we test the performance of more than seven thousand traditional technical trading rules on the Shanghai Securities Composite Index (SSCI) from May 21, 1992 through June 30, 2013 and China Securities Index 300 (CSI 300) from April 8, 2005 through June 30, 2013 to check whether an effective trading strategy could be found by using the performance measurements based on the return and Sharpe ratio. To correct for the influence of the data-snooping effect, we adopt the Superior Predictive Ability test to evaluate if there exists a trading rule that can significantly outperform the benchmark. The result shows that for SSCI, technical trading rules offer significant profitability, while for CSI 300, this ability is lost. We further partition the SSCI into two sub-series and find that the efficiency of technical trading in sub-series, which have exactly the same spanning period as that of CSI 300, is severely weakened. By testing the trading rules on both indexes with a five-year moving window, we find that during the financial bubble from 2005 to 2007, the effectiveness of technical trading rules is greatly improved. This is consistent with the predictive ability of technical trading rules which appears when the market is less efficient.

  2. A rule-based software test data generator

    Science.gov (United States)

    Deason, William H.; Brown, David B.; Chang, Kai-Hsiung; Cross, James H., II

    1991-01-01

    Rule-based software test data generation is proposed as an alternative to either path/predicate analysis or random data generation. A prototype rule-based test data generator for Ada programs is constructed and compared to a random test data generator. Four Ada procedures are used in the comparison. Approximately 2000 rule-based test cases and 100,000 randomly generated test cases are automatically generated and executed. The success of the two methods is compared using standard coverage metrics. Simple statistical tests showing that even the primitive rule-based test data generation prototype is significantly better than random data generation are performed. This result demonstrates that rule-based test data generation is feasible and shows great promise in assisting test engineers, especially when the rule base is developed further.

  3. Rule induction performance in amnestic mild cognitive impairment and Alzheimer's dementia: examining the role of simple and biconditional rule learning processes.

    Science.gov (United States)

    Oosterman, Joukje M; Heringa, Sophie M; Kessels, Roy P C; Biessels, Geert Jan; Koek, Huiberdina L; Maes, Joseph H R; van den Berg, Esther

    2017-04-01

    Rule induction tests such as the Wisconsin Card Sorting Test require executive control processes, but also the learning and memorization of simple stimulus-response rules. In this study, we examined the contribution of diminished learning and memorization of simple rules to complex rule induction test performance in patients with amnestic mild cognitive impairment (aMCI) or Alzheimer's dementia (AD). Twenty-six aMCI patients, 39 AD patients, and 32 control participants were included. A task was used in which the memory load and the complexity of the rules were independently manipulated. This task consisted of three conditions: a simple two-rule learning condition (Condition 1), a simple four-rule learning condition (inducing an increase in memory load, Condition 2), and a complex biconditional four-rule learning condition-inducing an increase in complexity and, hence, executive control load (Condition 3). Performance of AD patients declined disproportionately when the number of simple rules that had to be memorized increased (from Condition 1 to 2). An additional increment in complexity (from Condition 2 to 3) did not, however, disproportionately affect performance of the patients. Performance of the aMCI patients did not differ from that of the control participants. In the patient group, correlation analysis showed that memory performance correlated with Condition 1 performance, whereas executive task performance correlated with Condition 2 performance. These results indicate that the reduced learning and memorization of underlying task rules explains a significant part of the diminished complex rule induction performance commonly reported in AD, although results from the correlation analysis suggest involvement of executive control functions as well. Taken together, these findings suggest that care is needed when interpreting rule induction task performance in terms of executive function deficits in these patients.

  4. Contrast class cues and performance facilitation in a hypothesis-testing task: evidence for an iterative counterfactual model.

    Science.gov (United States)

    Gale, Maggie; Ball, Linden J

    2012-04-01

    Hypothesis-testing performance on Wason's (Quarterly Journal of Experimental Psychology 12:129-140, 1960) 2-4-6 task is typically poor, with only around 20% of participants announcing the to-be-discovered "ascending numbers" rule on their first attempt. Enhanced solution rates can, however, readily be observed with dual-goal (DG) task variants requiring the discovery of two complementary rules, one labeled "DAX" (the standard "ascending numbers" rule) and the other labeled "MED" ("any other number triples"). Two DG experiments are reported in which we manipulated the usefulness of a presented MED exemplar, where usefulness denotes cues that can establish a helpful "contrast class" that can stand in opposition to the presented 2-4-6 DAX exemplar. The usefulness of MED exemplars had a striking facilitatory effect on DAX rule discovery, which supports the importance of contrast-class information in hypothesis testing. A third experiment ruled out the possibility that the useful MED triple seeded the correct rule from the outset and obviated any need for hypothesis testing. We propose that an extension of Oaksford and Chater's (European Journal of Cognitive Psychology 6:149-169, 1994) iterative counterfactual model can neatly capture the mechanisms by which DG facilitation arises.

  5. Simulation of operating rules and discretional decisions using a fuzzy rule-based system integrated into a water resources management model

    Science.gov (United States)

    Macian-Sorribes, Hector; Pulido-Velazquez, Manuel

    2013-04-01

    Water resources systems are operated, mostly, using a set of pre-defined rules not regarding, usually, to an optimal allocation in terms of water use or economic benefits, but to historical and institutional reasons. These operating policies are reproduced, commonly, as hedging rules, pack rules or zone-based operations, and simulation models can be used to test their performance under a wide range of hydrological and/or socio-economic hypothesis. Despite the high degree of acceptation and testing that these models have achieved, the actual operation of water resources systems hardly follows all the time the pre-defined rules with the consequent uncertainty on the system performance. Real-world reservoir operation is very complex, affected by input uncertainty (imprecision in forecast inflow, seepage and evaporation losses, etc.), filtered by the reservoir operator's experience and natural risk-aversion, while considering the different physical and legal/institutional constraints in order to meet the different demands and system requirements. The aim of this work is to expose a fuzzy logic approach to derive and assess the historical operation of a system. This framework uses a fuzzy rule-based system to reproduce pre-defined rules and also to match as close as possible the actual decisions made by managers. After built up, the fuzzy rule-based system can be integrated in a water resources management model, making possible to assess the system performance at the basin scale. The case study of the Mijares basin (eastern Spain) is used to illustrate the method. A reservoir operating curve regulates the two main reservoir releases (operated in a conjunctive way) with the purpose of guaranteeing a high realiability of supply to the traditional irrigation districts with higher priority (more senior demands that funded the reservoir construction). A fuzzy rule-based system has been created to reproduce the operating curve's performance, defining the system state (total

  6. The Linear Logistic Test Model (LLTM as the methodological foundation of item generating rules for a new verbal reasoning test

    Directory of Open Access Journals (Sweden)

    HERBERT POINSTINGL

    2009-06-01

    Full Text Available Based on the demand for new verbal reasoning tests to enrich psychological test inventory, a pilot version of a new test was analysed: the 'Family Relation Reasoning Test' (FRRT; Poinstingl, Kubinger, Skoda & Schechtner, forthcoming, in which several basic cognitive operations (logical rules have been embedded/implemented. Given family relationships of varying complexity embedded in short stories, testees had to logically conclude the correct relationship between two individuals within a family. Using empirical data, the linear logistic test model (LLTM; Fischer, 1972, a special case of the Rasch model, was used to test the construct validity of the test: The hypothetically assumed basic cognitive operations had to explain the Rasch model's item difficulty parameters. After being shaped in LLTM's matrices of weights ((qij, none of these operations were corroborated by means of the Andersen's Likelihood Ratio Test.

  7. Ruling Out Pulmonary Embolism in Primary Care: Comparison of the Diagnostic Performance of "Gestalt" and the Wells Rule

    NARCIS (Netherlands)

    Hendriksen, Janneke M. T.; Lucassen, Wim A. M.; Erkens, Petra M. G.; Stoffers, Henri E. J. H.; van Weert, Henk C. P. M.; Büller, Harry R.; Hoes, Arno W.; Moons, Karel G. M.; Geersing, Geert-Jan

    2016-01-01

    Diagnostic prediction models such as the Wells rule can be used for safely ruling out pulmonary embolism (PE) when it is suspected. A physician's own probability estimate ("gestalt"), however, is commonly used instead. We evaluated the diagnostic performance of both approaches in primary care.

  8. Targeted training of the decision rule benefits rule-guided behavior in Parkinson's disease.

    Science.gov (United States)

    Ell, Shawn W

    2013-12-01

    The impact of Parkinson's disease (PD) on rule-guided behavior has received considerable attention in cognitive neuroscience. The majority of research has used PD as a model of dysfunction in frontostriatal networks, but very few attempts have been made to investigate the possibility of adapting common experimental techniques in an effort to identify the conditions that are most likely to facilitate successful performance. The present study investigated a targeted training paradigm designed to facilitate rule learning and application using rule-based categorization as a model task. Participants received targeted training in which there was no selective-attention demand (i.e., stimuli varied along a single, relevant dimension) or nontargeted training in which there was selective-attention demand (i.e., stimuli varied along a relevant dimension as well as an irrelevant dimension). Following training, all participants were tested on a rule-based task with selective-attention demand. During the test phase, PD patients who received targeted training performed similarly to control participants and outperformed patients who did not receive targeted training. As a preliminary test of the generalizability of the benefit of targeted training, a subset of the PD patients were tested on the Wisconsin card sorting task (WCST). PD patients who received targeted training outperformed PD patients who did not receive targeted training on several WCST performance measures. These data further characterize the contribution of frontostriatal circuitry to rule-guided behavior. Importantly, these data also suggest that PD patient impairment, on selective-attention-demanding tasks of rule-guided behavior, is not inevitable and highlight the potential benefit of targeted training.

  9. Anthraquinone Final Reporting and Recordkeeping Requirements and Test Rule

    Science.gov (United States)

    EPA is issuing a final rule, under section 4 of the Toxic Substances Control Act (TSCA), requiring manufacturers and processors of 9,10-anthraquinone (CAS No. 84—65—1), hereinafter anthraquinone, to perform testing.

  10. Retrieving Knowledge in Social Situations: A Test of the Implicit Rules Model.

    Science.gov (United States)

    Meyer, Janet R.

    1996-01-01

    Supports the Implicit Rules Model, which suggests that individuals acquire implicit rules that connect request situation schemas to behaviors. Shows how individuals, in two experiments, learned, based on feedback, which behaviors were "correct" for multiple instances, and then, on their own, chose the correct behavior for new instances.…

  11. Distrust and the positive test heuristic: dispositional and situated social distrust improves performance on the Wason rule discovery task.

    Science.gov (United States)

    Mayo, Ruth; Alfasi, Dana; Schwarz, Norbert

    2014-06-01

    Feelings of distrust alert people not to take information at face value, which may influence their reasoning strategy. Using the Wason (1960) rule identification task, we tested whether chronic and temporary distrust increase the use of negative hypothesis testing strategies suited to falsify one's own initial hunch. In Study 1, participants who were low in dispositional trust were more likely to engage in negative hypothesis testing than participants high in dispositional trust. In Study 2, trust and distrust were induced through an alleged person-memory task. Paralleling the effects of chronic distrust, participants exposed to a single distrust-eliciting face were 3 times as likely to engage in negative hypothesis testing as participants exposed to a trust-eliciting face. In both studies, distrust increased negative hypothesis testing, which was associated with better performance on the Wason task. In contrast, participants' initial rule generation was not consistently affected by distrust. These findings provide first evidence that distrust can influence which reasoning strategy people adopt. PsycINFO Database Record (c) 2014 APA, all rights reserved.

  12. A Machine Learning Approach to Discover Rules for Expressive Performance Actions in Jazz Guitar Music

    Science.gov (United States)

    Giraldo, Sergio I.; Ramirez, Rafael

    2016-01-01

    Expert musicians introduce expression in their performances by manipulating sound properties such as timing, energy, pitch, and timbre. Here, we present a data driven computational approach to induce expressive performance rule models for note duration, onset, energy, and ornamentation transformations in jazz guitar music. We extract high-level features from a set of 16 commercial audio recordings (and corresponding music scores) of jazz guitarist Grant Green in order to characterize the expression in the pieces. We apply machine learning techniques to the resulting features to learn expressive performance rule models. We (1) quantitatively evaluate the accuracy of the induced models, (2) analyse the relative importance of the considered musical features, (3) discuss some of the learnt expressive performance rules in the context of previous work, and (4) assess their generailty. The accuracies of the induced predictive models is significantly above base-line levels indicating that the audio performances and the musical features extracted contain sufficient information to automatically learn informative expressive performance patterns. Feature analysis shows that the most important musical features for predicting expressive transformations are note duration, pitch, metrical strength, phrase position, Narmour structure, and tempo and key of the piece. Similarities and differences between the induced expressive rules and the rules reported in the literature were found. Differences may be due to the fact that most previously studied performance data has consisted of classical music recordings. Finally, the rules' performer specificity/generality is assessed by applying the induced rules to performances of the same pieces performed by two other professional jazz guitar players. Results show a consistency in the ornamentation patterns between Grant Green and the other two musicians, which may be interpreted as a good indicator for generality of the ornamentation rules

  13. A Machine Learning Approach to Discover Rules for Expressive Performance Actions in Jazz Guitar Music.

    Science.gov (United States)

    Giraldo, Sergio I; Ramirez, Rafael

    2016-01-01

    Expert musicians introduce expression in their performances by manipulating sound properties such as timing, energy, pitch, and timbre. Here, we present a data driven computational approach to induce expressive performance rule models for note duration, onset, energy, and ornamentation transformations in jazz guitar music. We extract high-level features from a set of 16 commercial audio recordings (and corresponding music scores) of jazz guitarist Grant Green in order to characterize the expression in the pieces. We apply machine learning techniques to the resulting features to learn expressive performance rule models. We (1) quantitatively evaluate the accuracy of the induced models, (2) analyse the relative importance of the considered musical features, (3) discuss some of the learnt expressive performance rules in the context of previous work, and (4) assess their generailty. The accuracies of the induced predictive models is significantly above base-line levels indicating that the audio performances and the musical features extracted contain sufficient information to automatically learn informative expressive performance patterns. Feature analysis shows that the most important musical features for predicting expressive transformations are note duration, pitch, metrical strength, phrase position, Narmour structure, and tempo and key of the piece. Similarities and differences between the induced expressive rules and the rules reported in the literature were found. Differences may be due to the fact that most previously studied performance data has consisted of classical music recordings. Finally, the rules' performer specificity/generality is assessed by applying the induced rules to performances of the same pieces performed by two other professional jazz guitar players. Results show a consistency in the ornamentation patterns between Grant Green and the other two musicians, which may be interpreted as a good indicator for generality of the ornamentation rules.

  14. Ruling out pulmonary embolism in primary care : Comparison of the diagnostic performance of “gestalt” and the wells rule

    NARCIS (Netherlands)

    Hendriksen, Janneke M T; Lucassen, Wim A M; Erkens, Petra M G; Stoffers, Henri E J H; van Weert, Henk C P M; Büller, Harry R.; Hoes, Arno W.; Moons, Karel G M; Geersing, Geert Jan

    2016-01-01

    PURPOSE Diagnostic prediction models such as the Wells rule can be used for safely ruling out pulmonary embolism (PE) when it is suspected. A physician’s own probability estimate (“gestalt”), however, is commonly used instead. We evaluated the diagnostic performance of both approaches in primary

  15. Performance based regulation - The maintenance rule

    Energy Technology Data Exchange (ETDEWEB)

    Correia, Richard P. [NRR/DOTS/TQMP, U.S. Nuclear Regulatory Commission, Office of Nuclear Reactor Regulation, M/S OWFN 10A19, Washington, D.C. 20555 (United States)

    1997-07-01

    The U.S. Nuclear Regulatory Commission has begun a transition from 'process-oriented' to 'results-oriented' regulations. The maintenance rule is a results-oriented rule that mandates consideration of risk and plant performance. The Maintenance Rule allows licensees to devise the most effective and efficient means of achieving the results described in the rule including the use of Probabilistic Risk (or Safety) Assessments. The NRC staff conducted a series of site visits to evaluate implementation of the Rule. Conclusions from the site visits indicated that the results-oriented Maintenance Rule can be successfully implemented and enforced. (author)

  16. Performance based regulation - The maintenance rule

    International Nuclear Information System (INIS)

    Correia, Richard P.

    1997-01-01

    The U.S. Nuclear Regulatory Commission has begun a transition from 'process-oriented' to 'results-oriented' regulations. The maintenance rule is a results-oriented rule that mandates consideration of risk and plant performance. The Maintenance Rule allows licensees to devise the most effective and efficient means of achieving the results described in the rule including the use of Probabilistic Risk (or Safety) Assessments. The NRC staff conducted a series of site visits to evaluate implementation of the Rule. Conclusions from the site visits indicated that the results-oriented Maintenance Rule can be successfully implemented and enforced. (author)

  17. A Machine Learning Approach to Discover Rules for Expressive Performance Actions in Jazz Guitar Music

    Directory of Open Access Journals (Sweden)

    Sergio Ivan Giraldo

    2016-12-01

    Full Text Available Expert musicians introduce expression in their performances by manipulating sound properties such as timing, energy, pitch, and timbre. Here, we present a data driven computational approach to induce expressive performance rule models for note duration, onset, energy, and ornamentation transformations in jazz guitar music. We extract high-level features from a set of 16 commercial audio recordings (and corresponding music scores of jazz guitarist Grant Green in order to characterize the expression in the pieces. We apply machine learning techniques to the resulting features to learn expressive performance rule models. We (1 quantitatively evaluate the accuracy of the induced models, (2 analyse the relative importance of the considered musical features, (3 discuss some of the learnt expressive performance rules in the context of previous work, and (4 assess their generailty. The accuracies of the induced predictive models is significantly above base-line levels indicating that the audio performances and the musical features extracted contain sufficient information to automatically learn informative expressive performance patterns. Feature analysis shows that the most important musical features for predicting expressive transformations are note duration, pitch, metrical strength, phrase position, Narmour structure, and tempo and key of the piece. Similarities and differences between the induced expressive rules and the rules reported in the literature were found. Differences may be due to the fact that most previously studied performance data has consisted of classical music recordings. Finally, the rules’ performer specificity/generality is assessed by applying the induced rules to performances of the same pieces performed by two other professional jazz guitar players. Results show a consistency in the ornamentation patterns between Grant Green and the other two musicians, which may be interpreted as a good indicator for generality of the

  18. Model tests on dynamic performance of RC shear walls

    International Nuclear Information System (INIS)

    Nagashima, Toshio; Shibata, Akenori; Inoue, Norio; Muroi, Kazuo.

    1991-01-01

    For the inelastic dynamic response analysis of a reactor building subjected to earthquakes, it is essentially important to properly evaluate its restoring force characteristics under dynamic loading condition and its damping performance. Reinforced concrete shear walls are the main structural members of a reactor building, and dominate its seismic behavior. In order to obtain the basic information on the dynamic restoring force characteristics and damping performance of shear walls, the dynamic test using a large shaking table, static displacement control test and the pseudo-dynamic test on the models of a shear wall were conducted. In the dynamic test, four specimens were tested on a large shaking table. In the static test, four specimens were tested, and in the pseudo-dynamic test, three specimens were tested. These tests are outlined. The results of these tests were compared, placing emphasis on the restoring force characteristics and damping performance of the RC wall models. The strength was higher in the dynamic test models than in the static test models mainly due to the effect of loading rate. (K.I.)

  19. Coulomb sum rules in the relativistic Fermi gas model

    International Nuclear Information System (INIS)

    Do Dang, G.; L'Huillier, M.; Nguyen Giai, Van.

    1986-11-01

    Coulomb sum rules are studied in the framework of the Fermi gas model. A distinction is made between mathematical and observable sum rules. Differences between non-relativistic and relativistic Fermi gas predictions are stressed. A method to deduce a Coulomb response function from the longitudinal response is proposed and tested numerically. This method is applied to the 40 Ca data to obtain the experimental Coulomb sum rule as a function of momentum transfer

  20. Conformance Testing: Measurement Decision Rules

    Science.gov (United States)

    Mimbs, Scott M.

    2010-01-01

    The goal of a Quality Management System (QMS) as specified in ISO 9001 and AS9100 is to provide assurance to the customer that end products meet specifications. Measuring devices, often called measuring and test equipment (MTE), are used to provide the evidence of product conformity to specified requirements. Unfortunately, processes that employ MTE can become a weak link to the overall QMS if proper attention is not given to the measurement process design, capability, and implementation. Documented "decision rules" establish the requirements to ensure measurement processes provide the measurement data that supports the needs of the QMS. Measurement data are used to make the decisions that impact all areas of technology. Whether measurements support research, design, production, or maintenance, ensuring the data supports the decision is crucial. Measurement data quality can be critical to the resulting consequences of measurement-based decisions. Historically, most industries required simplistic, one-size-fits-all decision rules for measurements. One-size-fits-all rules in some cases are not rigorous enough to provide adequate measurement results, while in other cases are overly conservative and too costly to implement. Ideally, decision rules should be rigorous enough to match the criticality of the parameter being measured, while being flexible enough to be cost effective. The goal of a decision rule is to ensure that measurement processes provide data with a sufficient level of quality to support the decisions being made - no more, no less. This paper discusses the basic concepts of providing measurement-based evidence that end products meet specifications. Although relevant to all measurement-based conformance tests, the target audience is the MTE end-user, which is anyone using MTE other than calibration service providers. Topics include measurement fundamentals, the associated decision risks, verifying conformance to specifications, and basic measurement

  1. Predicting higher selection in elite junior Australian Rules football: The influence of physical performance and anthropometric attributes.

    Science.gov (United States)

    Robertson, Sam; Woods, Carl; Gastin, Paul

    2015-09-01

    To develop a physiological performance and anthropometric attribute model to predict Australian Football League draft selection. Cross-sectional observational. Data was obtained (n=4902) from three Under-18 Australian football competitions between 2010 and 2013. Players were allocated into one of the three groups, based on their highest level of selection in their final year of junior football (Australian Football League Drafted, n=292; National Championship, n=293; State-level club, n=4317). Physiological performance (vertical jumps, agility, speed and running endurance) and anthropometric (body mass and height) data were obtained. Hedge's effect sizes were calculated to assess the influence of selection-level and competition on these physical attributes, with logistic regression models constructed to discriminate Australian Football League Drafted and National Championship players. Rule induction analysis was undertaken to determine a set of rules for discriminating selection-level. Effect size comparisons revealed a range of small to moderate differences between State-level club players and both other groups for all attributes, with trivial to small differences between Australian Football League Drafted and National Championship players noted. Logistic regression models showed multistage fitness test, height and 20 m sprint time as the most important attributes in predicting Draft success. Rule induction analysis showed that players displaying multistage fitness test scores of >14.01 and/or 20 m sprint times of football players being recruited to the highest level of the sport. Copyright © 2014 Sports Medicine Australia. Published by Elsevier Ltd. All rights reserved.

  2. Business model for business rules

    NARCIS (Netherlands)

    Eline Haan; Martin Zoet; Koen Smit

    2014-01-01

    Business rule models are widely applied, standalone and embedded in smart objects. They have become segregated from information technology and they are now a valuable asset in their own right. As more business rule models are becoming assets, business models to monetize these assets are designed.

  3. Analyzing Strategic Business Rules through Simulation Modeling

    Science.gov (United States)

    Orta, Elena; Ruiz, Mercedes; Toro, Miguel

    Service Oriented Architecture (SOA) holds promise for business agility since it allows business process to change to meet new customer demands or market needs without causing a cascade effect of changes in the underlying IT systems. Business rules are the instrument chosen to help business and IT to collaborate. In this paper, we propose the utilization of simulation models to model and simulate strategic business rules that are then disaggregated at different levels of an SOA architecture. Our proposal is aimed to help find a good configuration for strategic business objectives and IT parameters. The paper includes a case study where a simulation model is built to help business decision-making in a context where finding a good configuration for different business parameters and performance is too complex to analyze by trial and error.

  4. An Improved Cognitive Model of the Iowa and Soochow Gambling Tasks With Regard to Model Fitting Performance and Tests of Parameter Consistency

    Directory of Open Access Journals (Sweden)

    Junyi eDai

    2015-03-01

    Full Text Available The Iowa Gambling Task (IGT and the Soochow Gambling Task (SGT are two experience-based risky decision-making tasks for examining decision-making deficits in clinical populations. Several cognitive models, including the expectancy-valence learning model (EVL and the prospect valence learning model (PVL, have been developed to disentangle the motivational, cognitive, and response processes underlying the explicit choices in these tasks. The purpose of the current study was to develop an improved model that can fit empirical data better than the EVL and PVL models and, in addition, produce more consistent parameter estimates across the IGT and SGT. Twenty-six opiate users (mean age 34.23; SD 8.79 and 27 control participants (mean age 35; SD 10.44 completed both tasks. Eighteen cognitive models varying in evaluation, updating, and choice rules were fit to individual data and their performances were compared to that of a statistical baseline model to find a best fitting model. The results showed that the model combining the prospect utility function treating gains and losses separately, the decay-reinforcement updating rule, and the trial-independent choice rule performed the best in both tasks. Furthermore, the winning model produced more consistent individual parameter estimates across the two tasks than any of the other models.

  5. Rule-based model of vein graft remodeling.

    Directory of Open Access Journals (Sweden)

    Minki Hwang

    Full Text Available When vein segments are implanted into the arterial system for use in arterial bypass grafting, adaptation to the higher pressure and flow of the arterial system is accomplished thorough wall thickening and expansion. These early remodeling events have been found to be closely coupled to the local hemodynamic forces, such as shear stress and wall tension, and are believed to be the foundation for later vein graft failure. To further our mechanistic understanding of the cellular and extracellular interactions that lead to global changes in tissue architecture, a rule-based modeling method is developed through the application of basic rules of behaviors for these molecular and cellular activities. In the current method, smooth muscle cell (SMC, extracellular matrix (ECM, and monocytes are selected as the three components that occupy the elements of a grid system that comprise the developing vein graft intima. The probabilities of the cellular behaviors are developed based on data extracted from in vivo experiments. At each time step, the various probabilities are computed and applied to the SMC and ECM elements to determine their next physical state and behavior. One- and two-dimensional models are developed to test and validate the computational approach. The importance of monocyte infiltration, and the associated effect in augmenting extracellular matrix deposition, was evaluated and found to be an important component in model development. Final model validation is performed using an independent set of experiments, where model predictions of intimal growth are evaluated against experimental data obtained from the complex geometry and shear stress patterns offered by a mid-graft focal stenosis, where simulation results show good agreements with the experimental data.

  6. A neural model of rule generation in inductive reasoning.

    Science.gov (United States)

    Rasmussen, Daniel; Eliasmith, Chris

    2011-01-01

    Inductive reasoning is a fundamental and complex aspect of human intelligence. In particular, how do subjects, given a set of particular examples, generate general descriptions of the rules governing that set? We present a biologically plausible method for accomplishing this task and implement it in a spiking neuron model. We demonstrate the success of this model by applying it to the problem domain of Raven's Progressive Matrices, a widely used tool in the field of intelligence testing. The model is able to generate the rules necessary to correctly solve Raven's items, as well as recreate many of the experimental effects observed in human subjects. Copyright © 2011 Cognitive Science Society, Inc.

  7. Rule-based decision making model

    International Nuclear Information System (INIS)

    Sirola, Miki

    1998-01-01

    A rule-based decision making model is designed in G2 environment. A theoretical and methodological frame for the model is composed and motivated. The rule-based decision making model is based on object-oriented modelling, knowledge engineering and decision theory. The idea of safety objective tree is utilized. Advanced rule-based methodologies are applied. A general decision making model 'decision element' is constructed. The strategy planning of the decision element is based on e.g. value theory and utility theory. A hypothetical process model is built to give input data for the decision element. The basic principle of the object model in decision making is division in tasks. Probability models are used in characterizing component availabilities. Bayes' theorem is used to recalculate the probability figures when new information is got. The model includes simple learning features to save the solution path. A decision analytic interpretation is given to the decision making process. (author)

  8. Online Rule Generation Software Process Model

    OpenAIRE

    Sudeep Marwaha; Alka Aroa; Satma M C; Rajni Jain; R C Goyal

    2013-01-01

    For production systems like expert systems, a rule generation software can facilitate the faster deployment. The software process model for rule generation using decision tree classifier refers to the various steps required to be executed for the development of a web based software model for decision rule generation. The Royce’s final waterfall model has been used in this paper to explain the software development process. The paper presents the specific output of various steps of modified wat...

  9. 77 FR 21065 - Certain High Production Volume Chemicals; Test Rule and Significant New Use Rule; Fourth Group of...

    Science.gov (United States)

    2012-04-09

    ... 2070-AJ66 Certain High Production Volume Chemicals; Test Rule and Significant New Use Rule; Fourth... an opportunity to comment on a proposed test rule for 23 high production volume (HPV) chemical... necessary, to prohibit or limit that activity before it occurs. The opportunity to present oral comment was...

  10. Performance of thirteen clinical rules to distinguish bacterial and presumed viral meningitis in Vietnamese children.

    Directory of Open Access Journals (Sweden)

    Nguyen Tien Huy

    Full Text Available BACKGROUND AND PURPOSE: Successful outcomes from bacterial meningitis require rapid antibiotic treatment; however, unnecessary treatment of viral meningitis may lead to increased toxicities and expense. Thus, improved diagnostics are required to maximize treatment and minimize side effects and cost. Thirteen clinical decision rules have been reported to identify bacterial from viral meningitis. However, few rules have been tested and compared in a single study, while several rules are yet to be tested by independent researchers or in pediatric populations. Thus, simultaneous test and comparison of these rules are required to enable clinicians to select an optimal diagnostic rule for bacterial meningitis in settings and populations similar to ours. METHODS: A retrospective cross-sectional study was conducted at the Infectious Department of Pediatric Hospital Number 1, Ho Chi Minh City, Vietnam. The performance of the clinical rules was evaluated by area under a receiver operating characteristic curve (ROC-AUC using the method of DeLong and McNemar test for specificity comparison. RESULTS: Our study included 129 patients, of whom 80 had bacterial meningitis and 49 had presumed viral meningitis. Spanos's rule had the highest AUC at 0.938 but was not significantly greater than other rules. No rule provided 100% sensitivity with a specificity higher than 50%. Based on our calculation of theoretical sensitivity and specificity, we suggest that a perfect rule requires at least four independent variables that posses both sensitivity and specificity higher than 85-90%. CONCLUSIONS: No clinical decision rules provided an acceptable specificity (>50% with 100% sensitivity when applying our data set in children. More studies in Vietnam and developing countries are required to develop and/or validate clinical rules and more very good biomarkers are required to develop such a perfect rule.

  11. Revisiting the debate on the relationship between display rules and performance: considering the explicitness of display rules.

    Science.gov (United States)

    Christoforou, Paraskevi S; Ashforth, Blake E

    2015-01-01

    We argue that the strength with which the organization communicates expectations regarding the appropriate emotional expression toward customers (i.e., explicitness of display rules) has an inverted U-shaped relationship with service delivery behaviors, customer satisfaction, and sales performance. Further, we argue that service organizations need a particular blend of explicitness of display rules and role discretion for the purpose of optimizing sales performance. As hypothesized, findings from 2 samples of salespeople suggest that either high or low explicitness of display rules impedes service delivery behaviors and sales performance, which peaks at moderate explicitness of display rules and high role discretion. The findings also suggest that the explicitness of display rules has a positive relationship with customer satisfaction. (c) 2015 APA, all rights reserved.

  12. Improved model reduction and tuning of fractional-order PI(λ)D(μ) controllers for analytical rule extraction with genetic programming.

    Science.gov (United States)

    Das, Saptarshi; Pan, Indranil; Das, Shantanu; Gupta, Amitava

    2012-03-01

    Genetic algorithm (GA) has been used in this study for a new approach of suboptimal model reduction in the Nyquist plane and optimal time domain tuning of proportional-integral-derivative (PID) and fractional-order (FO) PI(λ)D(μ) controllers. Simulation studies show that the new Nyquist-based model reduction technique outperforms the conventional H(2)-norm-based reduced parameter modeling technique. With the tuned controller parameters and reduced-order model parameter dataset, optimum tuning rules have been developed with a test-bench of higher-order processes via genetic programming (GP). The GP performs a symbolic regression on the reduced process parameters to evolve a tuning rule which provides the best analytical expression to map the data. The tuning rules are developed for a minimum time domain integral performance index described by a weighted sum of error index and controller effort. From the reported Pareto optimal front of the GP-based optimal rule extraction technique, a trade-off can be made between the complexity of the tuning formulae and the control performance. The efficacy of the single-gene and multi-gene GP-based tuning rules has been compared with the original GA-based control performance for the PID and PI(λ)D(μ) controllers, handling four different classes of representative higher-order processes. These rules are very useful for process control engineers, as they inherit the power of the GA-based tuning methodology, but can be easily calculated without the requirement for running the computationally intensive GA every time. Three-dimensional plots of the required variation in PID/fractional-order PID (FOPID) controller parameters with reduced process parameters have been shown as a guideline for the operator. Parametric robustness of the reported GP-based tuning rules has also been shown with credible simulation examples. Copyright © 2011 ISA. Published by Elsevier Ltd. All rights reserved.

  13. Performance of technical trading rules: evidence from Southeast Asian stock markets.

    Science.gov (United States)

    Tharavanij, Piyapas; Siraprapasiri, Vasan; Rajchamaha, Kittichai

    2015-01-01

    This paper examines the profitability of technical trading rules in the five Southeast Asian stock markets. The data cover a period of 14 years from January 2000 to December 2013. The instruments investigated are five Southeast Asian stock market indices: SET index (Thailand), FTSE Bursa Malaysia KLC index (Malaysia), FTSE Straits Times index (Singapore), JSX Composite index (Indonesia), and PSE composite index (the Philippines). Trading strategies investigated include Relative Strength Index, Stochastic oscillator, Moving Average Convergence-Divergence, Directional Movement Indicator and On Balance Volume. Performances are compared to a simple Buy-and-Hold. Statistical tests are also performed. Our empirical results show a strong performance of technical trading rules in an emerging stock market of Thailand but not in a more mature stock market of Singapore. The technical trading rules also generate statistical significant returns in the Malaysian, Indonesian and the Philippine markets. However, after taking transaction costs into account, most technical trading rules do not generate net returns. This fact suggests different levels of market efficiency among Southeast Asian stock markets. This paper finds three new insights. Firstly, technical indicators does not help much in terms of market timing. Basically, traders cannot expect to buy at a relative low price and sell at a relative high price by just using technical trading rules. Secondly, technical trading rules can be beneficial to individual investors as they help them to counter the behavioral bias called disposition effects which is the tendency to sell winning stocks too soon and holding on to losing stocks too long. Thirdly, even profitable strategies could not reliably predict subsequent market directions. They make money from having a higher average profit from profitable trades than an average loss from unprofitable ones.

  14. Rule-based Mamdani-type fuzzy modelling of thermal performance of fintube evaporator under frost conditions

    Directory of Open Access Journals (Sweden)

    Ozen Dilek Nur

    2016-01-01

    Full Text Available Frost formation brings about insulating effects over the surface of a heat exchanger and thereby deteriorating total heat transfer of the heat exchanger. In this study, a fin-tube evaporator is modeled by making use of Rule-based Mamdani-Type Fuzzy (RBMTF logic where total heat transfer, air inlet temperature of 2 °C to 7 °C and four different fluid speed groups (ua1=1; 1.44; 1.88 m s-1, ua2=2.32; 2.76 m s-1, ua3=3.2; 3.64 m s-1, ua4=4.08; 4.52; 4.96 m s-1 for the evaporator were taken into consideration. In the developed RBMTF system, outlet parameter UA was determined using inlet parameters Ta and ua. The RBMTF was trained and tested by using MATLAB® fuzzy logic toolbox. R2 (% for the training data and test data were found to be 99.91%. With this study, it has been shown that RBMTF model can be reliably used in determination of a total heat transfer of a fin-tube evaporator.

  15. Rule-based Test Generation with Mind Maps

    Directory of Open Access Journals (Sweden)

    Dimitry Polivaev

    2012-02-01

    Full Text Available This paper introduces basic concepts of rule based test generation with mind maps, and reports experiences learned from industrial application of this technique in the domain of smart card testing by Giesecke & Devrient GmbH over the last years. It describes the formalization of test selection criteria used by our test generator, our test generation architecture and test generation framework.

  16. Statistical metrology - measurement and modeling of variation for advanced process development and design rule generation

    International Nuclear Information System (INIS)

    Boning, Duane S.; Chung, James E.

    1998-01-01

    Advanced process technology will require more detailed understanding and tighter control of variation in devices and interconnects. The purpose of statistical metrology is to provide methods to measure and characterize variation, to model systematic and random components of that variation, and to understand the impact of variation on both yield and performance of advanced circuits. Of particular concern are spatial or pattern-dependencies within individual chips; such systematic variation within the chip can have a much larger impact on performance than wafer-level random variation. Statistical metrology methods will play an important role in the creation of design rules for advanced technologies. For example, a key issue in multilayer interconnect is the uniformity of interlevel dielectric (ILD) thickness within the chip. For the case of ILD thickness, we describe phases of statistical metrology development and application to understanding and modeling thickness variation arising from chemical-mechanical polishing (CMP). These phases include screening experiments including design of test structures and test masks to gather electrical or optical data, techniques for statistical decomposition and analysis of the data, and approaches to calibrating empirical and physical variation models. These models can be integrated with circuit CAD tools to evaluate different process integration or design rule strategies. One focus for the generation of interconnect design rules are guidelines for the use of 'dummy fill' or 'metal fill' to improve the uniformity of underlying metal density and thus improve the uniformity of oxide thickness within the die. Trade-offs that can be evaluated via statistical metrology include the improvements to uniformity possible versus the effect of increased capacitance due to additional metal

  17. Validation of the 3-day rule for stool bacterial tests in Japan.

    Science.gov (United States)

    Kobayashi, Masanori; Sako, Akahito; Ogami, Toshiko; Nishimura, So; Asayama, Naoki; Yada, Tomoyuki; Nagata, Naoyoshi; Sakurai, Toshiyuki; Yokoi, Chizu; Kobayakawa, Masao; Yanase, Mikio; Masaki, Naohiko; Takeshita, Nozomi; Uemura, Naomi

    2014-01-01

    Stool cultures are expensive and time consuming, and the positive rate of enteric pathogens in cases of nosocomial diarrhea is low. The 3-day rule, whereby clinicians order a Clostridium difficile (CD) toxin test rather than a stool culture for inpatients developing diarrhea >3 days after admission, has been well studied in Western countries. The present study sought to validate the 3-day rule in an acute care hospital setting in Japan. Stool bacterial and CD toxin test results for adult patients hospitalized in an acute care hospital in 2008 were retrospectively analyzed. Specimens collected after an initial positive test were excluded. The positive rate and cost-effectiveness of the tests were compared among three patient groups. The adult patients were divided into three groups for comparison: outpatients, patients hospitalized for ≤3 days and patients hospitalized for ≥4 days. Over the 12-month period, 1,597 stool cultures were obtained from 992 patients, and 880 CD toxin tests were performed in 529 patients. In the outpatient, inpatient ≤3 days and inpatient ≥4 days groups, the rate of positive stool cultures was 14.2%, 3.6% and 1.3% and that of positive CD toxin tests was 1.9%, 7.1% and 8.5%, respectively. The medical costs required to obtain one positive result were 9,181, 36,075 and 103,600 JPY and 43,200, 11,333 and 9,410 JPY, respectively. The 3-day rule was validated for the first time in a setting other than a Western country. Our results revealed that the "3-day rule" is also useful and cost-effective in Japan.

  18. Model-based automated testing of critical PLC programs.

    CERN Document Server

    Fernández Adiego, B; Tournier, J-C; González Suárez, V M; Bliudze, S

    2014-01-01

    Testing of critical PLC (Programmable Logic Controller) programs remains a challenging task for control system engineers as it can rarely be automated. This paper proposes a model based approach which uses the BIP (Behavior, Interactions and Priorities) framework to perform automated testing of PLC programs developed with the UNICOS (UNified Industrial COntrol System) framework. This paper defines the translation procedure and rules from UNICOS to BIP which can be fully automated in order to hide the complexity of the underlying model from the control engineers. The approach is illustrated and validated through the study of a water treatment process.

  19. Modelling collective foraging by means of individual behaviour rules in honey-bees

    NARCIS (Netherlands)

    Vries, Han de; Biesmeijer, J.C.

    1998-01-01

    An individual-oriented model is constructed which simulates the collective foraging behaviour of a colony of honey-bees, Apis mellifera. Each bee follows the same set of behavioural rules. Each rule consists of a set of conditions followed by the behavioural act to be performed if the

  20. FUZZY MODELING BY SUCCESSIVE ESTIMATION OF RULES ...

    African Journals Online (AJOL)

    This paper presents an algorithm for automatically deriving fuzzy rules directly from a set of input-output data of a process for the purpose of modeling. The rules are extracted by a method termed successive estimation. This method is used to generate a model without truncating the number of fired rules, to within user ...

  1. Performance of the Osteoporosis Self-Assessment Tool in ruling out low bone mineral density in postmenopausal women: a systematic review

    DEFF Research Database (Denmark)

    Rud, B; Hilden, J; Hyldstrup, L

    2007-01-01

    SUMMARY: The Osteoporosis Self-Assessment Tool (OST) is a simple test that may be of clinical value to rule-out low bone mineral density. We performed a systematic review to assess its performance in postmenopausal women. We included 36 studies. OST performed moderately in ruling-out femoral neck T...

  2. Performance of the osteoporosis self-assessment tool in ruling out low bone mineral density in postmenopausal women: A systematic review

    DEFF Research Database (Denmark)

    Rud, B.; Hilden, Jørgen; Hyldstrup, L.

    2007-01-01

    SUMMARY: The Osteoporosis Self-Assessment Tool (OST) is a simple test that may be of clinical value to rule-out low bone mineral density. We performed a systematic review to assess its performance in postmenopausal women. We included 36 studies. OST performed moderately in ruling-out femoral neck T...

  3. How much monetary policy rules do we need to estimate DSGE model for Russia?

    OpenAIRE

    Shulgin, Andrei

    2014-01-01

    This paper presents a three-sector DSGE model for a small open economy under the intermediate exchange rate regime. The central bank balance sheet equations are added to allow introducing two different monetary policy rules in the model. The principal question is how many independent monetary policy rules we need to describe Russian monetary policy in 2001–2012. To get an answer we perform Bayesian estimation of the DSGE model for four different combinations of monetary policy rules. The main...

  4. Modelling collective foraging by means of individual behaviour rules in honey-bees

    NARCIS (Netherlands)

    de Vries, H; Biesmeijer, JC

    1998-01-01

    An individual-oriented model is constructed which simulates the collective foraging behaviour of a colony of honey-bees, Apis mellifera. Each bee follows the same set of behavioural rules. Each rule consists of a set of conditions followed by the behavioural act to be performed if the conditions are

  5. Model dependence of energy-weighted sum rules

    International Nuclear Information System (INIS)

    Kirson, M.W.

    1977-01-01

    The contribution of the nucleon-nucleon interaction to energy-weighted sum rules for electromagnetic multipole transitions is investigated. It is found that only isoscalar electric transitions might have model-independent energy-weighted sum rules. For these transitions, explicit momentum and angular momentum dependence of the nuclear force give rise to corrections to the sum rule which are found to be negligibly small, thus confirming the model independence of these specific sum rules. These conclusions are unaffected by correlation effects. (author)

  6. Hedging Rules for Water Supply Reservoir Based on the Model of Simulation and Optimization

    Directory of Open Access Journals (Sweden)

    Yi Ji

    2016-06-01

    Full Text Available This study proposes a hedging rule model which is composed of a two-period reservior operation model considering the damage depth and hedging rule parameter optimization model. The former solves hedging rules based on a given poriod’s water supply weighting factor and carryover storage target, while the latter optimization model is used to optimize the weighting factor and carryover storage target based on the hedging rules. The coupling model gives the optimal poriod’s water supply weighting factor and carryover storage target to guide release. The conclusions achieved from this study as follows: (1 the water supply weighting factor and carryover storage target have a direct impact on the three elements of the hedging rule; (2 parameters can guide reservoirs to supply water reasonably after optimization of the simulation and optimization model; and (3 in order to verify the utility of the hedging rule, the Heiquan reservoir is used as a case study and particle swarm optimization algorithm with a simulation model is adopted for optimizing the parameter. The results show that the proposed hedging rule can improve the operation performances of the water supply reservoir.

  7. On the effects of adaptive reservoir operating rules in hydrological physically-based models

    Science.gov (United States)

    Giudici, Federico; Anghileri, Daniela; Castelletti, Andrea; Burlando, Paolo

    2017-04-01

    Recent years have seen a significant increase of the human influence on the natural systems both at the global and local scale. Accurately modeling the human component and its interaction with the natural environment is key to characterize the real system dynamics and anticipate future potential changes to the hydrological regimes. Modern distributed, physically-based hydrological models are able to describe hydrological processes with high level of detail and high spatiotemporal resolution. Yet, they lack in sophistication for the behavior component and human decisions are usually described by very simplistic rules, which might underperform in reproducing the catchment dynamics. In the case of water reservoir operators, these simplistic rules usually consist of target-level rule curves, which represent the average historical level trajectory. Whilst these rules can reasonably reproduce the average seasonal water volume shifts due to the reservoirs' operation, they cannot properly represent peculiar conditions, which influence the actual reservoirs' operation, e.g., variations in energy price or water demand, dry or wet meteorological conditions. Moreover, target-level rule curves are not suitable to explore the water system response to climate and socio economic changing contexts, because they assume a business-as-usual operation. In this work, we quantitatively assess how the inclusion of adaptive reservoirs' operating rules into physically-based hydrological models contribute to the proper representation of the hydrological regime at the catchment scale. In particular, we contrast target-level rule curves and detailed optimization-based behavioral models. We, first, perform the comparison on past observational records, showing that target-level rule curves underperform in representing the hydrological regime over multiple time scales (e.g., weekly, seasonal, inter-annual). Then, we compare how future hydrological changes are affected by the two modeling

  8. Testing the effects of safety climate and disruptive children behavior on school bus drivers performance: A multilevel model.

    Science.gov (United States)

    Zohar, Dov; Lee, Jin

    2016-10-01

    The study was designed to test a multilevel path model whose variables exert opposing effects on school bus drivers' performance. Whereas departmental safety climate was expected to improve driving safety, the opposite was true for in-vehicle disruptive children behavior. The driving safety path in this model consists of increasing risk-taking practices starting with safety shortcuts leading to rule violations and to near-miss events. The study used a sample of 474 school bus drivers in rural areas, driving children to school and school-related activities. Newly developed scales for measuring predictor, mediator and outcome variables were validated with video data taken from inner and outer cameras, which were installed in 29 buses. Results partially supported the model by indicating that group-level safety climate and individual-level children distraction exerted opposite effects on the driving safety path. Furthermore, as hypothesized, children disruption moderated the strength of the safety rule violation-near miss relationship, resulting in greater strength under high disruptiveness. At the same time, the hypothesized interaction between the two predictor variables was not supported. Theoretical and practical implications for studying safety climate in general and distracted driving in particular for professional drivers are discussed. Copyright © 2016 Elsevier Ltd. All rights reserved.

  9. Models of the Structure of Some Rule-Governed Mathematical Behaviors.

    Science.gov (United States)

    Bergan, John R.

    1981-01-01

    This study investigated the extent to which various latent class models adequately described elementary rule-governed mathematical behaviors. Children were given a fraction concepts test. Results supported the adoption of a set of three-class models including a mastery class, a nonmastery class, and a transitional class to describe the data.…

  10. Item Response Theory Models for Performance Decline during Testing

    Science.gov (United States)

    Jin, Kuan-Yu; Wang, Wen-Chung

    2014-01-01

    Sometimes, test-takers may not be able to attempt all items to the best of their ability (with full effort) due to personal factors (e.g., low motivation) or testing conditions (e.g., time limit), resulting in poor performances on certain items, especially those located toward the end of a test. Standard item response theory (IRT) models fail to…

  11. Optimization of Simple Monetary Policy Rules on the Base of Estimated DSGE-model

    OpenAIRE

    Shulgin, A.

    2015-01-01

    Optimization of coefficients in monetary policy rules is performed on the base of the DSGE-model with two independent monetary policy instruments estimated on the Russian data. It was found that welfare maximizing policy rules lead to inadequate result and pro-cyclical monetary policy. Optimal coefficients in Taylor rule and exchange rate rule allow to decrease volatility estimated on Russian data of 2001-2012 by about 20%. The degree of exchange rate flexibility parameter was found to be low...

  12. Performance verification tests of JT-60SA CS model coil

    Energy Technology Data Exchange (ETDEWEB)

    Obana, Tetsuhiro, E-mail: obana.tetsuhiro@LHD.nifs.ac.jp [National Institute for Fusion Science, 322-6 Oroshi, Toki, Gifu 509-5292 (Japan); Murakami, Haruyuki [Japan Atomic Energy Agency, 801-1 Mukoyama, Naka, Ibaraki 311-0193 (Japan); Takahata, Kazuya; Hamaguchi, Shinji; Chikaraishi, Hirotaka; Mito, Toshiyuki; Imagawa, Shinsaku [National Institute for Fusion Science, 322-6 Oroshi, Toki, Gifu 509-5292 (Japan); Kizu, Kaname; Natsume, Kyohei; Yoshida, Kiyoshi [Japan Atomic Energy Agency, 801-1 Mukoyama, Naka, Ibaraki 311-0193 (Japan)

    2015-11-15

    Highlights: • The performance of the JT-60SA CS model coil was verified. • The CS model coil comprised a quad-pancake wound with a Nb{sub 3}Sn CIC conductor. • The CS model coil met the design requirements. - Abstract: As a final check of the coil manufacturing method of the JT-60 Super Advanced (JT-60SA) central solenoid (CS), we verified the performance of a CS model coil. The model coil comprised a quad-pancake wound with a Nb{sub 3}Sn cable-in-conduit conductor. Measurements of the critical current, joint resistance, pressure drop, and magnetic field were conducted in the verification tests. In the critical-current measurement, the critical current of the model coil coincided with the estimation derived from a strain of −0.62% for the Nb{sub 3}Sn strands. As a result, critical-current degradation caused by the coil manufacturing process was not observed. The results of the performance verification tests indicate that the model coil met the design requirements. Consequently, the manufacturing process of the JT-60SA CS was established.

  13. Class association rules mining from students’ test data (Abstract)

    NARCIS (Netherlands)

    Romero, C.; Ventura, S.; Vasilyeva, E.; Pechenizkiy, M.; Baker, de R.S.J.; Merceron, A.; Pavlik Jr., P.I.

    2010-01-01

    In this paper we propose the use of a special type of association rules mining for discovering interesting relationships from the students’ test data collected in our case with Moodle learning management system (LMS). Particularly, we apply Class Association Rule (CAR) mining to different data

  14. Predictive performance of universal termination of resuscitation rules in an Asian community: are they accurate enough?

    Science.gov (United States)

    Chiang, Wen-Chu; Ko, Patrick Chow-In; Chang, Anna Marie; Liu, Sot Shih-Hung; Wang, Hui-Chih; Yang, Chih-Wei; Hsieh, Ming-Ju; Chen, Shey-Ying; Lai, Mei-Shu; Ma, Matthew Huei-Ming

    2015-04-01

    Prehospital termination of resuscitation (TOR) rules have not been widely validated outside of Western countries. This study evaluated the performance of TOR rules in an Asian metropolitan with a mixed-tier emergency medical service (EMS). We analysed the Utstein registry of adult, non-traumatic out-of-hospital cardiac arrests (OHCAs) in Taipei to test the performance of TOR rules for advanced life support (ALS) or basic life support (BLS) providers. ALS and BLS-TOR rules were tested in OHCAs among three subgroups: (1) resuscitated by ALS, (2) by BLS and (3) by mixed ALS and BLS. Outcome definition was in-hospital death. Sensitivity, specificity, positive predictive value (PPV), negative predictive value and decreased transport rate (DTR) among various provider combinations were calculated. Of the 3489 OHCAs included, 240 were resuscitated by ALS, 1727 by BLS and 1522 by ALS and BLS. Overall survival to hospital discharge was 197 patients (5.6%). Specificity and PPV of ALS-TOR and BLS-TOR for identifying death ranged from 70.7% to 81.8% and 95.1% to 98.1%, respectively. Applying the TOR rules would have a DTR of 34.2-63.9%. BLS rules had better predictive accuracy and DTR than ALS rules among all subgroups. Application of the ALS and BLS TOR rules would have decreased OHCA transported to the hospital, and BLS rules are reasonable as the universal criteria in a mixed-tier EMS. However, 1.9-4.9% of those who survived would be misclassified as non-survivors, raising concern of compromising patient safety for the implementation of the rules. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://group.bmj.com/group/rights-licensing/permissions.

  15. QEFSM model and Markov Algorithm for translating Quran reciting rules into Braille code

    Directory of Open Access Journals (Sweden)

    Abdallah M. Abualkishik

    2015-07-01

    Full Text Available The Holy Quran is the central religious verbal text of Islam. Muslims are expected to read, understand, and apply the teachings of the Holy Quran. The Holy Quran was translated to Braille code as a normal Arabic text without having its reciting rules included. It is obvious that the users of this transliteration will not be able to recite the Quran the right way. Through this work, Quran Braille Translator (QBT presents a specific translator to translate Quran verses and their reciting rules into the Braille code. Quran Extended Finite State Machine (QEFSM model is proposed through this study as it is able to detect the Quran reciting rules (QRR from the Quran text. Basis path testing was used to evaluate the inner work for the model by checking all the test cases for the model. Markov Algorithm (MA was used for translating the detected QRR and Quran text into the matched Braille code. The data entries for QBT are Arabic letters and diacritics. The outputs of this study are seen in the double lines of Braille symbols; the first line is the proposed Quran reciting rules and the second line is for the Quran scripts.

  16. Delay model and performance testing for FPGA carry chain TDC

    International Nuclear Information System (INIS)

    Kang Xiaowen; Liu Yaqiang; Cui Junjian Yang Zhangcan; Jin Yongjie

    2011-01-01

    Time-of-flight (TOF) information would improve the performance of PET (position emission tomography). TDC design is a key technique. It proposed Carry Chain TDC Delay model. Through changing the significant delay parameter of model, paper compared the difference of TDC performance, and finally realized Time-to-Digital Convertor (TDC) based on Carry Chain Method using FPGA EP2C20Q240C8N with 69 ps LSB, max error below 2 LSB. Such result could meet the TOF demand. It also proposed a Coaxial Cable Measuring method for TDC testing, without High-precision test equipment. (authors)

  17. Model-based Systems Engineering: Creation and Implementation of Model Validation Rules for MOS 2.0

    Science.gov (United States)

    Schmidt, Conrad K.

    2013-01-01

    Model-based Systems Engineering (MBSE) is an emerging modeling application that is used to enhance the system development process. MBSE allows for the centralization of project and system information that would otherwise be stored in extraneous locations, yielding better communication, expedited document generation and increased knowledge capture. Based on MBSE concepts and the employment of the Systems Modeling Language (SysML), extremely large and complex systems can be modeled from conceptual design through all system lifecycles. The Operations Revitalization Initiative (OpsRev) seeks to leverage MBSE to modernize the aging Advanced Multi-Mission Operations Systems (AMMOS) into the Mission Operations System 2.0 (MOS 2.0). The MOS 2.0 will be delivered in a series of conceptual and design models and documents built using the modeling tool MagicDraw. To ensure model completeness and cohesiveness, it is imperative that the MOS 2.0 models adhere to the specifications, patterns and profiles of the Mission Service Architecture Framework, thus leading to the use of validation rules. This paper outlines the process by which validation rules are identified, designed, implemented and tested. Ultimately, these rules provide the ability to maintain model correctness and synchronization in a simple, quick and effective manner, thus allowing the continuation of project and system progress.

  18. Modelling of LOCA Tests with the BISON Fuel Performance Code

    Energy Technology Data Exchange (ETDEWEB)

    Williamson, Richard L [Idaho National Laboratory; Pastore, Giovanni [Idaho National Laboratory; Novascone, Stephen Rhead [Idaho National Laboratory; Spencer, Benjamin Whiting [Idaho National Laboratory; Hales, Jason Dean [Idaho National Laboratory

    2016-05-01

    BISON is a modern finite-element based, multidimensional nuclear fuel performance code that is under development at Idaho National Laboratory (USA). Recent advances of BISON include the extension of the code to the analysis of LWR fuel rod behaviour during loss-of-coolant accidents (LOCAs). In this work, BISON models for the phenomena relevant to LWR cladding behaviour during LOCAs are described, followed by presentation of code results for the simulation of LOCA tests. Analysed experiments include separate effects tests of cladding ballooning and burst, as well as the Halden IFA-650.2 fuel rod test. Two-dimensional modelling of the experiments is performed, and calculations are compared to available experimental data. Comparisons include cladding burst pressure and temperature in separate effects tests, as well as the evolution of fuel rod inner pressure during ballooning and time to cladding burst. Furthermore, BISON three-dimensional simulations of separate effects tests are performed, which demonstrate the capability to reproduce the effect of azimuthal temperature variations in the cladding. The work has been carried out in the frame of the collaboration between Idaho National Laboratory and Halden Reactor Project, and the IAEA Coordinated Research Project FUMAC.

  19. Design Transformations for Rule-based Procedural Modeling

    KAUST Repository

    Lienhard, Stefan; Lau, Cheryl; Mü ller, Pascal; Wonka, Peter; Pauly, Mark

    2017-01-01

    We introduce design transformations for rule-based procedural models, e.g., for buildings and plants. Given two or more procedural designs, each specified by a grammar, a design transformation combines elements of the existing designs to generate new designs. We introduce two technical components to enable design transformations. First, we extend the concept of discrete rule switching to rule merging, leading to a very large shape space for combining procedural models. Second, we propose an algorithm to jointly derive two or more grammars, called grammar co-derivation. We demonstrate two applications of our work: we show that our framework leads to a larger variety of models than previous work, and we show fine-grained transformation sequences between two procedural models.

  20. Design Transformations for Rule-based Procedural Modeling

    KAUST Repository

    Lienhard, Stefan

    2017-05-24

    We introduce design transformations for rule-based procedural models, e.g., for buildings and plants. Given two or more procedural designs, each specified by a grammar, a design transformation combines elements of the existing designs to generate new designs. We introduce two technical components to enable design transformations. First, we extend the concept of discrete rule switching to rule merging, leading to a very large shape space for combining procedural models. Second, we propose an algorithm to jointly derive two or more grammars, called grammar co-derivation. We demonstrate two applications of our work: we show that our framework leads to a larger variety of models than previous work, and we show fine-grained transformation sequences between two procedural models.

  1. 1997 Performance Testing of Multi-Metal Continuous Emissions Monitors

    International Nuclear Information System (INIS)

    1998-01-01

    Five prototype and two commercially available multi-metals continuous emissions monitors (CEMs) were tested in September 1997 at the Rotary Kiln Incinerator Simulator facility at the EPA National Risk Management Research Laboratory, Research Triangle Park, North Carolina. The seven CEMs were tested side by side in a long section of duct following the secondary combustion chamber of the RKIS. Two different concentrations of six toxic metals were introduced into the incinerator-approximately 15 and 75 g/dscm of arsenic, beryllium, cadmium, chromium, lead, and mercury (We also tested for antimony but we are not reporting on it here because EPA recently dropped antimony from the list of metals addressed by the draft MACT rule). These concentrations were chosen to be close to emission standards in the draft MACT rule and the estimated Method Detection Limit (MDL) required of a CEM for regulatory compliance purposes. Results from this test show that no CEMs currently meet the performance specifications in the EPA draft MACT rule for hazardous waste incinerators. Only one of the CEMs tested was able to measure all six metals at the concentrations tested. Even so, the relative accuracy of this CEM varied between 35% and 100%, not 20% or less as required in the EPA performance specification. As a result, we conclude that no CEM is ready for long-term performance validation for compliance monitoring applications. Because sampling and measuring Hg is a recurring problem for multi-metal CEMs as well as Hg CEMs, we recommended that developers participate in a 1998 DOE-sponsored workshop to solve these and other common CEM measurement issues

  2. Fuzzy rule-based model for hydropower reservoirs operation

    Energy Technology Data Exchange (ETDEWEB)

    Moeini, R.; Afshar, A.; Afshar, M.H. [School of Civil Engineering, Iran University of Science and Technology, Tehran (Iran, Islamic Republic of)

    2011-02-15

    Real-time hydropower reservoir operation is a continuous decision-making process of determining the water level of a reservoir or the volume of water released from it. The hydropower operation is usually based on operating policies and rules defined and decided upon in strategic planning. This paper presents a fuzzy rule-based model for the operation of hydropower reservoirs. The proposed fuzzy rule-based model presents a set of suitable operating rules for release from the reservoir based on ideal or target storage levels. The model operates on an 'if-then' principle, in which the 'if' is a vector of fuzzy premises and the 'then' is a vector of fuzzy consequences. In this paper, reservoir storage, inflow, and period are used as premises and the release as the consequence. The steps involved in the development of the model include, construction of membership functions for the inflow, storage and the release, formulation of fuzzy rules, implication, aggregation and defuzzification. The required knowledge bases for the formulation of the fuzzy rules is obtained form a stochastic dynamic programming (SDP) model with a steady state policy. The proposed model is applied to the hydropower operation of ''Dez'' reservoir in Iran and the results are presented and compared with those of the SDP model. The results indicate the ability of the method to solve hydropower reservoir operation problems. (author)

  3. Tree Branching: Leonardo da Vinci's Rule versus Biomechanical Models

    Science.gov (United States)

    Minamino, Ryoko; Tateno, Masaki

    2014-01-01

    This study examined Leonardo da Vinci's rule (i.e., the sum of the cross-sectional area of all tree branches above a branching point at any height is equal to the cross-sectional area of the trunk or the branch immediately below the branching point) using simulations based on two biomechanical models: the uniform stress and elastic similarity models. Model calculations of the daughter/mother ratio (i.e., the ratio of the total cross-sectional area of the daughter branches to the cross-sectional area of the mother branch at the branching point) showed that both biomechanical models agreed with da Vinci's rule when the branching angles of daughter branches and the weights of lateral daughter branches were small; however, the models deviated from da Vinci's rule as the weights and/or the branching angles of lateral daughter branches increased. The calculated values of the two models were largely similar but differed in some ways. Field measurements of Fagus crenata and Abies homolepis also fit this trend, wherein models deviated from da Vinci's rule with increasing relative weights of lateral daughter branches. However, this deviation was small for a branching pattern in nature, where empirical measurements were taken under realistic measurement conditions; thus, da Vinci's rule did not critically contradict the biomechanical models in the case of real branching patterns, though the model calculations described the contradiction between da Vinci's rule and the biomechanical models. The field data for Fagus crenata fit the uniform stress model best, indicating that stress uniformity is the key constraint of branch morphology in Fagus crenata rather than elastic similarity or da Vinci's rule. On the other hand, mechanical constraints are not necessarily significant in the morphology of Abies homolepis branches, depending on the number of daughter branches. Rather, these branches were often in agreement with da Vinci's rule. PMID:24714065

  4. Tree branching: Leonardo da Vinci's rule versus biomechanical models.

    Science.gov (United States)

    Minamino, Ryoko; Tateno, Masaki

    2014-01-01

    This study examined Leonardo da Vinci's rule (i.e., the sum of the cross-sectional area of all tree branches above a branching point at any height is equal to the cross-sectional area of the trunk or the branch immediately below the branching point) using simulations based on two biomechanical models: the uniform stress and elastic similarity models. Model calculations of the daughter/mother ratio (i.e., the ratio of the total cross-sectional area of the daughter branches to the cross-sectional area of the mother branch at the branching point) showed that both biomechanical models agreed with da Vinci's rule when the branching angles of daughter branches and the weights of lateral daughter branches were small; however, the models deviated from da Vinci's rule as the weights and/or the branching angles of lateral daughter branches increased. The calculated values of the two models were largely similar but differed in some ways. Field measurements of Fagus crenata and Abies homolepis also fit this trend, wherein models deviated from da Vinci's rule with increasing relative weights of lateral daughter branches. However, this deviation was small for a branching pattern in nature, where empirical measurements were taken under realistic measurement conditions; thus, da Vinci's rule did not critically contradict the biomechanical models in the case of real branching patterns, though the model calculations described the contradiction between da Vinci's rule and the biomechanical models. The field data for Fagus crenata fit the uniform stress model best, indicating that stress uniformity is the key constraint of branch morphology in Fagus crenata rather than elastic similarity or da Vinci's rule. On the other hand, mechanical constraints are not necessarily significant in the morphology of Abies homolepis branches, depending on the number of daughter branches. Rather, these branches were often in agreement with da Vinci's rule.

  5. Comparative analysis of business rules and business process modeling languages

    Directory of Open Access Journals (Sweden)

    Audrius Rima

    2013-03-01

    Full Text Available During developing an information system is important to create clear models and choose suitable modeling languages. The article analyzes the SRML, SBVR, PRR, SWRL, OCL rules specifying language and UML, DFD, CPN, EPC and IDEF3 BPMN business process modeling language. The article presents business rules and business process modeling languages theoretical comparison. The article according to selected modeling aspects of the comparison between different business process modeling languages ​​and business rules representation languages sets. Also, it is selected the best fit of language set for three layer framework for business rule based software modeling.

  6. Optimal sizing of plug-in fuel cell electric vehicles using models of vehicle performance and system cost

    International Nuclear Information System (INIS)

    Xu, Liangfei; Ouyang, Minggao; Li, Jianqiu; Yang, Fuyuan; Lu, Languang; Hua, Jianfeng

    2013-01-01

    Highlights: ► An analytical model for vehicle performance and power-train parameters. ► Quantitative relationships between vehicle performance and power-train parameters. ► Optimal sizing rules that help designing an optimal PEM fuel cell power-train. ► An on-road testing showing the performance of the proposed vehicle. -- Abstract: This paper presents an optimal sizing method for plug-in proton exchange membrane (PEM) fuel cell and lithium-ion battery (LIB) powered city buses. We propose a theoretical model describing the relationship between components’ parameters and vehicle performance. Analysis results show that within the working range of the electric motor, the maximal velocity and driving distance are influenced linearly by the parameters of the components, e.g. fuel cell efficiency, fuel cell output power, stored hydrogen mass, vehicle auxiliary power, battery capacity, and battery average resistance. Moreover, accelerating time is also linearly dependant on the abovementioned parameters, except of those of the battery. Next, we attempt to minimize fixed and operating costs by introducing an optimal sizing problem that uses as constraints the requirements on vehicle performance. By solving this problem, we attain several optimal sizing rules. Finally, we use these rules to design a plug-in PEM fuel cell city bus and present performance results obtained by on-road testing.

  7. Life fraction rules

    International Nuclear Information System (INIS)

    Maile, K.

    1989-01-01

    Evaluations for lifetime estimation of high temperature loaded HTR-components under creep fatigue load had been performed. The evaluations were carried out on the basis of experimental data of strain controlled fatigue tests with respectively without hold times performed on material NiCr 22 Co 12 Mo (Inconel 617). Life prediction was made by means of the linear damage accumulation rule. Due to the high temperatures no realistic estimates of creep damage can be obtained with this rule. Therefore the rule was modified. The modifications consist in a different analysis of the relaxation curve including different calculation of the creep damage estimate resp. in an extended rule, taking into consideration the interaction between creep and fatigue. In order to reach a better result transparency and to reduce data set dependent result scattering a round robin with a given data set was carried out. The round robin yielded that for a given test temperature of T = 950deg C realistic estimate of damage can be obtained with each modification. Furthermore a reduction of resulting scatterbands in the interaction diagram can be observed, i.e. the practicability of the rule has been increased. (orig.)

  8. Rule-based modeling: a computational approach for studying biomolecular site dynamics in cell signaling systems

    Science.gov (United States)

    Chylek, Lily A.; Harris, Leonard A.; Tung, Chang-Shung; Faeder, James R.; Lopez, Carlos F.

    2013-01-01

    Rule-based modeling was developed to address the limitations of traditional approaches for modeling chemical kinetics in cell signaling systems. These systems consist of multiple interacting biomolecules (e.g., proteins), which themselves consist of multiple parts (e.g., domains, linear motifs, and sites of phosphorylation). Consequently, biomolecules that mediate information processing generally have the potential to interact in multiple ways, with the number of possible complexes and post-translational modification states tending to grow exponentially with the number of binary interactions considered. As a result, only large reaction networks capture all possible consequences of the molecular interactions that occur in a cell signaling system, which is problematic because traditional modeling approaches for chemical kinetics (e.g., ordinary differential equations) require explicit network specification. This problem is circumvented through representation of interactions in terms of local rules. With this approach, network specification is implicit and model specification is concise. Concise representation results in a coarse graining of chemical kinetics, which is introduced because all reactions implied by a rule inherit the rate law associated with that rule. Coarse graining can be appropriate if interactions are modular, and the coarseness of a model can be adjusted as needed. Rules can be specified using specialized model-specification languages, and recently developed tools designed for specification of rule-based models allow one to leverage powerful software engineering capabilities. A rule-based model comprises a set of rules, which can be processed by general-purpose simulation and analysis tools to achieve different objectives (e.g., to perform either a deterministic or stochastic simulation). PMID:24123887

  9. Rule-based modeling: a computational approach for studying biomolecular site dynamics in cell signaling systems.

    Science.gov (United States)

    Chylek, Lily A; Harris, Leonard A; Tung, Chang-Shung; Faeder, James R; Lopez, Carlos F; Hlavacek, William S

    2014-01-01

    Rule-based modeling was developed to address the limitations of traditional approaches for modeling chemical kinetics in cell signaling systems. These systems consist of multiple interacting biomolecules (e.g., proteins), which themselves consist of multiple parts (e.g., domains, linear motifs, and sites of phosphorylation). Consequently, biomolecules that mediate information processing generally have the potential to interact in multiple ways, with the number of possible complexes and posttranslational modification states tending to grow exponentially with the number of binary interactions considered. As a result, only large reaction networks capture all possible consequences of the molecular interactions that occur in a cell signaling system, which is problematic because traditional modeling approaches for chemical kinetics (e.g., ordinary differential equations) require explicit network specification. This problem is circumvented through representation of interactions in terms of local rules. With this approach, network specification is implicit and model specification is concise. Concise representation results in a coarse graining of chemical kinetics, which is introduced because all reactions implied by a rule inherit the rate law associated with that rule. Coarse graining can be appropriate if interactions are modular, and the coarseness of a model can be adjusted as needed. Rules can be specified using specialized model-specification languages, and recently developed tools designed for specification of rule-based models allow one to leverage powerful software engineering capabilities. A rule-based model comprises a set of rules, which can be processed by general-purpose simulation and analysis tools to achieve different objectives (e.g., to perform either a deterministic or stochastic simulation). © 2013 Wiley Periodicals, Inc.

  10. High speed high stakes scoring rule: assessing the performance of a new scoring rule for digital asssesment

    NARCIS (Netherlands)

    Klinkenberg, S.; Kalz, M.; Ras, E.

    2014-01-01

    In this paper we will present the results of a three year subsidized research project investigating the performance of a new scoring rule for digital assessment. The scoring rule incorporates response time and accuracy in an adaptive environment. The project aimed to assess the validity and

  11. PSA Model Improvement Using Maintenance Rule Function Mapping

    Energy Technology Data Exchange (ETDEWEB)

    Seo, Mi Ro [KHNP-CRI, Nuclear Safety Laboratory, Daejeon (Korea, Republic of)

    2011-10-15

    The Maintenance Rule (MR) program, in nature, is a performance-based program. Therefore, the risk information derived from the Probabilistic Safety Assessment model is introduced into the MR program during the Safety Significance determination and Performance Criteria selection processes. However, this process also facilitates the determination of the vulnerabilities in currently utilized PSA models and offers means of improving them. To find vulnerabilities in an existing PSA model, an initial review determines whether the safety-related MR functions are included in the PSA model. Because safety-related MR functions are related to accident prevention and mitigation, it is generally necessary for them to be included in the PSA model. In the process of determining the safety significance of each functions, quantitative risk importance levels are determined through a process known as PSA model basic event mapping to MR functions. During this process, it is common for some inadequate and overlooked models to be uncovered. In this paper, the PSA model and the MR program of Wolsong Unit 1 were used as references

  12. Assessing the operation rules of a reservoir system based on a detailed modelling-chain

    Science.gov (United States)

    Bruwier, M.; Erpicum, S.; Pirotton, M.; Archambeau, P.; Dewals, B.

    2014-09-01

    According to available climate change scenarios for Belgium, drier summers and wetter winters are expected. In this study, we focus on two muti-purpose reservoirs located in the Vesdre catchment, which is part of the Meuse basin. The current operation rules of the reservoirs are first analysed. Next, the impacts of two climate change scenarios are assessed and enhanced operation rules are proposed to mitigate these impacts. For this purpose, an integrated model of the catchment was used. It includes a hydrological model, one-dimensional and two-dimensional hydraulic models of the river and its main tributaries, a model of the reservoir system and a flood damage model. Five performance indicators of the reservoir system have been defined, reflecting its ability to provide sufficient drinking, to control floods, to produce hydropower and to reduce low-flow condition. As shown by the results, enhanced operation rules may improve the drinking water potential and the low-flow augmentation while the existing operation rules are efficient for flood control and for hydropower production.

  13. Assessing the operation rules of a reservoir system based on a detailed modelling chain

    Science.gov (United States)

    Bruwier, M.; Erpicum, S.; Pirotton, M.; Archambeau, P.; Dewals, B. J.

    2015-03-01

    According to available climate change scenarios for Belgium, drier summers and wetter winters are expected. In this study, we focus on two multi-purpose reservoirs located in the Vesdre catchment, which is part of the Meuse basin. The current operation rules of the reservoirs are first analysed. Next, the impacts of two climate change scenarios are assessed and enhanced operation rules are proposed to mitigate these impacts. For this purpose, an integrated model of the catchment was used. It includes a hydrological model, one-dimensional and two-dimensional hydraulic models of the river and its main tributaries, a model of the reservoir system and a flood damage model. Five performance indicators of the reservoir system have been defined, reflecting its ability to provide sufficient drinking water, to control floods, to produce hydropower and to reduce low-flow conditions. As shown by the results, enhanced operation rules may improve the drinking water potential and the low-flow augmentation while the existing operation rules are efficient for flood control and for hydropower production.

  14. Rule-Blocking and Forward-Looking Conditions in the Computational Modelling of Pāṇinian Derivation

    Science.gov (United States)

    Scharf, Peter M.

    Attempting to model Pāṇinian procedure computationally forces one to clarify concepts explicitly and allows one to test various versions and interpretations of his grammar against each other and against bodies of extant Sanskrit texts. To model Pāṇinian procedure requires creating data structures and a framework that allow one to approximate the statement of Pāṇinian rules in an executable language. Scharf (2009: 117-125) provided a few examples of how rules would be formulated in a computational model of Pāṇinian grammar as opposed to in software that generated speech forms without regard to Pāṇinian procedure. Mishra (2009) described the extensive use of attributes to track classification, marking and other features of phonetic strings. Goyal, Kulkarni, and Behera (2009, especially sec. 3.5) implemented a model of the asiddhavat section of rules (6.4.22-129) in which the state of the data passed to rules of the section is maintained unchanged and is utilized by those rules as conditions, yet the rules of the section are applied in parallel, and the result of all applicable rules applying exits the section. The current paper describes Scharf and Hyman's implementation of rule blocking and forward-looking conditions. The former deals with complex groups of rules concerned with domains included within the scope of a general rule. The latter concerns a case where a decision at an early stage in the derivation requires evaluation of conditions that do not obtain until a subsequent stage in the derivation.

  15. Performance testing of real-time AI systems using the activation framework

    International Nuclear Information System (INIS)

    Becker, L.; Duckworth, J.; Laznovsky, A.; Green, P.

    1992-01-01

    This paper describes methods for automated performance testing of real-time artificial intelligence systems using the Activation Framework software development tool. The Activation Framework is suitable for applications such as the diagnosis of power system failures, which require the interpretation of large volumes of data in real-time. The Activation Framework consists of tools for compiling groups of Expert Systems rules into executable code modules, for automatically generating code modules from high level system configuration descriptions, and for automatically generating command files for program compilation and linking. It includes an operating system environment which provides the code which is common from one real-time AI applications to the next. It also includes mechanisms, described here, for automatic performance testing. The principal emphasis of this paper is on a rule based language which is used to capture performance specifications. This specification is compiled into code modules which are used to automatically test the system. This testing can validate that the system meets performance requirements during development and after maintenance. A large number of tests can be randomly generated and executed and the correctness of the outputs automatically validated. The paper also describes graph directed testing methods to minimize the number of test runs required

  16. Rules, Models, and Self-Reinforcement in Children

    Science.gov (United States)

    Hildebrandt, David E.; And Others

    1973-01-01

    The study found that concordance between a rule and a recent model's behavior was most effective in leading to acceptance of the rule, despite a tendency for the subject's to adopt lenient self-reinforcement criteria when given an opportunity to do so. (JB)

  17. Discovery of Transition Rules for Cellular Automata Using Artificial Bee Colony and Particle Swarm Optimization Algorithms in Urban Growth Modeling

    Directory of Open Access Journals (Sweden)

    Fereydoun Naghibi

    2016-12-01

    Full Text Available This paper presents an advanced method in urban growth modeling to discover transition rules of cellular automata (CA using the artificial bee colony (ABC optimization algorithm. Also, comparisons between the simulation results of CA models optimized by the ABC algorithm and the particle swarm optimization algorithms (PSO as intelligent approaches were performed to evaluate the potential of the proposed methods. According to previous studies, swarm intelligence algorithms for solving optimization problems such as discovering transition rules of CA in land use change/urban growth modeling can produce reasonable results. Modeling of urban growth as a dynamic process is not straightforward because of the existence of nonlinearity and heterogeneity among effective involved variables which can cause a number of challenges for traditional CA. ABC algorithm, the new powerful swarm based optimization algorithms, can be used to capture optimized transition rules of CA. This paper has proposed a methodology based on remote sensing data for modeling urban growth with CA calibrated by the ABC algorithm. The performance of ABC-CA, PSO-CA, and CA-logistic models in land use change detection is tested for the city of Urmia, Iran, between 2004 and 2014. Validations of the models based on statistical measures such as overall accuracy, figure of merit, and total operating characteristic were made. We showed that the overall accuracy of the ABC-CA model was 89%, which was 1.5% and 6.2% higher than those of the PSO-CA and CA-logistic model, respectively. Moreover, the allocation disagreement (simulation error of the simulation results for the ABC-CA, PSO-CA, and CA-logistic models are 11%, 12.5%, and 17.2%, respectively. Finally, for all evaluation indices including running time, convergence capability, flexibility, statistical measurements, and the produced spatial patterns, the ABC-CA model performance showed relative improvement and therefore its superiority was

  18. Test results of the SMES model coil. Pulse performance

    International Nuclear Information System (INIS)

    Hamajima, Takataro; Shimada, Mamoru; Ono, Michitaka

    1998-01-01

    A model coil for superconducting magnetic energy storage (SMES model coil) has been developed to establish the component technologies needed for a small-scale 100 kWh SMES device. The SMES model coil was fabricated, and then performance tests were carried out in 1996. The coil was successfully charged up to around 30 kA and down to zero at the same ramp rate of magnetic field experienced in a 100 kWh SMES device. AC loss in the coil was measured by an enthalpy method as parameters of ramp rate and flat top current. The results were evaluated by an analysis and compared with short-sample test results. The measured hysteresis loss is in good agreement with that estimated from the short-sample results. It was found that the coupling loss of the coil consists of two major coupling time constants. One is a short time constant of about 200 ms, which is in agreement with the test results of a short real conductor. The other is a long time constant of about 30 s, which could not be expected from the short sample test results. (author)

  19. Maintenance Personnel Performance Simulation (MAPPS) model: description of model content, structure, and sensitivity testing. Volume 2

    International Nuclear Information System (INIS)

    Siegel, A.I.; Bartter, W.D.; Wolf, J.J.; Knee, H.E.

    1984-12-01

    This volume of NUREG/CR-3626 presents details of the content, structure, and sensitivity testing of the Maintenance Personnel Performance Simulation (MAPPS) model that was described in summary in volume one of this report. The MAPPS model is a generalized stochastic computer simulation model developed to simulate the performance of maintenance personnel in nuclear power plants. The MAPPS model considers workplace, maintenance technician, motivation, human factors, and task oriented variables to yield predictive information about the effects of these variables on successful maintenance task performance. All major model variables are discussed in detail and their implementation and interactive effects are outlined. The model was examined for disqualifying defects from a number of viewpoints, including sensitivity testing. This examination led to the identification of some minor recalibration efforts which were carried out. These positive results indicate that MAPPS is ready for initial and controlled applications which are in conformity with its purposes

  20. Guidelines for visualizing and annotating rule-based models.

    Science.gov (United States)

    Chylek, Lily A; Hu, Bin; Blinov, Michael L; Emonet, Thierry; Faeder, James R; Goldstein, Byron; Gutenkunst, Ryan N; Haugh, Jason M; Lipniacki, Tomasz; Posner, Richard G; Yang, Jin; Hlavacek, William S

    2011-10-01

    Rule-based modeling provides a means to represent cell signaling systems in a way that captures site-specific details of molecular interactions. For rule-based models to be more widely understood and (re)used, conventions for model visualization and annotation are needed. We have developed the concepts of an extended contact map and a model guide for illustrating and annotating rule-based models. An extended contact map represents the scope of a model by providing an illustration of each molecule, molecular component, direct physical interaction, post-translational modification, and enzyme-substrate relationship considered in a model. A map can also illustrate allosteric effects, structural relationships among molecular components, and compartmental locations of molecules. A model guide associates elements of a contact map with annotation and elements of an underlying model, which may be fully or partially specified. A guide can also serve to document the biological knowledge upon which a model is based. We provide examples of a map and guide for a published rule-based model that characterizes early events in IgE receptor (FcεRI) signaling. We also provide examples of how to visualize a variety of processes that are common in cell signaling systems but not considered in the example model, such as ubiquitination. An extended contact map and an associated guide can document knowledge of a cell signaling system in a form that is visual as well as executable. As a tool for model annotation, a map and guide can communicate the content of a model clearly and with precision, even for large models.

  1. Simple Decision-Analytic Functions of the AUC for Ruling Out a Risk Prediction Model and an Added Predictor.

    Science.gov (United States)

    Baker, Stuart G

    2018-02-01

    When using risk prediction models, an important consideration is weighing performance against the cost (monetary and harms) of ascertaining predictors. The minimum test tradeoff (MTT) for ruling out a model is the minimum number of all-predictor ascertainments per correct prediction to yield a positive overall expected utility. The MTT for ruling out an added predictor is the minimum number of added-predictor ascertainments per correct prediction to yield a positive overall expected utility. An approximation to the MTT for ruling out a model is 1/[P (H(AUC model )], where H(AUC) = AUC - {½ (1-AUC)} ½ , AUC is the area under the receiver operating characteristic (ROC) curve, and P is the probability of the predicted event in the target population. An approximation to the MTT for ruling out an added predictor is 1 /[P {(H(AUC Model:2 ) - H(AUC Model:1 )], where Model 2 includes an added predictor relative to Model 1. The latter approximation requires the Tangent Condition that the true positive rate at the point on the ROC curve with a slope of 1 is larger for Model 2 than Model 1. These approximations are suitable for back-of-the-envelope calculations. For example, in a study predicting the risk of invasive breast cancer, Model 2 adds to the predictors in Model 1 a set of 7 single nucleotide polymorphisms (SNPs). Based on the AUCs and the Tangent Condition, an MTT of 7200 was computed, which indicates that 7200 sets of SNPs are needed for every correct prediction of breast cancer to yield a positive overall expected utility. If ascertaining the SNPs costs $500, this MTT suggests that SNP ascertainment is not likely worthwhile for this risk prediction.

  2. Multiprocessor performance modeling with ADAS

    Science.gov (United States)

    Hayes, Paul J.; Andrews, Asa M.

    1989-01-01

    A graph managing strategy referred to as the Algorithm to Architecture Mapping Model (ATAMM) appears useful for the time-optimized execution of application algorithm graphs in embedded multiprocessors and for the performance prediction of graph designs. This paper reports the modeling of ATAMM in the Architecture Design and Assessment System (ADAS) to make an independent verification of ATAMM's performance prediction capability and to provide a user framework for the evaluation of arbitrary algorithm graphs. Following an overview of ATAMM and its major functional rules are descriptions of the ADAS model of ATAMM, methods to enter an arbitrary graph into the model, and techniques to analyze the simulation results. The performance of a 7-node graph example is evaluated using the ADAS model and verifies the ATAMM concept by substantiating previously published performance results.

  3. Learning general phonological rules from distributional information: a computational model.

    Science.gov (United States)

    Calamaro, Shira; Jarosz, Gaja

    2015-04-01

    Phonological rules create alternations in the phonetic realizations of related words. These rules must be learned by infants in order to identify the phonological inventory, the morphological structure, and the lexicon of a language. Recent work proposes a computational model for the learning of one kind of phonological alternation, allophony (Peperkamp, Le Calvez, Nadal, & Dupoux, 2006). This paper extends the model to account for learning of a broader set of phonological alternations and the formalization of these alternations as general rules. In Experiment 1, we apply the original model to new data in Dutch and demonstrate its limitations in learning nonallophonic rules. In Experiment 2, we extend the model to allow it to learn general rules for alternations that apply to a class of segments. In Experiment 3, the model is further extended to allow for generalization by context; we argue that this generalization must be constrained by linguistic principles. Copyright © 2014 Cognitive Science Society, Inc.

  4. Higher Education: New Models, New Rules

    Science.gov (United States)

    Soares, Louis; Eaton, Judith S.; Smith, Burck

    2013-01-01

    The Internet enables new models. In the commercial world, for example, we have eBay, Amazon.com, and Netflix. These new models operate with a different set of rules than do traditional models. New models are emerging in higher education as well--for example, competency-based programs. In addition, courses that are being provided from outside the…

  5. RISMA: A Rule-based Interval State Machine Algorithm for Alerts Generation, Performance Analysis and Monitoring Real-Time Data Processing

    Science.gov (United States)

    Laban, Shaban; El-Desouky, Aly

    2013-04-01

    The monitoring of real-time systems is a challenging and complicated process. So, there is a continuous need to improve the monitoring process through the use of new intelligent techniques and algorithms for detecting exceptions, anomalous behaviours and generating the necessary alerts during the workflow monitoring of such systems. The interval-based or period-based theorems have been discussed, analysed, and used by many researches in Artificial Intelligence (AI), philosophy, and linguistics. As explained by Allen, there are 13 relations between any two intervals. Also, there have also been many studies of interval-based temporal reasoning and logics over the past decades. Interval-based theorems can be used for monitoring real-time interval-based data processing. However, increasing the number of processed intervals makes the implementation of such theorems a complex and time consuming process as the relationships between such intervals are increasing exponentially. To overcome the previous problem, this paper presents a Rule-based Interval State Machine Algorithm (RISMA) for processing, monitoring, and analysing the behaviour of interval-based data, received from real-time sensors. The proposed intelligent algorithm uses the Interval State Machine (ISM) approach to model any number of interval-based data into well-defined states as well as inferring them. An interval-based state transition model and methodology are presented to identify the relationships between the different states of the proposed algorithm. By using such model, the unlimited number of relationships between similar large numbers of intervals can be reduced to only 18 direct relationships using the proposed well-defined states. For testing the proposed algorithm, necessary inference rules and code have been designed and applied to the continuous data received in near real-time from the stations of International Monitoring System (IMS) by the International Data Centre (IDC) of the Preparatory

  6. Residents' surgical performance during the laboratory years: an analysis of rule-based errors.

    Science.gov (United States)

    Nathwani, Jay N; Wise, Brett J; Garren, Margaret E; Mohamadipanah, Hossein; Van Beek, Nicole; DiMarco, Shannon M; Pugh, Carla M

    2017-11-01

    Nearly one-third of surgical residents will enter into academic development during their surgical residency by dedicating time to a research fellowship for 1-3 y. Major interest lies in understanding how laboratory residents' surgical skills are affected by minimal clinical exposure during academic development. A widely held concern is that the time away from clinical exposure results in surgical skills decay. This study examines the impact of the academic development years on residents' operative performance. We hypothesize that the use of repeated, annual assessments may result in learning even without individual feedback on participants simulated performance. Surgical performance data were collected from laboratory residents (postgraduate years 2-5) during the summers of 2014, 2015, and 2016. Residents had 15 min to complete a shortened, simulated laparoscopic ventral hernia repair procedure. Final hernia repair skins from all participants were scored using a previously validated checklist. An analysis of variance test compared the mean performance scores of repeat participants to those of first time participants. Twenty-seven (37% female) laboratory residents provided 2-year assessment data over the 3-year span of the study. Second time performance revealed improvement from a mean score of 14 (standard error = 1.0) in the first year to 17.2 (SD = 0.9) in the second year, (F[1, 52] = 5.6, P = 0.022). Detailed analysis demonstrated improvement in performance for 3 grading criteria that were considered to be rule-based errors. There was no improvement in operative strategy errors. Analysis of longitudinal performance of laboratory residents shows higher scores for repeat participants in the category of rule-based errors. These findings suggest that laboratory residents can learn from rule-based mistakes when provided with annual performance-based assessments. This benefit was not seen with operative strategy errors and has important implications for

  7. Split-Ring Springback Simulations with the Non-associated Flow Rule and Evolutionary Elastic-Plasticity Models

    Science.gov (United States)

    Lee, K. J.; Choi, Y.; Choi, H. J.; Lee, J. Y.; Lee, M. G.

    2018-06-01

    Finite element simulations and experiments for the split-ring test were conducted to investigate the effect of anisotropic constitutive models on the predictive capability of sheet springback. As an alternative to the commonly employed associated flow rule, a non-associated flow rule for Hill1948 yield function was implemented in the simulations. Moreover, the evolution of anisotropy with plastic deformation was efficiently modeled by identifying equivalent plastic strain-dependent anisotropic coefficients. Comparative study with different yield surfaces and elasticity models showed that the split-ring springback could be best predicted when the anisotropy in both the R value and yield stress, their evolution and variable apparent elastic modulus were taken into account in the simulations. Detailed analyses based on deformation paths superimposed on the anisotropic yield functions predicted by different constitutive models were provided to understand the complex springback response in the split-ring test.

  8. High Performance Electrical Modeling and Simulation Verification Test Suite - Tier I; TOPICAL

    International Nuclear Information System (INIS)

    SCHELLS, REGINA L.; BOGDAN, CAROLYN W.; WIX, STEVEN D.

    2001-01-01

    This document describes the High Performance Electrical Modeling and Simulation (HPEMS) Global Verification Test Suite (VERTS). The VERTS is a regression test suite used for verification of the electrical circuit simulation codes currently being developed by the HPEMS code development team. This document contains descriptions of the Tier I test cases

  9. Constructing rule-based models using the belief functions framework

    NARCIS (Netherlands)

    Almeida, R.J.; Denoeux, T.; Kaymak, U.; Greco, S.; Bouchon-Meunier, B.; Coletti, G.; Fedrizzi, M.; Matarazzo, B.; Yager, R.R.

    2012-01-01

    Abstract. We study a new approach to regression analysis. We propose a new rule-based regression model using the theoretical framework of belief functions. For this purpose we use the recently proposed Evidential c-means (ECM) to derive rule-based models solely from data. ECM allocates, for each

  10. Modeling and Cognitive Behavior: The Effects of Modeling, Modes of Modeling and Selected Model Attributes on Rule-Governed Language Behavior.

    Science.gov (United States)

    Grieshop, James Ivo

    The effect of modeling on the performance of rule-governed language behaviors of 208 male and female, Anglo and Chicano, sixth grade students in Albuquerque, N.M. was experimentally investigated. Eight boys and 8 girls (4 each Chicano and Anglo) were randomly assigned to each of the 12 experimental conditions and to the control group. Three modes…

  11. Selecting short-statured children needing growth hormone testing: Derivation and validation of a clinical decision rule

    Directory of Open Access Journals (Sweden)

    Bréart Gérard

    2008-07-01

    Full Text Available Abstract Background Numerous short-statured children are evaluated for growth hormone (GH deficiency (GHD. In most patients, GH provocative tests are normal and are thus in retrospect unnecessary. Methods A retrospective cohort study was conducted to identify predictors of growth hormone (GH deficiency (GHD in children seen for short stature, and to construct a very sensitive and fairly specific predictive tool to avoid unnecessary GH provocative tests. GHD was defined by the presence of 2 GH concentration peaks Results The initial study included 167 patients, 36 (22% of whom had GHD, including 5 (3% with certain GHD. Independent predictors of GHD were: growth rate Conclusion We have derived and performed an internal validation of a highly sensitive decision rule that could safely help to avoid more than 2/3 of the unnecessary GH tests. External validation of this rule is needed before any application.

  12. Development of turbopump cavitation performance test facility and the test of inducer performance

    International Nuclear Information System (INIS)

    Sohn, Dong Kee; Kim, Chun Tak; Yoon, Min Soo; Cha, Bong Jun; Kim, Jin Han; Yang, Soo Seok

    2001-01-01

    A performance test facility for turbopump inducer cavitation was developed and the inducer cavitation performance tests were performed. Major components of the performance test facility are driving unit, test section, piping, water tank, and data acquisition and control system. The maximum of testing capability of this facility are as follows: flow rate - 30kg/s; pressure - 13 bar, rotational speed - 10,000rpm. This cavitation test facility is characterized by the booster pump installed at the outlet of the pump that extends the flow rate range, and by the pressure control system that makes the line pressure down to vapor pressure. The vacuum pump is used for removing the dissolved air in the water as well as the line pressure. Performance tests were carried out and preliminary data of test model inducer were obtained. The cavitation performance test and cavitation bubble flow visualization were also made. This facility is originally designed for turbopump inducer performance test and cavitation test. However it can be applied to the pump impeller performance test in the future with little modification

  13. The Balance-Scale Task Revisited: A Comparison of Statistical Models for Rule-Based and Information-Integration Theories of Proportional Reasoning.

    Directory of Open Access Journals (Sweden)

    Abe D Hofman

    Full Text Available We propose and test three statistical models for the analysis of children's responses to the balance scale task, a seminal task to study proportional reasoning. We use a latent class modelling approach to formulate a rule-based latent class model (RB LCM following from a rule-based perspective on proportional reasoning and a new statistical model, the Weighted Sum Model, following from an information-integration approach. Moreover, a hybrid LCM using item covariates is proposed, combining aspects of both a rule-based and information-integration perspective. These models are applied to two different datasets, a standard paper-and-pencil test dataset (N = 779, and a dataset collected within an online learning environment that included direct feedback, time-pressure, and a reward system (N = 808. For the paper-and-pencil dataset the RB LCM resulted in the best fit, whereas for the online dataset the hybrid LCM provided the best fit. The standard paper-and-pencil dataset yielded more evidence for distinct solution rules than the online data set in which quantitative item characteristics are more prominent in determining responses. These results shed new light on the discussion on sequential rule-based and information-integration perspectives of cognitive development.

  14. Display rules versus display autonomy: emotion regulation, emotional exhaustion, and task performance in a call center simulation.

    Science.gov (United States)

    Goldberg, Lori Sideman; Grandey, Alicia A

    2007-07-01

    "Service with a smile" is satisfying for the customer, but such display rules may be costly to the employee and the organization. Most previous research on such costs has used self-reported and cross-sectional designs. The authors use an experimental approach to test tenets of resource depletion theories; specifically, whether the self-regulation of emotions required by display rules depletes energy and attentional resources during a service encounter. Using a call center simulation with three "customer" interactions, the authors found that participants given positive display rules (e.g., be enthusiastic and hide frustration) reported more postsimulation exhaustion and made more errors on the order form compared to those with display autonomy. Customer hostility during one of the calls also increased exhaustion overall and the number of errors during that specific call, though proposed interactions with display rules were not supported. Surface-level emotion regulation, but not deep-level, was the mechanism for the energy depletion effect of display rules, while display rules had a direct effect on performance decrements. Theoretical and practical implications for display rules as part of job requirements are discussed. Copyright 2007 APA

  15. Choice Rules and Accumulator Networks

    Science.gov (United States)

    2015-01-01

    This article presents a preference accumulation model that can be used to implement a number of different multi-attribute heuristic choice rules, including the lexicographic rule, the majority of confirming dimensions (tallying) rule and the equal weights rule. The proposed model differs from existing accumulators in terms of attribute representation: Leakage and competition, typically applied only to preference accumulation, are also assumed to be involved in processing attribute values. This allows the model to perform a range of sophisticated attribute-wise comparisons, including comparisons that compute relative rank. The ability of a preference accumulation model composed of leaky competitive networks to mimic symbolic models of heuristic choice suggests that these 2 approaches are not incompatible, and that a unitary cognitive model of preferential choice, based on insights from both these approaches, may be feasible. PMID:28670592

  16. Evaluation and modelling of SWIW tests performed within the SKB site characterisation programme

    International Nuclear Information System (INIS)

    Nordqvist, Rune

    2008-08-01

    In this report, a comprehensive overview of SWIW (Single Well Injection-Withdrawal) tests carried out within the SKB site investigations at Oskarshamn and Forsmark is presented. The purpose of this study is to make a general review and a comparison of performed SWIW tests within the site investigation programmes at the two sites. The study summarises experimental conditions for each test and discusses factors that may influence the experimental results and evaluation of the tests. Further, an extended model evaluation is carried out using a one- dimensional radial flow and transport model with matrix diffusion and matrix sorption. The intended outcome is an improved understanding of various mechanisms that may influence the SWIW test results and also to improve interpretation of the tests. Six SWIW test at each site have been carried out, generally resulting in high-quality and well documented experimental data with high tracer recovery. The tests have been performed in surface boreholes at repository depth, ranging approximately between 300 to 700 m borehole lengths. In all of the tests, a non-sorbing tracer (Uranine) and one or two sorbing tracers (cesium and rubidium) have been used simultaneously. A general result is that all of the tests demonstrate a very clear and relatively large retardation effect for the sorbing tracers. Basic initial modelling of the SWIW tests data, using a one-dimensional radial flow model with advection and dispersion, generally resulted in relatively good agreement between model and experimental data. However, a consistent feature of the initial modelling was a discrepancy between model and experimental data in the later parts of the recovery tracer breakthrough curve. It was concluded that this likely was caused by processes occurring in the tested rock formation and therefore an extended model evaluation (presented in this report) including matrix diffusion was carried out on all of the performed tests. Evaluated retardation

  17. Evaluation and modelling of SWIW tests performed within the SKB site characterisation programme

    Energy Technology Data Exchange (ETDEWEB)

    Nordqvist, Rune (Geosigma AB, Uppsala (SE))

    2008-08-15

    In this report, a comprehensive overview of SWIW (Single Well Injection-Withdrawal) tests carried out within the SKB site investigations at Oskarshamn and Forsmark is presented. The purpose of this study is to make a general review and a comparison of performed SWIW tests within the site investigation programmes at the two sites. The study summarises experimental conditions for each test and discusses factors that may influence the experimental results and evaluation of the tests. Further, an extended model evaluation is carried out using a one- dimensional radial flow and transport model with matrix diffusion and matrix sorption. The intended outcome is an improved understanding of various mechanisms that may influence the SWIW test results and also to improve interpretation of the tests. Six SWIW test at each site have been carried out, generally resulting in high-quality and well documented experimental data with high tracer recovery. The tests have been performed in surface boreholes at repository depth, ranging approximately between 300 to 700 m borehole lengths. In all of the tests, a non-sorbing tracer (Uranine) and one or two sorbing tracers (cesium and rubidium) have been used simultaneously. A general result is that all of the tests demonstrate a very clear and relatively large retardation effect for the sorbing tracers. Basic initial modelling of the SWIW tests data, using a one-dimensional radial flow model with advection and dispersion, generally resulted in relatively good agreement between model and experimental data. However, a consistent feature of the initial modelling was a discrepancy between model and experimental data in the later parts of the recovery tracer breakthrough curve. It was concluded that this likely was caused by processes occurring in the tested rock formation and therefore an extended model evaluation (presented in this report) including matrix diffusion was carried out on all of the performed tests. Evaluated retardation

  18. Inflexibility and independence: Phase transitions in the majority-rule model.

    Science.gov (United States)

    Crokidakis, Nuno; de Oliveira, Paulo Murilo Castro

    2015-12-01

    In this work we study opinion formation in a population participating in a public debate with two distinct choices. We consider three distinct mechanisms of social interactions and individuals' behavior: conformity, nonconformity, and inflexibility. The conformity is ruled by the majority-rule dynamics, whereas the nonconformity is introduced in the population as an independent behavior, implying the failure of attempted group influence. Finally, the inflexible agents are introduced in the population with a given density. These individuals present a singular behavior, in a way that their stubbornness makes them reluctant to change their opinions. We consider these effects separately and all together, with the aim to analyze the critical behavior of the system. We perform numerical simulations in some lattice structures and for distinct population sizes. Our results suggest that the different formulations of the model undergo order-disorder phase transitions in the same universality class as the Ising model. Some of our results are complemented by analytical calculations.

  19. Computational modeling of spiking neural network with learning rules from STDP and intrinsic plasticity

    Science.gov (United States)

    Li, Xiumin; Wang, Wei; Xue, Fangzheng; Song, Yongduan

    2018-02-01

    Recently there has been continuously increasing interest in building up computational models of spiking neural networks (SNN), such as the Liquid State Machine (LSM). The biologically inspired self-organized neural networks with neural plasticity can enhance the capability of computational performance, with the characteristic features of dynamical memory and recurrent connection cycles which distinguish them from the more widely used feedforward neural networks. Despite a variety of computational models for brain-like learning and information processing have been proposed, the modeling of self-organized neural networks with multi-neural plasticity is still an important open challenge. The main difficulties lie in the interplay among different forms of neural plasticity rules and understanding how structures and dynamics of neural networks shape the computational performance. In this paper, we propose a novel approach to develop the models of LSM with a biologically inspired self-organizing network based on two neural plasticity learning rules. The connectivity among excitatory neurons is adapted by spike-timing-dependent plasticity (STDP) learning; meanwhile, the degrees of neuronal excitability are regulated to maintain a moderate average activity level by another learning rule: intrinsic plasticity (IP). Our study shows that LSM with STDP+IP performs better than LSM with a random SNN or SNN obtained by STDP alone. The noticeable improvement with the proposed method is due to the better reflected competition among different neurons in the developed SNN model, as well as the more effectively encoded and processed relevant dynamic information with its learning and self-organizing mechanism. This result gives insights to the optimization of computational models of spiking neural networks with neural plasticity.

  20. Testing decision rules for categorizing species' extinction risk to help develop quantitative listing criteria for the U.S. Endangered Species Act.

    Science.gov (United States)

    Regan, Tracey J; Taylor, Barbara L; Thompson, Grant G; Cochrane, Jean Fitts; Ralls, Katherine; Runge, Michael C; Merrick, Richard

    2013-08-01

    Lack of guidance for interpreting the definitions of endangered and threatened in the U.S. Endangered Species Act (ESA) has resulted in case-by-case decision making leaving the process vulnerable to being considered arbitrary or capricious. Adopting quantitative decision rules would remedy this but requires the agency to specify the relative urgency concerning extinction events over time, cutoff risk values corresponding to different levels of protection, and the importance given to different types of listing errors. We tested the performance of 3 sets of decision rules that use alternative functions for weighting the relative urgency of future extinction events: a threshold rule set, which uses a decision rule of x% probability of extinction over y years; a concave rule set, where the relative importance of future extinction events declines exponentially over time; and a shoulder rule set that uses a sigmoid shape function, where relative importance declines slowly at first and then more rapidly. We obtained decision cutoffs by interviewing several biologists and then emulated the listing process with simulations that covered a range of extinction risks typical of ESA listing decisions. We evaluated performance of the decision rules under different data quantities and qualities on the basis of the relative importance of misclassification errors. Although there was little difference between the performance of alternative decision rules for correct listings, the distribution of misclassifications differed depending on the function used. Misclassifications for the threshold and concave listing criteria resulted in more overprotection errors, particularly as uncertainty increased, whereas errors for the shoulder listing criteria were more symmetrical. We developed and tested the framework for quantitative decision rules for listing species under the U.S. ESA. If policy values can be agreed on, use of this framework would improve the implementation of the ESA by

  1. Understanding protocol performance: impact of test performance.

    Science.gov (United States)

    Turner, Robert G

    2013-01-01

    This is the second of two articles that examine the factors that determine protocol performance. The objective of these articles is to provide a general understanding of protocol performance that can be used to estimate performance, establish limits on performance, decide if a protocol is justified, and ultimately select a protocol. The first article was concerned with protocol criterion and test correlation. It demonstrated the advantages and disadvantages of different criterion when all tests had the same performance. It also examined the impact of increasing test correlation on protocol performance and the characteristics of the different criteria. To examine the impact on protocol performance when individual tests in a protocol have different performance. This is evaluated for different criteria and test correlations. The results of the two articles are combined and summarized. A mathematical model is used to calculate protocol performance for different protocol criteria and test correlations when there are small to large variations in the performance of individual tests in the protocol. The performance of the individual tests that make up a protocol has a significant impact on the performance of the protocol. As expected, the better the performance of the individual tests, the better the performance of the protocol. Many of the characteristics of the different criteria are relatively independent of the variation in the performance of the individual tests. However, increasing test variation degrades some criteria advantages and causes a new disadvantage to appear. This negative impact increases as test variation increases and as more tests are added to the protocol. Best protocol performance is obtained when individual tests are uncorrelated and have the same performance. In general, the greater the variation in the performance of tests in the protocol, the more detrimental this variation is to protocol performance. Since this negative impact is increased as

  2. Hierarchical graphs for rule-based modeling of biochemical systems

    Directory of Open Access Journals (Sweden)

    Hu Bin

    2011-02-01

    Full Text Available Abstract Background In rule-based modeling, graphs are used to represent molecules: a colored vertex represents a component of a molecule, a vertex attribute represents the internal state of a component, and an edge represents a bond between components. Components of a molecule share the same color. Furthermore, graph-rewriting rules are used to represent molecular interactions. A rule that specifies addition (removal of an edge represents a class of association (dissociation reactions, and a rule that specifies a change of a vertex attribute represents a class of reactions that affect the internal state of a molecular component. A set of rules comprises an executable model that can be used to determine, through various means, the system-level dynamics of molecular interactions in a biochemical system. Results For purposes of model annotation, we propose the use of hierarchical graphs to represent structural relationships among components and subcomponents of molecules. We illustrate how hierarchical graphs can be used to naturally document the structural organization of the functional components and subcomponents of two proteins: the protein tyrosine kinase Lck and the T cell receptor (TCR complex. We also show that computational methods developed for regular graphs can be applied to hierarchical graphs. In particular, we describe a generalization of Nauty, a graph isomorphism and canonical labeling algorithm. The generalized version of the Nauty procedure, which we call HNauty, can be used to assign canonical labels to hierarchical graphs or more generally to graphs with multiple edge types. The difference between the Nauty and HNauty procedures is minor, but for completeness, we provide an explanation of the entire HNauty algorithm. Conclusions Hierarchical graphs provide more intuitive formal representations of proteins and other structured molecules with multiple functional components than do the regular graphs of current languages for

  3. Strategy-Driven Exploration for Rule-Based Models of Biochemical Systems with Porgy

    OpenAIRE

    Andrei , Oana; Fernández , Maribel; Kirchner , Hélène; Pinaud , Bruno

    2016-01-01

    This paper presents Porgy – an interactive visual environment for rule-based modelling of biochemical systems. We model molecules and molecule interactions as port graphs and port graph rewrite rules, respectively. We use rewriting strategies to control which rules to apply, and where and when to apply them. Our main contributions to rule-based modelling of biochemical systems lie in the strategy language and the associated visual and interactive features offered by Porgy. These features faci...

  4. Synthesizing Service Composition Models on the Basis of Temporal Business Rules

    Institute of Scientific and Technical Information of China (English)

    Jian Yu; Yan-Bo Han; Jun Han; Yan Jin; Paolo Falcarin; Maurizio Morisio

    2008-01-01

    Transformational approaches to generating design and implementation models from requirements can bring effectiveness and quality to software development. In this paper we present a framework and associated techniques to generate the process model of a service composition from a set of temporal business rules. Dedicated techniques including pathfinding, branching structure identification and parallel structure identification are used for semi-automatically synthesizing the process model from the semantics-equivalent Finite State Automata of the rules. These process models naturally satisfy the prescribed behavioral constraints of the rules. With the domain knowledge encoded in the temporal business rules,an executable service composition program, e.g., a BPEL program, can be further generated from the process models. A running example in the e-business domain is used for illustrating our approach throughout this paper.

  5. An empirical model to describe performance degradation for warranty abuse detection in portable electronics

    International Nuclear Information System (INIS)

    Oh, Hyunseok; Choi, Seunghyuk; Kim, Keunsu; Youn, Byeng D.; Pecht, Michael

    2015-01-01

    Portable electronics makers have introduced liquid damage indicators (LDIs) into their products to detect warranty abuse caused by water damage. However, under certain conditions, these indicators can exhibit inconsistencies in detecting liquid damage. This study is motivated by the fact that the reliability of LDIs in portable electronics is suspected. In this paper, first, the scheme of life tests is devised for LDIs in conjunction with a robust color classification rule. Second, a degradation model is proposed by considering the two physical mechanisms—(1) phase change from vapor to water and (2) water transport in the porous paper—for LDIs. Finally, the degradation model is validated with additional tests using actual smartphone sets subjected to the thermal cycling of −15 °C to 25 °C and the relative humidity of 95%. By employing the innovative life testing scheme and the novel performance degradation model, it is expected that the performance of LDIs for a particular application can be assessed quickly and accurately. - Highlights: • Devise an efficient scheme of life testing for a warranty abuse detector in portable electronics. • Develop a performance degradation model for the warranty abuse detector used in portable electronics. • Validate the performance degradation model with life tests of actual smartphone sets. • Help make a decision on warranty service in portable electronics manufacturers

  6. Transition sum rules in the shell model

    Science.gov (United States)

    Lu, Yi; Johnson, Calvin W.

    2018-03-01

    An important characterization of electromagnetic and weak transitions in atomic nuclei are sum rules. We focus on the non-energy-weighted sum rule (NEWSR), or total strength, and the energy-weighted sum rule (EWSR); the ratio of the EWSR to the NEWSR is the centroid or average energy of transition strengths from an nuclear initial state to all allowed final states. These sum rules can be expressed as expectation values of operators, which in the case of the EWSR is a double commutator. While most prior applications of the double commutator have been to special cases, we derive general formulas for matrix elements of both operators in a shell model framework (occupation space), given the input matrix elements for the nuclear Hamiltonian and for the transition operator. With these new formulas, we easily evaluate centroids of transition strength functions, with no need to calculate daughter states. We apply this simple tool to a number of nuclides and demonstrate the sum rules follow smooth secular behavior as a function of initial energy, as well as compare the electric dipole (E 1 ) sum rule against the famous Thomas-Reiche-Kuhn version. We also find surprising systematic behaviors for ground-state electric quadrupole (E 2 ) centroids in the s d shell.

  7. Nonlinear adaptive synchronization rule for identification of a large amount of parameters in dynamical models

    International Nuclear Information System (INIS)

    Ma Huanfei; Lin Wei

    2009-01-01

    The existing adaptive synchronization technique based on the stability theory and invariance principle of dynamical systems, though theoretically proved to be valid for parameters identification in specific models, is always showing slow convergence rate and even failed in practice when the number of parameters becomes large. Here, for parameters update, a novel nonlinear adaptive rule is proposed to accelerate the rate. Its feasibility is validated by analytical arguments as well as by specific parameters identification in the Lotka-Volterra model with multiple species. Two adjustable factors in this rule influence the identification accuracy, which means that a proper choice of these factors leads to an optimal performance of this rule. In addition, a feasible method for avoiding the occurrence of the approximate linear dependence among terms with parameters on the synchronized manifold is also proposed.

  8. Is the Factor-of-2 Rule Broadly Applicable for Evaluating the Prediction Accuracy of Metal-Toxicity Models?

    Science.gov (United States)

    Meyer, Joseph S; Traudt, Elizabeth M; Ranville, James F

    2018-01-01

    In aquatic toxicology, a toxicity-prediction model is generally deemed acceptable if its predicted median lethal concentrations (LC50 values) or median effect concentrations (EC50 values) are within a factor of 2 of their paired, observed LC50 or EC50 values. However, that rule of thumb is based on results from only two studies: multiple LC50 values for the fathead minnow (Pimephales promelas) exposed to Cu in one type of exposure water, and multiple EC50 values for Daphnia magna exposed to Zn in another type of exposure water. We tested whether the factor-of-2 rule of thumb also is supported in a different dataset in which D. magna were exposed separately to Cd, Cu, Ni, or Zn. Overall, the factor-of-2 rule of thumb appeared to be a good guide to evaluating the acceptability of a toxicity model's underprediction or overprediction of observed LC50 or EC50 values in these acute toxicity tests.

  9. Mindset Changes Lead to Drastic Impairments in Rule Finding

    Science.gov (United States)

    ErEl, Hadas; Meiran, Nachshon

    2011-01-01

    Rule finding is an important aspect of human reasoning and flexibility. Previous studies associated rule finding "failure" with past experience with the test stimuli and stable personality traits. We additionally show that rule finding performance is severely impaired by a mindset associated with applying an instructed rule. The mindset was…

  10. Methods for Quantifying the Uncertainties of LSIT Test Parameters, Test Results, and Full-Scale Mixing Performance Using Models Developed from Scaled Test Data

    Energy Technology Data Exchange (ETDEWEB)

    Piepel, Gregory F. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Cooley, Scott K. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Kuhn, William L. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Rector, David R. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Heredia-Langner, Alejandro [Pacific Northwest National Lab. (PNNL), Richland, WA (United States)

    2015-05-01

    This report discusses the statistical methods for quantifying uncertainties in 1) test responses and other parameters in the Large Scale Integrated Testing (LSIT), and 2) estimates of coefficients and predictions of mixing performance from models that relate test responses to test parameters. Testing at a larger scale has been committed to by Bechtel National, Inc. and the U.S. Department of Energy (DOE) to “address uncertainties and increase confidence in the projected, full-scale mixing performance and operations” in the Waste Treatment and Immobilization Plant (WTP).

  11. Decision Rules, Trees and Tests for Tables with Many-valued Decisions–comparative Study

    KAUST Repository

    Azad, Mohammad; Zielosko, Beata; Moshkov, Mikhail; Chikalov, Igor

    2013-01-01

    In this paper, we present three approaches for construction of decision rules for decision tables with many-valued decisions. We construct decision rules directly for rows of decision table, based on paths in decision tree, and based on attributes contained in a test (super-reduct). Experimental results for the data sets taken from UCI Machine Learning Repository, contain comparison of the maximum and the average length of rules for the mentioned approaches.

  12. Decision Rules, Trees and Tests for Tables with Many-valued Decisions–comparative Study

    KAUST Repository

    Azad, Mohammad

    2013-10-04

    In this paper, we present three approaches for construction of decision rules for decision tables with many-valued decisions. We construct decision rules directly for rows of decision table, based on paths in decision tree, and based on attributes contained in a test (super-reduct). Experimental results for the data sets taken from UCI Machine Learning Repository, contain comparison of the maximum and the average length of rules for the mentioned approaches.

  13. Simple and complex rule induction performance in young and older adults: contribution of episodic memory and working memory

    NARCIS (Netherlands)

    Oosterman, J.M.; Boeschoten, M.S.; Eling, P.A.; Kessels, R.P.C.; Maes, J.H.

    2014-01-01

    This study tested the hypothesis that part of the age-related decline in performance on executive function tasks is due to a decline in episodic memory. For this, we developed a rule induction task in which we manipulated the involvement of episodic memory and executive control processes; age

  14. Methods for Quantifying the Uncertainties of LSIT Test Parameters, Test Results, and Full-Scale Mixing Performance Using Models Developed from Scaled Test Data

    International Nuclear Information System (INIS)

    Piepel, Gregory F.; Cooley, Scott K.; Kuhn, William L.; Rector, David R.; Heredia-Langner, Alejandro

    2015-01-01

    This report discusses the statistical methods for quantifying uncertainties in 1) test responses and other parameters in the Large Scale Integrated Testing (LSIT), and 2) estimates of coefficients and predictions of mixing performance from models that relate test responses to test parameters. Testing at a larger scale has been committed to by Bechtel National, Inc. and the U.S. Department of Energy (DOE) to ''address uncertainties and increase confidence in the projected, full-scale mixing performance and operations'' in the Waste Treatment and Immobilization Plant (WTP).

  15. Fiber Bragg Grating-Based Performance Monitoring of Piles Fiber in a Geotechnical Centrifugal Model Test

    Directory of Open Access Journals (Sweden)

    Xiaolin Weng

    2014-01-01

    Full Text Available In centrifugal tests, conventional sensors can hardly capture the performance of reinforcement in small-scale models. However, recent advances in fiber optic sensing technologies enable the accurate and reliable monitoring of strain and temperature in laboratory geotechnical tests. This paper outlines a centrifugal model test, performed using a 60 g ton geocentrifuge, to investigate the performance of pipe piles used to reinforce the loess foundation below a widened embankment. Prior to the test, quasidistributed fiber Bragg grating (FBG strain sensors were attached to the surface of the pipe piles to measure the lateral friction resistance in real time. Via the centrifuge actuator, the driving of pipe piles was simulated. During testing, the variations of skin friction distribution along the pipe piles were measured automatically using an optical fiber interrogator. This paper represents the presentation and detailed analysis of monitoring results. Herein, we verify the reliability of the fiber optic sensors in monitoring the model piles without affecting the integrity of the centrifugal model. This paper, furthermore, shows that lateral friction resistance developed in stages with the pipe piles being pressed in and that this sometimes may become negative.

  16. Bayesian hypothesis testing and the maintenance rule

    International Nuclear Information System (INIS)

    Kelly, D.L.

    1997-01-01

    The Maintenance Rule (10 CFR 50.65) went into effect in the United States in July 1996. It requires commercial nuclear utilities to monitor system performance (system reliability and maintenance unavailability) for systems that are determined by the utility to be important to plant safety. Utilities must set performance goals for such systems and monitor system performance against these goals. In addition, these performance goals are intended to be commensurate with the safety significance of the system, which can be established by a probabilistic safety assessment of the plant. The author examines the frequents approach to monitoring performance, which is being used by several utilities, and proposes an alternative Bayesian approach. The Bayesian approach makes more complete use of the information in the probabilistic safety assessment, is consistent philosophically with the subjective interpretation given to probability in most probabilistic safety assessments, overcomes several pitfalls in the frequents approach, provides results which are easily interpretable, and is straightforward to implement using the information in the probabilistic safety assessment

  17. Association Rule-based Predictive Model for Machine Failure in Industrial Internet of Things

    Science.gov (United States)

    Kwon, Jung-Hyok; Lee, Sol-Bee; Park, Jaehoon; Kim, Eui-Jik

    2017-09-01

    This paper proposes an association rule-based predictive model for machine failure in industrial Internet of things (IIoT), which can accurately predict the machine failure in real manufacturing environment by investigating the relationship between the cause and type of machine failure. To develop the predictive model, we consider three major steps: 1) binarization, 2) rule creation, 3) visualization. The binarization step translates item values in a dataset into one or zero, then the rule creation step creates association rules as IF-THEN structures using the Lattice model and Apriori algorithm. Finally, the created rules are visualized in various ways for users’ understanding. An experimental implementation was conducted using R Studio version 3.3.2. The results show that the proposed predictive model realistically predicts machine failure based on association rules.

  18. Optimization of conventional rule curves coupled with hedging rules for reservoir operation

    DEFF Research Database (Denmark)

    Taghian, Mehrdad; Rosbjerg, Dan; Haghighi, Ali

    2014-01-01

    As a common approach to reservoir operating policies, water levels at the end of each time interval should be kept at or above the rule curve. In this study, the policy is captured using rationing of the target yield to reduce the intensity of severe water shortages. For this purpose, a hybrid...... to achieve the optimal water allocation and the target storage levels for reservoirs. As a case study, a multipurpose, multireservoir system in southern Iran is selected. The results show that the model has good performance in extracting the optimum policy for reservoir operation under both normal...... model is developed to optimize simultaneously both the conventional rule curve and the hedging rule. In the compound model, a simple genetic algorithm is coupled with a simulation program, including an inner linear programming algorithm. In this way, operational policies are imposed by priority concepts...

  19. Decision making under internal uncertainty: the case of multiple-choice tests with different scoring rules.

    Science.gov (United States)

    Bereby-Meyer, Yoella; Meyer, Joachim; Budescu, David V

    2003-02-01

    This paper assesses framing effects on decision making with internal uncertainty, i.e., partial knowledge, by focusing on examinees' behavior in multiple-choice (MC) tests with different scoring rules. In two experiments participants answered a general-knowledge MC test that consisted of 34 solvable and 6 unsolvable items. Experiment 1 studied two scoring rules involving Positive (only gains) and Negative (only losses) scores. Although answering all items was the dominating strategy for both rules, the results revealed a greater tendency to answer under the Negative scoring rule. These results are in line with the predictions derived from Prospect Theory (PT) [Econometrica 47 (1979) 263]. The second experiment studied two scoring rules, which allowed respondents to exhibit partial knowledge. Under the Inclusion-scoring rule the respondents mark all answers that could be correct, and under the Exclusion-scoring rule they exclude all answers that might be incorrect. As predicted by PT, respondents took more risks under the Inclusion rule than under the Exclusion rule. The results illustrate that the basic process that underlies choice behavior under internal uncertainty and especially the effect of framing is similar to the process of choice under external uncertainty and can be described quite accurately by PT. Copyright 2002 Elsevier Science B.V.

  20. Early implementation of the Maintenance Rule

    International Nuclear Information System (INIS)

    Green, J.R.

    1995-01-01

    On July 10, 1991, the U.S. Nuclear Regulatory Commission (NRC) published the Maintenance Rule as Section 50.65 of 10 CFR 50. The purpose of the Maintenance Rule is to ensure that the effectiveness of maintenance activities is assessed on an ongoing basis in a manner that ensures key structures, systems, and components are capable of performing their intended function. Full implementation of the rule is required by July 10, 1996. On May 31, 1994, the NRC published Generic Letter 94-01, which allowed removal of technical specification accelerated testing and special reporting requirements for emergency diesel generators provided a program is implemented for monitoring and maintaining emergency diesel generator performance consistent with the provisions of the Maintenance Rule

  1. Thurstonian models for sensory discrimination tests as generalized linear models

    DEFF Research Database (Denmark)

    Brockhoff, Per B.; Christensen, Rune Haubo Bojesen

    2010-01-01

    as a so-called generalized linear model. The underlying sensory difference 6 becomes directly a parameter of the statistical model and the estimate d' and it's standard error becomes the "usual" output of the statistical analysis. The d' for the monadic A-NOT A method is shown to appear as a standard......Sensory discrimination tests such as the triangle, duo-trio, 2-AFC and 3-AFC tests produce binary data and the Thurstonian decision rule links the underlying sensory difference 6 to the observed number of correct responses. In this paper it is shown how each of these four situations can be viewed...

  2. Delta Learning Rule for the Active Sites Model

    OpenAIRE

    Lingashetty, Krishna Chaithanya

    2010-01-01

    This paper reports the results on methods of comparing the memory retrieval capacity of the Hebbian neural network which implements the B-Matrix approach, by using the Widrow-Hoff rule of learning. We then, extend the recently proposed Active Sites model by developing a delta rule to increase memory capacity. Also, this paper extends the binary neural network to a multi-level (non-binary) neural network.

  3. FeynRules - Feynman rules made easy

    OpenAIRE

    Christensen, Neil D.; Duhr, Claude

    2008-01-01

    In this paper we present FeynRules, a new Mathematica package that facilitates the implementation of new particle physics models. After the user implements the basic model information (e.g. particle content, parameters and Lagrangian), FeynRules derives the Feynman rules and stores them in a generic form suitable for translation to any Feynman diagram calculation program. The model can then be translated to the format specific to a particular Feynman diagram calculator via F...

  4. Comparative analysis of diagnostic performance, feasibility and cost of different test-methods for thyroid nodules with indeterminate cytology.

    Science.gov (United States)

    Sciacchitano, Salvatore; Lavra, Luca; Ulivieri, Alessandra; Magi, Fiorenza; De Francesco, Gian Paolo; Bellotti, Carlo; Salehi, Leila B; Trovato, Maria; Drago, Carlo; Bartolazzi, Armando

    2017-07-25

    Since it is impossible to recognize malignancy at fine needle aspiration (FNA) cytology in indeterminate thyroid nodules, surgery is recommended for all of them. However, cancer rate at final histology is blood assay.We performed systematic reviews and meta-analyses to compare their features, feasibility, diagnostic performance and cost. GEC, GEC+BRAF, M/F panel+miRNA GEC and M/F panel by NGS were the best in ruling-out malignancy (sensitivity = 90%, 89%, 89% and 90% respectively). BRAF and M/F panel alone and by NGS were the best in ruling-in malignancy (specificity = 100%, 93% and 93%). The M/F by NGS showed the highest accuracy (92%) and BRAF the highest diagnostic odds ratio (DOR) (247). GAL-3-ICC performed well as rule-out (sensitivity = 83%) and rule-in test (specificity = 85%), with good accuracy (84%) and high DOR (27) and is one of the cheapest (113 USD) and easiest one to be performed in different clinical settings.In conclusion, the more accurate molecular-based test-methods are still expensive and restricted to few, highly specialized and centralized laboratories. GAL-3-ICC, although limited by some false negatives, represents the most suitable screening test-method to be applied on a large-scale basis in the diagnostic algorithm of indeterminate thyroid lesions.

  5. Spin structure of the neutron ({sup 3}He) and the Bjoerken sum rule

    Energy Technology Data Exchange (ETDEWEB)

    Meziani, Z.E. [Stanford Univ., CA (United States)

    1994-12-01

    A first measurement of the longitudinal asymmetry of deep-inelastic scattering of polarized electrons from a polarized {sup 3}He target at energies ranging from 19 to 26 GeV has been performed at the Stanford Linear Accelerator Center (SLAC). The spin-structure function of the neutron g{sub 1}{sup n} has been extracted from the measured asymmetries. The Quark Parton Model (QPM) interpretation of the nucleon spin-structure function is examined in light of the new results. A test of the Ellis-Jaffe sum rule (E-J) on the neutron is performed at high momentum transfer and found to be satisfied. Furthermore, combining the proton results of the European Muon Collaboration (EMC) and the neutron results of E-142, the Bjoerken sum rule test is carried at high Q{sup 2} where higher order Perturbative Quantum Chromodynamics (PQCD) corrections and higher-twist corrections are smaller. The sum rule is saturated to within one standard deviation.

  6. A two-stage stochastic rule-based model to determine pre-assembly buffer content

    Science.gov (United States)

    Gunay, Elif Elcin; Kula, Ufuk

    2018-01-01

    This study considers instant decision-making needs of the automobile manufactures for resequencing vehicles before final assembly (FA). We propose a rule-based two-stage stochastic model to determine the number of spare vehicles that should be kept in the pre-assembly buffer to restore the altered sequence due to paint defects and upstream department constraints. First stage of the model decides the spare vehicle quantities, where the second stage model recovers the scrambled sequence respect to pre-defined rules. The problem is solved by sample average approximation (SAA) algorithm. We conduct a numerical study to compare the solutions of heuristic model with optimal ones and provide following insights: (i) as the mismatch between paint entrance and scheduled sequence decreases, the rule-based heuristic model recovers the scrambled sequence as good as the optimal resequencing model, (ii) the rule-based model is more sensitive to the mismatch between the paint entrance and scheduled sequences for recovering the scrambled sequence, (iii) as the defect rate increases, the difference in recovery effectiveness between rule-based heuristic and optimal solutions increases, (iv) as buffer capacity increases, the recovery effectiveness of the optimization model outperforms heuristic model, (v) as expected the rule-based model holds more inventory than the optimization model.

  7. Predicting Examination Performance Using an Expanded Integrated Hierarchical Model of Test Emotions and Achievement Goals

    Science.gov (United States)

    Putwain, Dave; Deveney, Carolyn

    2009-01-01

    The aim of this study was to examine an expanded integrative hierarchical model of test emotions and achievement goal orientations in predicting the examination performance of undergraduate students. Achievement goals were theorised as mediating the relationship between test emotions and performance. 120 undergraduate students completed…

  8. When none of us perform better than all of us together: the role of analogical decision rules in groups.

    Directory of Open Access Journals (Sweden)

    Nicoleta Meslec

    Full Text Available During social interactions, groups develop collective competencies that (ideally should assist groups to outperform average standalone individual members (weak cognitive synergy or the best performing member in the group (strong cognitive synergy. In two experimental studies we manipulate the type of decision rule used in group decision-making (identify the best vs. collaborative, and the way in which the decision rules are induced (direct vs. analogical and we test the effect of these two manipulations on the emergence of strong and weak cognitive synergy. Our most important results indicate that an analogically induced decision rule (imitate-the-successful heuristic in which groups have to identify the best member and build on his/her performance (take-the-best heuristic is the most conducive for strong cognitive synergy. Our studies bring evidence for the role of analogy-making in groups as well as the role of fast-and-frugal heuristics for group decision-making.

  9. Differential impact of relevant and irrelevant dimension primes on rule-based and information-integration category learning.

    Science.gov (United States)

    Grimm, Lisa R; Maddox, W Todd

    2013-11-01

    Research has identified multiple category-learning systems with each being "tuned" for learning categories with different task demands and each governed by different neurobiological systems. Rule-based (RB) classification involves testing verbalizable rules for category membership while information-integration (II) classification requires the implicit learning of stimulus-response mappings. In the first study to directly test rule priming with RB and II category learning, we investigated the influence of the availability of information presented at the beginning of the task. Participants viewed lines that varied in length, orientation, and position on the screen, and were primed to focus on stimulus dimensions that were relevant or irrelevant to the correct classification rule. In Experiment 1, we used an RB category structure, and in Experiment 2, we used an II category structure. Accuracy and model-based analyses suggested that a focus on relevant dimensions improves RB task performance later in learning while a focus on an irrelevant dimension improves II task performance early in learning. © 2013.

  10. Performance Indicators for Business Rule Management

    NARCIS (Netherlands)

    Eline de Haan; dr. Martijn Zoet; Koen Smit

    2016-01-01

    From the article: With increasing investments in business rules management (BRM), organizations are searching for ways to value and benchmark their processes to elicitate, design, accept, deploy and execute business rules. To realize valuation and benchmarking of previously mentioned processes,

  11. Performance Model for High-Power Lithium Titanate Oxide Batteries based on Extended Characterization Tests

    DEFF Research Database (Denmark)

    Stroe, Ana-Irina; Swierczynski, Maciej Jozef; Stroe, Daniel Ioan

    2015-01-01

    Lithium-ion (Li-ion) batteries are found nowadays not only in portable/consumer electronics but also in more power demanding applications, such as stationary renewable energy storage, automotive and back-up power supply, because of their superior characteristics in comparison to other energy...... storage technologies. Nevertheless, prior to be used in any of the aforementioned application, a Li-ion battery cell must be intensively characterized and its behavior needs to be understood. This can be realized by performing extended laboratory characterization tests and developing Li-ion battery...... performance models. Furthermore, accurate performance models are necessary in order to analyze the behavior of the battery cell under different mission profiles, by simulation; thus, avoiding time and cost demanding real life tests. This paper presents the development and the parametrization of a performance...

  12. Logical-Rule Models of Classification Response Times: A Synthesis of Mental-Architecture, Random-Walk, and Decision-Bound Approaches

    Science.gov (United States)

    Fific, Mario; Little, Daniel R.; Nosofsky, Robert M.

    2010-01-01

    We formalize and provide tests of a set of logical-rule models for predicting perceptual classification response times (RTs) and choice probabilities. The models are developed by synthesizing mental-architecture, random-walk, and decision-bound approaches. According to the models, people make independent decisions about the locations of stimuli…

  13. A knowledge representation meta-model for rule-based modelling of signalling networks

    Directory of Open Access Journals (Sweden)

    Adrien Basso-Blandin

    2016-03-01

    Full Text Available The study of cellular signalling pathways and their deregulation in disease states, such as cancer, is a large and extremely complex task. Indeed, these systems involve many parts and processes but are studied piecewise and their literatures and data are consequently fragmented, distributed and sometimes—at least apparently—inconsistent. This makes it extremely difficult to build significant explanatory models with the result that effects in these systems that are brought about by many interacting factors are poorly understood. The rule-based approach to modelling has shown some promise for the representation of the highly combinatorial systems typically found in signalling where many of the proteins are composed of multiple binding domains, capable of simultaneous interactions, and/or peptide motifs controlled by post-translational modifications. However, the rule-based approach requires highly detailed information about the precise conditions for each and every interaction which is rarely available from any one single source. Rather, these conditions must be painstakingly inferred and curated, by hand, from information contained in many papers—each of which contains only part of the story. In this paper, we introduce a graph-based meta-model, attuned to the representation of cellular signalling networks, which aims to ease this massive cognitive burden on the rule-based curation process. This meta-model is a generalization of that used by Kappa and BNGL which allows for the flexible representation of knowledge at various levels of granularity. In particular, it allows us to deal with information which has either too little, or too much, detail with respect to the strict rule-based meta-model. Our approach provides a basis for the gradual aggregation of fragmented biological knowledge extracted from the literature into an instance of the meta-model from which we can define an automated translation into executable Kappa programs.

  14. Electronuclear sum rules for the lightest nuclei

    International Nuclear Information System (INIS)

    Efros, V.D.

    1992-01-01

    It is shown that the model-independent longitudinal electronuclear sum rules for nuclei with A = 3 and A = 4 have an accuracy on the order of a percent in the traditional single-nucleon approximation with free nucleons for the nuclear charge-density operator. This makes it possible to test this approximation by using these sum rules. The longitudinal sum rules for A = 3 and A = 4 are calculated using the wave functions of these nuclei corresponding to a large set of realistic NN interactions. The values of the model-independent sum rules lie in the range of values calculated by this method. Model-independent expressions are obtained for the transverse sum rules for nuclei with A = 3 and A = 4. These sum rules are calculated using a large set of realistic wave functions of these nuclei. The contribution of the convection current and the changes in the results for different versions of realistic NN forces are given. 29 refs., 4 tabs

  15. Functional networks inference from rule-based machine learning models.

    Science.gov (United States)

    Lazzarini, Nicola; Widera, Paweł; Williamson, Stuart; Heer, Rakesh; Krasnogor, Natalio; Bacardit, Jaume

    2016-01-01

    Functional networks play an important role in the analysis of biological processes and systems. The inference of these networks from high-throughput (-omics) data is an area of intense research. So far, the similarity-based inference paradigm (e.g. gene co-expression) has been the most popular approach. It assumes a functional relationship between genes which are expressed at similar levels across different samples. An alternative to this paradigm is the inference of relationships from the structure of machine learning models. These models are able to capture complex relationships between variables, that often are different/complementary to the similarity-based methods. We propose a protocol to infer functional networks from machine learning models, called FuNeL. It assumes, that genes used together within a rule-based machine learning model to classify the samples, might also be functionally related at a biological level. The protocol is first tested on synthetic datasets and then evaluated on a test suite of 8 real-world datasets related to human cancer. The networks inferred from the real-world data are compared against gene co-expression networks of equal size, generated with 3 different methods. The comparison is performed from two different points of view. We analyse the enriched biological terms in the set of network nodes and the relationships between known disease-associated genes in a context of the network topology. The comparison confirms both the biological relevance and the complementary character of the knowledge captured by the FuNeL networks in relation to similarity-based methods and demonstrates its potential to identify known disease associations as core elements of the network. Finally, using a prostate cancer dataset as a case study, we confirm that the biological knowledge captured by our method is relevant to the disease and consistent with the specialised literature and with an independent dataset not used in the inference process. The

  16. Tests of Parameters Instability: Theoretical Study and Empirical Applications on Two Types of Models (ARMA Model and Market Model

    Directory of Open Access Journals (Sweden)

    Sahbi FARHANI

    2012-01-01

    Full Text Available This paper considers tests of parameters instability and structural change with known, unknown or multiple breakpoints. The results apply to a wide class of parametric models that are suitable for estimation by strong rules for detecting the number of breaks in a time series. For that, we use Chow, CUSUM, CUSUM of squares, Wald, likelihood ratio and Lagrange multiplier tests. Each test implicitly uses an estimate of a change point. We conclude with an empirical analysis on two different models (ARMA model and simple linear regression model.

  17. A Method for the Comparison of Item Selection Rules in Computerized Adaptive Testing

    Science.gov (United States)

    Barrada, Juan Ramon; Olea, Julio; Ponsoda, Vicente; Abad, Francisco Jose

    2010-01-01

    In a typical study comparing the relative efficiency of two item selection rules in computerized adaptive testing, the common result is that they simultaneously differ in accuracy and security, making it difficult to reach a conclusion on which is the more appropriate rule. This study proposes a strategy to conduct a global comparison of two or…

  18. Earthquake likelihood model testing

    Science.gov (United States)

    Schorlemmer, D.; Gerstenberger, M.C.; Wiemer, S.; Jackson, D.D.; Rhoades, D.A.

    2007-01-01

    INTRODUCTIONThe Regional Earthquake Likelihood Models (RELM) project aims to produce and evaluate alternate models of earthquake potential (probability per unit volume, magnitude, and time) for California. Based on differing assumptions, these models are produced to test the validity of their assumptions and to explore which models should be incorporated in seismic hazard and risk evaluation. Tests based on physical and geological criteria are useful but we focus on statistical methods using future earthquake catalog data only. We envision two evaluations: a test of consistency with observed data and a comparison of all pairs of models for relative consistency. Both tests are based on the likelihood method, and both are fully prospective (i.e., the models are not adjusted to fit the test data). To be tested, each model must assign a probability to any possible event within a specified region of space, time, and magnitude. For our tests the models must use a common format: earthquake rates in specified “bins” with location, magnitude, time, and focal mechanism limits.Seismology cannot yet deterministically predict individual earthquakes; however, it should seek the best possible models for forecasting earthquake occurrence. This paper describes the statistical rules of an experiment to examine and test earthquake forecasts. The primary purposes of the tests described below are to evaluate physical models for earthquakes, assure that source models used in seismic hazard and risk studies are consistent with earthquake data, and provide quantitative measures by which models can be assigned weights in a consensus model or be judged as suitable for particular regions.In this paper we develop a statistical method for testing earthquake likelihood models. A companion paper (Schorlemmer and Gerstenberger 2007, this issue) discusses the actual implementation of these tests in the framework of the RELM initiative.Statistical testing of hypotheses is a common task and a

  19. Sum rule limitations of kinetic particle-production models

    International Nuclear Information System (INIS)

    Knoll, J.; CEA Centre d'Etudes Nucleaires de Grenoble, 38; Guet, C.

    1988-04-01

    Photoproduction and absorption sum rules generalized to systems at finite temperature provide a stringent check on the validity of kinetic models for the production of hard photons in intermediate energy nuclear collisions. We inspect such models for the case of nuclear matter at finite temperature employed in a kinetic regime which copes those encountered in energetic nuclear collisions, and find photon production rates which significantly exceed the limits imposed by the sum rule even under favourable concession. This suggests that coherence effects are quite important and the production of photons cannot be considered as an incoherent addition of individual NNγ production processes. The deficiencies of present kinetic models may also apply for the production of probes such as the pion which do not couple perturbatively to the nuclear currents. (orig.)

  20. Achievement Goals and Achievement Emotions: Testing a Model of Their Joint Relations with Academic Performance

    Science.gov (United States)

    Pekrun, Reinhard; Elliot, Andrew J.; Maier, Markus A.

    2009-01-01

    The authors propose a theoretical model linking achievement goals and achievement emotions to academic performance. This model was tested in a prospective study with undergraduates (N = 213), using exam-specific assessments of both goals and emotions as predictors of exam performance in an introductory-level psychology course. The findings were…

  1. High Level Rule Modeling Language for Airline Crew Pairing

    Science.gov (United States)

    Mutlu, Erdal; Birbil, Ş. Ilker; Bülbül, Kerem; Yenigün, Hüsnü

    2011-09-01

    The crew pairing problem is an airline optimization problem where a set of least costly pairings (consecutive flights to be flown by a single crew) that covers every flight in a given flight network is sought. A pairing is defined by using a very complex set of feasibility rules imposed by international and national regulatory agencies, and also by the airline itself. The cost of a pairing is also defined by using complicated rules. When an optimization engine generates a sequence of flights from a given flight network, it has to check all these feasibility rules to ensure whether the sequence forms a valid pairing. Likewise, the engine needs to calculate the cost of the pairing by using certain rules. However, the rules used for checking the feasibility and calculating the costs are usually not static. Furthermore, the airline companies carry out what-if-type analyses through testing several alternate scenarios in each planning period. Therefore, embedding the implementation of feasibility checking and cost calculation rules into the source code of the optimization engine is not a practical approach. In this work, a high level language called ARUS is introduced for describing the feasibility and cost calculation rules. A compiler for ARUS is also implemented in this work to generate a dynamic link library to be used by crew pairing optimization engines.

  2. Optimal Operational Monetary Policy Rules in an Endogenous Growth Model: a calibrated analysis

    OpenAIRE

    Arato, Hiroki

    2009-01-01

    This paper constructs an endogenous growth New Keynesian model and considers growth and welfare effect of Taylor-type (operational) monetary policy rules. The Ramsey equilibrium and optimal operational monetary policy rule is also computed. In the calibrated model, the Ramseyoptimal volatility of inflation rate is smaller than that in standard exogenous growth New Keynesian model with physical capital accumulation. Optimal operational monetary policy rule makes nominal interest rate respond s...

  3. A Nonlinear Programming and Artificial Neural Network Approach for Optimizing the Performance of a Job Dispatching Rule in a Wafer Fabrication Factory

    Directory of Open Access Journals (Sweden)

    Toly Chen

    2012-01-01

    Full Text Available A nonlinear programming and artificial neural network approach is presented in this study to optimize the performance of a job dispatching rule in a wafer fabrication factory. The proposed methodology fuses two existing rules and constructs a nonlinear programming model to choose the best values of parameters in the two rules by dynamically maximizing the standard deviation of the slack, which has been shown to benefit scheduling performance by several studies. In addition, a more effective approach is also applied to estimate the remaining cycle time of a job, which is empirically shown to be conducive to the scheduling performance. The efficacy of the proposed methodology was validated with a simulated case; evidence was found to support its effectiveness. We also suggested several directions in which it can be exploited in the future.

  4. Concurrence of rule- and similarity-based mechanisms in artificial grammar learning.

    Science.gov (United States)

    Opitz, Bertram; Hofmann, Juliane

    2015-03-01

    A current theoretical debate regards whether rule-based or similarity-based learning prevails during artificial grammar learning (AGL). Although the majority of findings are consistent with a similarity-based account of AGL it has been argued that these results were obtained only after limited exposure to study exemplars, and performance on subsequent grammaticality judgment tests has often been barely above chance level. In three experiments the conditions were investigated under which rule- and similarity-based learning could be applied. Participants were exposed to exemplars of an artificial grammar under different (implicit and explicit) learning instructions. The analysis of receiver operating characteristics (ROC) during a final grammaticality judgment test revealed that explicit but not implicit learning led to rule knowledge. It also demonstrated that this knowledge base is built up gradually while similarity knowledge governed the initial state of learning. Together these results indicate that rule- and similarity-based mechanisms concur during AGL. Moreover, it could be speculated that two different rule processes might operate in parallel; bottom-up learning via gradual rule extraction and top-down learning via rule testing. Crucially, the latter is facilitated by performance feedback that encourages explicit hypothesis testing. Copyright © 2015 Elsevier Inc. All rights reserved.

  5. The Latent Class Model as a Measurement Model for Situational Judgment Tests

    Directory of Open Access Journals (Sweden)

    Frank Rijmen

    2011-11-01

    Full Text Available In a situational judgment test, it is often debatable what constitutes a correct answer to a situation. There is currently a multitude of scoring procedures. Establishing a measurement model can guide the selection of a scoring rule. It is argued that the latent class model is a good candidate for a measurement model. Two latent class models are applied to the Managing Emotions subtest of the Mayer, Salovey, Caruso Emotional Intelligence Test: a plain-vanilla latent class model, and a second-order latent class model that takes into account the clustering of several possible reactions within each hypothetical scenario of the situational judgment test. The results for both models indicated that there were three subgroups characterised by the degree to which differentiation occurred between possible reactions in terms of perceived effectiveness. Furthermore, the results for the second-order model indicated a moderate cluster effect.

  6. Engine Performance Test of the 1975 Chrysler - Nissan Model CN633 Diesel Engine

    Science.gov (United States)

    1975-09-01

    An engine test of the Chrysler-Nissan Model CN633 diesel engine was performed to determine its steady-state fuel consumption and emissions (HC, CO, NOx) maps. The data acquired are summarized in this report.

  7. Reservoir Operating Rule Optimization for California's Sacramento Valley

    Directory of Open Access Journals (Sweden)

    Timothy Nelson

    2016-03-01

    Full Text Available doi: http://dx.doi.org/10.15447/sfews.2016v14iss1art6Reservoir operating rules for water resource systems are typically developed by combining intuition, professional discussion, and simulation modeling. This paper describes a joint optimization–simulation approach to develop preliminary economically-based operating rules for major reservoirs in California’s Sacramento Valley, based on optimized results from CALVIN, a hydro-economic optimization model. We infer strategic operating rules from the optimization model results, including storage allocation rules to balance storage among multiple reservoirs, and reservoir release rules to determine monthly release for individual reservoirs. Results show the potential utility of considering previous year type on water availability and various system and sub-system storage conditions, in addition to normal consideration of local reservoir storage, season, and current inflows. We create a simple simulation to further refine and test the derived operating rules. Optimization model results show particular insights for balancing the allocation of water storage among Shasta, Trinity, and Oroville reservoirs over drawdown and refill seasons, as well as some insights for release rules at major reservoirs in the Sacramento Valley. We also discuss the applicability and limitations of developing reservoir operation rules from optimization model results.

  8. Hierarchy-associated semantic-rule inference framework for classifying indoor scenes

    Science.gov (United States)

    Yu, Dan; Liu, Peng; Ye, Zhipeng; Tang, Xianglong; Zhao, Wei

    2016-03-01

    Typically, the initial task of classifying indoor scenes is challenging, because the spatial layout and decoration of a scene can vary considerably. Recent efforts at classifying object relationships commonly depend on the results of scene annotation and predefined rules, making classification inflexible. Furthermore, annotation results are easily affected by external factors. Inspired by human cognition, a scene-classification framework was proposed using the empirically based annotation (EBA) and a match-over rule-based (MRB) inference system. The semantic hierarchy of images is exploited by EBA to construct rules empirically for MRB classification. The problem of scene classification is divided into low-level annotation and high-level inference from a macro perspective. Low-level annotation involves detecting the semantic hierarchy and annotating the scene with a deformable-parts model and a bag-of-visual-words model. In high-level inference, hierarchical rules are extracted to train the decision tree for classification. The categories of testing samples are generated from the parts to the whole. Compared with traditional classification strategies, the proposed semantic hierarchy and corresponding rules reduce the effect of a variable background and improve the classification performance. The proposed framework was evaluated on a popular indoor scene dataset, and the experimental results demonstrate its effectiveness.

  9. The combining of multiple hemispheric resources in learning-disabled and skilled readers' recall of words: a test of three information-processing models.

    Science.gov (United States)

    Swanson, H L

    1987-01-01

    Three theoretical models (additive, independence, maximum rule) that characterize and predict the influence of independent hemispheric resources on learning-disabled and skilled readers' simultaneous processing were tested. Predictions related to word recall performance during simultaneous encoding conditions (dichotic listening task) were made from unilateral (dichotic listening task) presentations. The maximum rule model best characterized both ability groups in that simultaneous encoding produced no better recall than unilateral presentations. While the results support the hypothesis that both ability groups use similar processes in the combining of hemispheric resources (i.e., weak/dominant processing), ability group differences do occur in the coordination of such resources.

  10. Standardization of test conditions for gamma camera performance measurement

    International Nuclear Information System (INIS)

    Jordan, K.

    1980-01-01

    The actual way of measuring gamma camera performance is to use point sources or flood sources in air, often in combination with bar phantoms. This method mostly brings best performance parameters for cameras but it has nothing in common with the use of a camera in clinical practice. Particular in the case of low energy emitters, like Tc-99m, the influence of scattered radiation over the performance of cameras is very high. Therefore it is important to have test conditions of radionuclide imaging devices, that will approach as best as practicable the measuring conditions in clinical applications. It is therefore a good news that the International Electrochemical Commission IEC has prepared a draft 'Characteristics and test conditions of radionuclide imaging devices' which is now submitted to the national committees for formal approval under the Six Months' Rule. Some essential points of this document are discussed in the paper. (orig.) [de

  11. Commercial Building Energy Baseline Modeling Software: Performance Metrics and Method Testing with Open Source Models and Implications for Proprietary Software Testing

    Energy Technology Data Exchange (ETDEWEB)

    Price, Phillip N.; Granderson, Jessica; Sohn, Michael; Addy, Nathan; Jump, David

    2013-09-01

    The overarching goal of this work is to advance the capabilities of technology evaluators in evaluating the building-level baseline modeling capabilities of Energy Management and Information System (EMIS) software. Through their customer engagement platforms and products, EMIS software products have the potential to produce whole-building energy savings through multiple strategies: building system operation improvements, equipment efficiency upgrades and replacements, and inducement of behavioral change among the occupants and operations personnel. Some offerings may also automate the quantification of whole-building energy savings, relative to a baseline period, using empirical models that relate energy consumption to key influencing parameters, such as ambient weather conditions and building operation schedule. These automated baseline models can be used to streamline the whole-building measurement and verification (M&V) process, and therefore are of critical importance in the context of multi-measure whole-building focused utility efficiency programs. This report documents the findings of a study that was conducted to begin answering critical questions regarding quantification of savings at the whole-building level, and the use of automated and commercial software tools. To evaluate the modeling capabilities of EMIS software particular to the use case of whole-building savings estimation, four research questions were addressed: 1. What is a general methodology that can be used to evaluate baseline model performance, both in terms of a) overall robustness, and b) relative to other models? 2. How can that general methodology be applied to evaluate proprietary models that are embedded in commercial EMIS tools? How might one handle practical issues associated with data security, intellectual property, appropriate testing ‘blinds’, and large data sets? 3. How can buildings be pre-screened to identify those that are the most model-predictable, and therefore those

  12. Knowledge base rule partitioning design for CLIPS

    Science.gov (United States)

    Mainardi, Joseph D.; Szatkowski, G. P.

    1990-01-01

    This describes a knowledge base (KB) partitioning approach to solve the problem of real-time performance using the CLIPS AI shell when containing large numbers of rules and facts. This work is funded under the joint USAF/NASA Advanced Launch System (ALS) Program as applied research in expert systems to perform vehicle checkout for real-time controller and diagnostic monitoring tasks. The Expert System advanced development project (ADP-2302) main objective is to provide robust systems responding to new data frames of 0.1 to 1.0 second intervals. The intelligent system control must be performed within the specified real-time window, in order to meet the demands of the given application. Partitioning the KB reduces the complexity of the inferencing Rete net at any given time. This reduced complexity improves performance but without undo impacts during load and unload cycles. The second objective is to produce highly reliable intelligent systems. This requires simple and automated approaches to the KB verification & validation task. Partitioning the KB reduces rule interaction complexity overall. Reduced interaction simplifies the V&V testing necessary by focusing attention only on individual areas of interest. Many systems require a robustness that involves a large number of rules, most of which are mutually exclusive under different phases or conditions. The ideal solution is to control the knowledge base by loading rules that directly apply for that condition, while stripping out all rules and facts that are not used during that cycle. The practical approach is to cluster rules and facts into associated 'blocks'. A simple approach has been designed to control the addition and deletion of 'blocks' of rules and facts, while allowing real-time operations to run freely. Timing tests for real-time performance for specific machines under R/T operating systems have not been completed but are planned as part of the analysis process to validate the design.

  13. GRAMMAR RULE BASED INFORMATION RETRIEVAL MODEL FOR BIG DATA

    Directory of Open Access Journals (Sweden)

    T. Nadana Ravishankar

    2015-07-01

    Full Text Available Though Information Retrieval (IR in big data has been an active field of research for past few years; the popularity of the native languages presents a unique challenge in big data information retrieval systems. There is a need to retrieve information which is present in English and display it in the native language for users. This aim of cross language information retrieval is complicated by unique features of the native languages such as: morphology, compound word formations, word spelling variations, ambiguity, word synonym, other language influence and etc. To overcome some of these issues, the native language is modeled using a grammar rule based approach in this work. The advantage of this approach is that the native language is modeled and its unique features are encoded using a set of inference rules. This rule base coupled with the customized ontological system shows considerable potential and is found to show better precision and recall.

  14. A General Attribute and Rule Based Role-Based Access Control Model

    Institute of Scientific and Technical Information of China (English)

    2007-01-01

    Growing numbers of users and many access control policies which involve many different resource attributes in service-oriented environments bring various problems in protecting resource. This paper analyzes the relationships of resource attributes to user attributes in all policies, and propose a general attribute and rule based role-based access control(GAR-RBAC) model to meet the security needs. The model can dynamically assign users to roles via rules to meet the need of growing numbers of users. These rules use different attribute expression and permission as a part of authorization constraints, and are defined by analyzing relations of resource attributes to user attributes in many access policies that are defined by the enterprise. The model is a general access control model, and can support many access control policies, and also can be used to wider application for service. The paper also describes how to use the GAR-RBAC model in Web service environments.

  15. FORMAL MODELLING OF BUSINESS RULES: WHAT KIND OF TOOL TO USE?

    Directory of Open Access Journals (Sweden)

    Sandra Lovrenčić

    2006-12-01

    Full Text Available Business rules are today essential parts of a business system model. But presently, there are still various approaches to, definitions and classifications of this concept. Similarly, there are also different approaches in business rules formalization and implementation. This paper investigates formalization using formal language in association with easy domain modelling. Two of the tools that enable such approach are described and compared according to several factors. They represent ontology modelling and UML, nowadays widely used standard for object-oriented modelling. A simple example is also presented.

  16. Annotation of rule-based models with formal semantics to enable creation, analysis, reuse and visualization

    Science.gov (United States)

    Misirli, Goksel; Cavaliere, Matteo; Waites, William; Pocock, Matthew; Madsen, Curtis; Gilfellon, Owen; Honorato-Zimmer, Ricardo; Zuliani, Paolo; Danos, Vincent; Wipat, Anil

    2016-01-01

    Motivation: Biological systems are complex and challenging to model and therefore model reuse is highly desirable. To promote model reuse, models should include both information about the specifics of simulations and the underlying biology in the form of metadata. The availability of computationally tractable metadata is especially important for the effective automated interpretation and processing of models. Metadata are typically represented as machine-readable annotations which enhance programmatic access to information about models. Rule-based languages have emerged as a modelling framework to represent the complexity of biological systems. Annotation approaches have been widely used for reaction-based formalisms such as SBML. However, rule-based languages still lack a rich annotation framework to add semantic information, such as machine-readable descriptions, to the components of a model. Results: We present an annotation framework and guidelines for annotating rule-based models, encoded in the commonly used Kappa and BioNetGen languages. We adapt widely adopted annotation approaches to rule-based models. We initially propose a syntax to store machine-readable annotations and describe a mapping between rule-based modelling entities, such as agents and rules, and their annotations. We then describe an ontology to both annotate these models and capture the information contained therein, and demonstrate annotating these models using examples. Finally, we present a proof of concept tool for extracting annotations from a model that can be queried and analyzed in a uniform way. The uniform representation of the annotations can be used to facilitate the creation, analysis, reuse and visualization of rule-based models. Although examples are given, using specific implementations the proposed techniques can be applied to rule-based models in general. Availability and implementation: The annotation ontology for rule-based models can be found at http

  17. Intelligent control for modeling of real-time reservoir operation, part II: artificial neural network with operating rule curves

    Science.gov (United States)

    Chang, Ya-Ting; Chang, Li-Chiu; Chang, Fi-John

    2005-04-01

    To bridge the gap between academic research and actual operation, we propose an intelligent control system for reservoir operation. The methodology includes two major processes, the knowledge acquired and implemented, and the inference system. In this study, a genetic algorithm (GA) and a fuzzy rule base (FRB) are used to extract knowledge based on the historical inflow data with a design objective function and on the operating rule curves respectively. The adaptive network-based fuzzy inference system (ANFIS) is then used to implement the knowledge, to create the fuzzy inference system, and then to estimate the optimal reservoir operation. To investigate its applicability and practicability, the Shihmen reservoir, Taiwan, is used as a case study. For the purpose of comparison, a simulation of the currently used M-5 operating rule curve is also performed. The results demonstrate that (1) the GA is an efficient way to search the optimal input-output patterns, (2) the FRB can extract the knowledge from the operating rule curves, and (3) the ANFIS models built on different types of knowledge can produce much better performance than the traditional M-5 curves in real-time reservoir operation. Moreover, we show that the model can be more intelligent for reservoir operation if more information (or knowledge) is involved.

  18. Slave-particle quantization and sum rules in the t-J model

    International Nuclear Information System (INIS)

    Le Guillou, J.C.; Ragoucy, E.

    1994-12-01

    In the framework of constrained systems, the classical Hamiltonian formulation of slave-particle models and their correct quantization are given. The electron-momentum distribution function in the t-J and Hubbard models is then studied in the framework of slave-particle approaches and within the decoupling scheme. It is shown that criticisms which have been addressed in this context coming from a violation of the sum rule for the physical electron are not valid. Due to the correct quantization rules for the slave-particles, the sum rule for the physical electron is indeed obeyed, both exactly and within the decoupling scheme. (author). 15 refs

  19. Fault Detection Using the Clustering-kNN Rule for Gas Sensor Arrays

    Directory of Open Access Journals (Sweden)

    Jingli Yang

    2016-12-01

    Full Text Available The k-nearest neighbour (kNN rule, which naturally handles the possible non-linearity of data, is introduced to solve the fault detection problem of gas sensor arrays. In traditional fault detection methods based on the kNN rule, the detection process of each new test sample involves all samples in the entire training sample set. Therefore, these methods can be computation intensive in monitoring processes with a large volume of variables and training samples and may be impossible for real-time monitoring. To address this problem, a novel clustering-kNN rule is presented. The landmark-based spectral clustering (LSC algorithm, which has low computational complexity, is employed to divide the entire training sample set into several clusters. Further, the kNN rule is only conducted in the cluster that is nearest to the test sample; thus, the efficiency of the fault detection methods can be enhanced by reducing the number of training samples involved in the detection process of each test sample. The performance of the proposed clustering-kNN rule is fully verified in numerical simulations with both linear and non-linear models and a real gas sensor array experimental system with different kinds of faults. The results of simulations and experiments demonstrate that the clustering-kNN rule can greatly enhance both the accuracy and efficiency of fault detection methods and provide an excellent solution to reliable and real-time monitoring of gas sensor arrays.

  20. Fault Detection Using the Clustering-kNN Rule for Gas Sensor Arrays

    Science.gov (United States)

    Yang, Jingli; Sun, Zhen; Chen, Yinsheng

    2016-01-01

    The k-nearest neighbour (kNN) rule, which naturally handles the possible non-linearity of data, is introduced to solve the fault detection problem of gas sensor arrays. In traditional fault detection methods based on the kNN rule, the detection process of each new test sample involves all samples in the entire training sample set. Therefore, these methods can be computation intensive in monitoring processes with a large volume of variables and training samples and may be impossible for real-time monitoring. To address this problem, a novel clustering-kNN rule is presented. The landmark-based spectral clustering (LSC) algorithm, which has low computational complexity, is employed to divide the entire training sample set into several clusters. Further, the kNN rule is only conducted in the cluster that is nearest to the test sample; thus, the efficiency of the fault detection methods can be enhanced by reducing the number of training samples involved in the detection process of each test sample. The performance of the proposed clustering-kNN rule is fully verified in numerical simulations with both linear and non-linear models and a real gas sensor array experimental system with different kinds of faults. The results of simulations and experiments demonstrate that the clustering-kNN rule can greatly enhance both the accuracy and efficiency of fault detection methods and provide an excellent solution to reliable and real-time monitoring of gas sensor arrays. PMID:27929412

  1. Bridging Weighted Rules and Graph Random Walks for Statistical Relational Models

    Directory of Open Access Journals (Sweden)

    Seyed Mehran Kazemi

    2018-02-01

    Full Text Available The aim of statistical relational learning is to learn statistical models from relational or graph-structured data. Three main statistical relational learning paradigms include weighted rule learning, random walks on graphs, and tensor factorization. These paradigms have been mostly developed and studied in isolation for many years, with few works attempting at understanding the relationship among them or combining them. In this article, we study the relationship between the path ranking algorithm (PRA, one of the most well-known relational learning methods in the graph random walk paradigm, and relational logistic regression (RLR, one of the recent developments in weighted rule learning. We provide a simple way to normalize relations and prove that relational logistic regression using normalized relations generalizes the path ranking algorithm. This result provides a better understanding of relational learning, especially for the weighted rule learning and graph random walk paradigms. It opens up the possibility of using the more flexible RLR rules within PRA models and even generalizing both by including normalized and unnormalized relations in the same model.

  2. Parton model (Moessbauer) sum rules for b → c decays

    International Nuclear Information System (INIS)

    Lipkin, H.J.

    1993-01-01

    The parton model is a starting point or zero-order approximation in many treatments. The author follows an approach previously used for the Moessbauer effect and shows how parton model sum rules derived for certain moments of the lepton energy spectrum in b → c semileptonic decays remain valid even when binding effects are included. The parton model appears as a open-quote semiclassical close-quote model whose results for certain averages also hold (correspondence principle) in quantum mechanics. Algebraic techniques developed for the Moessbauer effect exploit simple features of the commutator between the weak current operator and the bound state Hamiltonian to find the appropriate sum rules and show the validity of the parton model in the classical limit, ℎ → 0, where all commutators vanish

  3. The effect of the number of seed variables on the performance of Cooke′s classical model

    International Nuclear Information System (INIS)

    Eggstaff, Justin W.; Mazzuchi, Thomas A.; Sarkani, Shahram

    2014-01-01

    In risk analysis, Cooke′s classical model for aggregating expert judgment has been widely used for over 20 years. However, the validity of this model has been the subject of much debate. Critics assert that this model′s scoring rule may unintentionally reward experts who manipulate their quantile estimates in order to receive a greater weight. In addition, the question of the number of seed variables required to ensure adequate performance of Cooke′s classical model remains unanswered. In this study, we conduct a comprehensive examination of the model through an iterative, cross validation test to perform an out-of-sample comparison between Cooke′s classical model and the equal-weight linear opinion pool method on almost all of the expert judgment studies compiled by Cooke and colleagues to date. Our results indicate that Cooke′s classical model significantly outperforms equally weighting expert judgment, regardless of the number of seed variables used; however, there may, in fact, be a maximum number of seed variables beyond which Cooke′s model cannot outperform an equally-weighted panel. - Highlights: • We examine Cooke′s classical model through an iterative, cross validation test. • The performance-based and equally weighted decision makers are compared. • Results strengthen Cooke′s argument for a two-fold cross-validation approach. • Accuracy test results show strong support in favor of Cooke′s classical method. • There may be a maximum number of seed variables that ensures model performance

  4. Effect of Linked Rules on Business Process Model Understanding

    DEFF Research Database (Denmark)

    Wang, Wei; Indulska, Marta; Sadiq, Shazia

    2017-01-01

    Business process models are widely used in organizations by information systems analysts to represent complex business requirements and by business users to understand business operations and constraints. This understanding is extracted from graphical process models as well as business rules. Prior...

  5. Test planning and performance

    International Nuclear Information System (INIS)

    Zola, Maurizio

    2001-01-01

    Testing plan should include Safety guide Q4 - Inspection and testing - A testing plan should be prepared including following information: General information (facility name, item or system reference, procurement document reference, document reference number and status, associated procedures and drawings); A sequential listing of all testing activities; Procedure, work instruction, specification or standard to be followed in respect of each operation and test; Acceptance criteria; Identification of who is performing tests; Identification of hold points; Type of records to be prepared for each test; Persons and organizations having authority for final acceptance. Proposed activities sequence is: visual, electrical and mechanical checks; environmental tests (thermal aging, vibrations aging, radioactive aging); performance evaluation in extreme conditions; dynamic tests with functional checks; final electrical and mechanical checks The planning of the tests should always be performed taking into account an interpretative model: a very tight cooperation is advisable between experimental people and numerical people dealing with the analysis of more or less complex models for the seismic assessment of structures and components. Preparatory phase should include the choice of the following items should be agreed upon with the final user of the tests: Excitation points, Excitation types, Excitation amplitude with respect to frequency, Measuring points. Data acquisition, recording and storage, should take into account the characteristics of the successive data processing: to much data can be cumbersome to be processed, but to few data can make unusable the experimental results. The parameters for time history acquisition should be chosen taking into account data processing: for Shock Response Spectrum calculation some special requirements should be met: frequency bounded signal, high frequency sampling, shock noise. For stationary random-like excitation, the sample length

  6. Advantages and limitations of anticipating laboratory test results from regression- and tree-based rules derived from electronic health-record data.

    Directory of Open Access Journals (Sweden)

    Fahim Mohammad

    Full Text Available Laboratory testing is the single highest-volume medical activity, making it useful to ask how well one can anticipate whether a given test result will be high, low, or within the reference interval ("normal". We analyzed 10 years of electronic health records--a total of 69.4 million blood tests--to see how well standard rule-mining techniques can anticipate test results based on patient age and gender, recent diagnoses, and recent laboratory test results. We evaluated rules according to their positive and negative predictive value (PPV and NPV and area under the receiver-operator characteristic curve (ROC AUCs. Using a stringent cutoff of PPV and/or NPV≥0.95, standard techniques yield few rules for sendout tests but several for in-house tests, mostly for repeat laboratory tests that are part of the complete blood count and basic metabolic panel. Most rules were clinically and pathophysiologically plausible, and several seemed clinically useful for informing pre-test probability of a given result. But overall, rules were unlikely to be able to function as a general substitute for actually ordering a test. Improving laboratory utilization will likely require different input data and/or alternative methods.

  7. Advantages and limitations of anticipating laboratory test results from regression- and tree-based rules derived from electronic health-record data.

    Science.gov (United States)

    Mohammad, Fahim; Theisen-Toupal, Jesse C; Arnaout, Ramy

    2014-01-01

    Laboratory testing is the single highest-volume medical activity, making it useful to ask how well one can anticipate whether a given test result will be high, low, or within the reference interval ("normal"). We analyzed 10 years of electronic health records--a total of 69.4 million blood tests--to see how well standard rule-mining techniques can anticipate test results based on patient age and gender, recent diagnoses, and recent laboratory test results. We evaluated rules according to their positive and negative predictive value (PPV and NPV) and area under the receiver-operator characteristic curve (ROC AUCs). Using a stringent cutoff of PPV and/or NPV≥0.95, standard techniques yield few rules for sendout tests but several for in-house tests, mostly for repeat laboratory tests that are part of the complete blood count and basic metabolic panel. Most rules were clinically and pathophysiologically plausible, and several seemed clinically useful for informing pre-test probability of a given result. But overall, rules were unlikely to be able to function as a general substitute for actually ordering a test. Improving laboratory utilization will likely require different input data and/or alternative methods.

  8. Transforming process models : executable rewrite rules versus a formalized Java program

    NARCIS (Netherlands)

    Van Gorp, P.M.E.; Eshuis, H.; Petriu, D.C.; Rouquette, N.

    2010-01-01

    In the business process management community, transformations for process models are usually programmed using imperative languages (such as Java). The underlying mapping rules tend to be documented using informal visual rules whereas they tend to be formalized using mathematical set constructs. In

  9. Transforming process models : executable rewrite rules versus a formalized Java program

    NARCIS (Netherlands)

    Van Gorp, P.M.E.; Eshuis, H.

    2010-01-01

    In the business process management community, transformations for process models are usually programmed using imperative languages. The underlying mapping rules tend to be documented using informal visual rules whereas they tend to be formalized using mathematical set constructs. In the Graph and

  10. Comparison of Heuristics for Inhibitory Rule Optimization

    KAUST Repository

    Alsolami, Fawaz

    2014-09-13

    Knowledge representation and extraction are very important tasks in data mining. In this work, we proposed a variety of rule-based greedy algorithms that able to obtain knowledge contained in a given dataset as a series of inhibitory rules containing an expression “attribute ≠ value” on the right-hand side. The main goal of this paper is to determine based on rule characteristics, rule length and coverage, whether the proposed rule heuristics are statistically significantly different or not; if so, we aim to identify the best performing rule heuristics for minimization of rule length and maximization of rule coverage. Friedman test with Nemenyi post-hoc are used to compare the greedy algorithms statistically against each other for length and coverage. The experiments are carried out on real datasets from UCI Machine Learning Repository. For leading heuristics, the constructed rules are compared with optimal ones obtained based on dynamic programming approach. The results seem to be promising for the best heuristics: the average relative difference between length (coverage) of constructed and optimal rules is at most 2.27% (7%, respectively). Furthermore, the quality of classifiers based on sets of inhibitory rules constructed by the considered heuristics are compared against each other, and the results show that the three best heuristics from the point of view classification accuracy coincides with the three well-performed heuristics from the point of view of rule length minimization.

  11. Mathematical modelling of contact of ruled surfaces: theory and practical application

    Science.gov (United States)

    Panchuk, K. L.; Niteyskiy, A. S.

    2016-04-01

    In the theory of ruled surfaces there are well known researches of contact of ruled surfaces along their common generator line (Klein image is often used [1]). In this paper we propose a study of contact of non developable ruled surfaces via the dual vector calculus. The advantages of this method have been demonstrated by E. Study, W. Blaschke and D. N. Zeiliger in differential geometry studies of ruled surfaces in space R3 over the algebra of dual numbers. A practical use of contact is demonstrated by the example modeling of the working surface of the progressive tool for tillage.

  12. Tutorial on Modeling VAT Rules Using OWL-DL

    DEFF Research Database (Denmark)

    Nielsen, Morten Ib; Simonsen, Jakob Grue; Larsen, Ken Friis

    . In an ERP setting such a model could reduce the Total Cost of Ownership (TCO) and increase the quality of the system. We have selected OWL-DL because we believe that description logic is suited for modeling VAT rules due to the decidability of important inference problems that are key to the way we plan...... to use the model and because OWL-DL is relatively intuitive to use....

  13. Multiphase pumping: indoor performance test and oilfield application

    Science.gov (United States)

    Kong, Xiangling; Zhu, Hongwu; Zhang, Shousen; Li, Jifeng

    2010-03-01

    Multiphase pumping is essentially a means of adding energy to the unprocessed effluent which enables the liquid and gas mixture to be transported over a long distances without prior separation. A reduction, consolidation, or elimination of the production infrastructure, such as separation equipments and offshore platforms can be developed more economically. Also it successfully lowed the backpressure of wells, revived dead wells and improved the production and efficiency of oilfield. This paper reviews the issues related to indoor performance test and an oilfield application of the helico-axial multiphase pump designed by China University of Petroleum (Beijing). Pump specification and its hydraulic design are given. Results of performance testing under different condition, such as operational speed and gas volume fraction (GVF) etc are presented. Experimental studies on combination of theoretical analysis showed the multiphase pump satisfies the similitude rule, which can be used in the development of new MPP design and performance prediction. Test results showed that rising the rotation speed and suction pressure could better its performance, pressure boost improved, high efficiency zone expanding and the flow rate related to the optimum working condition increased. The pump worked unstable as GVF increased to a certain extent and slip occurred between two phases in the pump, creating surging and gas lock at a high GVF. A case of application in Nanyang oilfield is also studied.

  14. Testing a model of intonation in a tone language.

    Science.gov (United States)

    Lindau, M

    1986-09-01

    Schematic fundamental frequency curves of simple statements and questions are generated for Hausa, a two-tone language of Nigeria, using a modified version of an intonational model developed by Gårding and Bruce [Nordic Prosody II, edited by T. Fretheim (Tapir, Trondheim, 1981), pp. 33-39]. In this model, rules for intonation and tones are separated. Intonation is represented as sloping grids of (near) parallel lines, inside which tones are placed. The tones are associated with turning points of the fundamental frequency contour. Local rules may also modify the exact placement of a tone within the grid. The continuous fundamental frequency contour is modeled by concatenating the tonal points using polynomial equations. Thus the final pitch contour is modeled as an interaction between global and local factors. The slope of the intonational grid lines depends at least on sentence type (statement or question), sentence length, and tone pattern. The model is tested by reference to data from nine speakers of Kano Hausa.

  15. Quantifying Listeria monocytogenes prevalence and concentration in minced pork meat and estimating performance of three culture media from presence/absence microbiological testing using a deterministic and stochastic approach.

    Science.gov (United States)

    Andritsos, Nikolaos D; Mataragas, Marios; Paramithiotis, Spiros; Drosinos, Eleftherios H

    2013-12-01

    Listeria monocytogenes poses a serious threat to public health, and the majority of cases of human listeriosis are associated with contaminated food. Reliable microbiological testing is needed for effective pathogen control by food industry and competent authorities. The aims of this work were to estimate the prevalence and concentration of L. monocytogenes in minced pork meat by the application of a Bayesian modeling approach, and also to determine the performance of three culture media commonly used for detecting L. monocytogenes in foods from a deterministic and stochastic perspective. Samples (n = 100) collected from local markets were tested for L. monocytogenes using in parallel the PALCAM, ALOA and RAPID'L.mono selective media according to ISO 11290-1:1996 and 11290-2:1998 methods. Presence of the pathogen was confirmed by conducting biochemical and molecular tests. Independent experiments (n = 10) for model validation purposes were performed. Performance attributes were calculated from the presence-absence microbiological test results by combining the results obtained from the culture media and confirmative tests. Dirichlet distribution, the multivariate expression of a Beta distribution, was used to analyze the performance data from a stochastic perspective. No L. monocytogenes was enumerated by direct-plating (media were best at ruling in L. monocytogenes presence than ruling it out. Sensitivity and specificity varied depending on the culture-dependent method. None of the culture media was perfect in detecting L. monocytogenes in minced pork meat alone. The use of at least two culture media in parallel enhanced the efficiency of L. monocytogenes detection. Bayesian modeling may reduce the time needed to draw conclusions regarding L. monocytogenes presence and the uncertainty of the results obtained. Furthermore, the problem of observing zero counts may be overcome by applying Bayesian analysis, making the determination of a test performance feasible

  16. Topic categorisation of statements in suicide notes with integrated rules and machine learning.

    Science.gov (United States)

    Kovačević, Aleksandar; Dehghan, Azad; Keane, John A; Nenadic, Goran

    2012-01-01

    We describe and evaluate an automated approach used as part of the i2b2 2011 challenge to identify and categorise statements in suicide notes into one of 15 topics, including Love, Guilt, Thankfulness, Hopelessness and Instructions. The approach combines a set of lexico-syntactic rules with a set of models derived by machine learning from a training dataset. The machine learning models rely on named entities, lexical, lexico-semantic and presentation features, as well as the rules that are applicable to a given statement. On a testing set of 300 suicide notes, the approach showed the overall best micro F-measure of up to 53.36%. The best precision achieved was 67.17% when only rules are used, whereas best recall of 50.57% was with integrated rules and machine learning. While some topics (eg, Sorrow, Anger, Blame) prove challenging, the performance for relatively frequent (eg, Love) and well-scoped categories (eg, Thankfulness) was comparatively higher (precision between 68% and 79%), suggesting that automated text mining approaches can be effective in topic categorisation of suicide notes.

  17. Preliminary tests of a model of cooling-pond thermal performance

    International Nuclear Information System (INIS)

    Hicks, B.B.; Wesely, M.L.; Wilczek, J.

    1975-01-01

    Experiments performed during recent years at the cooling pond complex at the Dresden nuclear power station have been designed to improve our understanding of the fundamental properties of thermal exchange at a warm-water surface. To a considerable extent, the field studies have been successful in that they have shown that modern micrometeorological techniques can be successfully applied to the demanding circumstances of an industrial cooling lake at temperature of at least 40 0 C. The intent of these studies has been to create a set of parameterization schemes good enough to allow simulation of the performance of the Dresden cooling lake without adjustment of numerical constants. An obvious extension of these studies, and one of the goals of the cooling-pond research program as presently stated, is to obtain an accurate numerical simulation of thermal performance of ponds with use of the improved formulations that have resulted from the experimental work at the Dresden lake. The computer model is divided into two sections and can be used to test the sensitivity of predicted performance to variations in procedures for determining the thermal transfer from the surface

  18. Rule-bases construction through self-learning for a table-based Sugeno-Takagi fuzzy logic control system

    Directory of Open Access Journals (Sweden)

    C. Boldisor

    2009-12-01

    Full Text Available A self-learning based methodology for building the rule-base of a fuzzy logic controller (FLC is presented and verified, aiming to engage intelligent characteristics to a fuzzy logic control systems. The methodology is a simplified version of those presented in today literature. Some aspects are intentionally ignored since it rarely appears in control system engineering and a SISO process is considered here. The fuzzy inference system obtained is a table-based Sugeno-Takagi type. System’s desired performance is defined by a reference model and rules are extracted from recorded data, after the correct control actions are learned. The presented algorithm is tested in constructing the rule-base of a fuzzy controller for a DC drive application. System’s performances and method’s viability are analyzed.

  19. 47 CFR 76.1905 - Petitions to modify encoding rules for new services within defined business models.

    Science.gov (United States)

    2010-10-01

    ... services within defined business models. 76.1905 Section 76.1905 Telecommunication FEDERAL COMMUNICATIONS... Rules § 76.1905 Petitions to modify encoding rules for new services within defined business models. (a) The encoding rules for defined business models in § 76.1904 reflect the conventional methods for...

  20. Generating Concise Rules for Human Motion Retrieval

    Science.gov (United States)

    Mukai, Tomohiko; Wakisaka, Ken-Ichi; Kuriyama, Shigeru

    This paper proposes a method for retrieving human motion data with concise retrieval rules based on the spatio-temporal features of motion appearance. Our method first converts motion clip into a form of clausal language that represents geometrical relations between body parts and their temporal relationship. A retrieval rule is then learned from the set of manually classified examples using inductive logic programming (ILP). ILP automatically discovers the essential rule in the same clausal form with a user-defined hypothesis-testing procedure. All motions are indexed using this clausal language, and the desired clips are retrieved by subsequence matching using the rule. Such rule-based retrieval offers reasonable performance and the rule can be intuitively edited in the same language form. Consequently, our method enables efficient and flexible search from a large dataset with simple query language.

  1. Testing proton spin models with polarized beams

    International Nuclear Information System (INIS)

    Ramsey, G.P.

    1991-01-01

    We review models for spin-weighted parton distributions in a proton. Sum rules involving the nonsinglet components of the structure function xg 1 p help narrow the range of parameters in these models. The contribution of the γ 5 anomaly term depends on the size of the integrated polarized gluon distribution and experimental predictions depend on its size. We have proposed three models for the polarized gluon distributions, whose range is considerable. These model distributions give an overall range is considerable. These model distributions give an overall range of parameters that can be tested with polarized beam experiments. These are discussed with regard to specific predictions for polarized beam experiments at energies typical of UNK

  2. Improving cloud network security using tree-rule firewall

    NARCIS (Netherlands)

    He, Xiangjian; Chomsiri, Thawatchai; Nanda, Priyadarsi; Tan, Zhiyuan

    This study proposes a new model of firewall called the ‘Tree-Rule Firewall’, which offers various benefits and is applicable for large networks such as ‘cloud’ networks. The recently available firewalls (i.e., Listed-Rule firewalls) have their limitations in performing the tasks and are inapplicable

  3. 47 CFR 76.1904 - Encoding rules for defined business models.

    Science.gov (United States)

    2010-10-01

    ... 47 Telecommunication 4 2010-10-01 2010-10-01 false Encoding rules for defined business models. 76... defined business models. (a) Commercial audiovisual content delivered as unencrypted broadcast television... the Commission pursuant to a petition with respect to a defined business model other than unencrypted...

  4. Bisphenol A; Final Test Rule

    Science.gov (United States)

    EPA is issuing a final rule, under section 4 of the Toxic Substances Control Act (TSCA) requiring manufacturers and processors of bisphenol A, hereinafter BPA, (4.4’-isopropylidenediphenol, CAS No. 80-05—7) to conduct a 90-day inhalation study.

  5. Implementation and automated validation of the minimal Z' model in FeynRules

    International Nuclear Information System (INIS)

    Basso, L.; Christensen, N.D.; Duhr, C.; Fuks, B.; Speckner, C.

    2012-01-01

    We describe the implementation of a well-known class of U(1) gauge models, the 'minimal' Z' models, in FeynRules. We also describe a new automated validation tool for FeynRules models which is controlled by a web interface and allows the user to run a complete set of 2 → 2 processes on different matrix element generators, different gauges, and compare between them all. If existing, the comparison with independent implementations is also possible. This tool has been used to validate our implementation of the 'minimal' Z' models. (authors)

  6. Experimental test of Neel's theory of the Rayleigh rule using gradually devitrified Co-based glass

    International Nuclear Information System (INIS)

    Lachowicz, H.K.

    2000-01-01

    It is shown that gradually devitrified Co-based nonmagnetostrictive metallic glass is an excellent model material to verify Louis Neel's theory of the Rayleigh rule. In the course of the calculations, Neel showed that the parameter p=bH c /a (where H c is the coercivity, a and b are the coefficients of a quadratic polynomial expressing the Rayleigh rule) is expected to range between 0.6 (hard magnets) and 1.6 (soft). However, the experimental values of this parameter, reported in the literature for a number of mono- and poly-crystalline magnets, are much greater than those expected from the theory presented by Neel (in some cases even by two orders of magnitude). The measurements, performed for a series of Co-based metallic glass samples annealed at gradually increasing temperature to produce nanocrystalline structures with differentiated density and size of the crystallites, have shown that the calculated values of the parameter p fall within the range expected from Neel's theory

  7. Progress in Developing Finite Element Models Replicating Flexural Graphite Testing

    International Nuclear Information System (INIS)

    Bratton, Robert

    2010-01-01

    This report documents the status of flexural strength evaluations from current ASTM procedures and of developing finite element models predicting the probability of failure. This work is covered under QLD REC-00030. Flexural testing procedures of the American Society for Testing and Materials (ASTM) assume a linear elastic material that has the same moduli for tension and compression. Contrary to this assumption, graphite is known to have different moduli for tension and compression. A finite element model was developed and demonstrated that accounts for the difference in moduli tension and compression. Brittle materials such as graphite exhibit significant scatter in tensile strength, so probabilistic design approaches must be used when designing components fabricated from brittle materials. ASTM procedures predicting probability of failure in ceramics were compared to methods from the current version of the ASME graphite core components rules predicting probability of failure. Using the ASTM procedures yields failure curves at lower applied forces than the ASME rules. A journal paper was published in the Journal of Nuclear Engineering and Design exploring the statistical models of fracture in graphite.

  8. Causal judgment from contingency information: a systematic test of the pCI rule.

    Science.gov (United States)

    White, Peter A

    2004-04-01

    Contingency information is information about the occurrence or nonoccurrence of an effect when a possible cause is present or absent. Under the evidential evaluation model, instances of contingency information are transformed into evidence and causal judgment is based on the proportion of relevant instances evaluated as confirmatory for the candidate cause. In this article, two experiments are reported that were designed to test systematic manipulations of the proportion of confirming instances in relation to other variables: the proportion of instances on which the candidate cause is present, the proportion of instances in which the effect occurs when the cause is present, and the objective contingency. Results showed that both unweighted and weighted versions of the proportion-of-confirmatory-instances rule successfully predicted the main features of the results, with the weighted version proving more successful. Other models, including the power PC theory, failed to predict the results.

  9. A Hierarchal Risk Assessment Model Using the Evidential Reasoning Rule

    Directory of Open Access Journals (Sweden)

    Xiaoxiao Ji

    2017-02-01

    Full Text Available This paper aims to develop a hierarchical risk assessment model using the newly-developed evidential reasoning (ER rule, which constitutes a generic conjunctive probabilistic reasoning process. In this paper, we first provide a brief introduction to the basics of the ER rule and emphasize the strengths for representing and aggregating uncertain information from multiple experts and sources. Further, we discuss the key steps of developing the hierarchical risk assessment framework systematically, including (1 formulation of risk assessment hierarchy; (2 representation of both qualitative and quantitative information; (3 elicitation of attribute weights and information reliabilities; (4 aggregation of assessment information using the ER rule and (5 quantification and ranking of risks using utility-based transformation. The proposed hierarchical risk assessment framework can potentially be implemented to various complex and uncertain systems. A case study on the fire/explosion risk assessment of marine vessels demonstrates the applicability of the proposed risk assessment model.

  10. Review of Bose-Fermi and ''Supersymmetry'' models; problems in particle transfer tests

    International Nuclear Information System (INIS)

    Vergnes, M.

    1986-01-01

    The first case suggested for a supersymmetry in nuclei was that of a j = 3/2 particle coupled to an 0(6) core. A more recent and elaborate scheme is the ''multi-j'' supersymmetry, describing the coupling of a particle in more than just one orbital, with the three possible cores of the interacting boson model. A general survey of the particle transfer tests of these different models is presented and the results summarized. A comparison of IBFM-2 calculations with experimental data is discussed, as well as results of sum rules analysis. Present and future tests concerning extensions of the above mentioned models, particularly to odd-odd nuclei, are briefly indicated. It appears necessary to clearly determine if the origin of the difficulties outlined for transfer reactions indeed lies -as often suggested- in the simplified form of the transfer operator used in deriving the selection rules, and not in the models themselves

  11. Model-Scale Aerodynamic Performance Testing of Proposed Modifications to the NASA Langley Low Speed Aeroacoustic Wind Tunnel

    Science.gov (United States)

    Booth, Earl R., Jr.; Coston, Calvin W., Jr.

    2005-01-01

    Tests were performed on a 1/20th-scale model of the Low Speed Aeroacoustic Wind Tunnel to determine the performance effects of insertion of acoustic baffles in the tunnel inlet, replacement of the existing collector with a new collector design in the open jet test section, and addition of flow splitters to the acoustic baffle section downstream of the test section. As expected, the inlet baffles caused a reduction in facility performance. About half of the performance loss was recovered by addition the flow splitters to the downstream baffles. All collectors tested reduced facility performance. However, test chamber recirculation flow was reduced by the new collector designs and shielding of some of the microphones was reduced owing to the smaller size of the new collector. Overall performance loss in the facility is expected to be a 5 percent top flow speed reduction, but the facility will meet OSHA limits for external noise levels and recirculation in the test section will be reduced.

  12. Performance test results of mock-up model test facility with a full-scale reaction tube for HTTR hydrogen production system. Contract research

    Energy Technology Data Exchange (ETDEWEB)

    Inagaki, Yoshiyuki; Hayashi, Koji; Kato, Michio [Japan Atomic Energy Research Inst., Oarai, Ibaraki (Japan). Oarai Research Establishment] [and others

    2003-03-01

    Research on a hydrogen production system by steam reforming of methane, chemical reaction; CH{sub 4} + H{sub 2}O {yields} 3H{sub 2}O + CO, has been carried out to couple with the HTTR for establishment of high-temperature nuclear heat utilization technology and contribution to hydrogen energy society in future. The mock-up test facility with a full-scale reaction tube test facility, a model simulating one reaction tube of a steam reformer of the HTTR hydrogen production system in full scale, was fabricated to perform tests on controllability, hydrogen production performance etc. under the same pressure and temperature conditions as those of the HTTR hydrogen production system. The design and fabrication of the test facility started from 1997, and the all components were installed until September in 2001. In a performance test conducted from October in 2001 to February in 2002, performance of each component was examined and hydrogen of 120m{sup 3}{sub N}/h was successfully produced with high-temperature helium gas. This report describes the performance test results on components performance, hydrogen production characteristics etc., and main troubles and countermeasures. (author)

  13. Performance test results of mock-up model test facility with a full-scale reaction tube for HTTR hydrogen production system. Contract research

    International Nuclear Information System (INIS)

    Inagaki, Yoshiyuki; Hayashi, Koji; Kato, Michio

    2003-03-01

    Research on a hydrogen production system by steam reforming of methane, chemical reaction; CH 4 + H 2 O → 3H 2 O + CO, has been carried out to couple with the HTTR for establishment of high-temperature nuclear heat utilization technology and contribution to hydrogen energy society in future. The mock-up test facility with a full-scale reaction tube test facility, a model simulating one reaction tube of a steam reformer of the HTTR hydrogen production system in full scale, was fabricated to perform tests on controllability, hydrogen production performance etc. under the same pressure and temperature conditions as those of the HTTR hydrogen production system. The design and fabrication of the test facility started from 1997, and the all components were installed until September in 2001. In a performance test conducted from October in 2001 to February in 2002, performance of each component was examined and hydrogen of 120m 3 N /h was successfully produced with high-temperature helium gas. This report describes the performance test results on components performance, hydrogen production characteristics etc., and main troubles and countermeasures. (author)

  14. Modeling and fuzzy control of the engine coolant conditioning system in an IC engine test bed

    International Nuclear Information System (INIS)

    Mohtasebi, Seyed Saeid; Shirazi, Farzad A.; Javaheri, Ahmad; Nava, Ghodrat Hamze

    2010-01-01

    Mechanical and thermodynamical performance of internal combustion engines is significantly affected by the engine working temperature. In an engine test bed, the internal combustion engines are tested in different operating conditions using a dynamometer. It is required that the engine temperature be controlled precisely, particularly in transient states. This precise control can be achieved by an engine coolant conditioning system mainly consisting of a heat exchanger, a control valve, and a controller. In this study, constitutive equations of the system are derived first. These differential equations show the second- order nonlinear time-varying dynamics of the system. The model is validated with the experimental data providing satisfactory results. After presenting the dynamic equations of the system, a fuzzy controller is designed based on our prior knowledge of the system. The fuzzy rules and the membership functions are derived by a trial and error and heuristic method. Because of the nonlinear nature of the system the fuzzy rules are set to satisfy the requirements of the temperature control for different operating conditions of the engine. The performance of the fuzzy controller is compared with a PI one for different transient conditions. The results of the simulation show the better performance of the fuzzy controller. The main advantages of the fuzzy controller are the shorter settling time, smaller overshoot, and improved performance especially in the transient states of the system

  15. MODELING MONETARY POLICY RULES IN THE MENACOUNTRIES: ISSUES AND EVIDENCE

    Directory of Open Access Journals (Sweden)

    Mohamad Husam Helmi

    2011-07-01

    Full Text Available This paper estimates the monetary policy reaction function for two sets of MENAcountries: The inflation target countries, (Turkeyand Israel and the exchange ratetarget countries, (Jordan and Morocco. We motivateour empirical analysis byanalyzing a simple Taylor rule. This model looks atthe effects of inflation andoutput on setting the interest rate by the centralbank. Furthermore, we extendedour model by adding the exchange rate and the foreign interest rate using similarmodel used by Clarida et al (1998 with using GMM estimator.Findings of this study yield some interesting results,all the central banks in thesample uses interest rate smoothing in managing their monetary policy. Inaddition, The Central bank in Turkey, Israel and Morocco focuses on achievinglow level of inflation. On the other hand, the Monetary Authority in Jordan caresabout stabilizing the output gap. Estimating the extended Taylor rule suggests thehighly significant effect of foreign interest rateon setting the interest rate inTurkey. Taken all together, the results lend support to the importance of followinga rule rather than discretionary in reducing the inflation rate and crediblemonetary policy. In addition, the simple Taylor rule can be applied on MENAcountries but it requires some modification such asadding the exchange rate andthe foreign interest rate.

  16. 47 CFR 76.1906 - Encoding rules for undefined business models.

    Science.gov (United States)

    2010-10-01

    ... 47 Telecommunication 4 2010-10-01 2010-10-01 false Encoding rules for undefined business models... for undefined business models. (a) Upon public notice and subject to requirements as set forth herein, a covered entity may launch a program service pursuant to an undefined business model. Subject to...

  17. Tier 1 and Tier 3 eAdjudication Business Rule Validation

    Science.gov (United States)

    2018-04-01

    correct rejections. • Research ways to safely approve more cases through eAdjudication. PERSEREC has established a business rule test environment that can... WORK UNIT NUMBER: 7. PERFORMING ORGANIZATION NAME(S) AND ADDRESS(ES) Defense Personnel and Security Research Center Office of People Analytics 400...interagency working group of personnel security and suitability experts on business rule development for T3 and T3R. The results of rule development and

  18. The Performance test of Mechanical Sodium Pump with Water Environment

    International Nuclear Information System (INIS)

    Cho, Chungho; Kim, Jong-Man; Ko, Yung Joo; Jeong, Ji-Young; Kim, Jong-Bum; Ko, Bock Seong; Park, Sang Jun; Lee, Yoon Sang

    2015-01-01

    As contrasted with PWR(Pressurized light Water Reactor) using water as a coolant, sodium is used as a coolant in SFR because of its low melting temperature, high thermal conductivity, the high boiling temperature allowing the reactors to operate at ambient pressure, and low neutron absorption cross section which is required to achieve a high neutron flux. But, sodium is violently reactive with water or oxygen like the other alkali metal. So Very strict requirements are demanded to design and fabricate of sodium experimental facilities. Furthermore, performance testing in high temperature sodium environments is more expensive and time consuming and need an extra precautions because operating and maintaining of sodium experimental facilities are very difficult. The present paper describes performance test results of mechanical sodium pump with water which has been performed with some design changes using water test facility in SAM JIN Industrial Co. To compare the hydraulic characteristic of model pump with water and sodium, the performance test of model pump were performed using vender's experimental facility for mechanical sodium pump. To accommodate non-uniform thermal expansion and to secure the operability and the safety, the gap size of some parts of original model pump was modified. Performance tests of modified mechanical sodium pump with water were successfully performed. Water is therefore often selected as a surrogate test fluid because it is not only cheap, easily available and easy to handle but also its important hydraulic properties (density and kinematic viscosity) are very similar to that of the sodium. Normal practice to thoroughly test a design or component before applied or installed in reactor is important to ensure the safety and operability in the sodium-cooled fast reactor (SFR). So, in order to estimate the hydraulic behavior of the PHTS pump of DSFR (600 MWe Demonstraion SFR), the performance tests of the model pump such as performance

  19. Development of a rule-based diagnostic platform on an object-oriented expert system shell

    International Nuclear Information System (INIS)

    Wang, Wenlin; Yang, Ming; Seong, Poong Hyun

    2016-01-01

    Highlights: • Multilevel Flow Model represents system knowledge as a domain map in expert system. • Rule-based fault diagnostic expert system can identify root cause via a causal chain. • Rule-based fault diagnostic expert system can be used for fault simulation training. - Abstract: This paper presents the development and implementation of a real-time rule-based diagnostic platform. The knowledge is acquired from domain experts and textbooks and the design of the fault diagnosis expert system was performed in the following ways: (i) establishing of corresponding classes and instances to build the domain map, (ii) creating of generic fault models based on events, and (iii) building of diagnostic reasoning based on rules. Knowledge representation is a complicated issue of expert systems. One highlight of this paper is that the Multilevel Flow Model has been used to represent the knowledge, which composes the domain map within the expert system as well as providing a concise description of the system. The developed platform is illustrated using the pressure safety system of a pressurized water reactor as an example of the simulation test bed; the platform is developed using the commercial and industrially validated software G2. The emulation test was conducted and it has been proven that the fault diagnosis expert system can identify the faults correctly and in a timely way; this system can be used as a simulation-based training tool to assist operators to make better decisions.

  20. A performance benchmark over semantic rule checking approaches in construction industry

    NARCIS (Netherlands)

    Pauwels, P.; de Farias, T.; Zhang, C.; Roxin, A.; Beetz, J.; De Roo, J.; Nicolle, C.

    2017-01-01

    As more and more architectural design and construction data is represented using the Resource Description Framework (RDF) data model, it makes sense to take advantage of the logical basis of RDF and implement a semantic rule checking process as it is currently not available in the architectural

  1. Ruling out the Weinberg Model of Spontaneous CP Violation

    International Nuclear Information System (INIS)

    Chang, Darwin

    2000-01-01

    There have been many declarations of the death of the Weinberg model of spontaneous CP violation. Previous studies, before the recent measurements of ε'/ε, indicated that the model could not accommodate the experimental values on ε in K 0 - bar K 0 mixing, the neutron electric dipole moment (EDM), the branching ratio of b → sγ and the upper limit on ε'/ε. The authors point out that these studies were based on optimistic estimates of the uncertainties in the calculations and that when more realistic estimates of these errors are used the Weinberg model cannot be conclusively ruled out from these considerations alone. Here we use these realistic error estimates to analyze the present situation of the Weinberg model. The latest results from Belle and BaBar on sin 2β allow the small values of this parameter which occur naturally in the Weinberg model. However, in this model, the recently measured value of Re(ε'/ε) = (1.92 ± 0.25) x 10 -3 cannot be made compatible with the branching ratio B(b → sγ) = (3.15 ± 0.54) x 10 -4 . As a result they conclude that the Weinburg model is now confidently and conservatively ruled out

  2. Ruling out the Weinberg Model of Spontaneous CP Violation

    International Nuclear Information System (INIS)

    Chang, Darwin

    2000-01-01

    There have been many declarations of the death of the Weinberg model of spontaneous CP violation. Previous studies, before the recent measurements of ε'/ε indicated that the model could not accommodate the experimental values on ε in K 0 - bar K 0 mixing, the neutron electric dipole moment (EDM), the branching ratio of b → sγ and the upper limit on ε'/ε. We point out that these studies were based on optimistic estimates of the uncertainties in the calculations and that when more realistic estimates of these errors are used the Weinberg model cannot be conclusively ruled out from these considerations alone. Here we use these realistic error estimates to analyze the present situation of the Weinberg model. The latest results from Belle and BaBar on sin 2β allow the small values of this parameter which occur naturally in the Weinberg model. However, in this model, the recently measured value of Re(ε'/ε) = (1.92 ± 10 -3 ) cannot be made compatible with the branching ratio B(b → sγ) = (3.15 ± 0.54) x 10 -4 . As a result they conclude that the Weinberg model is now confidently and conservatively ruled out

  3. Progress in sensor performance testing, modeling and range prediction using the TOD method: an overview

    Science.gov (United States)

    Bijl, Piet; Hogervorst, Maarten A.; Toet, Alexander

    2017-05-01

    The Triangle Orientation Discrimination (TOD) methodology includes i) a widely applicable, accurate end-to-end EO/IR sensor test, ii) an image-based sensor system model and iii) a Target Acquisition (TA) range model. The method has been extensively validated against TA field performance for a wide variety of well- and under-sampled imagers, systems with advanced image processing techniques such as dynamic super resolution and local adaptive contrast enhancement, and sensors showing smear or noise drift, for both static and dynamic test stimuli and as a function of target contrast. Recently, significant progress has been made in various directions. Dedicated visual and NIR test charts for lab and field testing are available and thermal test benches are on the market. Automated sensor testing using an objective synthetic human observer is within reach. Both an analytical and an image-based TOD model have recently been developed and are being implemented in the European Target Acquisition model ECOMOS and in the EOSTAR TDA. Further, the methodology is being applied for design optimization of high-end security camera systems. Finally, results from a recent perception study suggest that DRI ranges for real targets can be predicted by replacing the relevant distinctive target features by TOD test patterns of the same characteristic size and contrast, enabling a new TA modeling approach. This paper provides an overview.

  4. LFK, FORTRAN Application Performance Test

    International Nuclear Information System (INIS)

    McMahon, F.H.

    1991-01-01

    -loop controls so that short, medium, and long vector performance is sampled and can be compared. Following these three executions, the 72 timings are combined for statistical analysis and printed. The entire LFK test is executed seven times to measure experimental timing errors. An analysis of these timing errors for each kernel is provided to confirm the accuracy of the test. The LFK test also computes a sensitivity analysis of the weighted harmonic mean rate by assigning 49 sets of weights to the kernels. This analysis may be used for risk analysis to understand the variation in net performance that different workloads would cause. The LFK test report concludes with an analysis of the sensitivity of the net FORTRAN rate to optimization using the SISD/SIMD model, a two-component form of the weighted harmonic mean (harmonic Mflops) model. This analysis may be used to gauge the performance of applications from a knowledge of their vectorizability. 3 - Restrictions on the complexity of the problem: Although the LFK test evaluates the performance of a broad sampling of FORTRAN computations, it is not an application program; neither is it a complete benchmark test nor a substitute for one

  5. Performance of the majority voting rule in solving the density classification problem in high dimensions

    Energy Technology Data Exchange (ETDEWEB)

    Gomez Soto, Jose Manuel [Unidad Academica de Matematicas, Universidad Autonoma de Zacatecas, Calzada Solidaridad entronque Paseo a la Bufa, Zacatecas, Zac. (Mexico); Fuks, Henryk, E-mail: jmgomezgoo@gmail.com, E-mail: hfuks@brocku.ca [Department of Mathematics, Brock University, St. Catharines, ON (Canada)

    2011-11-04

    The density classification problem (DCP) is one of the most widely studied problems in the theory of cellular automata. After it was shown that the DCP cannot be solved perfectly, the research in this area has been focused on finding better rules that could solve the DCP approximately. In this paper, we argue that the majority voting rule in high dimensions can achieve high performance in solving the DCP, and that its performance increases with dimension. We support this conjecture with arguments based on the mean-field approximation and direct computer simulations. (paper)

  6. Performance of the majority voting rule in solving the density classification problem in high dimensions

    International Nuclear Information System (INIS)

    Gomez Soto, Jose Manuel; Fuks, Henryk

    2011-01-01

    The density classification problem (DCP) is one of the most widely studied problems in the theory of cellular automata. After it was shown that the DCP cannot be solved perfectly, the research in this area has been focused on finding better rules that could solve the DCP approximately. In this paper, we argue that the majority voting rule in high dimensions can achieve high performance in solving the DCP, and that its performance increases with dimension. We support this conjecture with arguments based on the mean-field approximation and direct computer simulations. (paper)

  7. Test anxiety and academic performance in chiropractic students.

    Science.gov (United States)

    Zhang, Niu; Henderson, Charles N R

    2014-01-01

    Objective : We assessed the level of students' test anxiety, and the relationship between test anxiety and academic performance. Methods : We recruited 166 third-quarter students. The Test Anxiety Inventory (TAI) was administered to all participants. Total scores from written examinations and objective structured clinical examinations (OSCEs) were used as response variables. Results : Multiple regression analysis shows that there was a modest, but statistically significant negative correlation between TAI scores and written exam scores, but not OSCE scores. Worry and emotionality were the best predictive models for written exam scores. Mean total anxiety and emotionality scores for females were significantly higher than those for males, but not worry scores. Conclusion : Moderate-to-high test anxiety was observed in 85% of the chiropractic students examined. However, total test anxiety, as measured by the TAI score, was a very weak predictive model for written exam performance. Multiple regression analysis demonstrated that replacing total anxiety (TAI) with worry and emotionality (TAI subscales) produces a much more effective predictive model of written exam performance. Sex, age, highest current academic degree, and ethnicity contributed little additional predictive power in either regression model. Moreover, TAI scores were not found to be statistically significant predictors of physical exam skill performance, as measured by OSCEs.

  8. 49 CFR 40.321 - What is the general confidentiality rule for drug and alcohol test information?

    Science.gov (United States)

    2010-10-01

    ... Transportation PROCEDURES FOR TRANSPORTATION WORKPLACE DRUG AND ALCOHOL TESTING PROGRAMS Confidentiality and Release of Information § 40.321 What is the general confidentiality rule for drug and alcohol test... DOT drug or alcohol testing process, you are prohibited from releasing individual test results or...

  9. Optimizing Fuzzy Rule Base for Illumination Compensation in Face Recognition using Genetic Algorithms

    Directory of Open Access Journals (Sweden)

    Bima Sena Bayu Dewantara

    2014-12-01

    Full Text Available Fuzzy rule optimization is a challenging step in the development of a fuzzy model. A simple two inputs fuzzy model may have thousands of combination of fuzzy rules when it deals with large number of input variations. Intuitively and trial‐error determination of fuzzy rule is very difficult. This paper addresses the problem of optimizing Fuzzy rule using Genetic Algorithm to compensate illumination effect in face recognition. Since uneven illumination contributes negative effects to the performance of face recognition, those effects must be compensated. We have developed a novel algorithmbased on a reflectance model to compensate the effect of illumination for human face recognition. We build a pair of model from a single image and reason those modelsusing Fuzzy.Fuzzy rule, then, is optimized using Genetic Algorithm. This approachspendsless computation cost by still keepinga high performance. Based on the experimental result, we can show that our algorithm is feasiblefor recognizing desired person under variable lighting conditions with faster computation time. Keywords: Face recognition, harsh illumination, reflectance model, fuzzy, genetic algorithm

  10. Learning Classification Models of Cognitive Conditions from Subtle Behaviors in the Digital Clock Drawing Test.

    Science.gov (United States)

    Souillard-Mandar, William; Davis, Randall; Rudin, Cynthia; Au, Rhoda; Libon, David J; Swenson, Rodney; Price, Catherine C; Lamar, Melissa; Penney, Dana L

    2016-03-01

    The Clock Drawing Test - a simple pencil and paper test - has been used for more than 50 years as a screening tool to differentiate normal individuals from those with cognitive impairment, and has proven useful in helping to diagnose cognitive dysfunction associated with neurological disorders such as Alzheimer's disease, Parkinson's disease, and other dementias and conditions. We have been administering the test using a digitizing ballpoint pen that reports its position with considerable spatial and temporal precision, making available far more detailed data about the subject's performance. Using pen stroke data from these drawings categorized by our software, we designed and computed a large collection of features, then explored the tradeoffs in performance and interpretability in classifiers built using a number of different subsets of these features and a variety of different machine learning techniques. We used traditional machine learning methods to build prediction models that achieve high accuracy. We operationalized widely used manual scoring systems so that we could use them as benchmarks for our models. We worked with clinicians to define guidelines for model interpretability, and constructed sparse linear models and rule lists designed to be as easy to use as scoring systems currently used by clinicians, but more accurate. While our models will require additional testing for validation, they offer the possibility of substantial improvement in detecting cognitive impairment earlier than currently possible, a development with considerable potential impact in practice.

  11. Rules of thumb in life-cycle savings models

    OpenAIRE

    Rodepeter, Ralf; Winter, Joachim

    1999-01-01

    We analyze life-cycle savings decisions when households use simple heuristics, or rules of thumb, rather than solve the underlying intertemporal optimization problem. The decision rules we explore are a simple Keynesian rule where consumption follows income; a simple consumption rule where only a fraction of positive income shocks is saved; a rule that corresponds to the permanent income hypothesis; and two rules that have been found in experimental studies. Using these rules, we simulate lif...

  12. Relative performance of priority rules for hybrid flow shop scheduling with setup times

    Directory of Open Access Journals (Sweden)

    Helio Yochihiro Fuchigami

    2015-12-01

    Full Text Available This paper focuses the hybrid flow shop scheduling problem with explicit and sequence-independent setup times. This production environment is a multistage system with unidirectional flow of jobs, wherein each stage may contain multiple machines available for processing. The optimized measure was the total time to complete the schedule (makespan. The aim was to propose new priority rules to support the schedule and to evaluate their relative performance at the production system considered by the percentage of success, relative deviation, standard deviation of relative deviation, and average CPU time. Computational experiments have indicated that the rules using ascending order of the sum of processing and setup times of the first stage (SPT1 and SPT1_ERD performed better, reaching together more than 56% of success.

  13. 40 CFR Table 5 to Subpart Bbbb of... - Model Rule-Carbon Monoxide Emission Limits for Existing Small Municipal Waste Combustion Units

    Science.gov (United States)

    2010-07-01

    ... 40 Protection of Environment 6 2010-07-01 2010-07-01 false Model Rule-Carbon Monoxide Emission... BBBB of Part 60—Model Rule—Carbon Monoxide Emission Limits for Existing Small Municipal Waste... PERFORMANCE FOR NEW STATIONARY SOURCES Emission Guidelines and Compliance Times for Small Municipal Waste...

  14. Specialist committee's review reports for experimental fast reactor JOYO' MK-III performance tests

    International Nuclear Information System (INIS)

    Yamashita, Kiyonobu; Okubo, Toshiyuki; Kamide, Hideki

    2004-02-01

    Performance tests (startup-physics tests and power elevation tests) were planed for experimental fast reactor 'JOYO' MK-III where irradiation performances were upgraded by power increase from 100 to 140 MW. The reactor safety committee of O-arai Engineering Center has established a specialist committee for 'JOYO' MK-III Performance Tests at the first meeting of 2003 on 23th. April 2003, to accomplish the tests successfully. Subjects of the specialist committee were reviews of following items covering a wide range. 1) Contents of modification works. 2) Reflections of functional test results to the plant and facilities. 3) Reflections of safety rule modification to instruction and manual for operation. 4) Quality assurances and pre-calculation for performance test. 5) Inspection plan and its results. 6) Adequacy of performance test plan. 7) Confirmation of performance test results. Before test-starts, the specialist committee has confirmed by reviewing the items from 1) to 6) based on explanations and documents of the Division of Experimental Reactor, that the test plan and pre-inspections are adequate. After the tests, the specialist committee had confirmed by reviewing the item 7) in the same way, that the each test result satisfies the corresponding criterion. The specialist committee has concluded from these review's results before and after the tests that the 'JOYO' MK-III Performance Tests were carried out appropriately. Besides, the first criticality of the JOYO MK-III was achieved on 2nd. July 2003, and the continuous full power operation was carried on 20th. Nov. 2003. Finally, all performance tests were completed by the pass of the last governmental pre-serviced inspection (dose rate measurement during the shut down condition). (author)

  15. Measurement properties of continuous text reading performance tests.

    Science.gov (United States)

    Brussee, Tamara; van Nispen, Ruth M A; van Rens, Ger H M B

    2014-11-01

    Measurement properties of tests to assess reading acuity or reading performance have not been extensively evaluated. This study aims to provide an overview of the literature on available continuous text reading tests and their measurement properties. A literature search was performed in PubMed, Embase and PsycInfo. Subsequently, information on design and content of reading tests, study design and measurement properties were extracted using consensus-based standards for selection of health measurement instruments. Quality of studies, reading tests and measurement properties were systematically assessed using pre-specified criteria. From 2334 identified articles, 20 relevant articles were found on measurement properties of three reading tests in various languages: IReST, MNread Reading Test and Radner Reading Charts. All three reading tests scored high on content validity. Reproducibility studies (repeated measurements between different testing sessions) of the IReST and MNread of commercially available reading tests in different languages were missing. The IReST scored best on inter-language comparison, the MNread scored well in repeatability studies (repeated measurements under the same conditions) and the Radner showed good reproducibility in studies. Although in daily practice there are other continuous text reading tests available meeting the criteria of this review, measurement properties were described in scientific studies for only three of them. Of the few available studies, the quality and content of study design and methodology used varied. For testing existing reading tests and the development of new ones, for example in other languages, we make several recommendations, including careful description of patient characteristics, use of objective and subjective lighting levels, good control of working distance, documentation of the number of raters and their training, careful documentation of scoring rules and the use of Bland-Altman analyses or similar for

  16. Rule knowledge aids performance on spatial and object alternation tasks by alcoholic patients with and without Korsakoff’s amnesia

    Directory of Open Access Journals (Sweden)

    Fiona J Bardenhagen

    2007-01-01

    Full Text Available Fiona J Bardenhagen1,2, Marlene Oscar-Berman3, Stephen C Bowden2,41School of Psychology, Victoria University, Melbourne, Victoria, Australia; 2Clinical Neurosciences, St. Vincent’s Hospital, Melbourne, Australia; 3Division of Psychiatry and Departments of Neurology and Anatomy and Neurobiology, Boston University School of Medicine; and Psychology Research Service, US Department of Veterans Affairs (VA Healthcare System, Jamaica Plain Campus, MA, USA; 4School of Behavioural Science, University of Melbourne, Parkville, Victoria, AustraliaAbstract: Delayed alternation (DA and object alternation (OA tasks traditionally have been used to measure defective response inhibition associated with dysfunction of frontal brain systems. However, these tasks are also sensitive to nonfrontal lesions, and cognitive processes such as the induction of rule-learning strategies also are needed in order to perform well on these tasks. Performance on DA and OA tasks was explored in 10 patients with alcohol-induced persisting amnestic disorder (Korsakoff’s syndrome, 11 abstinent long-term alcoholics, and 13 healthy non-alcoholic controls under each of two rule provision conditions: Alternation Rule and Correction Rule. Results confirmed that rule knowledge is a crucial cognitive component for solving problems such as DA and OA, and therefore, that errors on these tasks are not due to defective response inhibition alone. Further, rule-induction strategies were helpful to Korsakoff patients, despite their poorer performance on the tasks. These results stress the role of multiple cognitive abilities in successful performance on rule induction tasks. Evidence that these cognitive abilities are served by diffusely distributed neural networks should be considered when interpreting behavioral impairments on these tasks.Keywords: alcoholism, Korsakoff’s syndrome, comparative neuropsychology, perseveration, rule induction, working memory

  17. Comparison of Heuristics for Inhibitory Rule Optimization

    KAUST Repository

    Alsolami, Fawaz; Chikalov, Igor; Moshkov, Mikhail

    2014-01-01

    Friedman test with Nemenyi post-hoc are used to compare the greedy algorithms statistically against each other for length and coverage. The experiments are carried out on real datasets from UCI Machine Learning Repository. For leading heuristics, the constructed rules are compared with optimal ones obtained based on dynamic programming approach. The results seem to be promising for the best heuristics: the average relative difference between length (coverage) of constructed and optimal rules is at most 2.27% (7%, respectively). Furthermore, the quality of classifiers based on sets of inhibitory rules constructed by the considered heuristics are compared against each other, and the results show that the three best heuristics from the point of view classification accuracy coincides with the three well-performed heuristics from the point of view of rule length minimization.

  18. New scheduling rules for a dynamic flexible flow line problem with sequence-dependent setup times

    Science.gov (United States)

    Kia, Hamidreza; Ghodsypour, Seyed Hassan; Davoudpour, Hamid

    2017-09-01

    In the literature, the application of multi-objective dynamic scheduling problem and simple priority rules are widely studied. Although these rules are not efficient enough due to simplicity and lack of general insight, composite dispatching rules have a very suitable performance because they result from experiments. In this paper, a dynamic flexible flow line problem with sequence-dependent setup times is studied. The objective of the problem is minimization of mean flow time and mean tardiness. A 0-1 mixed integer model of the problem is formulated. Since the problem is NP-hard, four new composite dispatching rules are proposed to solve it by applying genetic programming framework and choosing proper operators. Furthermore, a discrete-event simulation model is made to examine the performances of scheduling rules considering four new heuristic rules and the six adapted heuristic rules from the literature. It is clear from the experimental results that composite dispatching rules that are formed from genetic programming have a better performance in minimization of mean flow time and mean tardiness than others.

  19. Game-theoretic modeling of curtailment rules and network investments with distributed generation

    International Nuclear Information System (INIS)

    Andoni, Merlinda; Robu, Valentin; Früh, Wolf-Gerrit; Flynn, David

    2017-01-01

    Highlights: •Comparative study on curtailment rules and their effects on RES profitability. •Proposal of novel fair curtailment rule which minimises generators’ disruption. •Modeling of private network upgrade as leader-follower (Stackelberg) game. •New model incorporating stochastic generation and variable demand. •New methodology for setting transmission charges in private network upgrade. -- Abstract: Renewable energy has achieved high penetration rates in many areas, leading to curtailment, especially if existing network infrastructure is insufficient and energy generated cannot be exported. In this context, Distribution Network Operators (DNOs) face a significant knowledge gap about how to implement curtailment rules that achieve desired operational objectives, but at the same time minimise disruption and economic losses for renewable generators. In this work, we study the properties of several curtailment rules widely used in UK renewable energy projects, and their effect on the viability of renewable generation investment. Moreover, we propose a new curtailment rule which guarantees fair allocation of curtailment amongst all generators with minimal disruption. Another key knowledge gap faced by DNOs is how to incentivise private network upgrades, especially in settings where several generators can use the same line against the payment of a transmission fee. In this work, we provide a solution to this problem by using tools from algorithmic game theory. Specifically, this setting can be modelled as a Stackelberg game between the private transmission line investor and local renewable generators, who are required to pay a transmission fee to access the line. We provide a method for computing the equilibrium of this game, using a model that captures the stochastic nature of renewable energy generation and demand. Finally, we use the practical setting of a grid reinforcement project from the UK and a large dataset of wind speed measurements and demand

  20. Neutrino mass sum rules and symmetries of the mass matrix

    Energy Technology Data Exchange (ETDEWEB)

    Gehrlein, Julia [Karlsruhe Institute of Technology, Institut fuer Theoretische Teilchenphysik, Karlsruhe (Germany); Universidad Autonoma de Madrid, Departamento de Fisica Teorica, Madrid (Spain); Instituto de Fisica Teorica UAM/CSIC, Madrid (Spain); Spinrath, Martin [Karlsruhe Institute of Technology, Institut fuer Theoretische Teilchenphysik, Karlsruhe (Germany); National Center for Theoretical Sciences, Physics Division, Hsinchu (China)

    2017-05-15

    Neutrino mass sum rules have recently gained again more attention as a powerful tool to discriminate and test various flavour models in the near future. A related question which has not yet been discussed fully satisfactorily was the origin of these sum rules and if they are related to any residual or accidental symmetry. We will address this open issue here systematically and find previous statements confirmed. Namely, the sum rules are not related to any enhanced symmetry of the Lagrangian after family symmetry breaking but they are simply the result of a reduction of free parameters due to skillful model building. (orig.)

  1. Performance testing of an off-plane reflection grating and silicon pore optic spectrograph at PANTER

    Science.gov (United States)

    Marlowe, Hannah; McEntaffer, Randall L.; Allured, Ryan; DeRoo, Casey T.; Donovan, Benjamin D.; Miles, Drew M.; Tutt, James H.; Burwitz, Vadim; Menz, Benedikt; Hartner, Gisela D.; Smith, Randall K.; Cheimets, Peter; Hertz, Edward; Bookbinder, Jay A.; Günther, Ramses; Yanson, Alex; Vacanti, Giuseppe; Ackermann, Marcelo

    2015-10-01

    An x-ray spectrograph consisting of aligned, radially ruled off-plane reflection gratings and silicon pore optics (SPO) was tested at the Max Planck Institute for Extraterrestrial Physics PANTER x-ray test facility. SPO is a test module for the proposed Arcus mission, which will also feature aligned off-plane reflection gratings. This test is the first time two off-plane gratings were actively aligned to each other and with an SPO to produce an overlapped spectrum. We report the performance of the complete spectrograph utilizing the aligned gratings module and plans for future development.

  2. Impact of high-performance work systems on individual- and branch-level performance: test of a multilevel model of intermediate linkages.

    Science.gov (United States)

    Aryee, Samuel; Walumbwa, Fred O; Seidu, Emmanuel Y M; Otaye, Lilian E

    2012-03-01

    We proposed and tested a multilevel model, underpinned by empowerment theory, that examines the processes linking high-performance work systems (HPWS) and performance outcomes at the individual and organizational levels of analyses. Data were obtained from 37 branches of 2 banking institutions in Ghana. Results of hierarchical regression analysis revealed that branch-level HPWS relates to empowerment climate. Additionally, results of hierarchical linear modeling that examined the hypothesized cross-level relationships revealed 3 salient findings. First, experienced HPWS and empowerment climate partially mediate the influence of branch-level HPWS on psychological empowerment. Second, psychological empowerment partially mediates the influence of empowerment climate and experienced HPWS on service performance. Third, service orientation moderates the psychological empowerment-service performance relationship such that the relationship is stronger for those high rather than low in service orientation. Last, ordinary least squares regression results revealed that branch-level HPWS influences branch-level market performance through cross-level and individual-level influences on service performance that emerges at the branch level as aggregated service performance.

  3. How well do clinical prediction rules perform in identifying serious infections in acutely ill children across an international network of ambulatory care datasets?

    Directory of Open Access Journals (Sweden)

    Verbakel Jan Y

    2013-01-01

    Full Text Available Abstract Background Diagnosing serious infections in children is challenging, because of the low incidence of such infections and their non-specific presentation early in the course of illness. Prediction rules are promoted as a means to improve recognition of serious infections. A recent systematic review identified seven clinical prediction rules, of which only one had been prospectively validated, calling into question their appropriateness for clinical practice. We aimed to examine the diagnostic accuracy of these rules in multiple ambulatory care populations in Europe. Methods Four clinical prediction rules and two national guidelines, based on signs and symptoms, were validated retrospectively in seven individual patient datasets from primary care and emergency departments, comprising 11,023 children from the UK, the Netherlands, and Belgium. The accuracy of each rule was tested, with pre-test and post-test probabilities displayed using dumbbell plots, with serious infection settings stratified as low prevalence (LP; 20% . In LP and IP settings, sensitivity should be >90% for effective ruling out infection. Results In LP settings, a five-stage decision tree and a pneumonia rule had sensitivities of >90% (at a negative likelihood ratio (NLR of Conclusions None of the clinical prediction rules examined in this study provided perfect diagnostic accuracy. In LP or IP settings, prediction rules and evidence-based guidelines had high sensitivity, providing promising rule-out value for serious infections in these datasets, although all had a percentage of residual uncertainty. Additional clinical assessment or testing such as point-of-care laboratory tests may be needed to increase clinical certainty. None of the prediction rules identified seemed to be valuable for HP settings such as emergency departments.

  4. Application of decision rules for empowering of Indonesian telematics services SMEs

    Science.gov (United States)

    Tosida, E. T.; Hairlangga, O.; Amirudin, F.; Ridwanah, M.

    2018-03-01

    The independence of the field of telematics became one of Indonesia's vision in 2024. One effort to achieve it can be done by empowering SMEs in the field of telematics. Empowerment carried out need a practical mechanism by utilizing data centered, including through the National Economic Census database (Susenas). Based on the Susenas can be formulated the decision rules of determining the provision of assistance for SMEs in the field of telematics. The way it did by generating the rule base through the classification technique. The CART algorithm-based decision rule model performs better than C45 and ID3 models. The high level of performance model is also in line with the regulations applied by the government. This becomes one of the strengths of research, because the resulting model is consistent with the existing conditions in Indonesia. The rules base generated from the three classification techniques show different rules. The CART technique has pattern matching with the realization of activities in The Ministry of Cooperatives and SMEs. So far, the government has difficulty in referring data related to the empowerment of SMEs telematics services. Therefore, the findings resulting from this research can be used as an alternative decision support system related to the program of empowerment of SMEs in telematics.

  5. Base Station Performance Model

    OpenAIRE

    Walsh, Barbara; Farrell, Ronan

    2005-01-01

    At present the testing of power amplifiers within base station transmitters is limited to testing at component level as opposed to testing at the system level. While the detection of catastrophic failure is possible, that of performance degradation is not. This paper proposes a base station model with respect to transmitter output power with the aim of introducing system level monitoring of the power amplifier behaviour within the base station. Our model reflects the expe...

  6. Advantages and Limitations of Anticipating Laboratory Test Results from Regression- and Tree-Based Rules Derived from Electronic Health-Record Data

    OpenAIRE

    Mohammad, Fahim; Theisen-Toupal, Jesse C.; Arnaout, Ramy

    2014-01-01

    Laboratory testing is the single highest-volume medical activity, making it useful to ask how well one can anticipate whether a given test result will be high, low, or within the reference interval ("normal"). We analyzed 10 years of electronic health records--a total of 69.4 million blood tests--to see how well standard rule-mining techniques can anticipate test results based on patient age and gender, recent diagnoses, and recent laboratory test results. We evaluated rules according to thei...

  7. Ontological Analysis of Integrated Process Models: testing hypotheses

    Directory of Open Access Journals (Sweden)

    Michael Rosemann

    2001-11-01

    Full Text Available Integrated process modeling is achieving prominence in helping to document and manage business administration and IT processes in organizations. The ARIS framework is a popular example for a framework of integrated process modeling not least because it underlies the 800 or more reference models embedded in the world's most popular ERP package, SAP R/3. This paper demonstrates the usefulness of the Bunge-Wand-Weber (BWW representation model for evaluating modeling grammars such as those constituting ARIS. It reports some initial insights gained from pilot testing Green and Rosemann's (2000 evaluative propositions. Even when considering all five views of ARIS, modelers have problems representing business rules, the scope and boundary of systems, and decomposing models. However, even though it is completely ontologically redundant, users still find the function view useful in modeling.

  8. Introducing an interface between FeynRules and WHIZARD

    Energy Technology Data Exchange (ETDEWEB)

    Christensen, Neil D. [Pittsburgh Univ., PA (United States). PITTsburgh Particle Physics, Astrophysics and Cosmology Center; Duhr, Claude [Durham Univ. (United Kingdom). Inst. for Particle Physics Phenomenology; Fuks, Benjamin [Strasbourg Univ. (France). Inst. Pluridisciplinaire Hubert Curien - Dept. Recherches Subatomiques; Reuter, Juergen [Deutsches Elektronen-Synchrotron (DESY), Hamburg (Germany). Theory Group; Freiburg Univ. (Germany). Physikalisches Inst.; Edinburgh Univ. (United Kingdom). School of Physics and Astronomy; Speckner, Christian [Freiburg Univ. (Germany). Physikalisches Inst.

    2012-04-15

    While Monte Carlo event generators like WHIZARD have become indispensable tools in studying the impact of new physics on collider observables over the last decades, the implementation of new models in such packages has remained a rather awkward and error-prone process. Recently, the FeynRules package was introduced which greatly simplifies this process by providing a single unified model format from which model implementations for many different Monte Carlo codes can be derived automatically. In this note, we present an interface which extends FeynRules to provide this functionality also for the WHIZARD package, thus making WHIZARD's strengths and performance easily available to model builders.

  9. Introducing an interface between FeynRules and WHIZARD

    International Nuclear Information System (INIS)

    Christensen, Neil D.; Duhr, Claude; Fuks, Benjamin; Reuter, Juergen; Freiburg Univ.; Edinburgh Univ.; Speckner, Christian

    2012-04-01

    While Monte Carlo event generators like WHIZARD have become indispensable tools in studying the impact of new physics on collider observables over the last decades, the implementation of new models in such packages has remained a rather awkward and error-prone process. Recently, the FeynRules package was introduced which greatly simplifies this process by providing a single unified model format from which model implementations for many different Monte Carlo codes can be derived automatically. In this note, we present an interface which extends FeynRules to provide this functionality also for the WHIZARD package, thus making WHIZARD's strengths and performance easily available to model builders.

  10. Motivation to comply with task rules and multitasking performance: The role of need for cognitive closure and goal importance.

    Science.gov (United States)

    Szumowska, Ewa; Kossowska, Małgorzata; Roets, Arne

    2018-01-01

    In three studies, we examined the role task rules play in multitasking performance. We postulated that rules should be especially important for individuals highly motivated to have structure and clear answers, i.e., those high on need for cognitive closure (NFC). High NFC should thus be related to greater compliance with task rules. Specifically, given high goal importance, NFC should be more strongly related to a multitasking strategy when multitasking is imposed by the rules, and to a mono-tasking strategy when monotasking is imposed by the rules. This should translate into better multitasking or mono-tasking performance, depending on condition. Overall, the results were supportive as NFC was related to a more mono-tasking strategy in the mono-tasking condition (Studies 1 and 2 only) and more dual-tasking strategy in the dual-tasking condition (Studies 1-3). This translated into respective differences in performance. The effects were significant only when goal importance was high (Study 1) and held when cognitive ability was controlled for (Study 2).

  11. Test of a causal Human Resource Management-Performance Linkage Model: Evidence from the Greek manufacturing sector

    OpenAIRE

    Katou, A.; Katou, A.

    2011-01-01

    Although a number of studies have recognized the relationship between Human Resource Management (HRM) policies and organisational performance, the mechanisms through which HRM policies lead to organisational performance remain still unexplored. The purpose of this paper is to investigate the pathways leading from HRM policies to organisational performance by using structural equation modelling. Specifically, this analytical tool has been used to test a research framework that is constituted ...

  12. Cognitive information based on large-scale tests: Representation´s method of rule space [Información cognitiva a partir de pruebas de gran escala: el método de representación del espacio de reglas

    Directory of Open Access Journals (Sweden)

    Alvaro Artavia Medrano

    2012-06-01

    Full Text Available Over the last years, there has been a growing demand for standardized achievement tests that provide useful information to educational proces- ses, by adding diagnostic information to improve the students’ deficient areas, while maintaining advantages in technical development. This paper advocates for Tatsuoka’s rule space method (1983, 2009 as an option to incorporate the benefits of cognitive diagnostic assessment to current edu- cational practices. The article briefly explains the process of problem solving according to some information processing theories, provides the basis for item analysis, and describes the rule space method with particular emphasis on the development of the Q matrix as a cognitive model that allows the interpretation of results. The aim of this method is to link performance on a test with specific inferences about the knowledge and skills of examinees.

  13. Autonomous Rule Creation for Intrusion Detection

    Energy Technology Data Exchange (ETDEWEB)

    Todd Vollmer; Jim Alves-Foss; Milos Manic

    2011-04-01

    Many computational intelligence techniques for anomaly based network intrusion detection can be found in literature. Translating a newly discovered intrusion recognition criteria into a distributable rule can be a human intensive effort. This paper explores a multi-modal genetic algorithm solution for autonomous rule creation. This algorithm focuses on the process of creating rules once an intrusion has been identified, rather than the evolution of rules to provide a solution for intrusion detection. The algorithm was demonstrated on anomalous ICMP network packets (input) and Snort rules (output of the algorithm). Output rules were sorted according to a fitness value and any duplicates were removed. The experimental results on ten test cases demonstrated a 100 percent rule alert rate. Out of 33,804 test packets 3 produced false positives. Each test case produced a minimum of three rule variations that could be used as candidates for a production system.

  14. Sum rules for four-spinon dynamic structure factor in XXX model

    International Nuclear Information System (INIS)

    Si Lakhal, B.; Abada, A.

    2005-01-01

    In the context of the antiferromagnetic spin 12 Heisenberg quantum spin chain (XXX model), we estimate the contribution of the exact four-spinon dynamic structure factor S 4 by calculating a number of sum rules the total dynamic structure factor S is known to satisfy exactly. These sum rules are: the static susceptibility, the integrated intensity, the total integrated intensity, the first frequency moment and the nearest-neighbor correlation function. We find that the contribution of S 4 is between 1% and 2.5%, depending on the sum rule, whereas the contribution of the exact two-spinon dynamic structure factor S 2 is between 70% and 75%. The calculations are numerical and Monte Carlo based. Good statistics are obtained

  15. Multi-scale inference of interaction rules in animal groups using Bayesian model selection.

    Directory of Open Access Journals (Sweden)

    Richard P Mann

    2012-01-01

    Full Text Available Inference of interaction rules of animals moving in groups usually relies on an analysis of large scale system behaviour. Models are tuned through repeated simulation until they match the observed behaviour. More recent work has used the fine scale motions of animals to validate and fit the rules of interaction of animals in groups. Here, we use a Bayesian methodology to compare a variety of models to the collective motion of glass prawns (Paratya australiensis. We show that these exhibit a stereotypical 'phase transition', whereby an increase in density leads to the onset of collective motion in one direction. We fit models to this data, which range from: a mean-field model where all prawns interact globally; to a spatial Markovian model where prawns are self-propelled particles influenced only by the current positions and directions of their neighbours; up to non-Markovian models where prawns have 'memory' of previous interactions, integrating their experiences over time when deciding to change behaviour. We show that the mean-field model fits the large scale behaviour of the system, but does not capture fine scale rules of interaction, which are primarily mediated by physical contact. Conversely, the Markovian self-propelled particle model captures the fine scale rules of interaction but fails to reproduce global dynamics. The most sophisticated model, the non-Markovian model, provides a good match to the data at both the fine scale and in terms of reproducing global dynamics. We conclude that prawns' movements are influenced by not just the current direction of nearby conspecifics, but also those encountered in the recent past. Given the simplicity of prawns as a study system our research suggests that self-propelled particle models of collective motion should, if they are to be realistic at multiple biological scales, include memory of previous interactions and other non-Markovian effects.

  16. Patch Departure Behavior of Bumble Bees: Rules and Mechanisms

    Directory of Open Access Journals (Sweden)

    Dale E. Taneyhill

    2010-01-01

    Full Text Available I present an increment-decay model for the mechanism of bumble bees' decision to depart from inflorescences. The probability of departure is the consequence of a dynamic threshold level of stimuli necessary to elicit a stereotyped landing reaction. Reception of floral nectar lowers this threshold, making the bee less likely to depart. Concurrently the threshold increases, making departure from the inflorescence more probable. Increments to the probability of landing are an increasing, decelerating function of nectar volume, and are worth less, in sequence, for the same amount of nectar. The model is contrasted to threshold departure rules, which predict that bees will depart from inflorescences if the amount of nectar in the last one or two flowers visited is below a given level. Field tests comparing the two models were performed with monkshood (Aconitum columbianum. Treated flowers contained a descending series of nectar volumes (6 to 0 L of 30 % sucrose solution. The more nectar that bees encountered in the treated flowers, the more likely they were to remain within the inflorescence after subsequently visiting one to three empty flowers. I discuss the differences between rules and mechanisms in regard to cognitive models of foraging behavior.

  17. Horizontal and Vertical Rule Bases Method in Fuzzy Controllers

    Directory of Open Access Journals (Sweden)

    Sadegh Aminifar

    2013-01-01

    Full Text Available Concept of horizontal and vertical rule bases is introduced. Using this method enables the designers to look for main behaviors of system and describes them with greater approximations. The rules which describe the system in first stage are called horizontal rule base. In the second stage, the designer modulates the obtained surface by describing needed changes on first surface for handling real behaviors of system. The rules used in the second stage are called vertical rule base. Horizontal and vertical rule bases method has a great roll in easing of extracting the optimum control surface by using too lesser rules than traditional fuzzy systems. This research involves with control of a system with high nonlinearity and in difficulty to model it with classical methods. As a case study for testing proposed method in real condition, the designed controller is applied to steaming room with uncertain data and variable parameters. A comparison between PID and traditional fuzzy counterpart and our proposed system shows that our proposed system outperforms PID and traditional fuzzy systems in point of view of number of valve switching and better surface following. The evaluations have done both with model simulation and DSP implementation.

  18. Models of the heat dynamics of solar collectors for performance testing

    DEFF Research Database (Denmark)

    Bacher, Peder; Madsen, Henrik; Perers, Bengt

    2011-01-01

    accurate estimates of parameters in physical models. The applied method is described by Kristensen et al. (2004) and implemented in the software CTSM1. Examples of successful applications of the method includes modelling the of the heat dynamics of integrated photo-voltaic modules (Friling et al., 2009......) and modelling of the heat dynamics of buildings (Madsen and Holst, 1995). Measurements obtained at a test site in Denmark during the spring 2010 are used for the modelling. The tested collector is a single glazed large area flat plate collector with selective absorber and Teflon anti convection layer. The test...

  19. A rule-based backchannel prediction model using pitch and pause information

    NARCIS (Netherlands)

    Truong, Khiet Phuong; Poppe, Ronald Walter; Heylen, Dirk K.J.

    We manually designed rules for a backchannel (BC) prediction model based on pitch and pause information. In short, the model predicts a BC when there is a pause of a certain length that is preceded by a falling or rising pitch. This model was validated against the Dutch IFADV Corpus in a

  20. Highly scalable and robust rule learner: performance evaluation and comparison.

    Science.gov (United States)

    Kurgan, Lukasz A; Cios, Krzysztof J; Dick, Scott

    2006-02-01

    Business intelligence and bioinformatics applications increasingly require the mining of datasets consisting of millions of data points, or crafting real-time enterprise-level decision support systems for large corporations and drug companies. In all cases, there needs to be an underlying data mining system, and this mining system must be highly scalable. To this end, we describe a new rule learner called DataSqueezer. The learner belongs to the family of inductive supervised rule extraction algorithms. DataSqueezer is a simple, greedy, rule builder that generates a set of production rules from labeled input data. In spite of its relative simplicity, DataSqueezer is a very effective learner. The rules generated by the algorithm are compact, comprehensible, and have accuracy comparable to rules generated by other state-of-the-art rule extraction algorithms. The main advantages of DataSqueezer are very high efficiency, and missing data resistance. DataSqueezer exhibits log-linear asymptotic complexity with the number of training examples, and it is faster than other state-of-the-art rule learners. The learner is also robust to large quantities of missing data, as verified by extensive experimental comparison with the other learners. DataSqueezer is thus well suited to modern data mining and business intelligence tasks, which commonly involve huge datasets with a large fraction of missing data.

  1. PSOLA: A Heuristic Land-Use Allocation Model Using Patch-Level Operations and Knowledge-Informed Rules.

    Directory of Open Access Journals (Sweden)

    Yaolin Liu

    Full Text Available Optimizing land-use allocation is important to regional sustainable development, as it promotes the social equality of public services, increases the economic benefits of land-use activities, and reduces the ecological risk of land-use planning. Most land-use optimization models allocate land-use using cell-level operations that fragment land-use patches. These models do not cooperate well with land-use planning knowledge, leading to irrational land-use patterns. This study focuses on building a heuristic land-use allocation model (PSOLA using particle swarm optimization. The model allocates land-use with patch-level operations to avoid fragmentation. The patch-level operations include a patch-edge operator, a patch-size operator, and a patch-compactness operator that constrain the size and shape of land-use patches. The model is also integrated with knowledge-informed rules to provide auxiliary knowledge of land-use planning during optimization. The knowledge-informed rules consist of suitability, accessibility, land use policy, and stakeholders' preference. To validate the PSOLA model, a case study was performed in Gaoqiao Town in Zhejiang Province, China. The results demonstrate that the PSOLA model outperforms a basic PSO (Particle Swarm Optimization in the terms of the social, economic, ecological, and overall benefits by 3.60%, 7.10%, 1.53% and 4.06%, respectively, which confirms the effectiveness of our improvements. Furthermore, the model has an open architecture, enabling its extension as a generic tool to support decision making in land-use planning.

  2. Connections between the Sznajd model with general confidence rules and graph theory

    Science.gov (United States)

    Timpanaro, André M.; Prado, Carmen P. C.

    2012-10-01

    The Sznajd model is a sociophysics model that is used to model opinion propagation and consensus formation in societies. Its main feature is that its rules favor bigger groups of agreeing people. In a previous work, we generalized the bounded confidence rule in order to model biases and prejudices in discrete opinion models. In that work, we applied this modification to the Sznajd model and presented some preliminary results. The present work extends what we did in that paper. We present results linking many of the properties of the mean-field fixed points, with only a few qualitative aspects of the confidence rule (the biases and prejudices modeled), finding an interesting connection with graph theory problems. More precisely, we link the existence of fixed points with the notion of strongly connected graphs and the stability of fixed points with the problem of finding the maximal independent sets of a graph. We state these results and present comparisons between the mean field and simulations in Barabási-Albert networks, followed by the main mathematical ideas and appendices with the rigorous proofs of our claims and some graph theory concepts, together with examples. We also show that there is no qualitative difference in the mean-field results if we require that a group of size q>2, instead of a pair, of agreeing agents be formed before they attempt to convince other sites (for the mean field, this would coincide with the q-voter model).

  3. The Reasonableness Test of the Principal Purpose Test Rule in OECD BEPS Action 6 (Tax Treaty Abuse) versus the EU Principle of Legal Certainty and the EU Abuse of Law Case Law

    OpenAIRE

    Weber, Dennis

    2017-01-01

    textabstractThe OECD BEPS Action 6 report contains a principal pur- pose test rule (PPT rule) for the purpose of combating abuse of tax treaties. This PPT rule is also included in the OECD Multilateral Instrument. The PPT rule is (amongst others) applicable when ‘it is rea- sonable to conclude’ that a benefit (granted by a tax treaty) was one of the principal purposes of any arrangement/ transaction. This requirement contains two elements: the reasonableness test and the principal purpose tes...

  4. SPATKIN: a simulator for rule-based modeling of biomolecular site dynamics on surfaces.

    Science.gov (United States)

    Kochanczyk, Marek; Hlavacek, William S; Lipniacki, Tomasz

    2017-11-15

    Rule-based modeling is a powerful approach for studying biomolecular site dynamics. Here, we present SPATKIN, a general-purpose simulator for rule-based modeling in two spatial dimensions. The simulation algorithm is a lattice-based method that tracks Brownian motion of individual molecules and the stochastic firing of rule-defined reaction events. Because rules are used as event generators, the algorithm is network-free, meaning that it does not require to generate the complete reaction network implied by rules prior to simulation. In a simulation, each molecule (or complex of molecules) is taken to occupy a single lattice site that cannot be shared with another molecule (or complex). SPATKIN is capable of simulating a wide array of membrane-associated processes, including adsorption, desorption and crowding. Models are specified using an extension of the BioNetGen language, which allows to account for spatial features of the simulated process. The C ++ source code for SPATKIN is distributed freely under the terms of the GNU GPLv3 license. The source code can be compiled for execution on popular platforms (Windows, Mac and Linux). An installer for 64-bit Windows and a macOS app are available. The source code and precompiled binaries are available at the SPATKIN Web site (http://pmbm.ippt.pan.pl/software/spatkin). spatkin.simulator@gmail.com. Supplementary data are available at Bioinformatics online. © The Author 2017. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com

  5. Testing the cranial evolutionary allometric 'rule' in Galliformes.

    Science.gov (United States)

    Linde-Medina, M

    2016-09-01

    Recent comparative studies have indicated the existence of a common cranial evolutionary allometric (CREA) pattern in mammals and birds, in which smaller species have relatively smaller faces and bigger braincases than larger species. In these studies, cranial allometry was tested using a multivariate regression between shape (described using landmarks coordinates) and size (i.e. centroid size), after accounting for phylogenetic relatedness. Alternatively, cranial allometry can be determined by comparing the sizes of two anatomical parts using a bivariate regression analysis. In this analysis, a slope higher or lower than one indicates the existence of positive or negative allometry, respectively. Thus, in those species that support the CREA 'rule', positive allometry is expected for the association between face size and braincase size, which would indicate that larger species have disproportionally larger faces. In this study, I applied these two approaches to explore cranial allometry in 83 Galliformes (Aves, Galloanserae), ranging in mean body weight from 30 g to 2.5 kg. The multivariate regression between shape and centroid size revealed the existence of a significant allometric pattern resembling CREA, whereas the second analysis revealed a negative allometry for beak size and braincase size (i.e. contrary to the CREA 'rule', larger galliform species have disproportionally shorter beaks than smaller galliform species). This study suggests that the presence of CREA may be overestimated when using cranium size as the standard measurement. © 2016 European Society For Evolutionary Biology. Journal of Evolutionary Biology © 2016 European Society For Evolutionary Biology.

  6. Performance Measurement Model A TarBase model with ...

    Indian Academy of Sciences (India)

    rohit

    Model A 8.0 2.0 94.52% 88.46% 76 108 12 12 0.86 0.91 0.78 0.94. Model B 2.0 2.0 93.18% 89.33% 64 95 10 9 0.88 0.90 0.75 0.98. The above results for TEST – 1 show details for our two models (Model A and Model B).Performance of Model A after adding of 32 negative dataset of MiRTif on our testing set(MiRecords) ...

  7. Supporting the Constructive Use of Existing Hydrological Models in Participatory Settings: a Set of "Rules of the Game"

    Directory of Open Access Journals (Sweden)

    Pieter W. G. Bots

    2011-06-01

    Full Text Available When hydrological models are used in support of water management decisions, stakeholders often contest these models because they perceive certain aspects to be inadequately addressed. A strongly contested model may be abandoned completely, even when stakeholders could potentially agree on the validity of part of the information it can produce. The development of a new model is costly, and the results may be contested again. We consider how existing hydrological models can be used in a policy process so as to benefit from both hydrological knowledge and the perspectives and local knowledge of stakeholders. We define a code of conduct as a set of "rules of the game" that we base on a case study of developing a water management plan for a Natura 2000 site in the Netherlands. We propose general rules for agenda management and information sharing, and more specific rules for model use and option development. These rules structure the interactions among actors, help them to explicitly acknowledge uncertainties, and prevent expertise from being neglected or overlooked. We designed the rules to favor openness, protection of core stakeholder values, the use of relevant substantive knowledge, and the momentum of the process. We expect that these rules, although developed on the basis of a water-management issue, can also be applied to support the use of existing computer models in other policy domains. As rules will shape actions only when they are constantly affirmed by actors, we expect that the rules will become less useful in an "unruly" social environment where stakeholders constantly challenge the proceedings.

  8. Approximating optimal behavioural strategies down to rules-of-thumb: energy reserve changes in pairs of social foragers.

    Directory of Open Access Journals (Sweden)

    Sean A Rands

    Full Text Available Functional explanations of behaviour often propose optimal strategies for organisms to follow. These 'best' strategies could be difficult to perform given biological constraints such as neural architecture and physiological constraints. Instead, simple heuristics or 'rules-of-thumb' that approximate these optimal strategies may instead be performed. From a modelling perspective, rules-of-thumb are also useful tools for considering how group behaviour is shaped by the behaviours of individuals. Using simple rules-of-thumb reduces the complexity of these models, but care needs to be taken to use rules that are biologically relevant. Here, we investigate the similarity between the outputs of a two-player dynamic foraging game (which generated optimal but complex solutions and a computational simulation of the behaviours of the two members of a foraging pair, who instead followed a rule-of-thumb approximation of the game's output. The original game generated complex results, and we demonstrate here that the simulations following the much-simplified rules-of-thumb also generate complex results, suggesting that the rule-of-thumb was sufficient to make some of the model outcomes unpredictable. There was some agreement between both modelling techniques, but some differences arose - particularly when pair members were not identical in how they gained and lost energy. We argue that exploring how rules-of-thumb perform in comparison to their optimal counterparts is an important exercise for biologically validating the output of agent-based models of group behaviour.

  9. Approximating optimal behavioural strategies down to rules-of-thumb: energy reserve changes in pairs of social foragers.

    Science.gov (United States)

    Rands, Sean A

    2011-01-01

    Functional explanations of behaviour often propose optimal strategies for organisms to follow. These 'best' strategies could be difficult to perform given biological constraints such as neural architecture and physiological constraints. Instead, simple heuristics or 'rules-of-thumb' that approximate these optimal strategies may instead be performed. From a modelling perspective, rules-of-thumb are also useful tools for considering how group behaviour is shaped by the behaviours of individuals. Using simple rules-of-thumb reduces the complexity of these models, but care needs to be taken to use rules that are biologically relevant. Here, we investigate the similarity between the outputs of a two-player dynamic foraging game (which generated optimal but complex solutions) and a computational simulation of the behaviours of the two members of a foraging pair, who instead followed a rule-of-thumb approximation of the game's output. The original game generated complex results, and we demonstrate here that the simulations following the much-simplified rules-of-thumb also generate complex results, suggesting that the rule-of-thumb was sufficient to make some of the model outcomes unpredictable. There was some agreement between both modelling techniques, but some differences arose - particularly when pair members were not identical in how they gained and lost energy. We argue that exploring how rules-of-thumb perform in comparison to their optimal counterparts is an important exercise for biologically validating the output of agent-based models of group behaviour.

  10. Levels of Organisation in agent-based modelling for renewable resources management. Agricultural water management collective rules enforcement in the French Drome River Valley Case Study

    International Nuclear Information System (INIS)

    Abrami, G.

    2004-11-01

    Levels of Organisation in agent-based modelling for renewable resources management. Agricultural water management collective rules enforcement in the French Dr me River Valley Case Study. In the context of Agent-Based Modelling for participative renewable resources management, this thesis is concerned with representing multiple tangled levels of organisation of a system. The Agent-Group-Role (AGR) formalism is borrowed from computer science research. It has been conceptually specified to handle levels of organisation, and behaviours within levels of organisation. A design methodology dedicated to AGR modelling has been developed, together with an implementation of the formalism over a multi-agent platform. AGR models of agricultural water management in the French Dr me River Valley have been built and tested. This experiment demonstrates the AGR formalism ability to (1) clarify usually implicit hypothesis on action modes, scales or viewpoints (2) facilitate the definition of scenarios with various collective rules, and various rules in enforcement behaviours (3) generate bricks for generic irrigated catchment models. (author)

  11. A high-level language for rule-based modelling.

    Science.gov (United States)

    Pedersen, Michael; Phillips, Andrew; Plotkin, Gordon D

    2015-01-01

    Rule-based languages such as Kappa excel in their support for handling the combinatorial complexities prevalent in many biological systems, including signalling pathways. But Kappa provides little structure for organising rules, and large models can therefore be hard to read and maintain. This paper introduces a high-level, modular extension of Kappa called LBS-κ. We demonstrate the constructs of the language through examples and three case studies: a chemotaxis switch ring, a MAPK cascade, and an insulin signalling pathway. We then provide a formal definition of LBS-κ through an abstract syntax and a translation to plain Kappa. The translation is implemented in a compiler tool which is available as a web application. We finally demonstrate how to increase the expressivity of LBS-κ through embedded scripts in a general-purpose programming language, a technique which we view as generally applicable to other domain specific languages.

  12. Numerical analysis of strain localization for transversely isotropic model with non-coaxial flow rule

    Science.gov (United States)

    Wei, Ding; Cong-cong, Yu; Chen-hui, Wu; Zheng-yi, Shu

    2018-03-01

    To analyse the strain localization behavior of geomaterials, the forward Euler schemes and the tangent modulus matrix are formulated based on the transversely isotropic yield criterion with non-coaxial flow rule developed by Lade, the program code is implemented based on the user subroutine (UMAT) of ABAQUS. The influence of the material principal direction on the strain localization and the bearing capacity of the structure are investigated and analyzed. Numerical results show the validity and performance of the proposed model in simulating the strain localization behavior of geostructures.

  13. Lesion size affects diagnostic performance of IOTA logistic regression models, IOTA simple rules and risk of malignancy index in discriminating between benign and malignant adnexal masses.

    Science.gov (United States)

    Di Legge, A; Testa, A C; Ameye, L; Van Calster, B; Lissoni, A A; Leone, F P G; Savelli, L; Franchi, D; Czekierdowski, A; Trio, D; Van Holsbeke, C; Ferrazzi, E; Scambia, G; Timmerman, D; Valentin, L

    2012-09-01

    To estimate the ability to discriminate between benign and malignant adnexal masses of different size using: subjective assessment, two International Ovarian Tumor Analysis (IOTA) logistic regression models (LR1 and LR2), the IOTA simple rules and the risk of malignancy index (RMI). We used a multicenter IOTA database of 2445 patients with at least one adnexal mass, i.e. the database previously used to prospectively validate the diagnostic performance of LR1 and LR2. The masses were categorized into three subgroups according to their largest diameter: small tumors (diameter IOTA simple rules and the RMI were applied to each of the three groups. Sensitivity, specificity, positive and negative likelihood ratio (LR+, LR-), diagnostic odds ratio (DOR) and area under the receiver-operating characteristics curve (AUC) were used to describe diagnostic performance. A moving window technique was applied to estimate the effect of tumor size as a continuous variable on the AUC. The reference standard was the histological diagnosis of the surgically removed adnexal mass. The frequency of invasive malignancy was 10% in small tumors, 19% in medium-sized tumors and 40% in large tumors; 11% of the large tumors were borderline tumors vs 3% and 4%, respectively, of the small and medium-sized tumors. The type of benign histology also differed among the three subgroups. For all methods, sensitivity with regard to malignancy was lowest in small tumors (56-84% vs 67-93% in medium-sized tumors and 74-95% in large tumors) while specificity was lowest in large tumors (60-87%vs 83-95% in medium-sized tumors and 83-96% in small tumors ). The DOR and the AUC value were highest in medium-sized tumors and the AUC was largest in tumors with a largest diameter of 7-11 cm. Tumor size affects the performance of subjective assessment, LR1 and LR2, the IOTA simple rules and the RMI in discriminating correctly between benign and malignant adnexal masses. The likely explanation, at least in part, is

  14. Optimal Sequential Rules for Computer-Based Instruction.

    Science.gov (United States)

    Vos, Hans J.

    1998-01-01

    Formulates sequential rules for adapting the appropriate amount of instruction to learning needs in the context of computer-based instruction. Topics include Bayesian decision theory, threshold and linear-utility structure, psychometric model, optimal sequential number of test questions, and an empirical example of sequential instructional…

  15. Profitability of simple stationary technical trading rules with high-frequency data of Chinese Index Futures

    Science.gov (United States)

    Chen, Jing-Chao; Zhou, Yu; Wang, Xi

    2018-02-01

    Technical trading rules have been widely used by practitioners in financial markets for a long time. The profitability remains controversial and few consider the stationarity of technical indicators used in trading rules. We convert MA, KDJ and Bollinger bands into stationary processes and investigate the profitability of these trading rules by using 3 high-frequency data(15s,30s and 60s) of CSI300 Stock Index Futures from January 4th 2012 to December 31st 2016. Several performance and risk measures are adopted to assess the practical value of all trading rules directly while ADF-test is used to verify the stationarity and SPA test to check whether trading rules perform well due to intrinsic superiority or pure luck. The results show that there are several significant combinations of parameters for each indicator when transaction costs are not taken into consideration. Once transaction costs are included, trading profits will be eliminated completely. We also propose a method to reduce the risk of technical trading rules.

  16. Rule-based modularization in model transformation languages illustrated with ATL

    NARCIS (Netherlands)

    Ivanov, Ivan; van den Berg, Klaas; Jouault, Frédéric

    2007-01-01

    This paper studies ways for modularizing transformation definitions in current rule-based model transformation languages. Two scenarios are shown in which the modular units are identified on the basis of relations between source and target metamodels and on the base of generic transformation

  17. The Reasonableness Test of the Principal Purpose Test Rule in OECD BEPS Action 6 (Tax Treaty Abuse) versus the EU Principle of Legal Certainty and the EU Abuse of Law Case Law

    NARCIS (Netherlands)

    D. Weber (Dennis)

    2017-01-01

    textabstractThe OECD BEPS Action 6 report contains a principal pur- pose test rule (PPT rule) for the purpose of combating abuse of tax treaties. This PPT rule is also included in the OECD Multilateral Instrument. The PPT rule is (amongst others) applicable when ‘it is rea- sonable to conclude’

  18. Rules of performance in the nursing home: A grounded theory of nurse-CNA communication.

    Science.gov (United States)

    Madden, Connie; Clayton, Margaret; Canary, Heather E; Towsley, Gail; Cloyes, Kristin; Lund, Dale

    This study offers an initial theoretical understanding of nurse-CNA communication processes from the perspectives of nurses and CNAs who are providing direct care to residents in nursing homes. A grounded theory approach provided an understanding of nurse-CNA communication process within the complexities of the nursing home setting. Four themes (maintaining information flow, following procedure, fostering collegiality, and showing respect) describe the "rules of performance" that intertwine in nuanced relationships to guide nurse-CNA communication processes. Understanding how these rules of performance guide nurse-CNA communication processes, and how they are positively and negatively influenced, suggests that nurse-CNA communication during direct care of nursing home residents could be improved through policy and education that is specifically designed to be relevant and applicable to direct care providers in the nursing home environment. Copyright © 2016 Elsevier Inc. All rights reserved.

  19. MODELING MONETARY POLICY RULES IN THE MENACOUNTRIES: ISSUES AND EVIDENCE

    OpenAIRE

    Mohamad Husam Helmi

    2011-01-01

    This paper estimates the monetary policy reaction function for two sets of MENAcountries: The inflation target countries, (Turkeyand Israel) and the exchange ratetarget countries, (Jordan and Morocco). We motivateour empirical analysis byanalyzing a simple Taylor rule. This model looks atthe effects of inflation andoutput on setting the interest rate by the centralbank. Furthermore, we extendedour model by adding the exchange rate and the foreign interest rate ...

  20. Rule-Based Storytelling Text-to-Speech (TTS Synthesis

    Directory of Open Access Journals (Sweden)

    Ramli Izzad

    2016-01-01

    Full Text Available In recent years, various real life applications such as talking books, gadgets and humanoid robots have drawn the attention to pursue research in the area of expressive speech synthesis. Speech synthesis is widely used in various applications. However, there is a growing need for an expressive speech synthesis especially for communication and robotic. In this paper, global and local rule are developed to convert neutral to storytelling style speech for the Malay language. In order to generate rules, modification of prosodic parameters such as pitch, intensity, duration, tempo and pauses are considered. Modification of prosodic parameters is examined by performing prosodic analysis on a story collected from an experienced female and male storyteller. The global and local rule is applied in sentence level and synthesized using HNM. Subjective tests are conducted to evaluate the synthesized storytelling speech quality of both rules based on naturalness, intelligibility, and similarity to the original storytelling speech. The results showed that global rule give a better result than local rule

  1. Uncertainty Analysis of Resistance Tests in Ata Nutku Ship Model Testing Laboratory of Istanbul Technical University

    Directory of Open Access Journals (Sweden)

    Cihad DELEN

    2015-12-01

    Full Text Available In this study, some systematical resistance tests, where were performed in Ata Nutku Ship Model Testing Laboratory of Istanbul Technical University (ITU, have been included in order to determine the uncertainties. Experiments which are conducted in the framework of mathematical and physical rules for the solution of engineering problems, measurements, calculations include uncertainty. To question the reliability of the obtained values, the existing uncertainties should be expressed as quantities. The uncertainty of a measurement system is not known if the results do not carry a universal value. On the other hand, resistance is one of the most important parameters that should be considered in the process of ship design. Ship resistance during the design phase of a ship cannot be determined precisely and reliably due to the uncertainty resources in determining the resistance value that are taken into account. This case may cause negative effects to provide the required specifications in the latter design steps. The uncertainty arising from the resistance test has been estimated and compared for a displacement type ship and high speed marine vehicles according to ITTC 2002 and ITTC 2014 regulations which are related to the uncertainty analysis methods. Also, the advantages and disadvantages of both ITTC uncertainty analysis methods have been discussed.

  2. Modeling reliability measurement of interface on information system: Towards the forensic of rules

    Science.gov (United States)

    Nasution, M. K. M.; Sitompul, Darwin; Harahap, Marwan

    2018-02-01

    Today almost all machines depend on the software. As a software and hardware system depends also on the rules that are the procedures for its use. If the procedure or program can be reliably characterized by involving the concept of graph, logic, and probability, then regulatory strength can also be measured accordingly. Therefore, this paper initiates an enumeration model to measure the reliability of interfaces based on the case of information systems supported by the rules of use by the relevant agencies. An enumeration model is obtained based on software reliability calculation.

  3. A retrospective study of two populations to test a simple rule for spirometry.

    Science.gov (United States)

    Ohar, Jill A; Yawn, Barbara P; Ruppel, Gregg L; Donohue, James F

    2016-06-04

    Chronic lung disease is common and often under-diagnosed. To test a simple rule for conducting spirometry we reviewed spirograms from two populations, occupational medicine evaluations (OME) conducted by Saint Louis and Wake Forest Universities at 3 sites (n = 3260, mean age 64.14 years, 95 % CI 58.94-69.34, 97 % men) and conducted by Wake Forest University preop clinic (POC) at one site (n = 845, mean age 62.10 years, 95 % CI 50.46-73.74, 57 % men). This retrospective review of database information that the first author collected prospectively identified rates, types, sensitivity, specificity and positive and negative predictive value for lung function abnormalities and associated mortality rate found when conducting spirometry based on the 20/40 rule (≥20 years of smoking in those aged ≥ 40 years) in the OME population. To determine the reproducibility of the 20/40 rule for conducting spirometry, the rule was applied to the POC population. A lung function abnormality was found in 74 % of the OME population and 67 % of the POC population. Sensitivity of the rule was 85 % for an obstructive pattern and 77 % for any abnormality on spirometry. Positive and negative predictive values of the rule for a spirometric abnormality were 74 and 55 %, respectively. Patients with an obstructive pattern were at greater risk of coronary heart disease (odds ratio (OR) 1.39 [confidence interval (CI) 1.00-1.93] vs. normal) and death (hazard ratio (HR) 1.53, 95 % CI 1.20-1.84) than subjects with normal spirometry. Restricted spirometry patterns were also associated with greater risk of coronary disease (odds ratio (OR) 1.7 [CI 1.23-2.35]) and death (Hazard ratio 1.40, 95 % CI 1.08-1.72). Smokers (≥ 20 pack years) age ≥ 40 years are at an increased risk for lung function abnormalities and those abnormalities are associated with greater presence of coronary heart disease and increased all-cause mortality. Use of the 20/40 rule could provide a

  4. Large scale model testing

    International Nuclear Information System (INIS)

    Brumovsky, M.; Filip, R.; Polachova, H.; Stepanek, S.

    1989-01-01

    Fracture mechanics and fatigue calculations for WWER reactor pressure vessels were checked by large scale model testing performed using large testing machine ZZ 8000 (with a maximum load of 80 MN) at the SKODA WORKS. The results are described from testing the material resistance to fracture (non-ductile). The testing included the base materials and welded joints. The rated specimen thickness was 150 mm with defects of a depth between 15 and 100 mm. The results are also presented of nozzles of 850 mm inner diameter in a scale of 1:3; static, cyclic, and dynamic tests were performed without and with surface defects (15, 30 and 45 mm deep). During cyclic tests the crack growth rate in the elastic-plastic region was also determined. (author). 6 figs., 2 tabs., 5 refs

  5. Sleep promotes the extraction of grammatical rules.

    Directory of Open Access Journals (Sweden)

    Ingrid L C Nieuwenhuis

    Full Text Available Grammar acquisition is a high level cognitive function that requires the extraction of complex rules. While it has been proposed that offline time might benefit this type of rule extraction, this remains to be tested. Here, we addressed this question using an artificial grammar learning paradigm. During a short-term memory cover task, eighty-one human participants were exposed to letter sequences generated according to an unknown artificial grammar. Following a time delay of 15 min, 12 h (wake or sleep or 24 h, participants classified novel test sequences as Grammatical or Non-Grammatical. Previous behavioral and functional neuroimaging work has shown that classification can be guided by two distinct underlying processes: (1 the holistic abstraction of the underlying grammar rules and (2 the detection of sequence chunks that appear at varying frequencies during exposure. Here, we show that classification performance improved after sleep. Moreover, this improvement was due to an enhancement of rule abstraction, while the effect of chunk frequency was unaltered by sleep. These findings suggest that sleep plays a critical role in extracting complex structure from separate but related items during integrative memory processing. Our findings stress the importance of alternating periods of learning with sleep in settings in which complex information must be acquired.

  6. Performance testing of a novel off-plane reflection grating and silicon pore optic spectrograph at PANTER

    Science.gov (United States)

    Marlowe, Hannah; McEntaffer, Randall L.; Allured, Ryan; DeRoo, Casey; Miles, Drew M.; Donovan, Benjamin D.; Tutt, James H.; Burwitz, Vadim; Menz, Benedikt; Hartner, Gisela D.; Smith, Randall K.; Günther, Ramses; Yanson, Alex; Vacanti, Giuseppe; Ackermann, Marcelo

    2015-05-01

    An X-ray spectrograph consisting of aligned, radially ruled off-plane reflection gratings and silicon pore optics (SPO) was tested at the Max Planck Institute for extraterrestrial Physics PANTER X-ray test facility. The SPO is a test module for the proposed Arcus mission, which will also feature aligned off-plane reflection gratings. This test is the first time two off-plane gratings were actively aligned to each other and with a SPO to produce an overlapped spectrum. We report the performance of the complete spectrograph utilizing the aligned gratings module and plans for future development.

  7. Test anxiety, perfectionism, goal orientation, and academic performance.

    Science.gov (United States)

    Eum, KoUn; Rice, Kenneth G

    2011-03-01

    Dimensions of perfectionism and goal orientation have been reported to have differential relationships with test anxiety. However, the degree of inter-relationship between different dimensions of perfectionism, the 2 × 2 model of goal orientations proposed by Elliot and McGregor, cognitive test anxiety, and academic performance indicators is not known. Based on data from 134 university students, we conducted correlation and regression analyses to test associations between adaptive and maladaptive perfectionism, four types of goal orientations, cognitive test anxiety, and two indicators of academic performance: proximal cognitive performance on a word list recall test and distal academic performance in terms of grade point average. Cognitive test anxiety was inversely associated with both performance indicators, and positively associated with maladaptive perfectionism and avoidance goal orientations. Adaptive and maladaptive perfectionism accounted for significant variance in cognitive test anxiety after controlling for approach and avoidance goal orientations. Overall, nearly 50% of the variance in cognitive test anxiety could be attributed to gender, goal orientations, and perfectionism. Results suggested that students who are highly test anxious are likely to be women who endorse avoidance goal orientations and are maladaptively perfectionistic.

  8. Evaluation of rules to distinguish unique female grizzly bears with cubs in Yellowstone

    Science.gov (United States)

    Schwartz, C.C.; Haroldson, M.A.; Cherry, S.; Keating, K.A.

    2008-01-01

    The United States Fish and Wildlife Service uses counts of unduplicated female grizzly bears (Ursus arctos) with cubs-of-the-year to establish limits of sustainable mortality in the Greater Yellowstone Ecosystem, USA. Sightings are dustered into observations of unique bears based on an empirically derived rule set. The method has never been tested or verified. To evaluate the rule set, we used data from radiocollared females obtained during 1975-2004 to simulate populations under varying densities, distributions, and sighting frequencies. We tested individual rules and rule-set performance, using custom software to apply the rule-set and duster sightings. Results indicated most rules were violated to some degree, and rule-based dustering consistently underestimated the minimum number of females and total population size derived from a nonparametric estimator (Chao2). We conclude that the current rule set returns conservative estimates, but with minor improvements, counts of unduplicated females-with-cubs can serve as a reasonable index of population size useful for establishing annual mortality limits. For the Yellowstone population, the index is more practical and cost-effective than capture-mark-recapture using either DNA hair snagging or aerial surveys with radiomarked bears. The method has useful application in other ecosystems, but we recommend rules used to distinguish unique females be adapted to local conditions and tested.

  9. Probabilistic safety assessment support for the maintenance rule at Duke Power Company

    International Nuclear Information System (INIS)

    Brewer, H. Duncan; Canady, Ken S.

    1999-01-01

    The Nuclear Regulatory Commission (NRC) published the Maintenance Rule on July 10, 1991 with an implementation date of July 10, 1996 . Maintenance rule implementation at the Duke Power Company has used probabilistic safety assessment (PSA) insights to help focus the monitoring of structures, systems and components (SSC) performance and to ensure that maintenance is effectively performed. This paper describes how the probabilistic risk assessment (PRA) group at the Duke Power Company provides support for the maintenance rule by performing the following tasks: (1) providing a member of the expert panel; (2) determining the risk-significant SSCs; (3) establishing SSC performance criteria for availability and reliability; (4) evaluating past performance and its impact on core damage risk as part of the periodic assessment; (5) providing input to the PRA matrix; (6) providing risk analyses of combinations of SSCs out of service; (7) providing support for the SENTINEL program; and (8) providing support for PSA training. These tasks are not simply tied to the initial implementation of the rule. The maintenance rule must be kept consistent with the current design and operation of the plant. This will require that the PRA models and the many PSA calculations performed to support the maintenance rule are kept up-to-date. Therefore, support of the maintenance rule will be one of the primary roles of the PSA group for the remainder of the life of the plant

  10. Exploration of SWRL Rule Bases through Visualization, Paraphrasing, and Categorization of Rules

    Science.gov (United States)

    Hassanpour, Saeed; O'Connor, Martin J.; Das, Amar K.

    Rule bases are increasingly being used as repositories of knowledge content on the Semantic Web. As the size and complexity of these rule bases increases, developers and end users need methods of rule abstraction to facilitate rule management. In this paper, we describe a rule abstraction method for Semantic Web Rule Language (SWRL) rules that is based on lexical analysis and a set of heuristics. Our method results in a tree data structure that we exploit in creating techniques to visualize, paraphrase, and categorize SWRL rules. We evaluate our approach by applying it to several biomedical ontologies that contain SWRL rules, and show how the results reveal rule patterns within the rule base. We have implemented our method as a plug-in tool for Protégé-OWL, the most widely used ontology modeling software for the Semantic Web. Our tool can allow users to rapidly explore content and patterns in SWRL rule bases, enabling their acquisition and management.

  11. Design rules for piping: Plastic stability of straight parts under level D loadings

    International Nuclear Information System (INIS)

    Touboul, F.; Ben Djidia, M.; Acker, D.

    1989-01-01

    Design rules for piping, elaborated for Fast Breeder Reactors, are based on analysis performed for Pressure Water Reactors. Interpretation of largely diversified straight parts tests, enable us to validate and improve existing rules and to propose a more suitable formula. Design rules for piping appear to be non conservative for austenitic thin tubes in bending or torsion. By introducing a B 2 coefficient, geometrically dependent, the gap between thin and thick tubes may be withheld. Conservatism of rules can be ensured by considering the allowable stress defined by ASME, Section III, Appendix F

  12. Item-saving assessment of self-care performance in children with developmental disabilities: A prospective caregiver-report computerized adaptive test

    Science.gov (United States)

    Chen, Cheng-Te; Chen, Yu-Lan; Lin, Yu-Ching; Hsieh, Ching-Lin; Tzeng, Jeng-Yi

    2018-01-01

    Objective The purpose of this study was to construct a computerized adaptive test (CAT) for measuring self-care performance (the CAT-SC) in children with developmental disabilities (DD) aged from 6 months to 12 years in a content-inclusive, precise, and efficient fashion. Methods The study was divided into 3 phases: (1) item bank development, (2) item testing, and (3) a simulation study to determine the stopping rules for the administration of the CAT-SC. A total of 215 caregivers of children with DD were interviewed with the 73-item CAT-SC item bank. An item response theory model was adopted for examining the construct validity to estimate item parameters after investigation of the unidimensionality, equality of slope parameters, item fitness, and differential item functioning (DIF). In the last phase, the reliability and concurrent validity of the CAT-SC were evaluated. Results The final CAT-SC item bank contained 56 items. The stopping rules suggested were (a) reliability coefficient greater than 0.9 or (b) 14 items administered. The results of simulation also showed that 85% of the estimated self-care performance scores would reach a reliability higher than 0.9 with a mean test length of 8.5 items, and the mean reliability for the rest was 0.86. Administering the CAT-SC could reduce the number of items administered by 75% to 84%. In addition, self-care performances estimated by the CAT-SC and the full item bank were very similar to each other (Pearson r = 0.98). Conclusion The newly developed CAT-SC can efficiently measure self-care performance in children with DD whose performances are comparable to those of TD children aged from 6 months to 12 years as precisely as the whole item bank. The item bank of the CAT-SC has good reliability and a unidimensional self-care construct, and the CAT can estimate self-care performance with less than 25% of the items in the item bank. Therefore, the CAT-SC could be useful for measuring self-care performance in children with

  13. Concurrent approach for evolving compact decision rule sets

    Science.gov (United States)

    Marmelstein, Robert E.; Hammack, Lonnie P.; Lamont, Gary B.

    1999-02-01

    The induction of decision rules from data is important to many disciplines, including artificial intelligence and pattern recognition. To improve the state of the art in this area, we introduced the genetic rule and classifier construction environment (GRaCCE). It was previously shown that GRaCCE consistently evolved decision rule sets from data, which were significantly more compact than those produced by other methods (such as decision tree algorithms). The primary disadvantage of GRaCCe, however, is its relatively poor run-time execution performance. In this paper, a concurrent version of the GRaCCE architecture is introduced, which improves the efficiency of the original algorithm. A prototype of the algorithm is tested on an in- house parallel processor configuration and the results are discussed.

  14. Efficient Encoding of Inflection Rules in NLP Systems

    Directory of Open Access Journals (Sweden)

    Péter BARABÁSS

    2012-12-01

    Full Text Available The grammatical parsing unit is a core module in natural language processing engines. This unit determines the grammatical roles of the incoming words and it converts the sentences into semantic models. A special grammar rule in agglutinative languages is the inflection rule. The traditional, automata-based parsers are usually not very effective in the parsing of inflection transformations. The paper presents implementation alternatives and compares them from the viewpoint of time efficiency and accuracy. The prototype system was tested with examples from Hungarian.

  15. Textiles Performance Testing Facilities

    Data.gov (United States)

    Federal Laboratory Consortium — The Textiles Performance Testing Facilities has the capabilities to perform all physical wet and dry performance testing, and visual and instrumental color analysis...

  16. Isokinetic strength testing does not predict hamstring injury in Australian Rules footballers

    OpenAIRE

    Bennell, K.; Wajswelner, H.; Lew, P.; Schall-Riaucour, A.; Leslie, S.; Plant, D.; Cirone, J.

    1998-01-01

    OBJECTIVE: To determine the relation of hamstring and quadriceps muscle strength and imbalance to hamstring injury using a prospective observational cohort study METHOD: A total of 102 senior male Australian Rules footballers aged 22.2 (3.6) years were tested at the start of a football season. Maximum voluntary concentric and eccentric torque of the hamstring and quadriceps muscles of both legs was assessed using a Kin-Com isokinetic dynamometer at angular velocities of 60 and 180 degre...

  17. The Cynomolgus Macaque Natural History Model of Pneumonic Tularemia for Predicting Clinical Efficacy Under the Animal Rule

    Science.gov (United States)

    Guina, Tina; Lanning, Lynda L.; Omland, Kristian S.; Williams, Mark S.; Wolfraim, Larry A.; Heyse, Stephen P.; Houchens, Christopher R.; Sanz, Patrick; Hewitt, Judith A.

    2018-01-01

    Francisella tularensis is a highly infectious Gram-negative bacterium that is the etiologic agent of tularemia in animals and humans and a Tier 1 select agent. The natural incidence of pneumonic tularemia worldwide is very low; therefore, it is not feasible to conduct clinical efficacy testing of tularemia medical countermeasures (MCM) in human populations. Development and licensure of tularemia therapeutics and vaccines need to occur under the Food and Drug Administration's (FDA's) Animal Rule under which efficacy studies are conducted in well-characterized animal models that reflect the pathophysiology of human disease. The Tularemia Animal Model Qualification (AMQ) Working Group is seeking qualification of the cynomolgus macaque (Macaca fascicularis) model of pneumonic tularemia under Drug Development Tools Qualification Programs with the FDA based upon the results of studies described in this manuscript. Analysis of data on survival, average time to death, average time to fever onset, average interval between fever and death, and bacteremia; together with summaries of clinical signs, necropsy findings, and histopathology from the animals exposed to aerosolized F. tularensis Schu S4 in five natural history studies and one antibiotic efficacy study form the basis for the proposed cynomolgus macaque model. Results support the conclusion that signs of pneumonic tularemia in cynomolgus macaques exposed to 300–3,000 colony forming units (cfu) aerosolized F. tularensis Schu S4, under the conditions described herein, and human pneumonic tularemia cases are highly similar. Animal age, weight, and sex of animals challenged with 300–3,000 cfu Schu S4 did not impact fever onset in studies described herein. This study summarizes critical parameters and endpoints of a well-characterized cynomolgus macaque model of pneumonic tularemia and demonstrates this model is appropriate for qualification, and for testing efficacy of tularemia therapeutics under Animal Rule. PMID

  18. Decision support system for triage management: A hybrid approach using rule-based reasoning and fuzzy logic.

    Science.gov (United States)

    Dehghani Soufi, Mahsa; Samad-Soltani, Taha; Shams Vahdati, Samad; Rezaei-Hachesu, Peyman

    2018-06-01

    Fast and accurate patient triage for the response process is a critical first step in emergency situations. This process is often performed using a paper-based mode, which intensifies workload and difficulty, wastes time, and is at risk of human errors. This study aims to design and evaluate a decision support system (DSS) to determine the triage level. A combination of the Rule-Based Reasoning (RBR) and Fuzzy Logic Classifier (FLC) approaches were used to predict the triage level of patients according to the triage specialist's opinions and Emergency Severity Index (ESI) guidelines. RBR was applied for modeling the first to fourth decision points of the ESI algorithm. The data relating to vital signs were used as input variables and modeled using fuzzy logic. Narrative knowledge was converted to If-Then rules using XML. The extracted rules were then used to create the rule-based engine and predict the triage levels. Fourteen RBR and 27 fuzzy rules were extracted and used in the rule-based engine. The performance of the system was evaluated using three methods with real triage data. The accuracy of the clinical decision support systems (CDSSs; in the test data) was 99.44%. The evaluation of the error rate revealed that, when using the traditional method, 13.4% of the patients were miss-triaged, which is statically significant. The completeness of the documentation also improved from 76.72% to 98.5%. Designed system was effective in determining the triage level of patients and it proved helpful for nurses as they made decisions, generated nursing diagnoses based on triage guidelines. The hybrid approach can reduce triage misdiagnosis in a highly accurate manner and improve the triage outcomes. Copyright © 2018 Elsevier B.V. All rights reserved.

  19. [Usefulness of clinical prediction rules for ruling out deep vein thrombosis in a hospital emergency department].

    Science.gov (United States)

    Rosa-Jiménez, Francisco; Rosa-Jiménez, Ascensión; Lozano-Rodríguez, Aquiles; Santoro-Martínez, María Del Carmen; Duro-López, María Del Carmen; Carreras-Álvarez de Cienfuegos, Amelia

    2015-01-01

    To compare the efficacy of the most familiar clinical prediction rules in combination with D-dimer testing to rule out a diagnosis of deep vein thrombosis (DVT) in a hospital emergency department. Retrospective cross-sectional analysis of the case records of all patients attending a hospital emergency department with suspected lower-limb DVT between 1998 and 2002. Ten clinical prediction scores were calculated and D-dimer levels were available for all patients. The gold standard was ultrasound diagnosis of DVT by an independent radiologist who was blinded to clinical records. For each prediction rule, we analyzed the effectiveness of the prediction strategy defined by "low clinical probability and negative D-dimer level" against the ultrasound diagnosis. A total of 861 case records were reviewed and 577 cases were selected; the mean (SD) age was 66.7 (14.2) years. DVT was diagnosed in 145 patients (25.1%). Only the Wells clinical prediction rule and 4 other models had a false negative rate under 2%. The Wells criteria and the score published by Johanning and colleagues identified higher percentages of cases (15.6% and 11.6%, respectively). This study shows that several clinical prediction rules can be safely used in the emergency department, although none of them have proven more effective than the Wells criteria.

  20. Maintenance Personnel Performance Simulation (MAPPS) model

    International Nuclear Information System (INIS)

    Siegel, A.I.; Bartter, W.D.; Wolf, J.J.; Knee, H.E.; Haas, P.M.

    1984-01-01

    A stochastic computer model for simulating the actions and behavior of nuclear power plant maintenance personnel is described. The model considers personnel, environmental, and motivational variables to yield predictions of maintenance performance quality and time to perform. The mode has been fully developed and sensitivity tested. Additional evaluation of the model is now taking place

  1. Operator performance in non-destructive testing: A study of operator performance in a performance test

    Energy Technology Data Exchange (ETDEWEB)

    Enkvist, J.; Edland, A.; Svenson, Ola [Stockholm Univ. (Sweden). Dept. of Psychology

    2000-05-15

    In the process industries there is a need of inspecting the integrity of critical components without disrupting the process. Such in-service inspections are typically performed with non-destructive testing (NDT). In NDT the task of the operator is to (based on diagnostic information) decide if the component can remain in service or not. The present study looks at the performance in NDT. The aim is to improve performance, in the long run, by exploring the operators' decision strategies and other underlying factors and to this way find out what makes some operators more successful than others. Sixteen operators performed manual ultrasonic inspections of four test pieces with the aim to detect (implanted) cracks. In addition to these performance demonstration tests (PDT), the operators performed independent ability tests and filled out questionnaires. The results show that operators who trust their gut feeling more than the procedure (when the two come to different results) and that at the same time have a positive attitude towards the procedure have a higher PDT performance. These results indicate the need for operators to be motivated and confident when performing NDT. It was also found that the operators who performed better rated more decision criteria higher in the detection phase than the operators who performed worse. For characterizing it was the other way around. Also, the operators who performed better used more time, both detecting and characterizing, than the operators who performed worse.

  2. Operator performance in non-destructive testing: A study of operator performance in a performance test

    International Nuclear Information System (INIS)

    Enkvist, J.; Edland, A.; Svenson, Ola

    2000-05-01

    In the process industries there is a need of inspecting the integrity of critical components without disrupting the process. Such in-service inspections are typically performed with non-destructive testing (NDT). In NDT the task of the operator is to (based on diagnostic information) decide if the component can remain in service or not. The present study looks at the performance in NDT. The aim is to improve performance, in the long run, by exploring the operators' decision strategies and other underlying factors and to this way find out what makes some operators more successful than others. Sixteen operators performed manual ultrasonic inspections of four test pieces with the aim to detect (implanted) cracks. In addition to these performance demonstration tests (PDT), the operators performed independent ability tests and filled out questionnaires. The results show that operators who trust their gut feeling more than the procedure (when the two come to different results) and that at the same time have a positive attitude towards the procedure have a higher PDT performance. These results indicate the need for operators to be motivated and confident when performing NDT. It was also found that the operators who performed better rated more decision criteria higher in the detection phase than the operators who performed worse. For characterizing it was the other way around. Also, the operators who performed better used more time, both detecting and characterizing, than the operators who performed worse

  3. A rule-based fault detection method for air handling units

    Energy Technology Data Exchange (ETDEWEB)

    Schein, J.; Bushby, S. T.; Castro, N. S. [National Institute of Standards and Technology, Gaithersburg, MD (United States); House, J. M. [Iowa Energy Center, Ankeny, IA (United States)

    2006-07-01

    Air handling unit performance assessment rules (APAR) is a fault detection tool that uses a set of expert rules derived from mass and energy balances to detect faults in air handling units (AHUs). Control signals are used to determine the mode of operation of the AHU. A subset of the expert rules which correspond to that mode of operation are then evaluated to determine whether a fault exists. APAR is computationally simple enough that it can be embedded in commercial building automation and control systems and relies only upon the sensor data and control signals that are commonly available in these systems. APAR was tested using data sets collected from a 'hardware-in-the-loop' emulator and from several field sites. APAR was also embedded in commercial AHU controllers and tested in the emulator. (author)

  4. Stochastic prey arrivals and crab spider giving-up times: simulations of spider performance using two simple "rules of thumb".

    Science.gov (United States)

    Kareiva, Peter; Morse, Douglass H; Eccleston, Jill

    1989-03-01

    We compared the patch-choice performances of an ambush predator, the crab spider Misumena vatia (Thomisidae) hunting on common milkweed Asclepias syriaca (Asclepiadaceae) umbles, with two stochastic rule-of-thumb simulation models: one that employed a threshold giving-up time and one that assumed a fixed probability of moving. Adult female Misumena were placed on milkweed plants with three umbels, each with markedly different numbers of flower-seeking prey. Using a variety of visitation regimes derived from observed visitation patterns of insect prey, we found that decreases in among-umbel variance in visitation rates or increases in overall mean visitation rates reduced the "clarity of the optimum" (the difference in the yield obtained as foraging behavior changes), both locally and globally. Yield profiles from both models were extremely flat or jagged over a wide range of prey visitation regimes; thus, differences between optimal and "next-best" strategies differed only modestly over large parts of the "foraging landscape". Although optimal yields from fixed probability simulations were one-third to one-half those obtained from threshold simulations, spiders appear to depart umbels in accordance with the fixed probability rule.

  5. Patterns of sexual size dimorphism in horseshoe bats: Testing Rensch's rule and potential causes.

    Science.gov (United States)

    Wu, Hui; Jiang, Tinglei; Huang, Xiaobin; Feng, Jiang

    2018-02-08

    Rensch's rule, stating that sexual size dimorphism (SSD) becomes more evident and male-biased with increasing body size, has been well supported for taxa that exhibit male-biased SSD. Bats, primarily having female-biased SSD, have so far been tested for whether SSD allometry conforms to Rensch's rule in only three studies. However, these studies did not consider phylogeny, and thus the mechanisms underlying SSD variations in bats remain unclear. Thus, the present study reviewed published and original data, including body size, baculum size, and habitat types in 45 bats of the family Rhinolophidae to determine whether horseshoe bats follow Rensch's rule using a phylogenetic comparative framework. We also investigated the potential effect of postcopulatory sexual selection and habitat type on SSD. Our findings indicated that Rensch's rule did not apply to Rhinolophidae, suggesting that SSD did not significantly vary with increasing size. This pattern may be attributable interactions between weak sexual selection to male body size and strong fecundity selection for on female body size. The degree of SSD among horseshoe bats may be attributed to a phylogenetic effect rather than to the intersexual competition for food or to baculum length. Interestingly, we observed that species in open habitats exhibited greater SSD than those in dense forests, suggesting that habitat types may be associated with variations in SSD in horseshoe bats.

  6. RULES FOR SELECTING AND USING KEY PERFORMANCE INDICATORS FOR THE SERVICE INDUSTRY

    Directory of Open Access Journals (Sweden)

    Alexandra - Elena RUSĂNEANU

    2014-06-01

    Full Text Available There is no question that performance is the desired result of every activity or action. In order to correctly measure an organization’s performance it is necessary to select key performance indicators (KPIs that will deliver long-term value to the company. KPIs are presenting performance information for all levels of the organization and they are reflecting the progress made so far to achieve strategic objectives. The selection of the key performance indicators must be made according to the organization’s industry and activity. The company must truly understand its business and its mission. Also, KPIs must be closely linked to the strategic objectives. The focus of this research is to present effective rules for defining key performance indicators for the Service industry. This sector of economy consists in generating intangible goods like experience, expertise and information. Therefore, monitoring this type of services requires a different approach when defining performance indicators compared to the manufacturing industry.

  7. Spatial Rule-Based Modeling: A Method and Its Application to the Human Mitotic Kinetochore

    Directory of Open Access Journals (Sweden)

    Jan Huwald

    2013-07-01

    Full Text Available A common problem in the analysis of biological systems is the combinatorial explosion that emerges from the complexity of multi-protein assemblies. Conventional formalisms, like differential equations, Boolean networks and Bayesian networks, are unsuitable for dealing with the combinatorial explosion, because they are designed for a restricted state space with fixed dimensionality. To overcome this problem, the rule-based modeling language, BioNetGen, and the spatial extension, SRSim, have been developed. Here, we describe how to apply rule-based modeling to integrate experimental data from different sources into a single spatial simulation model and how to analyze the output of that model. The starting point for this approach can be a combination of molecular interaction data, reaction network data, proximities, binding and diffusion kinetics and molecular geometries at different levels of detail. We describe the technique and then use it to construct a model of the human mitotic inner and outer kinetochore, including the spindle assembly checkpoint signaling pathway. This allows us to demonstrate the utility of the procedure, show how a novel perspective for understanding such complex systems becomes accessible and elaborate on challenges that arise in the formulation, simulation and analysis of spatial rule-based models.

  8. BNL NONLINEAR PRE TEST SEISMIC ANALYSIS FOR THE NUPEC ULTIMATE STRENGTH PIPING TEST PROGRAM

    International Nuclear Information System (INIS)

    DEGRASSI, G.; HOFMAYER, C.; MURPHY, C.; SUZUKI, K.; NAMITA, Y.

    2003-01-01

    The Nuclear Power Engineering Corporation (NUPEC) of Japan has been conducting a multi-year research program to investigate the behavior of nuclear power plant piping systems under large seismic loads. The objectives of the program are: to develop a better understanding of the elasto-plastic response and ultimate strength of nuclear piping; to ascertain the seismic safety margin of current piping design codes; and to assess new piping code allowable stress rules. Under this program, NUPEC has performed a large-scale seismic proving test of a representative nuclear power plant piping system. In support of the proving test, a series of materials tests, static and dynamic piping component tests, and seismic tests of simplified piping systems have also been performed. As part of collaborative efforts between the United States and Japan on seismic issues, the US Nuclear Regulatory Commission (USNRC) and its contractor, the Brookhaven National Laboratory (BNL), are participating in this research program by performing pre-test and post-test analyses, and by evaluating the significance of the program results with regard to safety margins. This paper describes BNL's pre-test analysis to predict the elasto-plastic response for one of NUPEC's simplified piping system seismic tests. The capability to simulate the anticipated ratcheting response of the system was of particular interest. Analyses were performed using classical bilinear and multilinear kinematic hardening models as well as a nonlinear kinematic hardening model. Comparisons of analysis results for each plasticity model against test results for a static cycling elbow component test and for a simplified piping system seismic test are presented in the paper

  9. IOTA Simple Rules in Differentiating between Benign and Malignant Adnexal Masses by Non-expert Examiners.

    Science.gov (United States)

    Tinnangwattana, Dangcheewan; Vichak-Ururote, Linlada; Tontivuthikul, Paponrad; Charoenratana, Cholaros; Lerthiranwong, Thitikarn; Tongsong, Theera

    2015-01-01

    To evaluate the diagnostic performance of IOTA simple rules in predicting malignant adnexal tumors by non-expert examiners. Five obstetric/gynecologic residents, who had never performed gynecologic ultrasound examination by themselves before, were trained for IOTA simple rules by an experienced examiner. One trained resident performed ultrasound examinations including IOTA simple rules on 100 women, who were scheduled for surgery due to ovarian masses, within 24 hours of surgery. The gold standard diagnosis was based on pathological or operative findings. The five-trained residents performed IOTA simple rules on 30 patients for evaluation of inter-observer variability. A total of 100 patients underwent ultrasound examination for the IOTA simple rules. Of them, IOTA simple rules could be applied in 94 (94%) masses including 71 (71.0%) benign masses and 29 (29.0%) malignant masses. The diagnostic performance of IOTA simple rules showed sensitivity of 89.3% (95%CI, 77.8%; 100.7%), specificity 83.3% (95%CI, 74.3%; 92.3%). Inter-observer variability was analyzed using Cohen's kappa coefficient. Kappa indices of the four pairs of raters are 0.713-0.884 (0.722, 0.827, 0.713, and 0.884). IOTA simple rules have high diagnostic performance in discriminating adnexal masses even when are applied by non-expert sonographers, though a training course may be required. Nevertheless, they should be further tested by a greater number of general practitioners before widely use.

  10. Studies in Software Cost Model Behavior: Do We Really Understand Cost Model Performance?

    Science.gov (United States)

    Lum, Karen; Hihn, Jairus; Menzies, Tim

    2006-01-01

    While there exists extensive literature on software cost estimation techniques, industry practice continues to rely upon standard regression-based algorithms. These software effort models are typically calibrated or tuned to local conditions using local data. This paper cautions that current approaches to model calibration often produce sub-optimal models because of the large variance problem inherent in cost data and by including far more effort multipliers than the data supports. Building optimal models requires that a wider range of models be considered while correctly calibrating these models requires rejection rules that prune variables and records and use multiple criteria for evaluating model performance. The main contribution of this paper is to document a standard method that integrates formal model identification, estimation, and validation. It also documents what we call the large variance problem that is a leading cause of cost model brittleness or instability.

  11. Collaboration rules.

    Science.gov (United States)

    Evans, Philip; Wolf, Bob

    2005-01-01

    Corporate leaders seeking to boost growth, learning, and innovation may find the answer in a surprising place: the Linux open-source software community. Linux is developed by an essentially volunteer, self-organizing community of thousands of programmers. Most leaders would sell their grandmothers for workforces that collaborate as efficiently, frictionlessly, and creatively as the self-styled Linux hackers. But Linux is software, and software is hardly a model for mainstream business. The authors have, nonetheless, found surprising parallels between the anarchistic, caffeinated, hirsute world of Linux hackers and the disciplined, tea-sipping, clean-cut world of Toyota engineering. Specifically, Toyota and Linux operate by rules that blend the self-organizing advantages of markets with the low transaction costs of hierarchies. In place of markets' cash and contracts and hierarchies' authority are rules about how individuals and groups work together (with rigorous discipline); how they communicate (widely and with granularity); and how leaders guide them toward a common goal (through example). Those rules, augmented by simple communication technologies and a lack of legal barriers to sharing information, create rich common knowledge, the ability to organize teams modularly, extraordinary motivation, and high levels of trust, which radically lowers transaction costs. Low transaction costs, in turn, make it profitable for organizations to perform more and smaller transactions--and so increase the pace and flexibility typical of high-performance organizations. Once the system achieves critical mass, it feeds on itself. The larger the system, the more broadly shared the knowledge, language, and work style. The greater individuals' reputational capital, the louder the applause and the stronger the motivation. The success of Linux is evidence of the power of that virtuous circle. Toyota's success is evidence that it is also powerful in conventional companies.

  12. French recent developments in support to rules for creep and creep-fatigue analyses

    International Nuclear Information System (INIS)

    Touboul, F.; Moulin, D.

    1997-01-01

    RCC-MR proposes Design rules for creep and creep-fatigue damage evaluation in zones with no geometrical discontinuities. Rules have been developed, based on the σ d concept, in order to consider zones with geometrical discontinuities. Rule for Weld are proposed in the paragraph relative to shell design rules and reduction coefficient due to material properties are given in Appendix A9. For fatigue analysis, last version of RCC-MR (1993) has proposed a reduction factor on fatigue curves (Jf value), derived from preliminary tests performed within European program. Studies have been carried out in order to have a better understanding of the phenomena involved in these fatigue reduction factors. Tests have been performed on large plates, with varying applied displacements, weld geometry, plate thickness, weld direction. It appears that material effect is not the only purpose to be considered but that it is necessary to think about the geometrical effect, linked to the welded zone dimensions, and the elastic follow-up effect between the two materials: base metal and weld metal. As a first approach, simplified calculations have been achieved with precise material characterization. Roche's method and Zarka method's give conservative result in comparison to tests results. (author). 3 refs, 4 tabs

  13. Do Group Decision Rules Affect Trust? A Laboratory Experiment on Group Decision Rules and Trust

    DEFF Research Database (Denmark)

    Nielsen, Julie Hassing

    2016-01-01

    Enhanced participation has been prescribed as the way forward for improving democratic decision making while generating positive attributes like trust. Yet we do not know the extent to which rules affect the outcome of decision making. This article investigates how different group decision rules......-hierarchical decision-making procedures enhance trust vis-à-vis other more hierarchical decision-making procedures....... affect group trust by testing three ideal types of decision rules (i.e., a Unilateral rule, a Representative rule and a 'Non-rule') in a laboratory experiment. The article shows significant differences between the three decision rules on trust after deliberation. Interestingly, however, it finds...

  14. Medicare Program; Cancellation of Advancing Care Coordination Through Episode Payment and Cardiac Rehabilitation Incentive Payment Models; Changes to Comprehensive Care for Joint Replacement Payment Model: Extreme and Uncontrollable Circumstances Policy for the Comprehensive Care for Joint Replacement Payment Model. Final rule; interim final rule with comment period.

    Science.gov (United States)

    2017-12-01

    This final rule cancels the Episode Payment Models (EPMs) and Cardiac Rehabilitation (CR) Incentive Payment Model and rescinds the regulations governing these models. It also implements certain revisions to the Comprehensive Care for Joint Replacement (CJR) model, including: Giving certain hospitals selected for participation in the CJR model a one-time option to choose whether to continue their participation in the model; technical refinements and clarifications for certain payment, reconciliation and quality provisions; and a change to increase the pool of eligible clinicians that qualify as affiliated practitioners under the Advanced Alternative Payment Model (Advanced APM) track. An interim final rule with comment period is being issued in conjunction with this final rule in order to address the need for a policy to provide some flexibility in the determination of episode costs for providers located in areas impacted by extreme and uncontrollable circumstances.

  15. A comparison of item response models for accuracy and speed of item responses with applications to adaptive testing.

    Science.gov (United States)

    van Rijn, Peter W; Ali, Usama S

    2017-05-01

    We compare three modelling frameworks for accuracy and speed of item responses in the context of adaptive testing. The first framework is based on modelling scores that result from a scoring rule that incorporates both accuracy and speed. The second framework is the hierarchical modelling approach developed by van der Linden (2007, Psychometrika, 72, 287) in which a regular item response model is specified for accuracy and a log-normal model for speed. The third framework is the diffusion framework in which the response is assumed to be the result of a Wiener process. Although the three frameworks differ in the relation between accuracy and speed, one commonality is that the marginal model for accuracy can be simplified to the two-parameter logistic model. We discuss both conditional and marginal estimation of model parameters. Models from all three frameworks were fitted to data from a mathematics and spelling test. Furthermore, we applied a linear and adaptive testing mode to the data off-line in order to determine differences between modelling frameworks. It was found that a model from the scoring rule framework outperformed a hierarchical model in terms of model-based reliability, but the results were mixed with respect to correlations with external measures. © 2017 The British Psychological Society.

  16. Rule-Based and Case-Based Reasoning in Housing Prices

    OpenAIRE

    Gabrielle Gayer; Itzhak Gilboa; Offer Lieberman

    2004-01-01

    People reason about real-estate prices both in terms of general rules and in terms of analogies to similar cases. We propose to empirically test which mode of reasoning fits the data better. To this end, we develop the statistical techniques required for the estimation of the case-based model. It is hypothesized that case-based reasoning will have relatively more explanatory power in databases of rental apartments, whereas rule-based reasoning will have a relative advantage in sales data. We ...

  17. Verification of design rules for EUROFER under TBM operating conditions

    International Nuclear Information System (INIS)

    Sunyk, R.; Aktaa, J.

    2007-01-01

    The aim of the activity presented in this work is, firstly, an evaluation of existing design rules considered for austenitic steels exhibiting cycle-by-cycle hardening, in contrast to the reduced-activation ferritic-martensitic steels (RAFM), which soften under cyclic loading. Secondly, we are aimed in a definition of the range of operating temperatures and loads for the current design of the test blanket module (TBM). Results of cycling tests of the EUROFER 97 have been thereby used to adjust material parameters needed for an ABAQUS-own combined non-linear isotropic-kinematic hardening model. Furthermore, a visco-plastic material model considering material damage and implemented recently as an ABAQUS user material (UMAT) has been also applied for simulations. Some important design rules within the elastic route have been evaluated and their predictions have been compared to results of cyclic simulations using the advanced material models mentioned above

  18. Local Culture and Rules as Competitive Strategic Predictor and the Impact on Real Estate Industry Performance in Indonesia

    Directory of Open Access Journals (Sweden)

    Dewa Putu Selawa

    2013-05-01

    Full Text Available Resources and operational environment of a company are known to have important role in its competitive strategic formulation. These three variables also found influencing business performance. However, little is known about the influence of an understanding toward local culture and rules where a company operated on competitive strategic formulation and the impact on business performance. This research is aimed to find out about the influence of an understanding and implementation of Tri Hita Karana as one of intangible strategic resources and the influence of local rules either in form of regional regulation or customary rules on competitive strategic formulation of real estate company operated in Indonesia, whcih is in Bali Province.In order to harmonize company with its environment through three dimensions based on Hindu philosophy (parahyangan as the manifestation of God dimension, pawongan as the manifestation of humanity dimension and palemahan as the manifestation of natural environment dimension, it is proven that understanding and implementation of Tri Hita Kirana have the highest influence from seven strategic resources formulation of a company. In addition, local rules also proven to significantly influence environmental dynamic faced by real estate companies operated in Bali. The research verified that resources owned by and environmental dynamics of a company have significant influence on competitive strategic that further influence business performance measured using Balance Score Card method.

  19. Predictive performance for population models using stochastic differential equations applied on data from an oral glucose tolerance test

    DEFF Research Database (Denmark)

    Møller, Jonas Bech; Overgaard, R.V.; Madsen, Henrik

    2010-01-01

    Several articles have investigated stochastic differential equations (SDEs) in PK/PD models, but few have quantitatively investigated the benefits to predictive performance of models based on real data. Estimation of first phase insulin secretion which reflects beta-cell function using models of ...... obtained from the glucose tolerance tests. Since, the estimation time of extended models was not heavily increased compared to basic models, the applied method is concluded to have high relevance not only in theory but also in practice....

  20. Geochemical Testing And Model Development - Residual Tank Waste Test Plan

    International Nuclear Information System (INIS)

    Cantrell, K.J.; Connelly, M.P.

    2010-01-01

    This Test Plan describes the testing and chemical analyses release rate studies on tank residual samples collected following the retrieval of waste from the tank. This work will provide the data required to develop a contaminant release model for the tank residuals from both sludge and salt cake single-shell tanks. The data are intended for use in the long-term performance assessment and conceptual model development.

  1. Use of ontology structure and Bayesian models to aid the crowdsourcing of ICD-11 sanctioning rules.

    Science.gov (United States)

    Lou, Yun; Tu, Samson W; Nyulas, Csongor; Tudorache, Tania; Chalmers, Robert J G; Musen, Mark A

    2017-04-01

    The International Classification of Diseases (ICD) is the de facto standard international classification for mortality reporting and for many epidemiological, clinical, and financial use cases. The next version of ICD, ICD-11, will be submitted for approval by the World Health Assembly in 2018. Unlike previous versions of ICD, where coders mostly select single codes from pre-enumerated disease and disorder codes, ICD-11 coding will allow extensive use of multiple codes to give more detailed disease descriptions. For example, "severe malignant neoplasms of left breast" may be coded using the combination of a "stem code" (e.g., code for malignant neoplasms of breast) with a variety of "extension codes" (e.g., codes for laterality and severity). The use of multiple codes (a process called post-coordination), while avoiding the pitfall of having to pre-enumerate vast number of possible disease and qualifier combinations, risks the creation of meaningless expressions that combine stem codes with inappropriate qualifiers. To prevent that from happening, "sanctioning rules" that define legal combinations are necessary. In this work, we developed a crowdsourcing method for obtaining sanctioning rules for the post-coordination of concepts in ICD-11. Our method utilized the hierarchical structures in the domain to improve the accuracy of the sanctioning rules and to lower the crowdsourcing cost. We used Bayesian networks to model crowd workers' skills, the accuracy of their responses, and our confidence in the acquired sanctioning rules. We applied reinforcement learning to develop an agent that constantly adjusted the confidence cutoffs during the crowdsourcing process to maximize the overall quality of sanctioning rules under a fixed budget. Finally, we performed formative evaluations using a skin-disease branch of the draft ICD-11 and demonstrated that the crowd-sourced sanctioning rules replicated those defined by an expert dermatologist with high precision and recall

  2. Finite energy sum rules and instantons in the instanton liquid model

    International Nuclear Information System (INIS)

    Elias, V.; Fang Shi; Steele, T.G.

    1998-01-01

    We obtain the imaginary part of the direct single-instanton contribution to the pseudoscalar correlator, as defined by the appropriate dispersion relation, in order to derive an explicit integral representation for the instanton contribution to finite energy sum rules in the instanton liquid model. (author)

  3. A rule-based stemmer for Arabic Gulf dialect

    Directory of Open Access Journals (Sweden)

    Belal Abuata

    2015-04-01

    Full Text Available Arabic dialects arewidely used from many years ago instead of Modern Standard Arabic language in many fields. The presence of dialects in any language is a big challenge. Dialects add a new set of variational dimensions in some fields like natural language processing, information retrieval and even in Arabic chatting between different Arab nationals. Spoken dialects have no standard morphological, phonological and lexical like Modern Standard Arabic. Hence, the objective of this paper is to describe a procedure or algorithm by which a stem for the Arabian Gulf dialect can be defined. The algorithm is rule based. Special rules are created to remove the suffixes and prefixes of the dialect words. Also, the algorithm applies rules related to the word size and the relation between adjacent letters. The algorithm was tested for a number of words and given a good correct stem ratio. The algorithm is also compared with two Modern Standard Arabic algorithms. The results showed that Modern Standard Arabic stemmers performed poorly with Arabic Gulf dialect and our algorithm performed poorly when applied for Modern Standard Arabic words.

  4. Aniseikonia quantification: error rate of rule of thumb estimation.

    Science.gov (United States)

    Lubkin, V; Shippman, S; Bennett, G; Meininger, D; Kramer, P; Poppinga, P

    1999-01-01

    To find the error rate in quantifying aniseikonia by using "Rule of Thumb" estimation in comparison with proven space eikonometry. Study 1: 24 adult pseudophakic individuals were measured for anisometropia, and astigmatic interocular difference. Rule of Thumb quantification for prescription was calculated and compared with aniseikonia measurement by the classical Essilor Projection Space Eikonometer. Study 2: parallel analysis was performed on 62 consecutive phakic patients from our strabismus clinic group. Frequency of error: For Group 1 (24 cases): 5 ( or 21 %) were equal (i.e., 1% or less difference); 16 (or 67% ) were greater (more than 1% different); and 3 (13%) were less by Rule of Thumb calculation in comparison to aniseikonia determined on the Essilor eikonometer. For Group 2 (62 cases): 45 (or 73%) were equal (1% or less); 10 (or 16%) were greater; and 7 (or 11%) were lower in the Rule of Thumb calculations in comparison to Essilor eikonometry. Magnitude of error: In Group 1, in 10/24 (29%) aniseikonia by Rule of Thumb estimation was 100% or more greater than by space eikonometry, and in 6 of those ten by 200% or more. In Group 2, in 4/62 (6%) aniseikonia by Rule of Thumb estimation was 200% or more greater than by space eikonometry. The frequency and magnitude of apparent clinical errors of Rule of Thumb estimation is disturbingly large. This problem is greatly magnified by the time and effort and cost of prescribing and executing an aniseikonic correction for a patient. The higher the refractive error, the greater the anisometropia, and the worse the errors in Rule of Thumb estimation of aniseikonia. Accurate eikonometric methods and devices should be employed in all cases where such measurements can be made. Rule of thumb estimations should be limited to cases where such subjective testing and measurement cannot be performed, as in infants after unilateral cataract surgery.

  5. Electronuclear sum rules

    International Nuclear Information System (INIS)

    Arenhoevel, H.; Drechsel, D.; Weber, H.J.

    1978-01-01

    Generalized sum rules are derived by integrating the electromagnetic structure functions along lines of constant ratio of momentum and energy transfer. For non-relativistic systems these sum rules are related to the conventional photonuclear sum rules by a scaling transformation. The generalized sum rules are connected with the absorptive part of the forward scattering amplitude of virtual photons. The analytic structure of the scattering amplitudes and the possible existence of dispersion relations have been investigated in schematic relativistic and non-relativistic models. While for the non-relativistic case analyticity does not hold, the relativistic scattering amplitude is analytical for time-like (but not for space-like) photons and relations similar to the Gell-Mann-Goldberger-Thirring sum rule exist. (Auth.)

  6. Analysis of dispatching rules in a stochastic dynamic job shop manufacturing system with sequence-dependent setup times

    Science.gov (United States)

    Sharma, Pankaj; Jain, Ajai

    2014-12-01

    Stochastic dynamic job shop scheduling problem with consideration of sequence-dependent setup times are among the most difficult classes of scheduling problems. This paper assesses the performance of nine dispatching rules in such shop from makespan, mean flow time, maximum flow time, mean tardiness, maximum tardiness, number of tardy jobs, total setups and mean setup time performance measures viewpoint. A discrete event simulation model of a stochastic dynamic job shop manufacturing system is developed for investigation purpose. Nine dispatching rules identified from literature are incorporated in the simulation model. The simulation experiments are conducted under due date tightness factor of 3, shop utilization percentage of 90% and setup times less than processing times. Results indicate that shortest setup time (SIMSET) rule provides the best performance for mean flow time and number of tardy jobs measures. The job with similar setup and modified earliest due date (JMEDD) rule provides the best performance for makespan, maximum flow time, mean tardiness, maximum tardiness, total setups and mean setup time measures.

  7. Performance assessment of the Greater Confinement Disposal facility on the Nevada Test Site: Comparing the performance of two conceptual site models

    International Nuclear Information System (INIS)

    Baer, T.A.; Price, L.L.; Gallegos, D.P.

    1993-01-01

    A small amount of transuranic (TRU) waste has been disposed of at the Greater Confinement Disposal (GCD) site located on the Nevada Test Site's (NTS) Radioactive Waste Management Site (RWMS). The waste has been buried in several deep (37 m) boreholes dug into the floor of an alluvial basin. For the waste to remain in its current configuration, the DOE must demonstrate compliance of the site with the TRU disposal requirements, 40 CFR 191. Sandia's approach to process modelling in performance assessment is to use demonstrably conservative models of the site. Choosing the most conservative model, however, can be uncertain. As an example, diffusion of contaminants upward from the buried waste in the vadose zone water is the primary mechanism of release. This process can be modelled as straight upward planar diffusion or as spherical diffusion in all directions. The former has high fluxes but low release areas, the latter has lower fluxes but is spread over a greater area. We have developed analytic solutions to a simple test problem for both models and compared the total integrated discharges. The spherical diffusion conceptual model results in at least five times greater release to the accessible environment than the planar model at all diffusivities. Modifying the planar model to allow for a larger release, however, compensated for the smaller original planar discharge and resulted in a new planar model that was more conservative that the spherical model except at low diffusivities

  8. Preliminary test conditions for KNGR SBLOCA DVI ECCS performance test

    International Nuclear Information System (INIS)

    Bae, Kyoo Whan; Song, Jin Ho; Chung, Young Jong; Sim, Suk Ku; Park, Jong Kyun

    1999-03-01

    The Korean Next Generation Reactor (KNGR) adopts 4-train Direct Vessel Injection (DVI) configuration and injects the safety injection water directly into the downcomer through the 8.5'' DVI nozzle. Thus, the thermal hydraulic phenomena such as ECCS mixing and bypass are expected to be different from those observed in the cold leg injection. In order to investigate the realistic injection phenomena and modify the analysis code developed in the basis of cold leg injection, thermal hydraulic test with the performance evaluation is required. Preliminarily, the sequence of events and major thermal hydraulic phenomena during the small break LOCA for KNGR are identified from the analysis results calculated by the CEFLASH-4AS/REM. It is shown from the analysis results that the major transient behaviors including the core mixture level are largely affected by the downcomer modeling. Therefore, to investigate the proper thermal hydraulic phenomena occurring in the downcomer with limited budget and time, the separate effects test focusing on this region is considered to be effective and the conceptual test facility based on this recommended. For this test facility the test initial and boundary conditions are developed using the CEFLASH-4AS/REM analysis results that will be used as input for the preliminary test requirements. The final test requirements will be developed through the further discussions with the test performance group. (Author). 10 refs., 18 tabs., 4 figs

  9. Testing and modelling the performance of inorganic exchangers for radionuclide removal from aqueous nuclear waste

    International Nuclear Information System (INIS)

    Harjula, R.; Lehto, J.; Paajanen, A.; Saarinen, L.

    1997-01-01

    Three different inorganic sorbents/ion exchangers have been tested in this work. Granular hexacyanoferrate-based ion exchanger was developed for Cs removal from radioactive liquid waste at NPPs. It was tested for Cs removal from waste solutions containing different complexing agents and detergents. Radiation stability and thermal stability test has shown, that this sorbent can be used for treatment of medium-active waste treatment. Active carbon materials were tested for Co removal from liquid waste effluents at NPPs. It was found that 60 Co cannot be removed from the evaporator concentrates with reasonable efficiency and a combined process with up-stream precipitation step is needed for better Co separation efficiency. Granular modified titanium oxide was tested for 90 Sr removal from the waste effluents and showed very high efficiency. A mathematical model was developed to analyze ion exchange performance in feeds of different chemical and radiochemical compositions. (author). 9 refs, 7 figs, 3 tabs

  10. Integrable anyon chains: From fusion rules to face models to effective field theories

    International Nuclear Information System (INIS)

    Finch, Peter E.; Flohr, Michael; Frahm, Holger

    2014-01-01

    Starting from the fusion rules for the algebra SO(5) 2 we construct one-dimensional lattice models of interacting anyons with commuting transfer matrices of ‘interactions round the face’ (IRF) type. The conserved topological charges of the anyon chain are recovered from the transfer matrices in the limit of large spectral parameter. The properties of the models in the thermodynamic limit and the low energy excitations are studied using Bethe ansatz methods. Two of the anyon models are critical at zero temperature. From the analysis of the finite size spectrum we find that they are effectively described by rational conformal field theories invariant under extensions of the Virasoro algebra, namely WB 2 and WD 5 , respectively. The latter contains primaries with half and quarter spin. The modular partition function and fusion rules are derived and found to be consistent with the results for the lattice model

  11. Rule-based topology system for spatial databases to validate complex geographic datasets

    Science.gov (United States)

    Martinez-Llario, J.; Coll, E.; Núñez-Andrés, M.; Femenia-Ribera, C.

    2017-06-01

    A rule-based topology software system providing a highly flexible and fast procedure to enforce integrity in spatial relationships among datasets is presented. This improved topology rule system is built over the spatial extension Jaspa. Both projects are open source, freely available software developed by the corresponding author of this paper. Currently, there is no spatial DBMS that implements a rule-based topology engine (considering that the topology rules are designed and performed in the spatial backend). If the topology rules are applied in the frontend (as in many GIS desktop programs), ArcGIS is the most advanced solution. The system presented in this paper has several major advantages over the ArcGIS approach: it can be extended with new topology rules, it has a much wider set of rules, and it can mix feature attributes with topology rules as filters. In addition, the topology rule system can work with various DBMSs, including PostgreSQL, H2 or Oracle, and the logic is performed in the spatial backend. The proposed topology system allows users to check the complex spatial relationships among features (from one or several spatial layers) that require some complex cartographic datasets, such as the data specifications proposed by INSPIRE in Europe and the Land Administration Domain Model (LADM) for Cadastral data.

  12. The study of the energetic performance in test cycles

    Science.gov (United States)

    Rentea, Cristian; Tuca, Alexandra; Oprean, Mircea; Marius, Bataus

    2017-10-01

    One of the most important subsystems of modern passenger cars is the transmission. This paper aims to investigate the global performances of a modern transmission in different test cycles including the newly introduced WLTC (Worldwide Harmonized Light Vehicles Test Cycle). The study is done using a complex model developed in a performant simulation environment. Transmission efficiency calculation is emphasized, the efficiency being considered variable depending on engine torque, engine speed and gear ratio. The main important parameters (vehicle speed fluctuation, overall transmission efficiency, fuel consumption ratio) needed to compare test cycles and the transmissions performance are determined.

  13. Science and the rules governing anti-doping violations.

    Science.gov (United States)

    Bowers, Larry D

    2010-01-01

    The fight against the use of performance-enhancing drugs in sports has been in effect for nearly 90 years. The formation of the World Anti-Doping Agency in 1999 was a major event because an independent agency was entrusted with harmonization of the antidoping program. In addition to sports governing bodies, governments have endorsed WADA and its programs by signing a United Nations Education, Science, and Cultural Organization Convention on Doping. The first step in the harmonization process was the development of the World Anti-Doping Program. This program consisted of five documents - the Code, the International Standard for Testing, the International Standard for Laboratories, the Prohibited List, and the International Standard for Therapeutic Use Exemptions - which unified the approach of the international federations and national antidoping agencies in applying antidoping rules. For laboratory testing, the International Standard for Laboratories establishes the performance expectations for and competence of laboratories recognized by WADA, including accreditation under ISO/IEC 17025. The antidoping rules are adjudicated by arbitration using the internationally recognized Court of Arbitration for Sport.

  14. Testing Decision Rules for Multiattribute Decision Making

    NARCIS (Netherlands)

    Seidl, C.; Traub, S.

    1996-01-01

    This paper investigates the existence of an editing phase and studies the com- pliance of subjects' behaviour with the most popular multiattribute decision rules. We observed that our data comply well with the existence of an editing phase, at least if we allow for a natural error rate of some 25%.

  15. The Empirical Testing of a Musical Performance Assessment Paradigm

    Science.gov (United States)

    Russell, Brian E.

    2010-01-01

    The purpose of this study was to test a hypothesized model of aurally perceived performer-controlled musical factors that influence assessments of performance quality. Previous research studies on musical performance constructs, musical achievement, musical expression, and scale construction were examined to identify the factors that influence…

  16. Organisational Rules in Schools: Teachers' Opinions about Functions of Rules, Rule-Following and Breaking Behaviours in Relation to Their Locus of Control

    Science.gov (United States)

    Demirkasimoglu, Nihan; Aydin, Inayet; Erdogan, Cetin; Akin, Ugur

    2012-01-01

    The main aim of this research is to examine teachers' opinions about functions of school rules, reasons for rule-breaking and results of rule-breaking in relation to their locus of control, gender, age, seniority and branch. 350 public elementary school teachers in Ankara are included in the correlational survey model study. According to the…

  17. Quantitative Assessment of Optical Coherence Tomography Imaging Performance with Phantom-Based Test Methods And Computational Modeling

    Science.gov (United States)

    Agrawal, Anant

    Optical coherence tomography (OCT) is a powerful medical imaging modality that uniquely produces high-resolution cross-sectional images of tissue using low energy light. Its clinical applications and technological capabilities have grown substantially since its invention about twenty years ago, but efforts have been limited to develop tools to assess performance of OCT devices with respect to the quality and content of acquired images. Such tools are important to ensure information derived from OCT signals and images is accurate and consistent, in order to support further technology development, promote standardization, and benefit public health. The research in this dissertation investigates new physical and computational models which can provide unique insights into specific performance characteristics of OCT devices. Physical models, known as phantoms, are fabricated and evaluated in the interest of establishing standardized test methods to measure several important quantities relevant to image quality. (1) Spatial resolution is measured with a nanoparticle-embedded phantom and model eye which together yield the point spread function under conditions where OCT is commonly used. (2) A multi-layered phantom is constructed to measure the contrast transfer function along the axis of light propagation, relevant for cross-sectional imaging capabilities. (3) Existing and new methods to determine device sensitivity are examined and compared, to better understand the detection limits of OCT. A novel computational model based on the finite-difference time-domain (FDTD) method, which simulates the physics of light behavior at the sub-microscopic level within complex, heterogeneous media, is developed to probe device and tissue characteristics influencing the information content of an OCT image. This model is first tested in simple geometric configurations to understand its accuracy and limitations, then a highly realistic representation of a biological cell, the retinal

  18. Consistence of Network Filtering Rules

    Institute of Scientific and Technical Information of China (English)

    SHE Kun; WU Yuancheng; HUANG Juncai; ZHOU Mingtian

    2004-01-01

    The inconsistence of firewall/VPN(Virtual Private Network) rule makes a huge maintainable cost.With development of Multinational Company,SOHO office,E-government the number of firewalls/VPN will increase rapidly.Rule table in stand-alone or network will be increased in geometric series accordingly.Checking the consistence of rule table manually is inadequate.A formal approach can define semantic consistence,make a theoretic foundation of intelligent management about rule tables.In this paper,a kind of formalization of host rules and network ones for auto rule-validation based on SET theory were proporsed and a rule validation scheme was defined.The analysis results show the superior performance of the methods and demonstrate its potential for the intelligent management based on rule tables.

  19. Spatio-temporal correlations in models of collective motion ruled by different dynamical laws.

    Science.gov (United States)

    Cavagna, Andrea; Conti, Daniele; Giardina, Irene; Grigera, Tomas S; Melillo, Stefania; Viale, Massimiliano

    2016-11-15

    Information transfer is an essential factor in determining the robustness of biological systems with distributed control. The most direct way to study the mechanisms ruling information transfer is to experimentally observe the propagation across the system of a signal triggered by some perturbation. However, this method may be inefficient for experiments in the field, as the possibilities to perturb the system are limited and empirical observations must rely on natural events. An alternative approach is to use spatio-temporal correlations to probe the information transfer mechanism directly from the spontaneous fluctuations of the system, without the need to have an actual propagating signal on record. Here we test this method on models of collective behaviour in their deeply ordered phase by using ground truth data provided by numerical simulations in three dimensions. We compare two models characterized by very different dynamical equations and information transfer mechanisms: the classic Vicsek model, describing an overdamped noninertial dynamics and the inertial spin model, characterized by an underdamped inertial dynamics. By using dynamic finite-size scaling, we show that spatio-temporal correlations are able to distinguish unambiguously the diffusive information transfer mechanism of the Vicsek model from the linear mechanism of the inertial spin model.

  20. Rules of thumb to increase the software quality through testing

    Science.gov (United States)

    Buttu, M.; Bartolini, M.; Migoni, C.; Orlati, A.; Poppi, S.; Righini, S.

    2016-07-01

    The software maintenance typically requires 40-80% of the overall project costs, and this considerable variability mostly depends on the software internal quality: the more the software is designed and implemented to constantly welcome new changes, the lower will be the maintenance costs. The internal quality is typically enforced through testing, which in turn also affects the development and maintenance costs. This is the reason why testing methodologies have become a major concern for any company that builds - or is involved in building - software. Although there is no testing approach that suits all contexts, we infer some general guidelines learned during the Development of the Italian Single-dish COntrol System (DISCOS), which is a project aimed at producing the control software for the three INAF radio telescopes (the Medicina and Noto dishes, and the newly-built SRT). These guidelines concern both the development and the maintenance phases, and their ultimate goal is to maximize the DISCOS software quality through a Behavior-Driven Development (BDD) workflow beside a continuous delivery pipeline. We consider different topics and patterns; they involve the proper apportion of the tests (from end-to-end to low-level tests), the choice between hardware simulators and mockers, why and how to apply TDD and the dependency injection to increase the test coverage, the emerging technologies available for test isolation, bug fixing, how to protect the system from the external resources changes (firmware updating, hardware substitution, etc.) and, eventually, how to accomplish BDD starting from functional tests and going through integration and unit tests. We discuss pros and cons of each solution and point out the motivations of our choices either as a general rule or narrowed in the context of the DISCOS project.

  1. A Fuzzy Rule-Based Expert System for Evaluating Intellectual Capital

    Directory of Open Access Journals (Sweden)

    Mohammad Hossein Fazel Zarandi

    2012-01-01

    Full Text Available A fuzzy rule-based expert system is developed for evaluating intellectual capital. A fuzzy linguistic approach assists managers to understand and evaluate the level of each intellectual capital item. The proposed fuzzy rule-based expert system applies fuzzy linguistic variables to express the level of qualitative evaluation and criteria of experts. Feasibility of the proposed model is demonstrated by the result of intellectual capital performance evaluation for a sample company.

  2. Testing constancy of unconditional variance in volatility models by misspecification and specification tests

    DEFF Research Database (Denmark)

    Silvennoinen, Annastiina; Terasvirta, Timo

    The topic of this paper is testing the hypothesis of constant unconditional variance in GARCH models against the alternative that the unconditional variance changes deterministically over time. Tests of this hypothesis have previously been performed as misspecification tests after fitting a GARCH...... models. An application to exchange rate returns is included....

  3. Bayesian models based on test statistics for multiple hypothesis testing problems.

    Science.gov (United States)

    Ji, Yuan; Lu, Yiling; Mills, Gordon B

    2008-04-01

    We propose a Bayesian method for the problem of multiple hypothesis testing that is routinely encountered in bioinformatics research, such as the differential gene expression analysis. Our algorithm is based on modeling the distributions of test statistics under both null and alternative hypotheses. We substantially reduce the complexity of the process of defining posterior model probabilities by modeling the test statistics directly instead of modeling the full data. Computationally, we apply a Bayesian FDR approach to control the number of rejections of null hypotheses. To check if our model assumptions for the test statistics are valid for various bioinformatics experiments, we also propose a simple graphical model-assessment tool. Using extensive simulations, we demonstrate the performance of our models and the utility of the model-assessment tool. In the end, we apply the proposed methodology to an siRNA screening and a gene expression experiment.

  4. Rule-Governed Imitative Verbal Behavior as a Function of Modeling Procedures

    Science.gov (United States)

    Clinton, LeRoy; Boyce, Kathleen D.

    1975-01-01

    Investigated the effectiveness of modeling procedures alone and complemented by the appropriate rule statement on the production of plurals. Subjects were 20 normal and 20 retarded children who were randomly assigned to one of two learning conditions and who received either affective or informative social reinforcement. (Author/SDH)

  5. Grating scattering BRDF and imaging performances: A test survey performed in the frame of the flex mission

    Science.gov (United States)

    Harnisch, Bernd; Deep, Atul; Vink, Ramon; Coatantiec, Claude

    2017-11-01

    Key components in optical spectrometers are the gratings. Their influence on the overall infield straylight of the spectrometer depends not only on the technology used for grating fabrication but also on the potential existence of ghost images caused by irregularities of the grating constant. For the straylight analysis of spectrometer no general Bidirectional Reflectance Distribution Function (BRDF) model of gratings exist, as it does for optically smooth surfaces. These models are needed for the determination of spectrometer straylight background and for the calculation of spectrometer out of band rejection performances. Within the frame of the Fluorescence Earth Explorer mission (FLEX), gratings manufactured using different technologies have been investigated in terms of straylight background and imaging performance in the used diffraction order. The gratings which have been investigated cover a lithographically written grating, a volume Bragg grating, two holographic gratings and an off-the-shelf ruled grating. In this paper we present a survey of the measured bidirectional reflectance/transmittance distribution function and the determination of an equivalent surface micro-roughness of the gratings, describing the scattering of the grating around the diffraction order. This is specifically needed for the straylight modeling of the spectrometer.

  6. Momentum sum rules for fragmentation functions

    International Nuclear Information System (INIS)

    Meissner, S.; Metz, A.; Pitonyak, D.

    2010-01-01

    Momentum sum rules for fragmentation functions are considered. In particular, we give a general proof of the Schaefer-Teryaev sum rule for the transverse momentum dependent Collins function. We also argue that corresponding sum rules for related fragmentation functions do not exist. Our model-independent analysis is supplemented by calculations in a simple field-theoretical model.

  7. Oxytocin modulates trait-based rule following

    NARCIS (Netherlands)

    Gross, J.; de Dreu, C.K.W.

    Rules, whether in the form of norms, taboos or laws, regulate and coordinate human life. Some rules, however, are arbitrary and adhering to them can be personally costly. Rigidly sticking to such rules can be considered maladaptive. Here, we test whether, at the neurobiological level, (mal)adaptive

  8. Electrical circuit models for performance modeling of Lithium-Sulfur batteries

    DEFF Research Database (Denmark)

    Knap, Vaclav; Stroe, Daniel Ioan; Teodorescu, Remus

    2015-01-01

    emerging technology for various applications, there is a need for Li-S battery performance model; however, developing such models represents a challenging task due to batteries' complex ongoing chemical reactions. Therefore, the literature review was performed to summarize electrical circuit models (ECMs......) used for modeling the performance behavior of Li-S batteries. The studied Li-S pouch cell was tested in the laboratory in order to parametrize four basic ECM topologies. These topologies were compared by analyzing their voltage estimation accuracy values, which were obtained for different battery...... current profiles. Based on these results, the 3 R-C ECM was chosen and the Li-S battery cell discharging performance model with current dependent parameters was derived and validated....

  9. Efficiency Of Different Teaching Models In Teaching Of Frisbee Ultimate

    Directory of Open Access Journals (Sweden)

    Žuffová Zuzana

    2015-05-01

    Full Text Available The aim of the study was to verify the efficiency of two frisbee ultimate teaching models at 8-year grammar schools relative to age. In the experimental group was used a game based model (Teaching Games for Understanding and in the control group the traditional model based on teaching techniques. 6 groups of female students took part in experiment: experimental group 1 (n=10, age=11.6, experimental group 2 (n=12, age=13.8, experimental group 3 (n=14, age =15.8, control group 1 (n=11, age =11.7, control group 2 (n=10, age =13.8 and control group 3 (n=9, age =15.8. Efficiency of the teaching models was evaluated based of game performance and special knowledge results. Game performance was evaluated by the method of game performance assessment based on GPAI (Game Performance Assessment Instrument through video record. To verify level of knowledge, we used a knowledge test, which consisted of questions related to the rules and tactics knowledge of frisbee ultimate. To perform statistical evaluation Mann-Whitney U-test was used. Game performance assessment and knowledge level indicated higher efficiency of TGfU in general, but mostly statistically insignificant. Experimental groups 1 and 2 were significantly better in the indicator that evaluates tactical aspect of game performance - decision making (p<0.05. Experimental group 3 was better in the indicator that evaluates skill execution - disc catching. The results showed that the students of the classes taught by game based model reached partially better game performance in general. Experimental groups achieved from 79.17 % to 80 % of correct answers relating to the rules and from 75 % to 87.5 % of correct answers relating to the tactical knowledge in the knowledge test. Control groups achieved from 57.69 % to 72.22 % of correct answers relating to the rules and from 51.92 % to 72.22 % of correct answers relating to the tactical knowledge in the knowledge test.

  10. A Test and Extension of Lane and Terry's (2000) Conceptual Model of Mood-Performance Relationships Using a Large Internet Sample.

    Science.gov (United States)

    Lane, Andrew M; Terry, Peter C; Devonport, Tracey J; Friesen, Andrew P; Totterdell, Peter A

    2017-01-01

    The present study tested and extended Lane and Terry (2000) conceptual model of mood-performance relationships using a large dataset from an online experiment. Methodological and theoretical advances included testing a more balanced model of pleasant and unpleasant emotions, and evaluating relationships among emotion regulation traits, states and beliefs, psychological skills use, perceptions of performance, mental preparation, and effort exerted during competition. Participants ( N = 73,588) completed measures of trait emotion regulation, emotion regulation beliefs, regulation efficacy, use of psychological skills, and rated their anger, anxiety, dejection, excitement, energy, and happiness before completing a competitive concentration task. Post-competition, participants completed measures of effort exerted, beliefs about the quality of mental preparation, and subjective performance. Results showed that dejection associated with worse performance with the no-dejection group performing 3.2% better. Dejection associated with higher anxiety and anger scores and lower energy, excitement, and happiness scores. The proposed moderating effect of dejection was supported for the anxiety-performance relationship but not the anger-performance relationship. In the no-dejection group, participants who reported moderate or high anxiety outperformed those reporting low anxiety by about 1.6%. Overall, results showed partial support for Lane and Terry's model. In terms of extending the model, results showed dejection associated with greater use of suppression, less frequent use of re-appraisal and psychological skills, lower emotion regulation beliefs, and lower emotion regulation efficacy. Further, dejection associated with greater effort during performance, beliefs that pre-competition emotions did not assist goal achievement, and low subjective performance. Future research is required to investigate the role of intense emotions in emotion regulation and performance.

  11. Development of Flight-Test Performance Estimation Techniques for Small Unmanned Aerial Systems

    Science.gov (United States)

    McCrink, Matthew Henry

    This dissertation provides a flight-testing framework for assessing the performance of fixed-wing, small-scale unmanned aerial systems (sUAS) by leveraging sub-system models of components unique to these vehicles. The development of the sub-system models, and their links to broader impacts on sUAS performance, is the key contribution of this work. The sub-system modeling and analysis focuses on the vehicle's propulsion, navigation and guidance, and airframe components. Quantification of the uncertainty in the vehicle's power available and control states is essential for assessing the validity of both the methods and results obtained from flight-tests. Therefore, detailed propulsion and navigation system analyses are presented to validate the flight testing methodology. Propulsion system analysis required the development of an analytic model of the propeller in order to predict the power available over a range of flight conditions. The model is based on the blade element momentum (BEM) method. Additional corrections are added to the basic model in order to capture the Reynolds-dependent scale effects unique to sUAS. The model was experimentally validated using a ground based testing apparatus. The BEM predictions and experimental analysis allow for a parameterized model relating the electrical power, measurable during flight, to the power available required for vehicle performance analysis. Navigation system details are presented with a specific focus on the sensors used for state estimation, and the resulting uncertainty in vehicle state. Uncertainty quantification is provided by detailed calibration techniques validated using quasi-static and hardware-in-the-loop (HIL) ground based testing. The HIL methods introduced use a soft real-time flight simulator to provide inertial quality data for assessing overall system performance. Using this tool, the uncertainty in vehicle state estimation based on a range of sensors, and vehicle operational environments is

  12. A Framework for Fully Automated Performance Testing for Smart Buildings

    DEFF Research Database (Denmark)

    Markoska, Elena; Johansen, Aslak; Lazarova-Molnar, Sanja

    2018-01-01

    , setup of performance tests has been manual and labor-intensive and has required intimate knowledge of buildings’ complexity and systems. The emergence of the concept of smart buildings has provided an opportunity to overcome this restriction. In this paper, we propose a framework for automated......A significant proportion of energy consumption by buildings worldwide, estimated to ca. 40%, has yielded a high importance to studying buildings’ performance. Performance testing is a mean by which buildings can be continuously commissioned to ensure that they operate as designed. Historically...... performance testing of smart buildings that utilizes metadata models. The approach features automatic detection of applicable performance tests using metadata queries and their corresponding instantiation, as well as continuous commissioning based on metadata. The presented approach has been implemented...

  13. Test of the Okubo-Zweig-Iizuka rule in phi production

    International Nuclear Information System (INIS)

    Etkin, A.; Foley, K.J.; Goldman, J.H.; Love, W.A.; Morris, T.W.; Ozaki, S.; Platner, E.D.; Saulys, A.C.; Wheeler, C.D.; Willen, E.H.; Lindenbaum, S.J.; Kramer, M.A.; Mallik, U.

    1978-01-01

    We have measured the reaction π - p → K + K - K + K - n at 22.6 GeV/c and defect strong phi signals in the K + K - effective-mass plots. We do not observe the expected Okubo-Zweig-Iizuka--rule suppression of the phiphin final state and conclude that the rule is working poorly in the observed production processes

  14. Wind Tunnel and Hover Performance Test Results for Multicopter UAS Vehicles

    Science.gov (United States)

    Russell, Carl R.; Jung, Jaewoo; Willink, Gina; Glasner, Brett

    2016-01-01

    There is currently a lack of published data for the performance of multicopter unmanned aircraft system (UAS) vehicles, such as quadcopters and octocopters, often referred to collectively as drones. With the rapidly increasing popularity of multicopter UAS, there is interest in better characterizing the performance of this type of aircraft. By studying the performance of currently available vehicles, it will be possible to develop models for vehicles at this scale that can accurately predict performance and model trajectories. This paper describes a wind tunnel test that was recently performed in the U.S. Army's 7- by 10-ft Wind Tunnel at NASA Ames Research Center. During this wind tunnel entry, five multicopter UAS vehicles were tested to determine forces and moments as well as electrical power as a function of wind speed, rotor speed, and vehicle attitude. The test is described here in detail, and a selection of the key results from the test is presented.

  15. Rule-based category learning in children: the role of age and executive functioning.

    Directory of Open Access Journals (Sweden)

    Rahel Rabi

    Full Text Available Rule-based category learning was examined in 4-11 year-olds and adults. Participants were asked to learn a set of novel perceptual categories in a classification learning task. Categorization performance improved with age, with younger children showing the strongest rule-based deficit relative to older children and adults. Model-based analyses provided insight regarding the type of strategy being used to solve the categorization task, demonstrating that the use of the task appropriate strategy increased with age. When children and adults who identified the correct categorization rule were compared, the performance deficit was no longer evident. Executive functions were also measured. While both working memory and inhibitory control were related to rule-based categorization and improved with age, working memory specifically was found to marginally mediate the age-related improvements in categorization. When analyses focused only on the sample of children, results showed that working memory ability and inhibitory control were associated with categorization performance and strategy use. The current findings track changes in categorization performance across childhood, demonstrating at which points performance begins to mature and resemble that of adults. Additionally, findings highlight the potential role that working memory and inhibitory control may play in rule-based category learning.

  16. Testing the Self-Efficacy-Performance Linkage of Social-Cognitive Theory.

    Science.gov (United States)

    Harrison, Allison W.; Rainer, R. Kelly, Jr.; Hochwarter, Wayne A.; Thompson, Kenneth R.

    1997-01-01

    Briefly reviews Albert Bandura's Self-Efficacy Performance Model (ability to perform a task is influenced by an individual's belief in their capability). Tests this model with a sample of 776 university employees and computer-related knowledge and skills. Results supported Bandura's thesis. Includes statistical tables and a discussion of related…

  17. Results of steel containment vessel model test

    International Nuclear Information System (INIS)

    Luk, V.K.; Ludwigsen, J.S.; Hessheimer, M.F.; Komine, Kuniaki; Matsumoto, Tomoyuki; Costello, J.F.

    1998-05-01

    A series of static overpressurization tests of scale models of nuclear containment structures is being conducted by Sandia National Laboratories for the Nuclear Power Engineering Corporation of Japan and the US Nuclear Regulatory Commission. Two tests are being conducted: (1) a test of a model of a steel containment vessel (SCV) and (2) a test of a model of a prestressed concrete containment vessel (PCCV). This paper summarizes the conduct of the high pressure pneumatic test of the SCV model and the results of that test. Results of this test are summarized and are compared with pretest predictions performed by the sponsoring organizations and others who participated in a blind pretest prediction effort. Questions raised by this comparison are identified and plans for posttest analysis are discussed

  18. Max-out-in pivot rule with Dantzig's safeguarding rule for the simplex method

    International Nuclear Information System (INIS)

    Tipawanna, Monsicha; Sinapiromsaran, Krung

    2014-01-01

    The simplex method is used to solve linear programming problem by improving the current basic feasible solution. It uses a pivot rule to guide the search in the feasible region. The pivot rule is used to select an entering index in simplex method. Nowadays, many pivot rule have been presented, but no pivot rule shows superior performance than other. Therefore, this is still an active research in linear programming. In this research, we present the max-out-in pivot rule with Dantzig's safeguarding for simplex method. This rule is based on maximum improvement of objective value of the current basic feasible point similar to the Dantzig's rule. We can illustrate by Klee and Minty problems that our rule outperforms that of Dantzig's rule by the number of iterations for solving linear programming problems

  19. Assess and Predict Automatic Generation Control Performances for Thermal Power Generation Units Based on Modeling Techniques

    Science.gov (United States)

    Zhao, Yan; Yang, Zijiang; Gao, Song; Liu, Jinbiao

    2018-02-01

    Automatic generation control(AGC) is a key technology to maintain real time power generation and load balance, and to ensure the quality of power supply. Power grids require each power generation unit to have a satisfactory AGC performance, being specified in two detailed rules. The two rules provide a set of indices to measure the AGC performance of power generation unit. However, the commonly-used method to calculate these indices is based on particular data samples from AGC responses and will lead to incorrect results in practice. This paper proposes a new method to estimate the AGC performance indices via system identification techniques. In addition, a nonlinear regression model between performance indices and load command is built in order to predict the AGC performance indices. The effectiveness of the proposed method is validated through industrial case studies.

  20. Welfare Implications of Alternative Monetary Policy Rules: A New Keynesian DSGE Model for Turkey

    Directory of Open Access Journals (Sweden)

    Yağcıbaşı Özge Filiz

    2017-12-01

    Full Text Available In recent years, there has been extensive research on the conduct of monetary policy in small open economies that are subject to inflation and output fluctuations. Policymakers should decide whether to implement strict inflation targeting or to respond to the changes in output fluctuations while conducting monetary policy rule. This study aims to examine the response of alternative monetary policy rules to Turkish economy by means of a DSGE model that is subject to demand and technology shocks. The New Keynesian model we used is borrowed from Gali (2015 and calibrated for the Turkish economy. Welfare effects of alternative Taylor rules are evaluated under different specifications of central bank loss function. One of the main findings of this paper is that in the case of a technology shock, strict inflation targeting rules provide the minimum welfare loss under all loss function configurations. On the contrary, the losses are weakened if the monetary authority responds to output fluctuations in the presence of a demand shock. Finally, there exists a trade-off between the volatility of output and inflation in case of a technology shock, while the volatility of both variables moves in the same direction in response to a demand shock.

  1. Method for automatic control rod operation using rule-based control

    International Nuclear Information System (INIS)

    Kinoshita, Mitsuo; Yamada, Naoyuki; Kiguchi, Takashi

    1988-01-01

    An automatic control rod operation method using rule-based control is proposed. Its features are as follows: (1) a production system to recognize plant events, determine control actions and realize fast inference (fast selection of a suitable production rule), (2) use of the fuzzy control technique to determine quantitative control variables. The method's performance was evaluated by simulation tests on automatic control rod operation at a BWR plant start-up. The results were as follows; (1) The performance which is related to stabilization of controlled variables and time required for reactor start-up, was superior to that of other methods such as PID control and program control methods, (2) the process time to select and interpret the suitable production rule, which was the same as required for event recognition or determination of control action, was short (below 1 s) enough for real time control. The results showed that the method is effective for automatic control rod operation. (author)

  2. WINE ADVISOR EXPERT SYSTEM USING DECISION RULES

    Directory of Open Access Journals (Sweden)

    Dinuca Elena Claudia

    2013-07-01

    Full Text Available In this article I focus on developing an expert system for advising the choice of wine that best matches a specific occasion. An expert system is a computer application that performs a task that would be performed by a human expert. The implementation is done using Delphi programming language. I used to represent the knowledge bases a set of rules. The rules are of type IF THEN ELSE rules, decision rules based on different important wine features.

  3. Verification Test of Hydraulic Performance for Reactor Coolant Pump

    Energy Technology Data Exchange (ETDEWEB)

    Park, Sang Jun; Kim, Jae Shin; Ryu, In Wan; Ko, Bok Seong; Song, Keun Myung [Samjin Ind. Co., Seoul (Korea, Republic of)

    2010-01-15

    According to this project, basic design for prototype pump and model pump of reactor coolant pump and test facilities has been completed. Basic design for prototype pump to establish structure, dimension and hydraulic performance has been completed and through primary flow analysis by computational fluid dynamics(CFD), flow characteristics and hydraulic performance have been established. This pump was designed with mixed flow pump having the following design requirements; specific velocity(Ns); 1080.9(rpm{center_dot}m{sup 3}/m{center_dot}m), capacity; 3115m{sup 3}/h, total head ; 26.3m, pump speed; 1710rpm, pump efficiency; 77.0%, Impeller out-diameter; 349mm, motor output; 360kw, design pressure; 17MPaG. The features of the pump are leakage free due to no mechanical seal on the pump shaft which insures reactor's safety and law noise level and low vibration due to no cooling fan on the motor which makes eco-friendly product. Model pump size was reduced to 44% of prototype pump for the verification test for hydraulic performance of reactor coolant pump and was designed with mixed flow pump and canned motor having the following design requirements; specific speed(NS); 1060.9(rpm{center_dot}m{sup 3}/m{center_dot}m), capacity; 539.4m{sup 3}/h, total head; 21.0m, pump speed; 3476rpm, pump efficiency; 72.9%, Impeller out-diameter; 154mm, motor output; 55kw, design pressure; 1.0MPaG. The test facilities were designed for verification test of hydraulic performance suitable for pump performance test, homologous test, NPSH test(cavitation), cost down test and pressure pulsation test of inlet and outlet ports. Test tank was designed with testing capacity enabling up to 2000m{sup 3}/h and design pressure 1.0MPaG. Auxiliary pump was designed with centrifugal pump having capacity; 1100m{sup 3}/h, total head; 42.0m, motor output; 190kw

  4. Sewage Treatment Plants: Standards of Performance for New Stationary Sources 1977 Final Rule (42 FR 58520)

    Science.gov (United States)

    This document includes a copy of the Federal Register publication of the November 10, 1977 Final Rule for the Standards of Performance of New Stationary Sources for 40 CFR 60 Subparts O. This document is provided curtesy of HeinOnline.

  5. Criterion learning in rule-based categorization: simulation of neural mechanism and new data.

    Science.gov (United States)

    Helie, Sebastien; Ell, Shawn W; Filoteo, J Vincent; Maddox, W Todd

    2015-04-01

    In perceptual categorization, rule selection consists of selecting one or several stimulus-dimensions to be used to categorize the stimuli (e.g., categorize lines according to their length). Once a rule has been selected, criterion learning consists of defining how stimuli will be grouped using the selected dimension(s) (e.g., if the selected rule is line length, define 'long' and 'short'). Very little is known about the neuroscience of criterion learning, and most existing computational models do not provide a biological mechanism for this process. In this article, we introduce a new model of rule learning called Heterosynaptic Inhibitory Criterion Learning (HICL). HICL includes a biologically-based explanation of criterion learning, and we use new category-learning data to test key aspects of the model. In HICL, rule selective cells in prefrontal cortex modulate stimulus-response associations using pre-synaptic inhibition. Criterion learning is implemented by a new type of heterosynaptic error-driven Hebbian learning at inhibitory synapses that uses feedback to drive cell activation above/below thresholds representing ionic gating mechanisms. The model is used to account for new human categorization data from two experiments showing that: (1) changing rule criterion on a given dimension is easier if irrelevant dimensions are also changing (Experiment 1), and (2) showing that changing the relevant rule dimension and learning a new criterion is more difficult, but also facilitated by a change in the irrelevant dimension (Experiment 2). We conclude with a discussion of some of HICL's implications for future research on rule learning. Copyright © 2015 Elsevier Inc. All rights reserved.

  6. Program for aerodynamic performance tests of helium gas compressor model of the gas turbine high temperature reactor (GTHTR300)

    International Nuclear Information System (INIS)

    Takada, Shoji; Takizuka, Takakazu; Kunimoto, Kazuhiko; Yan, Xing; Itaka, Hidehiko; Mori, Eiji

    2003-01-01

    Research and development program for helium gas compressor aerodynamics was planned for the power conversion system of the Gas Turbine High Temperature Reactor (GTHTR300). The axial compressor with polytropic efficiency of 90% and surge margin more than 30% was designed with 3-dimensional aerodynamic design. Performance and surge margin of the helium gas compressor tends to be lower due to the higher boss ratio which makes the tip clearance wide relative to the blade height, as well as due to a larger number of stages. The compressor was designed on the basis of methods and data for the aerodynamic design of industrial open-cycle gas-turbine. To validate the design of the helium gas compressor of the GTHTR300, aerodynamic performance tests were planned, and a 1/3-scale, 4-stage compressor model was designed. In the tests, the performance data of the helium gas compressor model will be acquired by using helium gas as a working fluid. The maximum design pressure at the model inlet is 0.88 MPa, which allows the Reynolds number to be sufficiently high. The present study is entrusted from the Ministry of Education, Culture, Sports, Science and Technology of Japan. (author)

  7. Cosmic Sum Rules

    DEFF Research Database (Denmark)

    T. Frandsen, Mads; Masina, Isabella; Sannino, Francesco

    2011-01-01

    We introduce new sum rules allowing to determine universal properties of the unknown component of the cosmic rays and show how it can be used to predict the positron fraction at energies not yet explored by current experiments and to constrain specific models.......We introduce new sum rules allowing to determine universal properties of the unknown component of the cosmic rays and show how it can be used to predict the positron fraction at energies not yet explored by current experiments and to constrain specific models....

  8. A Spectral Evaluation of Models Performances in Mediterranean Oak Woodlands

    Science.gov (United States)

    Vargas, R.; Baldocchi, D. D.; Abramowitz, G.; Carrara, A.; Correia, A.; Kobayashi, H.; Papale, D.; Pearson, D.; Pereira, J.; Piao, S.; Rambal, S.; Sonnentag, O.

    2009-12-01

    Ecosystem processes are influenced by climatic trends at multiple temporal scales including diel patterns and other mid-term climatic modes, such as interannual and seasonal variability. Because interactions between biophysical components of ecosystem processes are complex, it is important to test how models perform in frequency (e.g. hours, days, weeks, months, years) and time (i.e. day of the year) domains in addition to traditional tests of annual or monthly sums. Here we present a spectral evaluation using wavelet time series analysis of model performance in seven Mediterranean Oak Woodlands that encompass three deciduous and four evergreen sites. We tested the performance of five models (CABLE, ORCHIDEE, BEPS, Biome-BGC, and JULES) on measured variables of gross primary production (GPP) and evapotranspiration (ET). In general, model performance fails at intermediate periods (e.g. weeks to months) likely because these models do not represent the water pulse dynamics that influence GPP and ET at these Mediterranean systems. To improve the performance of a model it is critical to identify first where and when the model fails. Only by identifying where a model fails we can improve the model performance and use them as prognostic tools and to generate further hypotheses that can be tested by new experiments and measurements.

  9. Employee conscientiousness, agreeableness, and supervisor justice rule compliance: A three-study investigation.

    Science.gov (United States)

    Huang, Jason L; Cropanzano, Russell; Li, Andrew; Shao, Ping; Zhang, Xin-An; Li, Yuhui

    2017-11-01

    Researchers have paid limited attention to what makes organizational authority figures decide to treat their employees either justly or unjustly. Drawing from the actor-focused model of justice, as well as the stereotype content model, we argue that employee conscientiousness and agreeableness can impact the extent to which supervisors adhere to normative rules for distributive, procedural, informational, and interpersonal justice, as a result of supervisors' evaluations of their employees' effort and their liking of the employees. Supervisory compliance with justice rules may, in turn, impact the extent to which employees judge themselves to be treated either justly or unjustly. We tested these possibilities in 3 studies. In Study 1, we utilized a meta-analysis to demonstrate positive relationships between employees' conscientiousness, agreeableness, and their justice perceptions. In Study 2, we conducted 3 experiments to test the causal relationship between employee personality and supervisor intentions to comply with justice rules. In Study 3, we conducted an employee-supervisor dyadic field survey to examine the entire mediation model. Results are discussed in terms of the potential roles that both employees and supervisors may play in shaping employees' justice perceptions. (PsycINFO Database Record (c) 2017 APA, all rights reserved).

  10. Revised Rules for Concrete Bridges

    DEFF Research Database (Denmark)

    Thoft-Christensen, Palle; Jensen, F. M.; Middleton, C.

    This paper is based on research performed for the Highway Agency, London, UK under the project DPU/9/44 "Revision of Bridge Assessment Rules Based on Whole Life Performance: Concrete Bridges" It contains details of a methodology which can be used to generate Whole Life (WL) reliability profiles....... These WL reliability profiles may be used to establish revised rules for Concrete Bridges....

  11. Experimental Evaluation for the Microvibration Performance of a Segmented PC Method Based High Technology Industrial Facility Using 1/2 Scale Test Models

    Directory of Open Access Journals (Sweden)

    Sijun Kim

    2017-01-01

    Full Text Available The precast concrete (PC method used in the construction process of high technology industrial facilities is limited when applied to those with greater span lengths, due to the transport length restriction (maximum length of 15~16 m in Korea set by traffic laws. In order to resolve this, this study introduces a structural system with a segmented PC system, and a 1/2 scale model with a width of 9000 mm (hereafter Segmented Model is manufactured to evaluate vibration performance. Since a real vibrational environment cannot be reproduced for vibration testing using a scale model, a comparative analysis of their relative performances is conducted in this study. For this purpose, a 1/2 scale model with a width of 7200 mm (hereafter Nonsegmented Model of a high technology industrial facility is additionally prepared using the conventional PC method. By applying the same experiment method for both scale models and comparing the results, the relative vibration performance of the Segmented Model is observed. Through impact testing, the natural frequencies of the two scale models are compared. Also, in order to analyze the estimated response induced by the equipment, the vibration responses due to the exciter are compared. The experimental results show that the Segmented Model exhibits similar or superior performances when compared to the Nonsegmented Model.

  12. Agent-oriented enterprise modeling based on business rules

    NARCIS (Netherlands)

    Taveter, K.; Wagner, G.R.; Kunii, H.S.; Jajodia, S.; Solvberg, A.

    2001-01-01

    Business rules are statements that express (certain parts of) a business policy, defining business terms and defining or constraining the operations of an enterprise, in a declarative manner. Since these rules define and constrain the interaction among business agents in the course of business

  13. The research on business rules classification and specification methods

    OpenAIRE

    Baltrušaitis, Egidijus

    2005-01-01

    The work is based on the research of business rules classification and specification methods. The basics of business rules approach are discussed. The most common business rules classification and modeling methods are analyzed. Business rules modeling techniques and tools for supporting them in the information systems are presented. Basing on the analysis results business rules classification method is proposed. Templates for every business rule type are presented. Business rules structuring ...

  14. Multistate modelling extended by behavioural rules: An application to migration.

    Science.gov (United States)

    Klabunde, Anna; Zinn, Sabine; Willekens, Frans; Leuchter, Matthias

    2017-10-01

    We propose to extend demographic multistate models by adding a behavioural element: behavioural rules explain intentions and thus transitions. Our framework is inspired by the Theory of Planned Behaviour. We exemplify our approach with a model of migration from Senegal to France. Model parameters are determined using empirical data where available. Parameters for which no empirical correspondence exists are determined by calibration. Age- and period-specific migration rates are used for model validation. Our approach adds to the toolkit of demographic projection by allowing for shocks and social influence, which alter behaviour in non-linear ways, while sticking to the general framework of multistate modelling. Our simulations yield that higher income growth in Senegal leads to higher emigration rates in the medium term, while a decrease in fertility yields lower emigration rates.

  15. A Rule-Based Policy-Level Model of Nonsuperpower Behavior in Strategic Conflicts.

    Science.gov (United States)

    1982-12-01

    a mechanism. The human mind tends to work linearly and to focus implicitly on a few variables. Experience results in subconscious models with far...which is slower. Alternatives to the current ROSIE implementation include reprogramming Scenario Agent in the C language (the language used for the Red...perception, opportunity perception, opportunity response, and assertiveness. As rules are refined, maintenance and reprogramming of the model will be required

  16. Post-test analysis for the APR1400 LBLOCA DVI performance test using MARS

    International Nuclear Information System (INIS)

    Bae, Kyoo Hwan; Lee, Y. J.; Kim, H. C.; Bae, Y. Y.; Park, J. K.; Lee, W.

    2002-03-01

    Post-test analyses using a multi-dimensional best-estimate analysis code, MARS, are performed for the APR1400 LBLOCA DVI (Direct Vessel Injection) performance tests. This report describes the code evaluation results for the test data of various void height tests and direct bypass tests that have been performed at MIDAS test facility. MIDAS is a scaled test facility of APR1400 with the objective of identifying multi-dimensional thermal-hydraulic phenomena in the downcomer during the reflood conditions of a large break LOCA. A modified linear scale ratio was applied in its construction and test conditions. The major thermal-hydraulic parameters such as ECC bypass fraction, steam condensation fraction, and temperature distributions in downcomer are compared and evaluated. The evaluation results of MARS code for the various test cases show that: (a) MARS code has an advanced modeling capability of well predicting major multi-dimensional thermal-hydraulic phenomena occurring in the downcomer, (b) MARS code under-predicts the steam condensation rates, which in turn causes to over-predict the ECC bypass rates. However, the trend of decrease in steam condensation rate and increase in ECC bypass rate in accordance with the increase in steam flow rate, and the calculation results of the ECC bypass rates under the EM analysis conditions generally agree with the test data

  17. Mixing Rules Formulation for a Kinetic Model of the Langmuir-Hinshelwood Semipredictive Type Applied to the Heterogeneous Photocatalytic Degradation of Multicomponent Mixtures

    Directory of Open Access Journals (Sweden)

    John Wilman Rodriguez-Acosta

    2014-01-01

    Full Text Available Mixing rules coupled to a semipredictive kinetic model of the Langmuir-Hinshelwood type were proposed to determine the behavior of the heterogeneous solar photodegradation with TiO2-P25 of multicomponent mixtures at pilot scale. The kinetic expressions were expressed in terms of the effective concentration of total organic carbon (xTOC. An expression was obtained in a generalized form which is a function of the mixing rules as a product of a global contribution of the reaction rate constant k′ and a mixing function fC. Kinetic parameters of the model were obtained using the Nelder and Mead (N-M algorithm. The kinetic model was validated with experimental data obtained from the degradation of binary mixtures of chlorinated compounds (DCA: dichloroacetic acid and 4-CP: 4-chlorophenol at different initial global concentration, using a CPC reactor at pilot scale. A simplex-lattice {2,3} design experiment was adopted to perform the runs.

  18. Validation of transport models for use in repository performance assessments: a view illustrated for INTRAVAL test case 1b

    International Nuclear Information System (INIS)

    Jackson, C.P.; Lever, D.A.; Sumner, P.J.

    1991-03-01

    We present our views on validation. We consider that validation is slightly different for general models and specific models. We stress the importance of presenting for review the case for (or against) a model. We outline a formal framework for validation, which helps to ensure that all the issues are addressed. Our framework includes calibration, testing predictions, comparison with alternative models, which we consider particularly important, analysis of discrepancies, presentation, consideration of implications and suggested improved experiments. We illustrate the approach by application to an INTRAVAL test case based on laboratory experiments. Three models were considered: a simple model that included the effects of advection, dispersion and equilibrium sorption, a model that also included the effects of rock-matrix diffusion, and a model with kinetic sorption. We show that the model with rock-matrix diffusion is the only one to provide a good description of the data. We stress the implications of extrapolating to larger length and time scales for repository performance assessments. (author)

  19. Analyzing large gene expression and methylation data profiles using StatBicRM: statistical biclustering-based rule mining.

    Directory of Open Access Journals (Sweden)

    Ujjwal Maulik

    Full Text Available Microarray and beadchip are two most efficient techniques for measuring gene expression and methylation data in bioinformatics. Biclustering deals with the simultaneous clustering of genes and samples. In this article, we propose a computational rule mining framework, StatBicRM (i.e., statistical biclustering-based rule mining to identify special type of rules and potential biomarkers using integrated approaches of statistical and binary inclusion-maximal biclustering techniques from the biological datasets. At first, a novel statistical strategy has been utilized to eliminate the insignificant/low-significant/redundant genes in such way that significance level must satisfy the data distribution property (viz., either normal distribution or non-normal distribution. The data is then discretized and post-discretized, consecutively. Thereafter, the biclustering technique is applied to identify maximal frequent closed homogeneous itemsets. Corresponding special type of rules are then extracted from the selected itemsets. Our proposed rule mining method performs better than the other rule mining algorithms as it generates maximal frequent closed homogeneous itemsets instead of frequent itemsets. Thus, it saves elapsed time, and can work on big dataset. Pathway and Gene Ontology analyses are conducted on the genes of the evolved rules using David database. Frequency analysis of the genes appearing in the evolved rules is performed to determine potential biomarkers. Furthermore, we also classify the data to know how much the evolved rules are able to describe accurately the remaining test (unknown data. Subsequently, we also compare the average classification accuracy, and other related factors with other rule-based classifiers. Statistical significance tests are also performed for verifying the statistical relevance of the comparative results. Here, each of the other rule mining methods or rule-based classifiers is also starting with the same post

  20. Analyzing large gene expression and methylation data profiles using StatBicRM: statistical biclustering-based rule mining.

    Science.gov (United States)

    Maulik, Ujjwal; Mallik, Saurav; Mukhopadhyay, Anirban; Bandyopadhyay, Sanghamitra

    2015-01-01

    Microarray and beadchip are two most efficient techniques for measuring gene expression and methylation data in bioinformatics. Biclustering deals with the simultaneous clustering of genes and samples. In this article, we propose a computational rule mining framework, StatBicRM (i.e., statistical biclustering-based rule mining) to identify special type of rules and potential biomarkers using integrated approaches of statistical and binary inclusion-maximal biclustering techniques from the biological datasets. At first, a novel statistical strategy has been utilized to eliminate the insignificant/low-significant/redundant genes in such way that significance level must satisfy the data distribution property (viz., either normal distribution or non-normal distribution). The data is then discretized and post-discretized, consecutively. Thereafter, the biclustering technique is applied to identify maximal frequent closed homogeneous itemsets. Corresponding special type of rules are then extracted from the selected itemsets. Our proposed rule mining method performs better than the other rule mining algorithms as it generates maximal frequent closed homogeneous itemsets instead of frequent itemsets. Thus, it saves elapsed time, and can work on big dataset. Pathway and Gene Ontology analyses are conducted on the genes of the evolved rules using David database. Frequency analysis of the genes appearing in the evolved rules is performed to determine potential biomarkers. Furthermore, we also classify the data to know how much the evolved rules are able to describe accurately the remaining test (unknown) data. Subsequently, we also compare the average classification accuracy, and other related factors with other rule-based classifiers. Statistical significance tests are also performed for verifying the statistical relevance of the comparative results. Here, each of the other rule mining methods or rule-based classifiers is also starting with the same post-discretized data

  1. Mixing Languages during Learning? Testing the One Subject—One Language Rule

    Science.gov (United States)

    2015-01-01

    In bilingual communities, mixing languages is avoided in formal schooling: even if two languages are used on a daily basis for teaching, only one language is used to teach each given academic subject. This tenet known as the one subject-one language rule avoids mixing languages in formal schooling because it may hinder learning. The aim of this study was to test the scientific ground of this assumption by investigating the consequences of acquiring new concepts using a method in which two languages are mixed as compared to a purely monolingual method. Native balanced bilingual speakers of Basque and Spanish—adults (Experiment 1) and children (Experiment 2)—learnt new concepts by associating two different features to novel objects. Half of the participants completed the learning process in a multilingual context (one feature was described in Basque and the other one in Spanish); while the other half completed the learning phase in a purely monolingual context (both features were described in Spanish). Different measures of learning were taken, as well as direct and indirect indicators of concept consolidation. We found no evidence in favor of the non-mixing method when comparing the results of two groups in either experiment, and thus failed to give scientific support for the educational premise of the one subject—one language rule. PMID:26107624

  2. Mixing Languages during Learning? Testing the One Subject-One Language Rule.

    Directory of Open Access Journals (Sweden)

    Eneko Antón

    Full Text Available In bilingual communities, mixing languages is avoided in formal schooling: even if two languages are used on a daily basis for teaching, only one language is used to teach each given academic subject. This tenet known as the one subject-one language rule avoids mixing languages in formal schooling because it may hinder learning. The aim of this study was to test the scientific ground of this assumption by investigating the consequences of acquiring new concepts using a method in which two languages are mixed as compared to a purely monolingual method. Native balanced bilingual speakers of Basque and Spanish-adults (Experiment 1 and children (Experiment 2-learnt new concepts by associating two different features to novel objects. Half of the participants completed the learning process in a multilingual context (one feature was described in Basque and the other one in Spanish; while the other half completed the learning phase in a purely monolingual context (both features were described in Spanish. Different measures of learning were taken, as well as direct and indirect indicators of concept consolidation. We found no evidence in favor of the non-mixing method when comparing the results of two groups in either experiment, and thus failed to give scientific support for the educational premise of the one subject-one language rule.

  3. A "Sweet 16" of Rules About Teamwork

    Science.gov (United States)

    Laufer, Alexander (Editor)

    2002-01-01

    The following "Sweet 16" rules included in this paper derive from a longer paper by APPL Director Dr. Edward Hoffman and myself entitled " 99 Rules for Managing Faster, Better, Cheaper Projects." Our sources consisted mainly of "war stories" told by master project managers in my book Simultaneous Management: Managing Projects in a Dynamic Environment (AMACOM, The American Management Association, 1996). The Simultaneous Management model was a result of 10 years of intensive research and testing conducted with the active participation of master project managers from leading private organizations such as AT&T, DuPont, Exxon, General Motors, IBM, Motorola and Procter & Gamble. In a more recent study, led by Dr. Hoffman, we learned that master project managers in leading public organizations employ most of these rules as well. Both studies, in private and public organizations, found that a dynamic environment calls for dynamic management, and that is especially clear in how successful project managers think about their teams.

  4. Evaluation of Rule-based Modularization in Model Transformation Languages illustrated with ATL

    NARCIS (Netherlands)

    Ivanov, Ivan; van den Berg, Klaas; Jouault, Frédéric

    This paper studies ways for modularizing transformation definitions in current rule-based model transformation languages. Two scenarios are shown in which the modular units are identified on the base of the relations between source and target metamodels and on the base of generic transformation

  5. Small wind turbine performance evaluation using field test data and a coupled aero-electro-mechanical model

    Science.gov (United States)

    Wallace, Brian D.

    A series of field tests and theoretical analyses were performed on various wind turbine rotor designs at two Penn State residential-scale wind-electric facilities. This work involved the prediction and experimental measurement of the electrical and aerodynamic performance of three wind turbines; a 3 kW rated Whisper 175, 2.4 kW rated Skystream 3.7, and the Penn State designed Carolus wind turbine. Both the Skystream and Whisper 175 wind turbines are OEM blades which were originally installed at the facilities. The Carolus rotor is a carbon-fiber composite 2-bladed machine, designed and assembled at Penn State, with the intent of replacing the Whisper 175 rotor at the off-grid system. Rotor aerodynamic performance is modeled using WT_Perf, a National Renewable Energy Laboratory developed Blade Element Momentum theory based performance prediction code. Steady-state power curves are predicted by coupling experimentally determined electrical characteristics with the aerodynamic performance of the rotor simulated with WT_Perf. A dynamometer test stand is used to establish the electromechanical efficiencies of the wind-electric system generator. Through the coupling of WT_Perf and dynamometer test results, an aero-electro-mechanical analysis procedure is developed and provides accurate predictions of wind system performance. The analysis of three different wind turbines gives a comprehensive assessment of the capability of the field test facilities and the accuracy of aero-electro-mechanical analysis procedures. Results from this study show that the Carolus and Whisper 175 rotors are running at higher tip-speed ratios than are optimum for power production. The aero-electro-mechanical analysis predicted the high operating tip-speed ratios of the rotors and was accurate at predicting output power for the systems. It is shown that the wind turbines operate at high tip-speeds because of a miss-match between the aerodynamic drive torque and the operating torque of the wind

  6. Measurement and modelling of liquidity risk under the Basel III rules

    OpenAIRE

    Turkuner, Ercan

    2016-01-01

    In compliance with Basel III rules this study aims to create a model capable of generating a balance sheet. In the light of several hypotheses and general data about Turkish Banking System the model generates a balance sheet and, hence Basel III liquidity ratios could be set their threshold values. Besides, with the sensitivity analysis possible impacts of balance sheet structure on the Liquidity Coverage Ratio which promotes the short-term resilience of the liquidity risk profiles of banks h...

  7. Test-Driven, Model-Based Systems Engineering

    DEFF Research Database (Denmark)

    Munck, Allan

    Hearing systems have evolved over many years from simple mechanical devices (horns) to electronic units consisting of microphones, amplifiers, analog filters, loudspeakers, batteries, etc. Digital signal processors replaced analog filters to provide better performance end new features. Central....... This thesis concerns methods for identifying, selecting and implementing tools for various aspects of model-based systems engineering. A comprehensive method was proposed that include several novel steps such as techniques for analyzing the gap between requirements and tool capabilities. The method...... was verified with good results in two case studies for selection of a traceability tool (single-tool scenario) and a set of modeling tools (multi-tool scenarios). Models must be subjected to testing to allow engineers to predict functionality and performance of systems. Test-first strategies are known...

  8. Decision mining revisited - Discovering overlapping rules

    NARCIS (Netherlands)

    Mannhardt, Felix; De Leoni, Massimiliano; Reijers, Hajo A.; Van Der Aalst, Wil M P

    2016-01-01

    Decision mining enriches process models with rules underlying decisions in processes using historical process execution data. Choices between multiple activities are specified through rules defined over process data. Existing decision mining methods focus on discovering mutually-exclusive rules,

  9. Decision Mining Revisited - Discovering Overlapping Rules

    NARCIS (Netherlands)

    Mannhardt, F.; De Leoni, M.; Reijers, H.A.; van der Aalst, W.M.P.; Nurcan, S.; Soffer, P.; Bajec, M.; Eder, J.

    2016-01-01

    Decision mining enriches process models with rules underlying decisions in processes using historical process execution data. Choices between multiple activities are specified through rules defined over process data. Existing decision mining methods focus on discovering mutually-exclusive rules,

  10. Luria-Delbrück, revisited: the classic experiment does not rule out Lamarckian evolution

    Science.gov (United States)

    Holmes, Caroline M.; Ghafari, Mahan; Abbas, Anzar; Saravanan, Varun; Nemenman, Ilya

    2017-10-01

    We re-examined data from the classic Luria-Delbrück fluctuation experiment, which is often credited with establishing a Darwinian basis for evolution. We argue that, for the Lamarckian model of evolution to be ruled out by the experiment, the experiment must favor pure Darwinian evolution over both the Lamarckian model and a model that allows both Darwinian and Lamarckian mechanisms (as would happen for bacteria with CRISPR-Cas immunity). Analysis of the combined model was not performed in the original 1943 paper. The Luria-Delbrück paper also did not consider the possibility of neither model fitting the experiment. Using Bayesian model selection, we find that the Luria-Delbrück experiment, indeed, favors the Darwinian evolution over purely Lamarckian. However, our analysis does not rule out the combined model, and hence cannot rule out Lamarckian contributions to the evolutionary dynamics.

  11. Testing Born's rule in quantum mechanics for three mutually exclusive events

    DEFF Research Database (Denmark)

    Sollner, Immo Nathanael; Gschösser, Benjamin; Mai, Patrick

    2012-01-01

    We present a new experimental approach using a three-path interferometer and find a tighter empirical upper bound on possible violations of Born's Rule. A deviation from Born's rule would result in multi-order interference. Among the potential systematic errors that could lead to an apparent viol...

  12. Extracting classification rules from an informatic security incidents repository by genetic programming

    Directory of Open Access Journals (Sweden)

    Carlos Javier Carvajal Montealegre

    2015-04-01

    Full Text Available This paper describes the data mining process to obtain classification rules over an information security incident data collection, explaining in detail the use of genetic programming as a mean to model the incidents behavior and representing such rules as decision trees. The described mining process includes several tasks, such as the GP (Genetic Programming approach evaluation, the individual's representation and the algorithm parameters tuning to upgrade the performance. The paper concludes with the result analysis and the description of the rules obtained, suggesting measures to avoid the occurrence of new informatics attacks. This paper is a part of the thesis work degree: Information Security Incident Analytics by Data Mining for Behavioral Modeling and Pattern Recognition (Carvajal, 2012.

  13. Evaluation of impact limiter performance during end-on and slapdown drop tests of a one-third scale model storage/transport cask system

    International Nuclear Information System (INIS)

    Yoshimura, H.R.; Bronowski, D.R.; Uncapher, W.L.; Attaway, S.W.; Bateman, V.I.; Carne, T.G.; Gregory, D.L.; Huerta, M.

    1990-12-01

    This report describes drop testing of a one-third scale model shipping cask system. Two casks were designed and fabricated by Transnuclear, Inc., to ship spent fuel from the former Nuclear Fuel Services West Valley reprocessing facility in New York to the Idaho National Engineering Laboratory for a long-term spent fuel dry storage demonstration project. As part of the NRC's regulatory certification process, one-third scale model tests were performed to obtain experimental data on impact limiter performance during impact testing. The objectives of the testing program were to (1) obtain deceleration and displacement information for the cask and impact limiter system, (2) obtain dynamic force-displacement data for the impact limiters, (3) verify the integrity of the impact limiter retention system, and (4) examine the crush behavior of the limiters. Two 30-ft (9-m) drop tests were conducted on a mass model of the cask body and scaled balsa and redwood-filled impact limiters. This report describes the results of both tests in terms of measured decelerations, posttest deformation measurements, and the general structural response of the system. 3 refs., 32 figs

  14. The performance testing

    International Nuclear Information System (INIS)

    Mayr, A.

    1975-01-01

    Concerning the time-schedule of reactor performance tests they normally begin when suppliers or constructors have finished construction and made all necessary construction and coordinated tests. If the last-mentioned tests are conducted profoundly, they contribute substantially to a quick and simple carrying-out of the last performance tests and to the general quality of components and systems. At this stage all components of a system should be properly fixed, machinery, instruments and electrical components adjusted and calibrated, all set-points tested, electrical and other supply units in operation or ready to operate and all functions pretested. Just at this stage of the work most of the existing defects and failures of systems can be found. Remembering the fact that the difficulty of operation of complex systems results from detail problems, it is extremely useful to remove all things of this kind as soon as possible, at the latest at this time where it is done easily and normally quickly without influencing start-up-procedures of other systems or even of the total plant. (orig./TK) [de

  15. Constraining performance assessment models with tracer test results: a comparison between two conceptual models

    Science.gov (United States)

    McKenna, Sean A.; Selroos, Jan-Olof

    Tracer tests are conducted to ascertain solute transport parameters of a single rock feature over a 5-m transport pathway. Two different conceptualizations of double-porosity solute transport provide estimates of the tracer breakthrough curves. One of the conceptualizations (single-rate) employs a single effective diffusion coefficient in a matrix with infinite penetration depth. However, the tracer retention between different flow paths can vary as the ratio of flow-wetted surface to flow rate differs between the path lines. The other conceptualization (multirate) employs a continuous distribution of multiple diffusion rate coefficients in a matrix with variable, yet finite, capacity. Application of these two models with the parameters estimated on the tracer test breakthrough curves produces transport results that differ by orders of magnitude in peak concentration and time to peak concentration at the performance assessment (PA) time and length scales (100,000 years and 1,000 m). These differences are examined by calculating the time limits for the diffusive capacity to act as an infinite medium. These limits are compared across both conceptual models and also against characteristic times for diffusion at both the tracer test and PA scales. Additionally, the differences between the models are examined by re-estimating parameters for the multirate model from the traditional double-porosity model results at the PA scale. Results indicate that for each model the amount of the diffusive capacity that acts as an infinite medium over the specified time scale explains the differences between the model results and that tracer tests alone cannot provide reliable estimates of transport parameters for the PA scale. Results of Monte Carlo runs of the transport models with varying travel times and path lengths show consistent results between models and suggest that the variation in flow-wetted surface to flow rate along path lines is insignificant relative to variability in

  16. New developments in FeynRules

    CERN Document Server

    Alloul, Adam; Degrande, Céline; Duhr, Claude; Fuks, Benjamin

    2014-01-01

    The program FeynRules is a Mathematica package developed to facilitate the implementation of new physics theories into high-energy physics tools. Starting from a minimal set of information such as the model gauge symmetries, its particle content, parameters and Lagrangian, FeynRules provides all necessary routines to extract automatically from the Lagrangian (that can also be computed semi-automatically for supersymmetric theories) the associated Feynman rules. These can be further exported to several Monte Carlo event generators through dedicated interfaces, as well as translated into a Python library, under the so-called UFO model format, agnostic of the model complexity, especially in terms of Lorentz and/or color structures appearing in the vertices or of number of external legs. In this work, we briefly report on the most recent new features that have been added to FeynRules, including full support for spin-3/2 fermions, a new module allowing for the automated diagonalization of the particle spectrum and...

  17. Test performance of the QSE series of 5 cm aperture quadrupole model magnets

    International Nuclear Information System (INIS)

    Archer, B.; Bein, D.; Cunningham, G.; DiMarco, J.; Gathright, T.; Jayakumar, J.; LaBarge, A.; Li, W.; Lambert, D.; Scott, M.

    1994-01-01

    A 5 cm aperture quadrupole design, the QSE series of magnets were the first to be tested in the Short Magnet and Cable Test Laboratory (SMCTL) at the SSCL. Test performance of the first two magnets of the series are presented, including quench performance, quench localization, strain gage readings, and magnetic measurements. Both magnets behaved reasonably well with no quenches below the collider operating current, four training quenches to plateau, and good training memory between thermal cycles. Future magnets in the QSE series will be used to reduce the initial training and to tune out unwanted magnetic harmonics

  18. Test performance of the QSE series of 5 cm aperture quadrupole model magnets

    International Nuclear Information System (INIS)

    Archer, B.; Bein, D.; Cunningham, G.; DiMarco, J.; Gathright, T.; Jayakumar, J.; Labarge, A.; Li, W.; Lambert, D.; Scott, M.; Snitchler, G.; Zeigler, R.

    1993-04-01

    A 5 cm aperture quadrupole design, the QSE series of magnets were the first to be tested in the Short Magnet and Cable Test Laboratory (SMCTL) at the SSCL. Test performance of the first two magnets of the series are presented, including quench performance, quench localization, strain gage readings, and magnetic measurements.Both magnets behaved reasonably well with no quenches below the collider operating current, four training quenches to plateau, and good training memory between thermal cycles. Future magnets in the QSE series will be used to reduce the initial training and to tune out unwanted magnetic harmonics

  19. 40 CFR 60.8 - Performance tests.

    Science.gov (United States)

    2010-07-01

    ... 40 Protection of Environment 6 2010-07-01 2010-07-01 false Performance tests. 60.8 Section 60.8... PERFORMANCE FOR NEW STATIONARY SOURCES General Provisions § 60.8 Performance tests. (a) Except as specified in... conduct performance test(s) and furnish the Administrator a written report of the results of such...

  20. Developing and testing a model of psychosocial work environment and performance

    DEFF Research Database (Denmark)

    Edwards, Kasper; Pejtersen, Jan Hyld; Møller, Niels

    2011-01-01

    Good psychosocial work environment has been assumed to result in good work performance. However, little documentation exists which support the claim and the same goes for the opposite claim. This paper reports findings from a combined quantitative and qualitative study of the relationship between...... psychosocial work environment and performance in a large Danish firm. The objects of the study were more than 45 customer centers’ with 9-20 employees each. A substantial database covering the 45 customer centers over a period of 5 years has been gathered. In this period the Copenhagen psychosocial...... questionnaire (COPSOQ) has been used two times with two years in between. This allows us to build a model of the relationship between psychosocial work environment, selected context variables and performance data. The model proposes that good psychosocial work environment is a function of leadership which...

  1. Developing and testing a model of psychosocial work environment and performance

    DEFF Research Database (Denmark)

    Edwards, Kasper; Pejtersen, Jan Hyld; Møller, Niels

    Good psychosocial work environment has been assumed to result in good work performance. However, little documentation exists which support the claim and the same goes for the opposite claim. This paper reports findings from a combined quantitative and qualitative study of the relationship between...... psychosocial work environment and performance in a large Danish firm. The objects of the study were more than 45 customer centers’ with 9-20 employees each. A substantial database covering the 45 customer centers over a period of 5 years has been gathered. In this period the Copenhagen psychosocial...... questionnaire (COPSOQ) has been used two times with two years in between. This allows us to build a model of the relationship between psychosocial work environment, selected context variables and performance data. The model proposes that good psychosocial work environment is a function of leadership which...

  2. Testing the performance of empirical remote sensing algorithms in the Baltic Sea waters with modelled and in situ reflectance data

    Directory of Open Access Journals (Sweden)

    Martin Ligi

    2017-01-01

    Full Text Available Remote sensing studies published up to now show that the performance of empirical (band-ratio type algorithms in different parts of the Baltic Sea is highly variable. Best performing algorithms are different in the different regions of the Baltic Sea. Moreover, there is indication that the algorithms have to be seasonal as the optical properties of phytoplankton assemblages dominating in spring and summer are different. We modelled 15,600 reflectance spectra using HydroLight radiative transfer model to test 58 previously published empirical algorithms. 7200 of the spectra were modelled using specific inherent optical properties (SIOPs of the open parts of the Baltic Sea in summer and 8400 with SIOPs of spring season. Concentration range of chlorophyll-a, coloured dissolved organic matter (CDOM and suspended matter used in the model simulations were based on the actually measured values available in literature. For each optically active constituent we added one concentration below actually measured minimum and one concentration above the actually measured maximum value in order to test the performance of the algorithms in wider range. 77 in situ reflectance spectra from rocky (Sweden and sandy (Estonia, Latvia coastal areas were used to evaluate the performance of the algorithms also in coastal waters. Seasonal differences in the algorithm performance were confirmed but we found also algorithms that can be used in both spring and summer conditions. The algorithms that use bands available on OLCI, launched in February 2016, are highlighted as this sensor will be available for Baltic Sea monitoring for coming decades.

  3. Sum rules, asymptotic behaviour and (multi)baryon states in the Skyrme model

    International Nuclear Information System (INIS)

    Mignaco, J.A.; Wulck, S.

    1990-01-01

    We obtain sum roles that should be satisfied by the solutions of the Euler-Lagrange equation for the chiral angle in the Skyrme model in the hedgehog representation. The sum rules allow to determine the existence of solutions with integer baryon number for well determined values of a relevant dimensionless parameter Φ only. For all other values, there are no solutions with integer baryon number, in particular for the pure non-linear sigma model. (author)

  4. Numerical analysis and experiment research on fluid orbital performance of vane type propellant management device

    International Nuclear Information System (INIS)

    Hu, Q; Li, Y; Pan, H L; Liu, J T; Zhuang, B T

    2015-01-01

    Vane type propellant management device (PMD) is one of the key components of the vane-type surface tension tank (STT), and its fluid orbital performance directly determines the STT's success or failure. In present paper, numerical analysis and microgravity experiment study on fluid orbital performance of a vane type PMD were carried out. By using two-phase flow model of volume of fluid (VOF), fluid flow characteristics in the tank with the vane type PMD were numerically calculated, and the rules of fluid transfer and distribution were gotten. A abbreviate model test system of the vane type PMD is established and microgravity drop tower tests were performed, then fluid management and transmission rules of the vane type PMD were obtained under microgravity environment. The analysis and tests results show that the vane type PMD has good and initiative fluid orbital management ability and meets the demands of fluid orbital extrusion in the vane type STT. The results offer valuable guidance for the design and optimization of the new generation of vane type PMD, and also provide a new approach for fluid management and control in space environment

  5. Reliability of the CARE rule and the HEART score to rule out an acute coronary syndrome in non-traumatic chest pain patients.

    Science.gov (United States)

    Moumneh, Thomas; Richard-Jourjon, Vanessa; Friou, Emilie; Prunier, Fabrice; Soulie-Chavignon, Caroline; Choukroun, Jacques; Mazet-Guilaumé, Betty; Riou, Jérémie; Penaloza, Andréa; Roy, Pierre-Marie

    2018-03-02

    In patients consulting in the Emergency Department for chest pain, a HEART score ≤ 3 has been shown to rule out an acute coronary syndrome (ACS) with a low risk of major adverse cardiac event (MACE) occurrence. A negative CARE rule (≤ 1) that stands for the first four elements of the HEART score may have similar rule-out reliability without troponin assay requirement. We aim to prospectively assess the performance of the CARE rule and of the HEART score to predict MACE in a chest pain population. Prospective two-center non-interventional study. Patients admitted to the ED for non-traumatic chest pain were included, and followed-up at 6 weeks. The main study endpoint was the 6-week rate of MACE (myocardial infarction, coronary angioplasty, coronary bypass, and sudden unexplained death). 641 patients were included, of whom 9.5% presented a MACE at 6 weeks. The CARE rule was negative for 31.2% of patients, and none presented a MACE during follow-up [0, 95% confidence interval: (0.0-1.9)]. The HEART score was ≤ 3 for 63.0% of patients, and none presented a MACE during follow-up [0% (0.0-0.9)]. With an incidence below 2% in the negative group, the CARE rule seemed able to safely rule out a MACE without any biological test for one-third of patients with chest pain and the HEART score for another third with a single troponin assay.

  6. A statistical model for predicting muscle performance

    Science.gov (United States)

    Byerly, Diane Leslie De Caix

    The objective of these studies was to develop a capability for predicting muscle performance and fatigue to be utilized for both space- and ground-based applications. To develop this predictive model, healthy test subjects performed a defined, repetitive dynamic exercise to failure using a Lordex spinal machine. Throughout the exercise, surface electromyography (SEMG) data were collected from the erector spinae using a Mega Electronics ME3000 muscle tester and surface electrodes placed on both sides of the back muscle. These data were analyzed using a 5th order Autoregressive (AR) model and statistical regression analysis. It was determined that an AR derived parameter, the mean average magnitude of AR poles, significantly correlated with the maximum number of repetitions (designated Rmax) that a test subject was able to perform. Using the mean average magnitude of AR poles, a test subject's performance to failure could be predicted as early as the sixth repetition of the exercise. This predictive model has the potential to provide a basis for improving post-space flight recovery, monitoring muscle atrophy in astronauts and assessing the effectiveness of countermeasures, monitoring astronaut performance and fatigue during Extravehicular Activity (EVA) operations, providing pre-flight assessment of the ability of an EVA crewmember to perform a given task, improving the design of training protocols and simulations for strenuous International Space Station assembly EVA, and enabling EVA work task sequences to be planned enhancing astronaut performance and safety. Potential ground-based, medical applications of the predictive model include monitoring muscle deterioration and performance resulting from illness, establishing safety guidelines in the industry for repetitive tasks, monitoring the stages of rehabilitation for muscle-related injuries sustained in sports and accidents, and enhancing athletic performance through improved training protocols while reducing

  7. Assessment of the Radiation Enclosure Models in SPACE and RELAP5 with GOTA Test 27

    Energy Technology Data Exchange (ETDEWEB)

    Lee, T. B.; Lee, G. W.; Choi, T. S. [KEPCO, Daejeon (Korea, Republic of)

    2016-05-15

    SPACE (Safety and Performance Analysis Code) for nuclear power plant has been developed to calculate the transient thermal-hydraulic response of PWRs that can contain multiple types of fluids. Without explaining 3-D effects such as the change of fuel rod/guide tube thermal behavior as a result of the radiation heat transfer, the 1-D code could predict an unrealistically high peak clad temperature. A useful function to simulate the wall-to-wall radiation heat transfer is implemented in the SPACE and RELAP5 codes. This paper discusses the assessment results of the radiation enclosure model of SPACE and RELAP5. The capability of handling wall-to-wall radiation problem of the SPACE and the RELAP5 codes has been evaluated using the experimental data from the GOTA test facility. At the top of the bundle, the maximum errors of SPACE and RELAP5 are less than 1.6% and 2.3%, respectively. As noted, there is a small discrepancy between the calculated results and experimental data except for the predictions near the top of the test section. The SPACE code is based on the version 2.16 distributed by KHNP. In order to perform the simulation of the GOTA test 27, it was necessary to modify the SPACE code. There was the subroutine for an input process corresponding to the radiation model, the inp{sub c}heck function of the RadEncData Class, contained in a vulnerable algorithm to figure out the reciprocity rule of the view factor.

  8. Debt Shifting and Thin-Capitalization Rules – German Experience and Alternative Approaches

    Directory of Open Access Journals (Sweden)

    Ruf Martin

    2015-09-01

    Full Text Available This paper presents the general design of thin-capitalization rules and summarizes the economic effects of such rules as identified in theoretical models. We review empirical studies providing evidence on the experience with (German thin-capitalization rules as well as on the adjustment of German multinationals to foreign thin-capitalization rules. Special emphasis is given to the development in Germany, because Germany went a long way in limiting interest deductibility by enacting a drastic change in its thin-capitalization rules in 2008, and because superb German data on multinational finance allows for testing several aspects consistently. We then discuss the experience of the Nordic countries with thin-capitalization rules. Briefly reviewing potential alternatives as well, we believe that the arm’s-length principle is administratively too costly and impracticable, whereas we argue that controlled-foreign-company rules might be another promising avenue for limiting internal debt shifting. Fundamental tax reforms towards a system with either "allowance for corporate equity" (ACE or a "comprehensive business income tax" (CBIT should also eliminate any thin-capitalization incentive.

  9. A rule-based approach to model checking of UML state machines

    Science.gov (United States)

    Grobelna, Iwona; Grobelny, Michał; Stefanowicz, Łukasz

    2016-12-01

    In the paper a new approach to formal verification of control process specification expressed by means of UML state machines in version 2.x is proposed. In contrast to other approaches from the literature, we use the abstract and universal rule-based logical model suitable both for model checking (using the nuXmv model checker), but also for logical synthesis in form of rapid prototyping. Hence, a prototype implementation in hardware description language VHDL can be obtained that fully reflects the primary, already formally verified specification in form of UML state machines. Presented approach allows to increase the assurance that implemented system meets the user-defined requirements.

  10. Towards Accurate Modelling of Galaxy Clustering on Small Scales: Testing the Standard ΛCDM + Halo Model

    Science.gov (United States)

    Sinha, Manodeep; Berlind, Andreas A.; McBride, Cameron K.; Scoccimarro, Roman; Piscionere, Jennifer A.; Wibking, Benjamin D.

    2018-04-01

    Interpreting the small-scale clustering of galaxies with halo models can elucidate the connection between galaxies and dark matter halos. Unfortunately, the modelling is typically not sufficiently accurate for ruling out models statistically. It is thus difficult to use the information encoded in small scales to test cosmological models or probe subtle features of the galaxy-halo connection. In this paper, we attempt to push halo modelling into the "accurate" regime with a fully numerical mock-based methodology and careful treatment of statistical and systematic errors. With our forward-modelling approach, we can incorporate clustering statistics beyond the traditional two-point statistics. We use this modelling methodology to test the standard ΛCDM + halo model against the clustering of SDSS DR7 galaxies. Specifically, we use the projected correlation function, group multiplicity function and galaxy number density as constraints. We find that while the model fits each statistic separately, it struggles to fit them simultaneously. Adding group statistics leads to a more stringent test of the model and significantly tighter constraints on model parameters. We explore the impact of varying the adopted halo definition and cosmological model and find that changing the cosmology makes a significant difference. The most successful model we tried (Planck cosmology with Mvir halos) matches the clustering of low luminosity galaxies, but exhibits a 2.3σ tension with the clustering of luminous galaxies, thus providing evidence that the "standard" halo model needs to be extended. This work opens the door to adding interesting freedom to the halo model and including additional clustering statistics as constraints.

  11. ML-Space: Hybrid Spatial Gillespie and Particle Simulation of Multi-Level Rule-Based Models in Cell Biology.

    Science.gov (United States)

    Bittig, Arne T; Uhrmacher, Adelinde M

    2017-01-01

    Spatio-temporal dynamics of cellular processes can be simulated at different levels of detail, from (deterministic) partial differential equations via the spatial Stochastic Simulation algorithm to tracking Brownian trajectories of individual particles. We present a spatial simulation approach for multi-level rule-based models, which includes dynamically hierarchically nested cellular compartments and entities. Our approach ML-Space combines discrete compartmental dynamics, stochastic spatial approaches in discrete space, and particles moving in continuous space. The rule-based specification language of ML-Space supports concise and compact descriptions of models and to adapt the spatial resolution of models easily.

  12. Weak solutions to interdiffusion models with Vegard rule

    Science.gov (United States)

    Sapa, Lucjan; BoŻek, Bogusław; Danielewski, Marek

    2018-01-01

    In this work we consider the diffusional transport in an r-component solid solution. The one and multidimensional models are expressed by the nonlinear systems of strongly coupled differential equations with the initial and the nonlinear coupled boundary conditions. They are obtained from the local mass conservation law for fluxes which are a sum of the diffusional and Darken drift terms, together with the Vegard rule. The considered boundary conditions allow the physical system to be not only closed but also open. The theorems on existence, uniqueness and properties of global weak solutions in the one-dimensional case are formulated. The agreement between the theoretical results, numerical simulations and experimental data in the one-dimensional case is shown.

  13. High Dimensional Classification Using Features Annealed Independence Rules.

    Science.gov (United States)

    Fan, Jianqing; Fan, Yingying

    2008-01-01

    Classification using high-dimensional features arises frequently in many contemporary statistical studies such as tumor classification using microarray or other high-throughput data. The impact of dimensionality on classifications is largely poorly understood. In a seminal paper, Bickel and Levina (2004) show that the Fisher discriminant performs poorly due to diverging spectra and they propose to use the independence rule to overcome the problem. We first demonstrate that even for the independence classification rule, classification using all the features can be as bad as the random guessing due to noise accumulation in estimating population centroids in high-dimensional feature space. In fact, we demonstrate further that almost all linear discriminants can perform as bad as the random guessing. Thus, it is paramountly important to select a subset of important features for high-dimensional classification, resulting in Features Annealed Independence Rules (FAIR). The conditions under which all the important features can be selected by the two-sample t-statistic are established. The choice of the optimal number of features, or equivalently, the threshold value of the test statistics are proposed based on an upper bound of the classification error. Simulation studies and real data analysis support our theoretical results and demonstrate convincingly the advantage of our new classification procedure.

  14. A Model to Identify the Most Effective Business Rule in Information Systems using Rough Set Theory: Study on Loan Business Process

    Directory of Open Access Journals (Sweden)

    Mohammad Aghdasi

    2011-09-01

    In this paper, a practical model is used to identify the most effective rules in information systems. In this model, first, critical business attributes which fit to strategic expectations are taken into account. These are the attributes which their changes are more important than others in achieving the strategic expectations. To identify these attributes we utilize rough set theory. Those business rules which use critical information attribute in their structures are identified as the most effective business rules. The Proposed model helps information system developers to identify scope of effective business rules. It causes a decrease in time and cost of information system maintenance. Also it helps business analyst to focus on managing critical business attributes in order to achieve a specific goal.

  15. Connecting clinical and actuarial prediction with rule-based methods.

    Science.gov (United States)

    Fokkema, Marjolein; Smits, Niels; Kelderman, Henk; Penninx, Brenda W J H

    2015-06-01

    Meta-analyses comparing the accuracy of clinical versus actuarial prediction have shown actuarial methods to outperform clinical methods, on average. However, actuarial methods are still not widely used in clinical practice, and there has been a call for the development of actuarial prediction methods for clinical practice. We argue that rule-based methods may be more useful than the linear main effect models usually employed in prediction studies, from a data and decision analytic as well as a practical perspective. In addition, decision rules derived with rule-based methods can be represented as fast and frugal trees, which, unlike main effects models, can be used in a sequential fashion, reducing the number of cues that have to be evaluated before making a prediction. We illustrate the usability of rule-based methods by applying RuleFit, an algorithm for deriving decision rules for classification and regression problems, to a dataset on prediction of the course of depressive and anxiety disorders from Penninx et al. (2011). The RuleFit algorithm provided a model consisting of 2 simple decision rules, requiring evaluation of only 2 to 4 cues. Predictive accuracy of the 2-rule model was very similar to that of a logistic regression model incorporating 20 predictor variables, originally applied to the dataset. In addition, the 2-rule model required, on average, evaluation of only 3 cues. Therefore, the RuleFit algorithm appears to be a promising method for creating decision tools that are less time consuming and easier to apply in psychological practice, and with accuracy comparable to traditional actuarial methods. (c) 2015 APA, all rights reserved).

  16. Implementation of maintenance rule for structures

    International Nuclear Information System (INIS)

    Ashar, H.; Bagchi, G.

    1999-01-01

    The maintenance rule, 10 CFR 50.65, 'Requirements for Monitoring the Effectiveness of Maintenance at Nuclear Power Plants', was published by the Nuclear Regulatory Commission (NRC) in the Federal Register (56 FR 31324) on July 10, 1991. The rule became effective on July 10, 1996, giving nuclear power plant licensees 5 years to implement it. During 1994-1995, NRC staff visited nine nuclear power plant sites to observe licensees' preparations for implementation of the rule. The teams found that most of the licensees had not established goals, or performance criteria for monitoring structures at their sites. The licensees contended that the structures were inherently reliable and required no monitoring under the maintenance rule. On the basis of earlier site visits performed by NRC staff to assess the condition of structures, the NRC staff could not accept this contention, and clarified its position in Revision 1 of Regulatory Guide 1.160, 'Monitoring the Effectiveness of Maintenance at Nuclear Power Plants'. This paper discusses the applicability of the maintenance rule criteria for structures and its usefulness in ensuring that the structures, systems, and components within the scope of the maintenance rule are capable of fulfilling their intended functions. Also discussed are the aspects of maintenance rule efforts that could be useful for license renewal applications. (orig.)

  17. Computerized Adaptive Testing with R: Recent Updates of the Package catR

    Directory of Open Access Journals (Sweden)

    David Magis

    2017-01-01

    Full Text Available The purpose of this paper is to list the recent updates of the R package catR. This package allows for generating response patterns under a computerized adaptive testing (CAT framework with underlying item response theory (IRT models. Among the most important updates, well-known polytomous IRT models are now supported by catR; several item selection rules have been added; and it is now possible to perform post-hoc simulations. Some functions were also rewritten or withdrawn to improve the usefulness and performances of the package.

  18. Distributional Benefit Analysis of a National Air Quality Rule

    Directory of Open Access Journals (Sweden)

    Jin Huang

    2011-06-01

    Full Text Available Under Executive Order 12898, the U.S. Environmental Protection Agency (EPA must perform environmental justice (EJ reviews of its rules and regulations. EJ analyses address the hypothesis that environmental disamenities are experienced disproportionately by poor and/or minority subgroups. Such analyses typically use communities as the unit of analysis. While community-based approaches make sense when considering where polluting sources locate, they are less appropriate for national air quality rules affecting many sources and pollutants that can travel thousands of miles. We compare exposures and health risks of EJ-identified individuals rather than communities to analyze EPA’s Heavy Duty Diesel (HDD rule as an example national air quality rule. Air pollutant exposures are estimated within grid cells by air quality models; all individuals in the same grid cell are assigned the same exposure. Using an inequality index, we find that inequality within racial/ethnic subgroups far outweighs inequality between them. We find, moreover, that the HDD rule leaves between-subgroup inequality essentially unchanged. Changes in health risks depend also on subgroups’ baseline incidence rates, which differ across subgroups. Thus, health risk reductions may not follow the same pattern as reductions in exposure. These results are likely representative of other national air quality rules as well.

  19. A comparison of Reduced Life Expectancy (RLE) model with Haber's Rule to describe effects of exposure time on toxicity

    International Nuclear Information System (INIS)

    Verma, Vibha; Yu, Qiming J.; Connell, Des W.

    2015-01-01

    The Reduced Life Expectancy (RLE) Model (LC 50  = [ln(NLE) – ln(LT 50 )]/d) has been proposed as an alternative to Haber's Rule. The model is based on a linear relationship between LC 50 (Lethal Exposure Concentration) and lnLT 50 (Lethal Exposure Time) and uses NLE (Normal Life Expectancy) as a limiting point as well as a long term data point (where d is a constant). The purposes of this paper were to compare the RLE Model with Haber's Rule with available toxicity data and to evaluate the strengths and weaknesses of each approach. When LT 50 is relatively short and LC 50 is high, Haber's Rule is consistent with the RLE model. But the difference between the two was evident in the situation when LT 50 is relatively long and LC 50 is low where the RLE model is a marked departure from Haber's Rule. The RLE Model can be used to appropriately evaluate long term effects of exposure. - Highlights: • The strength and weakness of Haber's Rule in relation to the RLE model is needed. • Haber's Rule now used universally is quite inappropriate for environmental toxicity. • Normal life expectancy, a new parameter l is used to evaluate effects of exposure time on toxicity. • According to RLE model when LC 50 approaches zero, then LT 50  = Normal Life Expectancy. • The RLE model can be used to evaluate long term effects of exposure-accurately and easily. - The RLE approach is a more appropriate alternative particularly to evaluate long term effects of exposure. It can be easily used to estimate long term effects of exposure

  20. Methods for significance testing of categorical covariates in logistic regression models after multiple imputation: power and applicability analysis

    NARCIS (Netherlands)

    Eekhout, I.; Wiel, M.A. van de; Heymans, M.W.

    2017-01-01

    Background. Multiple imputation is a recommended method to handle missing data. For significance testing after multiple imputation, Rubin’s Rules (RR) are easily applied to pool parameter estimates. In a logistic regression model, to consider whether a categorical covariate with more than two levels

  1. Inspection system performance test procedure

    International Nuclear Information System (INIS)

    Jensen, C.E.

    1995-01-01

    This procedure establishes requirements to administer a performance demonstration test. The test is to demonstrate that the double-shell tank inspection system (DSTIS) supplied by the contractor performs in accordance with the WHC-S-4108, Double-Shell Tank Ultrasonic Inspection Performance Specification, Rev. 2-A, January, 1995. The inspection system is intended to provide ultrasonic (UT) and visual data to determine integrity of the Westinghouse Hanford Company (WHC) site underground waste tanks. The robotic inspection system consists of the following major sub-systems (modules) and components: Mobile control center; Deployment module; Cable management assembly; Robot mechanism; Ultrasonic testing system; Visual testing system; Pneumatic system; Electrical system; and Control system

  2. WhalePower tubercle blade power performance test report

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2008-07-15

    Toronto-based WhalePower Corporation has developed turbine blades that are modeled after humpback whale flippers. The blades, which incorporate tubercles along the leading edge of the blade, have been fitted to a Wenvor 25 kW turbine installed in North Cape, Prince Edward Island at a test site for the Wind Energy Institute of Canada (WEICan). A test was conducted to characterize the power performance of the prototype wind turbine. This report described the wind turbine configuration with particular reference to turbine information, power rating, blade information, tower information, control systems and grid connections. The test site was also described along with test equipment and measurement procedures. Information regarding power output as a function of wind speed was included along with power curves, power coefficient and annual energy production. The results for the power curve and annual energy production contain a level of uncertainty. While measurements for this test were collected and analyzed in accordance with International Electrotechnical Commission (IEC) standards for performance measurements of electricity producing wind turbines (IEC 61400-12-1), the comparative performance data between the prototype WhalePower wind turbine blade and the Wenvor standard blade was not gathered to IEC data standards. Deviations from IEC-61400-12-1 procedures were listed. 6 tabs., 16 figs., 3 appendices.

  3. Decision Mining Revisited – Discovering Overlapping Rules

    NARCIS (Netherlands)

    Mannhardt, F.; de Leoni, M.; Reijers, H.A.; van der Aalst, W.M.P.

    2016-01-01

    Decision mining enriches process models with rules underlying decisions in processes using historical process execution data. Choices between multiple activities are specified through rules defined over process data. Existing decision mining methods focus on discovering mutually-exclusive rules,

  4. Assessing The Performance of Hydrological Models

    Science.gov (United States)

    van der Knijff, Johan

    The performance of hydrological models is often characterized using the coefficient of efficiency, E. The sensitivity of E to extreme streamflow values, and the difficulty of deciding what value of E should be used as a threshold to identify 'good' models or model parameterizations, have proven to be serious shortcomings of this index. This paper reviews some alternative performance indices that have appeared in the litera- ture. Legates and McCabe (1999) suggested a more generalized form of E, E'(j,B). Here, j is a parameter that controls how much emphasis is put on extreme streamflow values, and B defines a benchmark or 'null hypothesis' against which the results of the model are tested. E'(j,B) was used to evaluate a large number of parameterizations of a conceptual rainfall-runoff model, using 6 different combinations of j and B. First, the effect of j and B is explained. Second, it is demonstrated how the index can be used to explicitly test hypotheses about the model and the data. This approach appears to be particularly attractive if the index is used as a likelihood measure within a GLUE-type analysis.

  5. Bilingualism trains specific brain circuits involved in flexible rule selection and application.

    Science.gov (United States)

    Stocco, Andrea; Prat, Chantel S

    2014-10-01

    Bilingual individuals have been shown to outperform monolinguals on a variety of tasks that measure non-linguistic executive functioning, suggesting that some facets of the bilingual experience give rise to generalized improvements in cognitive performance. The current study investigated the hypothesis that such advantage in executive functioning arises from the need to flexibly select and apply rules when speaking multiple languages. Such flexible behavior may strengthen the functioning of the fronto-striatal loops that direct signals to the prefrontal cortex. To test this hypothesis, we compared behavioral and brain data from proficient bilinguals and monolinguals who performed a Rapid Instructed Task Learning paradigm, which requires behaving according to ever-changing rules. Consistent with our hypothesis, bilinguals were faster than monolinguals when executing novel rules, and this improvement was associated with greater modulation of activity in the basal ganglia. The implications of these findings for language and executive function research are discussed herein. Published by Elsevier Inc.

  6. EM Transition Sum Rules Within the Framework of sdg Proton-Neutron Interacting Boson Model, Nuclear Pair Shell Model and Fermion Dynamical Symmetry Model

    Science.gov (United States)

    Zhao, Yumin

    1997-07-01

    By the techniques of the Wick theorem for coupled clusters, the no-energy-weighted electromagnetic sum-rule calculations are presented in the sdg neutron-proton interacting boson model, the nuclear pair shell model and the fermion-dynamical symmetry model. The project supported by Development Project Foundation of China, National Natural Science Foundation of China, Doctoral Education Fund of National Education Committee, Fundamental Research Fund of Southeast University

  7. Modeling Ignition of a Heptane Isomer: Improved Thermodynamics, Reaction Pathways, Kinetic, and Rate Rule Optimizations for 2-Methylhexane

    KAUST Repository

    Mohamed, Samah; Cai, Liming; Khaled, Fathi; Banyon, Colin; Wang, Zhandong; Rachidi, Mariam El; Pitsch, Heinz; Curran, Henry J.; Farooq, Aamir; Sarathy, Mani

    2016-01-01

    Accurate chemical kinetic combustion models of lightly branched alkanes (e.g., 2-methylalkanes) are important to investigate the combustion behavior of real fuels. Improving the fidelity of existing kinetic models is a necessity, as new experiments and advanced theories show inaccuracies in certain portions of the models. This study focuses on updating thermodynamic data and the kinetic reaction mechanism for a gasoline surrogate component, 2-methylhexane, based on recently published thermodynamic group values and rate rules derived from quantum calculations and experiments. Alternative pathways for the isomerization of peroxy-alkylhydroperoxide (OOQOOH) radicals are also investigated. The effects of these updates are compared against new high-pressure shock tube and rapid compression machine ignition delay measurements. It is shown that rate constant modifications are required to improve agreement between kinetic modeling simulations and experimental data. We further demonstrate the ability to optimize the kinetic model using both manual and automated techniques for rate parameter tunings to improve agreement with the measured ignition delay time data. Finally, additional low temperature chain branching reaction pathways are shown to improve the model’s performance. The present approach to model development provides better performance across extended operating conditions while also strengthening the fundamental basis of the model.

  8. Modeling Ignition of a Heptane Isomer: Improved Thermodynamics, Reaction Pathways, Kinetic, and Rate Rule Optimizations for 2-Methylhexane

    KAUST Repository

    Mohamed, Samah

    2016-03-21

    Accurate chemical kinetic combustion models of lightly branched alkanes (e.g., 2-methylalkanes) are important to investigate the combustion behavior of real fuels. Improving the fidelity of existing kinetic models is a necessity, as new experiments and advanced theories show inaccuracies in certain portions of the models. This study focuses on updating thermodynamic data and the kinetic reaction mechanism for a gasoline surrogate component, 2-methylhexane, based on recently published thermodynamic group values and rate rules derived from quantum calculations and experiments. Alternative pathways for the isomerization of peroxy-alkylhydroperoxide (OOQOOH) radicals are also investigated. The effects of these updates are compared against new high-pressure shock tube and rapid compression machine ignition delay measurements. It is shown that rate constant modifications are required to improve agreement between kinetic modeling simulations and experimental data. We further demonstrate the ability to optimize the kinetic model using both manual and automated techniques for rate parameter tunings to improve agreement with the measured ignition delay time data. Finally, additional low temperature chain branching reaction pathways are shown to improve the model’s performance. The present approach to model development provides better performance across extended operating conditions while also strengthening the fundamental basis of the model.

  9. Double-shell tank integrity assessments ultrasonic test equipment performance test

    Energy Technology Data Exchange (ETDEWEB)

    Pfluger, D.C.

    1996-09-26

    A double-shell tank (DST) inspection (DSTI) system was performance tested over three months until August 1995 at Pittsburgh, Pennsylvania, completing a contract initiated in February 1993 to design, fabricate, and test an ultrasonic inspection system intended to provide ultrasonic test (UT) and visual data to determine the integrity of 28 DSTs at Hanford. The DSTs are approximately one-million-gallon underground radioactive-waste storage tanks. The test was performed in accordance with a procedure (Jensen 1995) that included requirements described in the contract specification (Pfluger 1995). This report documents the results of tests conducted to evaluate the performance of the DSTI system against the requirements of the contract specification. The test of the DSTI system also reflects the performance of qualified personnel and operating procedures.

  10. A Template Model for Multidimensional Inter-Transactional Association Rules

    NARCIS (Netherlands)

    Feng, L.; Yu, J.X.; Lu, H.J.; Han, J.W.

    2002-01-01

    Multidimensional inter-transactional association rules extend the traditional association rules to describe more general associations among items with multiple properties across transactions. “After McDonald and Burger King open branches, KFC will open a branch two months later and one mile away��?

  11. Rule Systems for Runtime Verification: A Short Tutorial

    Science.gov (United States)

    Barringer, Howard; Havelund, Klaus; Rydeheard, David; Groce, Alex

    In this tutorial, we introduce two rule-based systems for on and off-line trace analysis, RuleR and LogScope. RuleR is a conditional rule-based system, which has a simple and easily implemented algorithm for effective runtime verification, and into which one can compile a wide range of temporal logics and other specification formalisms used for runtime verification. Specifications can be parameterized with data, or even with specifications, allowing for temporal logic combinators to be defined. We outline a number of simple syntactic extensions of core RuleR that can lead to further conciseness of specification but still enabling easy and efficient implementation. RuleR is implemented in Java and we will demonstrate its ease of use in monitoring Java programs. LogScope is a derivation of RuleR adding a simple very user-friendly temporal logic. It was developed in Python, specifically for supporting testing of spacecraft flight software for NASA’s next 2011 Mars mission MSL (Mars Science Laboratory). The system has been applied by test engineers to analysis of log files generated by running the flight software. Detailed logging is already part of the system design approach, and hence there is no added instrumentation overhead caused by this approach. While post-mortem log analysis prevents the autonomous reaction to problems possible with traditional runtime verification, it provides a powerful tool for test automation. A new system is being developed that integrates features from both RuleR and LogScope.

  12. Performance enhanced headgear: a scientific approach to the development of protective headgear.

    Science.gov (United States)

    McIntosh, A; McCrory, P; Finch, C F

    2004-02-01

    There is a continuing debate about the performance of protective headgear in rugby union, rugby league, and Australian rules football. To examine the impact energy attenuation performance of foam that could be incorporated into headgear and examine the performance of prototypes of modified headgear. Impact tests were conducted on polyethylene foams and protective headgear. Free fall drop tests with a rigid headform on to a flat rigid anvil were conducted. Resultant headform acceleration was measured. Means of the headform acceleration maxima for repeat tests were calculated. Tests on polyethylene foam indicated that an increase in thickness from 10 mm to 16 mm would improve headgear performance. These modifications were incorporated in part into two headgear models: the Albion Headpro and the Canterbury brand Body Armour honeycomb headgear. The headgear tests show that significant reductions in headform acceleration were achieved by increasing the foam density and thickness. Mean headform acceleration maxima for the 16 mm thick modified rugby headgear was about 25% of that of standard headgear for lateral impact 0.3 and 0.4 m drop heights and 27% for the centre front 0.3 m drop tests. At these impacts, the headform acceleration for the modified rugby headgear was below 200 g. Significant improvements in impact energy attenuation performance are possible with small design changes. Whether these are sufficient to reduce the rate or severity of concussion in rugby and Australian rules football can only be shown by formal prospective studies on the field.

  13. Comparison of two approaches for establishing performance criteria related to Maintenance Rule

    International Nuclear Information System (INIS)

    Jerng, Dong-Wook; Kim, Man Cheol

    2015-01-01

    Probabilistic safety assessment (PSA) serves as a tool for systemically analyzing the safety of nuclear power plants. This paper explains and compares two approaches for the establishment of performance criteria related to the Maintenance Rule: (1) the individual reliability-based approach, and (2) the PSA importance measure-based approach. Different characteristics of the two approaches were compared in a qualitative manner, while a quantitative comparison was performed through application of the two approaches to a nuclear power plant. It was observed that the individual reliability-based approach resulted in more conservative performance criteria, compared to the PSA importance measure-based approach. It is thus expected that the PSA importance measure-based approach will allow for more flexible maintenance policy under conditions of limited resources, while providing for a macroscopic view of overall plant safety. Based on insights derived through this analysis, we emphasize the importance of a balance between reliability and safety significance, and propose a balance measure accordingly. The conclusions of this analysis are likely to be applicable to other types of nuclear power plants. (author)

  14. Allele-sharing models: LOD scores and accurate linkage tests.

    Science.gov (United States)

    Kong, A; Cox, N J

    1997-11-01

    Starting with a test statistic for linkage analysis based on allele sharing, we propose an associated one-parameter model. Under general missing-data patterns, this model allows exact calculation of likelihood ratios and LOD scores and has been implemented by a simple modification of existing software. Most important, accurate linkage tests can be performed. Using an example, we show that some previously suggested approaches to handling less than perfectly informative data can be unacceptably conservative. Situations in which this model may not perform well are discussed, and an alternative model that requires additional computations is suggested.

  15. Rule base system in developing groundwater pollution expert system: predicting model

    International Nuclear Information System (INIS)

    Mongkon Ta-oun; Mohamed Daud; Mohd Zohadie Bardaie; Shamshuddin Jusop

    2000-01-01

    New techniques are now available for use in the protection of the environment. One of these techniques is the use of expert system for prediction groundwater pollution potential. Groundwater Pollution Expert system (GWPES) rules are a collection of principles and procedures used to know the comprehension of groundwater pollution prediction. The rules of groundwater pollution expert system in the form of questions, choice, radio-box, slide rule, button or frame are translated in to IF-THEN rule. The rules including of variables, types, domains and descriptions were used by the function of wxCLIPS (C Language Integrate Production System) expert system shell. (author)

  16. Can the Immune System Perform a t-Test?

    Science.gov (United States)

    Faria, Bruno Filipe; Mostardinha, Patricia

    2017-01-01

    The self-nonself discrimination hypothesis remains a landmark concept in immunology. It proposes that tolerance breaks down in the presence of nonself antigens. In strike contrast, in statistics, occurrence of nonself elements in a sample (i.e., outliers) is not obligatory to violate the null hypothesis. Very often, what is crucial is the combination of (self) elements in a sample. The two views on how to detect a change seem challengingly different and it could seem difficult to conceive how immunological cellular interactions could trigger responses with a precision comparable to some statistical tests. Here it is shown that frustrated cellular interactions reconcile the two views within a plausible immunological setting. It is proposed that the adaptive immune system can be promptly activated either when nonself ligands are detected or self-ligands occur in abnormal combinations. In particular we show that cellular populations behaving in this way could perform location statistical tests, with performances comparable to t or KS tests, or even more general data mining tests such as support vector machines or random forests. In more general terms, this work claims that plausible immunological models should provide accurate detection mechanisms for host protection and, furthermore, that investigation on mechanisms leading to improved detection in “in silico” models can help unveil how the real immune system works. PMID:28046042

  17. Temperature Buffer Test. Final THM modelling

    Energy Technology Data Exchange (ETDEWEB)

    Aakesson, Mattias; Malmberg, Daniel; Boergesson, Lennart; Hernelind, Jan [Clay Technology AB, Lund (Sweden); Ledesma, Alberto; Jacinto, Abel [UPC, Universitat Politecnica de Catalunya, Barcelona (Spain)

    2012-01-15

    The Temperature Buffer Test (TBT) is a joint project between SKB/ANDRA and supported by ENRESA (modelling) and DBE (instrumentation), which aims at improving the understanding and to model the thermo-hydro-mechanical behavior of buffers made of swelling clay submitted to high temperatures (over 100 deg C) during the water saturation process. The test has been carried out in a KBS-3 deposition hole at Aespoe HRL. It was installed during the spring of 2003. Two heaters (3 m long, 0.6 m diameter) and two buffer arrangements have been investigated: the lower heater was surrounded by bentonite only, whereas the upper heater was surrounded by a composite barrier, with a sand shield between the heater and the bentonite. The test was dismantled and sampled during the winter of 2009/2010. This report presents the final THM modelling which was resumed subsequent to the dismantling operation. The main part of this work has been numerical modelling of the field test. Three different modelling teams have presented several model cases for different geometries and different degree of process complexity. Two different numerical codes, Code{sub B}right and Abaqus, have been used. The modelling performed by UPC-Cimne using Code{sub B}right, has been divided in three subtasks: i) analysis of the response observed in the lower part of the test, by inclusion of a number of considerations: (a) the use of the Barcelona Expansive Model for MX-80 bentonite; (b) updated parameters in the vapour diffusive flow term; (c) the use of a non-conventional water retention curve for MX-80 at high temperature; ii) assessment of a possible relation between the cracks observed in the bentonite blocks in the upper part of TBT, and the cycles of suction and stresses registered in that zone at the start of the experiment; and iii) analysis of the performance, observations and interpretation of the entire test. It was however not possible to carry out a full THM analysis until the end of the test due to

  18. Temperature Buffer Test. Final THM modelling

    International Nuclear Information System (INIS)

    Aakesson, Mattias; Malmberg, Daniel; Boergesson, Lennart; Hernelind, Jan; Ledesma, Alberto; Jacinto, Abel

    2012-01-01

    The Temperature Buffer Test (TBT) is a joint project between SKB/ANDRA and supported by ENRESA (modelling) and DBE (instrumentation), which aims at improving the understanding and to model the thermo-hydro-mechanical behavior of buffers made of swelling clay submitted to high temperatures (over 100 deg C) during the water saturation process. The test has been carried out in a KBS-3 deposition hole at Aespoe HRL. It was installed during the spring of 2003. Two heaters (3 m long, 0.6 m diameter) and two buffer arrangements have been investigated: the lower heater was surrounded by bentonite only, whereas the upper heater was surrounded by a composite barrier, with a sand shield between the heater and the bentonite. The test was dismantled and sampled during the winter of 2009/2010. This report presents the final THM modelling which was resumed subsequent to the dismantling operation. The main part of this work has been numerical modelling of the field test. Three different modelling teams have presented several model cases for different geometries and different degree of process complexity. Two different numerical codes, Code B right and Abaqus, have been used. The modelling performed by UPC-Cimne using Code B right, has been divided in three subtasks: i) analysis of the response observed in the lower part of the test, by inclusion of a number of considerations: (a) the use of the Barcelona Expansive Model for MX-80 bentonite; (b) updated parameters in the vapour diffusive flow term; (c) the use of a non-conventional water retention curve for MX-80 at high temperature; ii) assessment of a possible relation between the cracks observed in the bentonite blocks in the upper part of TBT, and the cycles of suction and stresses registered in that zone at the start of the experiment; and iii) analysis of the performance, observations and interpretation of the entire test. It was however not possible to carry out a full THM analysis until the end of the test due to

  19. Development of Future Rule Curves for Multipurpose Reservoir Operation Using Conditional Genetic and Tabu Search Algorithms

    Directory of Open Access Journals (Sweden)

    Anongrit Kangrang

    2018-01-01

    Full Text Available Optimal rule curves are necessary guidelines in the reservoir operation that have been used to assess performance of any reservoir to satisfy water supply, irrigation, industrial, hydropower, and environmental conservation requirements. This study applied the conditional genetic algorithm (CGA and the conditional tabu search algorithm (CTSA technique to connect with the reservoir simulation model in order to search optimal reservoir rule curves. The Ubolrat Reservoir located in the northeast region of Thailand was an illustrative application including historic monthly inflow, future inflow generated by the SWAT hydrological model using 50-year future climate data from the PRECIS regional climate model in case of B2 emission scenario by IPCC SRES, water demand, hydrologic data, and physical reservoir data. The future and synthetic inflow data of reservoirs were used to simulate reservoir system for evaluating water situation. The situations of water shortage and excess water were shown in terms of frequency magnitude and duration. The results have shown that the optimal rule curves from CGA and CTSA connected with the simulation model can mitigate drought and flood situations than the existing rule curves. The optimal future rule curves were more suitable for future situations than the other rule curves.

  20. Assessing Performance of Multipurpose Reservoir System Using Two-Point Linear Hedging Rule

    Science.gov (United States)

    Sasireka, K.; Neelakantan, T. R.

    2017-07-01

    Reservoir operation is the one of the important filed of water resource management. Innovative techniques in water resource management are focussed at optimizing the available water and in decreasing the environmental impact of water utilization on the natural environment. In the operation of multi reservoir system, efficient regulation of the release to satisfy the demand for various purpose like domestic, irrigation and hydropower can lead to increase the benefit from the reservoir as well as significantly reduces the damage due to floods. Hedging rule is one of the emerging techniques in reservoir operation, which reduce the severity of drought by accepting number of smaller shortages. The key objective of this paper is to maximize the minimum power production and improve the reliability of water supply for municipal and irrigation purpose by using hedging rule. In this paper, Type II two-point linear hedging rule is attempted to improve the operation of Bargi reservoir in the Narmada basin in India. The results obtained from simulation of hedging rule is compared with results from Standard Operating Policy, the result shows that the application of hedging rule significantly improved the reliability of water supply and reliability of irrigation release and firm power production.

  1. Fast Flux Test Facility core restraint system performance

    International Nuclear Information System (INIS)

    Hecht, S.L.; Trenchard, R.G.

    1990-02-01

    Characterizing Fast Flux Test Facility (FFTF) core restraint system performance has been ongoing since the first operating cycle. Characterization consists of prerun analysis for each core load, in-reactor and postirradiation measurements of subassembly withdrawal loads and deformations, and using measurement data to fine tune predictive models. Monitoring FFTF operations and performing trend analysis has made it possible to gain insight into core restraint system performance and head off refueling difficulties while maximizing component lifetimes. Additionally, valuable information for improved designs and operating methods has been obtained. Focus is on past operating experience, emphasizing performance improvements and avoidance of potential problems. 4 refs., 12 figs., 2 tabs

  2. Microbiopsy a first-level diagnostic test to rule out oral dysplasia or carcinoma in general dental practice.

    Science.gov (United States)

    Pentenero, M; Val, M; Rosso, S; Gandolfo, S

    2018-03-01

    Diagnostic delay in oral oncology could be improved if general dentists had a reliable and easy-to-use first-level diagnostic test to rule out the presence of oral dysplasia or carcinoma. Microbiopsy has been proved to have high sensitivity and high negative predictive value in a clinical setting characterized by high prevalence of disease. Moreover, it has been proved to be easily performed by general dentists. This study aimed to determine the negative predictive value of microbiopsy in routine dental practice: a clinical setting characterized by low prevalence of disease. Within the frame of a previous study, general dentists from the Metropolitan Area of Turin performed microbiopsy for each oral mucosal lesion detected during their practice. The clinical outcome of 129 lesions negative at microbiopsy was checked by a query performed through the database of the Piedmont Cancer Registry, covering the population of the Metropolitan Area of Turin, with particular reference to cancer involving the mouth (ICD-10:C03-06). This allowed us to define "true negative" cases and to calculate the negative predictive value of microbiopsy. In a mean follow-up of 7.5 years (range 7-9 years), with a dropout rate of 7.7%, no case of tumour involving the mouth was observed, thus revealing a negative predictive value approaching 100%. Microbiopsy represents an easy-to-use and reliable first-level test able to aid general dentists to select patients requiring an oral medicine assessment in a short time and definitely to avoid diagnostic delay in oncologically relevant oral mucosal lesions. © 2018 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd. All rights reserved.

  3. Automatic Learning of Fine Operating Rules for Online Power System Security Control.

    Science.gov (United States)

    Sun, Hongbin; Zhao, Feng; Wang, Hao; Wang, Kang; Jiang, Weiyong; Guo, Qinglai; Zhang, Boming; Wehenkel, Louis

    2016-08-01

    Fine operating rules for security control and an automatic system for their online discovery were developed to adapt to the development of smart grids. The automatic system uses the real-time system state to determine critical flowgates, and then a continuation power flow-based security analysis is used to compute the initial transfer capability of critical flowgates. Next, the system applies the Monte Carlo simulations to expected short-term operating condition changes, feature selection, and a linear least squares fitting of the fine operating rules. The proposed system was validated both on an academic test system and on a provincial power system in China. The results indicated that the derived rules provide accuracy and good interpretability and are suitable for real-time power system security control. The use of high-performance computing systems enables these fine operating rules to be refreshed online every 15 min.

  4. The spatial distance rule in the moving and classical rubber hand illusions.

    Science.gov (United States)

    Kalckert, Andreas; Ehrsson, H Henrik

    2014-11-01

    The rubber hand illusion (RHI) is a perceptual illusion in which participants perceive a model hand as part of their own body. Here, through the use of one questionnaire experiment and two proprioceptive drift experiments, we investigated the effect of distance (12, 27.5, and 43cm) in the vertical plane on both the moving and classical RHI. In both versions of the illusion, we found an effect of distance on ownership of the rubber hand for both measures tested. Our results further suggested that the moving RHI might follow a narrower spatial rule. Finally, whereas ownership of the moving rubber hand was affected by distance, this was not the case for agency, which was present at all distances tested. In sum, the present results generalize the spatial distance rule in terms of ownership to the vertical plane of space and demonstrate that also the moving RHI obeys this rule. Copyright © 2014 The Authors. Published by Elsevier Inc. All rights reserved.

  5. Radioactive material packaging performance testing

    International Nuclear Information System (INIS)

    Romano, T.; Cruse, J.M.

    1991-02-01

    To provide uniform packaging of hazardous materials on an international level, the United Nations has developed packaging recommendations that have been implemented worldwide. The United Nations packaging recommendations are performance oriented, allowing for a wide variety of package materials and systems. As a result of this international standard, efforts in the United States are being directed toward use of performance-oriented packaging and elimination of specification (designed) packaging. This presentation will focus on trends, design evaluation, and performance testing of radioactive material packaging. The impacts of US Department of Transportation Dockets HM-181 and HM-169A on specification and low-specific activity radioactive material packaging requirements are briefly discussed. The US Department of Energy's program for evaluating radioactive material packings per US Department of Transportation Specification 7A Type A requirements, is used as the basis for discussing low-activity packaging performance test requirements. High-activity package testing requirements are presented with examples of testing performed at the Hanford Site that is operated by Westinghouse Hanford Company for the US Department of Energy. 5 refs., 2 tabs

  6. Performance of predictive models in phase equilibria of complex associating systems: PC-SAFT and CEOS/GE

    Directory of Open Access Journals (Sweden)

    N. Bender

    2013-03-01

    Full Text Available Cubic equations of state combined with excess Gibbs energy predictive models (like UNIFAC and equations of state based on applied statistical mechanics are among the main alternatives for phase equilibria prediction involving polar substances in wide temperature and pressure ranges. In this work, the predictive performances of the PC-SAFT with association contribution and Peng-Robinson (PR combined with UNIFAC (Do through mixing rules are compared. Binary and multi-component systems involving polar and non-polar substances were analyzed. Results were also compared to experimental data available in the literature. Results show a similar predictive performance for PC-SAFT with association and cubic equations combined with UNIFAC (Do through mixing rules. Although PC-SAFT with association requires less parameters, it is more complex and requires more computation time.

  7. Influence of fuzzy norms and other heuristics on "Mixed fuzzy rule formation" - [Corrigendum

    OpenAIRE

    Gabriel, Thomas R.; Berthold, Michael R.

    2008-01-01

    We hereby correct an error in Ref. [2], in which we studied the influence of various parameters that affect the generalization performance of fuzzy models constructed using the mixed fuzzy rule formation method [1].

  8. A wave model test bed study for wave energy resource characterization

    Energy Technology Data Exchange (ETDEWEB)

    Yang, Zhaoqing; Neary, Vincent S.; Wang, Taiping; Gunawan, Budi; Dallman, Annie R.; Wu, Wei-Cheng

    2017-12-01

    This paper presents a test bed study conducted to evaluate best practices in wave modeling to characterize energy resources. The model test bed off the central Oregon Coast was selected because of the high wave energy and available measured data at the site. Two third-generation spectral wave models, SWAN and WWIII, were evaluated. A four-level nested-grid approach—from global to test bed scale—was employed. Model skills were assessed using a set of model performance metrics based on comparing six simulated wave resource parameters to observations from a wave buoy inside the test bed. Both WWIII and SWAN performed well at the test bed site and exhibited similar modeling skills. The ST4 package with WWIII, which represents better physics for wave growth and dissipation, out-performed ST2 physics and improved wave power density and significant wave height predictions. However, ST4 physics tended to overpredict the wave energy period. The newly developed ST6 physics did not improve the overall model skill for predicting the six wave resource parameters. Sensitivity analysis using different wave frequencies and direction resolutions indicated the model results were not sensitive to spectral resolutions at the test bed site, likely due to the absence of complex bathymetric and geometric features.

  9. How Politics Shapes the Growth of Rules

    DEFF Research Database (Denmark)

    Jakobsen, Mads Leth Felsager; Mortensen, Peter Bjerre

    2015-01-01

    when, why, and how political factors shape changes in the stock of rules. Furthermore, we test these hypotheses on a unique, new data set based on all Danish primary legislation and administrative rules from 1989 to 2011 categorized into 20 different policy domains. The analysis shows......This article examines the impact of politics on governmental rule production. Traditionally, explanations of rule dynamics have focused on nonpolitical factors such as the self-evolvement of rules, environmental factors, and decision maker attributes. This article develops a set of hypotheses about...... that the traditional Weberian “rules breed rules” explanations must be supplemented with political explanations that take party ideology and changes in the political agenda into account. Moreover, the effect of political factors is indistinguishable across changes in primary laws and changes in administrative rules...

  10. Play Behavior in Wolves: Using the '50:50' Rule to Test for Egalitarian Play Styles.

    Directory of Open Access Journals (Sweden)

    Jennifer L Essler

    Full Text Available Social play is known as a cooperative interaction between individuals involving multiple mechanisms. However, the extent to which the equality of individuals' play styles affects the interaction has not been studied in many species. Dyadic play between wolf puppies, as well as between puppies and adults, was studied to investigate both self-handicapping and offensive behaviors to determine the extent to which wolves engage in play styles where one individual does not dominate the play. Our results did not support the hypothesized '50:50' rule, which suggests that more advantaged individuals should show higher rates of self-handicapping behaviors in order to facilitate play with others. Adult wolves performed significantly less self-handicapping behaviors than their puppy partners, and they performed significantly more offensive behaviors than their puppy partners. While the '50:50' rule was not supported at any time during our study period, dyads consisting of two puppies had significantly more equal play than dyads consisting of one puppy and one adult. These results suggest that wolf puppies are more likely to play on equal terms with similarly-aged play partners, while the dominance status of the partners dictates offensive and self-handicapping behaviors between animals of different ages.

  11. Play Behavior in Wolves: Using the '50:50' Rule to Test for Egalitarian Play Styles.

    Science.gov (United States)

    Essler, Jennifer L; Cafazzo, Simona; Marshall-Pescini, Sarah; Virányi, Zsófia; Kotrschal, Kurt; Range, Friederike

    2016-01-01

    Social play is known as a cooperative interaction between individuals involving multiple mechanisms. However, the extent to which the equality of individuals' play styles affects the interaction has not been studied in many species. Dyadic play between wolf puppies, as well as between puppies and adults, was studied to investigate both self-handicapping and offensive behaviors to determine the extent to which wolves engage in play styles where one individual does not dominate the play. Our results did not support the hypothesized '50:50' rule, which suggests that more advantaged individuals should show higher rates of self-handicapping behaviors in order to facilitate play with others. Adult wolves performed significantly less self-handicapping behaviors than their puppy partners, and they performed significantly more offensive behaviors than their puppy partners. While the '50:50' rule was not supported at any time during our study period, dyads consisting of two puppies had significantly more equal play than dyads consisting of one puppy and one adult. These results suggest that wolf puppies are more likely to play on equal terms with similarly-aged play partners, while the dominance status of the partners dictates offensive and self-handicapping behaviors between animals of different ages.

  12. Uniform peanut performance test 2017

    Science.gov (United States)

    The Uniform Peanut Performance Tests (UPPT) are designed to evaluate the commercial potential of advanced breeding peanut lines not formally released. The tests are performed in ten locations across the peanut production belt. In this study, 2 controls and 14 entries were evaluated at 8 locations....

  13. The effect of membership rules and voting schemes on the success of international climate agreements

    NARCIS (Netherlands)

    Finus, M.; Altamirano-Cabrera, J.C.; Ierland, van E.C.

    2005-01-01

    We empirically test the role of membership rules and voting schemes for climate change coalitions with the STAbility of COalitions model (STACO). The model comprises twelve world regions and captures long-run effects of greenhouse gas accumulation. We apply three stability concepts that capture the

  14. 12 CFR 621.7 - Rule of aggregation.

    Science.gov (United States)

    2010-01-01

    ... 12 Banks and Banking 6 2010-01-01 2010-01-01 false Rule of aggregation. 621.7 Section 621.7 Banks... Performance and Valuation Assessment § 621.7 Rule of aggregation. (a) When one loan to a borrower is placed in... risk and are fully collectible, then they may remain in their current performance category. (c) When an...

  15. Using fuzzy association rule mining in cancer classification

    International Nuclear Information System (INIS)

    Mahmoodian, Hamid; Marhaban, M.H.; Abdulrahim, Raha; Rosli, Rozita; Saripan, Iqbal

    2011-01-01

    Full text: The classification of the cancer tumors based on gene expression profiles has been extensively studied in numbers of studies. A wide variety of cancer datasets have been implemented by the various methods of gene selec tion and classification to identify the behavior of the genes in tumors and find the relationships between them and outcome of diseases. Interpretability of the model, which is developed by fuzzy rules and linguistic variables in this study, has been rarely considered. In addition, creating a fuzzy classifier with high performance in classification that uses a subset of significant genes which have been selected by different types of gene selection methods is another goal of this study. A new algorithm has been developed to identify the fuzzy rules and significant genes based on fuzzy association rule mining. At first, different subset of genes which have been selected by different methods, were used to generate primary fuzzy classifiers separately and then proposed algorithm was implemented to mix the genes which have been associated in the primary classifiers and generate a new classifier. The results show that fuzzy classifier can classify the tumors with high performance while presenting the relationships between the genes by linguistic variables

  16. A dynamic analysis of moving average rules

    NARCIS (Netherlands)

    Chiarella, C.; He, X.Z.; Hommes, C.H.

    2006-01-01

    The use of various moving average (MA) rules remains popular with financial market practitioners. These rules have recently become the focus of a number empirical studies, but there have been very few studies of financial market models where some agents employ technical trading rules of the type

  17. Experimental testing of the thermal performance of finned air coolers

    International Nuclear Information System (INIS)

    Imhof, A.; Keller, J.; Koelliker, A.

    1988-05-01

    Finned heat exchangers are often used as regenerators in heat recovery systems or as a heat source for heat pump installations. These exchangers are usually operating as air coolers. Heat is extracted from the air flowing through the heat exchanger. If the fin temperature lies below the dew point at the air inlet, water vapour may be condensed, increasing the thermal performance of the cooler. If the air/water heat exchanger is installed outdoors, the blower is usually mounted directly at the exchaner's case. In general this leads to non-ideal air flow conditions. For the sizing of such components the manufacturers dispose of design rules which are based either on theoretical models or on experiments using a uniform air stream. These rules which are mostly internal codes of the individual companies presumably do not take into account some non-ideal conditions such as an inhomogeneous air flow, a poorly sized blower or an increased pressure drop between the fins due to condensed water vapour. Moreover, these codes are possibly not sophisticated enough to enable a correct sizing of the products for any given condition of operation, especially in heat pumps operating under condensation conditions. Therfore, the Swiss Federal Institute for Reactor Research (EIR) carried out a research program dealing with the thermal performance of commercially available finned air coolers. The results give a strong evidence that the sizing of finned air coolers involving a phase change in one of the heat transfer fluids is not yet a procedure belonging to the common knowledge of most of the manufacturers. Moreover, the correct sizing of the blower is at least as important as the sizing of the finned exchanger itself. However, it is evident that there are companies on the Swiss market which use already reliable design tools. 25 refs., 81 figs., 12 tabs

  18. Superconducting solenoid model magnet test results

    International Nuclear Information System (INIS)

    Carcagno, R.; Dimarco, J.; Feher, S.; Ginsburg, C.M.; Hess, C.; Kashikhin, V.V.; Orris, D.F.; Pischalnikov, Y.; Sylvester, C.; Tartaglia, M.A.; Terechkine, I.; Tompkins, J.C.; Wokas, T.; Fermilab

    2006-01-01

    Superconducting solenoid magnets suitable for the room temperature front end of the Fermilab High Intensity Neutrino Source (formerly known as Proton Driver), an 8 GeV superconducting H- linac, have been designed and fabricated at Fermilab, and tested in the Fermilab Magnet Test Facility. We report here results of studies on the first model magnets in this program, including the mechanical properties during fabrication and testing in liquid helium at 4.2 K, quench performance, and magnetic field measurements. We also describe new test facility systems and instrumentation that have been developed to accomplish these tests

  19. Superconducting solenoid model magnet test results

    Energy Technology Data Exchange (ETDEWEB)

    Carcagno, R.; Dimarco, J.; Feher, S.; Ginsburg, C.M.; Hess, C.; Kashikhin, V.V.; Orris, D.F.; Pischalnikov, Y.; Sylvester, C.; Tartaglia, M.A.; Terechkine, I.; /Fermilab

    2006-08-01

    Superconducting solenoid magnets suitable for the room temperature front end of the Fermilab High Intensity Neutrino Source (formerly known as Proton Driver), an 8 GeV superconducting H- linac, have been designed and fabricated at Fermilab, and tested in the Fermilab Magnet Test Facility. We report here results of studies on the first model magnets in this program, including the mechanical properties during fabrication and testing in liquid helium at 4.2 K, quench performance, and magnetic field measurements. We also describe new test facility systems and instrumentation that have been developed to accomplish these tests.

  20. Horizontal crash testing and analysis of model flatrols

    International Nuclear Information System (INIS)

    Dowler, H.J.; Soanes, T.P.T.

    1985-01-01

    To assess the behaviour of a full scale flask and flatrol during a proposed demonstration impact into a tunnel abutment, a mathematical modelling technique was developed and validated. The work was performed at quarter scale and comprised of both scale model tests and mathematical analysis in one and two dimensions. Good agreement between model test results of the 26.8m/s (60 mph) abutment impacts and the mathematical analysis, validated the modelling techniques. The modelling method may be used with confidence to predict the outcome of the proposed full scale demonstration. (author)

  1. Mate choice when males are in patches: optimal strategies and good rules of thumb.

    Science.gov (United States)

    Hutchinson, John M C; Halupka, Konrad

    2004-11-07

    In standard mate-choice models, females encounter males sequentially and decide whether to inspect the quality of another male or to accept a male already inspected. What changes when males are clumped in patches and there is a significant cost to travel between patches? We use stochastic dynamic programming to derive optimum strategies under various assumptions. With zero costs to returning to a male in the current patch, the optimal strategy accepts males above a quality threshold which is constant whenever one or more males in the patch remain uninspected; this threshold drops when inspecting the last male in the patch, so returns may occur only then and are never to a male in a previously inspected patch. With non-zero within-patch return costs, such a two-threshold rule still performs extremely well, but a more gradual decline in acceptance threshold is optimal. Inability to return at all need not decrease performance by much. The acceptance threshold should also decline if it gets harder to discover the last males in a patch. Optimal strategies become more complex when mean male quality varies systematically between patches or years, and females estimate this in a Bayesian manner through inspecting male qualities. It can then be optimal to switch patch before inspecting all males on a patch, or, exceptionally, to return to an earlier patch. We compare performance of various rules of thumb in these environments and in ones without a patch structure. A two-threshold rule performs excellently, as do various simplifications of it. The best-of-N rule outperforms threshold rules only in non-patchy environments with between-year quality variation. The cutoff rule performs poorly.

  2. Domain XML semantic integration based on extraction rules and ontology mapping

    Directory of Open Access Journals (Sweden)

    Huayu LI

    2016-08-01

    Full Text Available A plenty of XML documents exist in petroleum engineering field, but traditional XML integration solution can’t provide semantic query, which leads to low data use efficiency. In light of WeXML(oil&gas well XML data semantic integration and query requirement, this paper proposes a semantic integration method based on extraction rules and ontology mapping. The method firstly defines a series of extraction rules with which elements and properties of WeXML Schema are mapped to classes and properties in WeOWL ontology, respectively; secondly, an algorithm is used to transform WeXML documents into WeOWL instances. Because WeOWL provides limited semantics, ontology mappings between two ontologies are then built to explain class and property of global ontology with terms of WeOWL, and semantic query based on global domain concepts model is provided. By constructing a WeXML data semantic integration prototype system, the proposed transformational rule, the transfer algorithm and the mapping rule are tested.

  3. Examining the Rule of Thumb of Not Using Multilevel Modeling: The "Design Effect Smaller than Two" Rule

    Science.gov (United States)

    Lai, Mark H. C.; Kwok, Oi-man

    2015-01-01

    Educational researchers commonly use the rule of thumb of "design effect smaller than 2" as the justification of not accounting for the multilevel or clustered structure in their data. The rule, however, has not yet been systematically studied in previous research. In the present study, we generated data from three different models…

  4. Structural-performance testing of titanium-shell lead-matrix container MM2

    Energy Technology Data Exchange (ETDEWEB)

    Hosaluk, L. J.; Barrie, J. N.

    1992-05-15

    This report describes the hydrostatic structural-performance testing of a half-scale, titanium-shell, lead-matrix container (MM2) with a large, simulated volumetric casting defect. Mechancial behaviour of the container is assessed from extensive surface-strain measurements and post-test non-destructive and destructive examinations. Measured strain data are compared briefly with analytical results from a finite-element model of a previous test prototype, MM1, and with data generated by a finite-difference computer code. Finally, procedures are recommended for more detailed analytical modelling. (auth)

  5. 77 FR 52977 - Regulatory Capital Rules: Advanced Approaches Risk-Based Capital Rule; Market Risk Capital Rule

    Science.gov (United States)

    2012-08-30

    ...-weighted assets for residential mortgages, securitization exposures, and counterparty credit risk. The.... Risk-Weighted Assets--Proposed Modifications to the Advanced Approaches Rules A. Counterparty Credit... Margin Period of Risk 3. Changes to the Internal Models Methodology (IMM) 4. Credit Valuation Adjustments...

  6. Preliminary Test for Constitutive Models of CAP

    Energy Technology Data Exchange (ETDEWEB)

    Choo, Yeon Joon; Hong, Soon Joon; Hwang, Su Hyun; Lee, Keo Hyung; Kim, Min Ki; Lee, Byung Chul [FNC Tech., Seoul (Korea, Republic of); Ha, Sang Jun; Choi, Hoon [Korea Electric Power Research Institute, Daejeon (Korea, Republic of)

    2010-05-15

    The development project for the domestic design code was launched to be used for the safety and performance analysis of pressurized light water reactors. As a part of this project, CAP (Containment Analysis Package) code has been developing for the containment safety and performance analysis side by side with SPACE. The CAP code treats three fields (vapor, continuous liquid and dispersed drop) for the assessment of containment specific phenomena, and is featured by assessment capabilities in multi-dimensional and lumped parameter thermal hydraulic cell. Thermal hydraulics solver was developed and has a significant progress now. Implementation of the well proven constitutive models and correlations are essential in other for a containment code to be used with the generalized or optimized purposes. Generally, constitutive equations are composed of interfacial and wall transport models and correlations. These equations are included in the source terms of the governing field equations. In order to develop the best model and correlation package of the CAP code, various models currently used in major containment analysis codes, such as GOTHIC, CONTAIN2.0 and CONTEMPT-LT are reviewed. Several models and correlations were incorporated for the preliminary test of CAP's performance and test results and future plans to improve the level of execution besides will be discussed in this paper

  7. Preliminary Test for Constitutive Models of CAP

    International Nuclear Information System (INIS)

    Choo, Yeon Joon; Hong, Soon Joon; Hwang, Su Hyun; Lee, Keo Hyung; Kim, Min Ki; Lee, Byung Chul; Ha, Sang Jun; Choi, Hoon

    2010-01-01

    The development project for the domestic design code was launched to be used for the safety and performance analysis of pressurized light water reactors. As a part of this project, CAP (Containment Analysis Package) code has been developing for the containment safety and performance analysis side by side with SPACE. The CAP code treats three fields (vapor, continuous liquid and dispersed drop) for the assessment of containment specific phenomena, and is featured by assessment capabilities in multi-dimensional and lumped parameter thermal hydraulic cell. Thermal hydraulics solver was developed and has a significant progress now. Implementation of the well proven constitutive models and correlations are essential in other for a containment code to be used with the generalized or optimized purposes. Generally, constitutive equations are composed of interfacial and wall transport models and correlations. These equations are included in the source terms of the governing field equations. In order to develop the best model and correlation package of the CAP code, various models currently used in major containment analysis codes, such as GOTHIC, CONTAIN2.0 and CONTEMPT-LT are reviewed. Several models and correlations were incorporated for the preliminary test of CAP's performance and test results and future plans to improve the level of execution besides will be discussed in this paper

  8. Decision tables and rule engines in organ allocation systems for optimal transparency and flexibility.

    Science.gov (United States)

    Schaafsma, Murk; van der Deijl, Wilfred; Smits, Jacqueline M; Rahmel, Axel O; de Vries Robbé, Pieter F; Hoitsma, Andries J

    2011-05-01

    Organ allocation systems have become complex and difficult to comprehend. We introduced decision tables to specify the rules of allocation systems for different organs. A rule engine with decision tables as input was tested for the Kidney Allocation System (ETKAS). We compared this rule engine with the currently used ETKAS by running 11,000 historical match runs and by running the rule engine in parallel with the ETKAS on our allocation system. Decision tables were easy to implement and successful in verifying correctness, completeness, and consistency. The outcomes of the 11,000 historical matches in the rule engine and the ETKAS were exactly the same. Running the rule engine simultaneously in parallel and in real time with the ETKAS also produced no differences. Specifying organ allocation rules in decision tables is already a great step forward in enhancing the clarity of the systems. Yet, using these tables as rule engine input for matches optimizes the flexibility, simplicity and clarity of the whole process, from specification to the performed matches, and in addition this new method allows well controlled simulations. © 2011 The Authors. Transplant International © 2011 European Society for Organ Transplantation.

  9. Balancing Model Performance and Simplicity to Predict Postoperative Primary Care Blood Pressure Elevation.

    Science.gov (United States)

    Schonberger, Robert B; Dai, Feng; Brandt, Cynthia A; Burg, Matthew M

    2015-09-01

    Because of uncertainty regarding the reliability of perioperative blood pressures and traditional notions downplaying the role of anesthesiologists in longitudinal patient care, there is no consensus for anesthesiologists to recommend postoperative primary care blood pressure follow-up for patients presenting for surgery with an increased blood pressure. The decision of whom to refer should ideally be based on a predictive model that balances performance with ease-of-use. If an acceptable decision rule was developed, a new practice paradigm integrating the surgical encounter into broader public health efforts could be tested, with the goal of reducing long-term morbidity from hypertension among surgical patients. Using national data from US veterans receiving surgical care, we determined the prevalence of poorly controlled outpatient clinic blood pressures ≥140/90 mm Hg, based on the mean of up to 4 readings in the year after surgery. Four increasingly complex logistic regression models were assessed to predict this outcome. The first included the mean of 2 preoperative blood pressure readings; other models progressively added a broad array of demographic and clinical data. After internal validation, the C-statistics and the Net Reclassification Index between the simplest and most complex models were assessed. The performance characteristics of several simple blood pressure referral thresholds were then calculated. Among 215,621 patients, poorly controlled outpatient clinic blood pressure was present postoperatively in 25.7% (95% confidence interval [CI], 25.5%-25.9%) including 14.2% (95% CI, 13.9%-14.6%) of patients lacking a hypertension history. The most complex prediction model demonstrated statistically significant, but clinically marginal, improvement in discrimination over a model based on preoperative blood pressure alone (C-statistic, 0.736 [95% CI, 0.734-0.739] vs 0.721 [95% CI, 0.718-0.723]; P for difference 1 of 4 patients (95% CI, 25

  10. Forecasting Urban Air Quality via a Back-Propagation Neural Network and a Selection Sample Rule

    Directory of Open Access Journals (Sweden)

    Yonghong Liu

    2015-07-01

    Full Text Available In this paper, based on a sample selection rule and a Back Propagation (BP neural network, a new model of forecasting daily SO2, NO2, and PM10 concentration in seven sites of Guangzhou was developed using data from January 2006 to April 2012. A meteorological similarity principle was applied in the development of the sample selection rule. The key meteorological factors influencing SO2, NO2, and PM10 daily concentrations as well as weight matrices and threshold matrices were determined. A basic model was then developed based on the improved BP neural network. Improving the basic model, identification of the factor variation consistency was added in the rule, and seven sets of sensitivity experiments in one of the seven sites were conducted to obtain the selected model. A comparison of the basic model from May 2011 to April 2012 in one site showed that the selected model for PM10 displayed better forecasting performance, with Mean Absolute Percentage Error (MAPE values decreasing by 4% and R2 values increasing from 0.53 to 0.68. Evaluations conducted at the six other sites revealed a similar performance. On the whole, the analysis showed that the models presented here could provide local authorities with reliable and precise predictions and alarms about air quality if used at an operational scale.

  11. Analysing model fit of psychometric process models: An overview, a new test and an application to the diffusion model.

    Science.gov (United States)

    Ranger, Jochen; Kuhn, Jörg-Tobias; Szardenings, Carsten

    2017-05-01

    Cognitive psychometric models embed cognitive process models into a latent trait framework in order to allow for individual differences. Due to their close relationship to the response process the models allow for profound conclusions about the test takers. However, before such a model can be used its fit has to be checked carefully. In this manuscript we give an overview over existing tests of model fit and show their relation to the generalized moment test of Newey (Econometrica, 53, 1985, 1047) and Tauchen (J. Econometrics, 30, 1985, 415). We also present a new test, the Hausman test of misspecification (Hausman, Econometrica, 46, 1978, 1251). The Hausman test consists of a comparison of two estimates of the same item parameters which should be similar if the model holds. The performance of the Hausman test is evaluated in a simulation study. In this study we illustrate its application to two popular models in cognitive psychometrics, the Q-diffusion model and the D-diffusion model (van der Maas, Molenaar, Maris, Kievit, & Boorsboom, Psychol Rev., 118, 2011, 339; Molenaar, Tuerlinckx, & van der Maas, J. Stat. Softw., 66, 2015, 1). We also compare the performance of the test to four alternative tests of model fit, namely the M 2 test (Molenaar et al., J. Stat. Softw., 66, 2015, 1), the moment test (Ranger et al., Br. J. Math. Stat. Psychol., 2016) and the test for binned time (Ranger & Kuhn, Psychol. Test. Asess. , 56, 2014b, 370). The simulation study indicates that the Hausman test is superior to the latter tests. The test closely adheres to the nominal Type I error rate and has higher power in most simulation conditions. © 2017 The British Psychological Society.

  12. [Application of association rule in mental health test for employees in a petrochemical enterprise].

    Science.gov (United States)

    Zhang, L F; Zhang, D N; Wang, Z P

    2017-10-20

    Objective: To investigate the occurrence ruleof common psychological abnormalities in petrochemical workers using association rule. Methods: From July to September,2014,the Symptom Checklist-90 (SCL-90)was used for the general survey of mental healthamong all employees in a petrochemical enterprise.The association rule Apriori algorithm was used to analyze the data of SCL-90 and investigate the occurrence rule of psychological abnormalities in petrochemical workers with different sexes,ages,or nationalities. Results: A total of 8 248 usable questionnaires were collected. The SCL-90 analysis showed that 1623 petrochemical workers(19.68%) had positive results,among whom 567(34.94%)had one positive factor and 1056 (65.06%)had two or more positive factors. A total of 7 strong association rules were identified and all of them included obsessive-compulsive symptom and depression. Male({obsessive-compulsive symptom,anxiety}=>{depression}) and female workers ({somatization,depression}=>{obsessive-compulsive symptom}) had their own special association rules. The workers aged 35-44 years had 17 special association rules,and ethnic minorities had 5 special association rules. Conclusion: Employeesin the petrochemical enterprise have multiple positive factors in SCL-90, and employees aged 35-44 years and ethnic minorities have a rich combination of psychological symptoms and need special attention during mental health intervention.

  13. Criterial noise effects on rule-based category learning: the impact of delayed feedback.

    Science.gov (United States)

    Ell, Shawn W; Ing, A David; Maddox, W Todd

    2009-08-01

    Variability in the representation of the decision criterion is assumed in many category-learning models, yet few studies have directly examined its impact. On each trial, criterial noise should result in drift in the criterion and will negatively impact categorization accuracy, particularly in rule-based categorization tasks, where learning depends on the maintenance and manipulation of decision criteria. In three experiments, we tested this hypothesis and examined the impact of working memory on slowing the drift rate. In Experiment 1, we examined the effect of drift by inserting a 5-sec delay between the categorization response and the delivery of corrective feedback, and working memory demand was manipulated by varying the number of decision criteria to be learned. Delayed feedback adversely affected performance, but only when working memory demand was high. In Experiment 2, we built on a classic finding in the absolute identification literature and demonstrated that distributing the criteria across multiple dimensions decreases the impact of drift during the delay. In Experiment 3, we confirmed that the effect of drift during the delay is moderated by working memory. These results provide important insights into the interplay between criterial noise and working memory, as well as providing important constraints for models of rule-based category learning.

  14. Top management turnover and organizational performance: A test of a contingency model

    OpenAIRE

    Boyne, George A.; James, Oliver; John, Peter; Petrovsky, Nicolai

    2011-01-01

    A crucial test of whether "management matters" is whether changes in the team at the top of an organization make a difference. Focusing on turnover in the collective senior team rather than successions of individual chief executives, this article argues that the impact of leadership succession is contingent upon prior organizational performance. The evidence on English local government shows that changes in the top management team lead to improvements when initial performance is bad, but resu...

  15. Performance test results of mock-up test facility of HTTR hydrogen production system

    International Nuclear Information System (INIS)

    Ohashi, Hirofumi; Inaba, Yoshitomo; Nishihara, Tetsuo

    2004-01-01

    For the purpose to demonstrate effectiveness of high-temperature nuclear heat utilization, Japan Atomic Energy Research Institute has been developing a hydrogen production system and has planned to connect the hydrogen production system to High Temperature Engineering Test Reactor (HTTR). Prior to construction of a HTTR hydrogen production system, a mock-up test facility was constructed to investigate transient behavior of the hydrogen production system and to establish system controllability. The Mock-up test facility with a full-scale reaction tube is an approximately 1/30-scale model of the HTTR hydrogen production system and an electric heater is used as a heat source instead of a reactor. After its construction, a performance test of the test facility was carried out in the same pressure and temperature conditions as those of the HTTR hydrogen production system to investigate its performance such as hydrogen production ability, controllability and so on. It was confirmed that hydrogen was stably produced with a hot helium gas about 120m 3 /h, which satisfy the design value, and thermal disturbance of helium gas during the start-up could be mitigated within the design value by using a steam generator. The mock-up test of the HTTR hydrogen production system using this facility will continue until 2004. (author)

  16. Modeling within-word and cross-word pronunciation variation to improve the performance of a Dutch CSR

    OpenAIRE

    Kessens, J.M.; Wester, M.; Strik, H.

    1999-01-01

    This paper describes how the performance of a continuous speech recognizer for Dutch has been improved by modeling within-word and cross-word pronunciation variation. Within-word variants were automatically generated by applying five phonological rules to the words in the lexicon. For the within-word method, a significant improvement is found compared to the baseline. Cross-word pronunciation variation was modeled using two different methods: 1) adding cross-word variants directly to the lexi...

  17. Utilidade de regras booleanas aplicadas à liberação de resultados de exames hormonais e marcadores tumorais Usefulness of Boolean rules applied on the release of hormonal and tumor markers tests results

    Directory of Open Access Journals (Sweden)

    Murilo Rezende Melo

    2006-08-01

    review of results for adequate laboratory section test release. Analysis of these results using Boolean rules is an interesting alternative to reduce the number of results that require manual review. MATERIAL AND METHOD: We evaluated the utilization of Boolean rules using Instrument Manager software and Architect analyzer, mainly performing sex and thyroid hormones measurement. The intervention was evaluated on: a number of rules and its easiness of construction; b blind comparison of results evaluation by clinical pathologist (printed results and set of rules in 940 consecutive tests. RESULTS: Rule creation was a complex and arduous task, especially due to hormonal profiles with several different request patterns. It was necessary to use a set of 153 Boolean (if…then rules, in a specific order. This set of rules agreed with expert opinion in 97.9% (920 tests. Rules hold 25 tests (2.7% and the clinical pathologist only nine tests. There was discordance in 20 cases; rules did not hold only two cases: a beta-hCG in a male patient (that prompted the creation of a new rule and a complete thyroid profile lacking only TSH request (pathologist opted to review the original request. CONCLUSION: Creation of an efficient set of Boolean rules proved to be a complex task requiring both technical and logics knowledge, but allowing optimization of laboratory workload. We achieved excellent concordance between the set of rules and clinical pathologist manual review, in a safe, fast and low cost system.

  18. Ruled-based control of off-grid desalination powered by renewable energies

    Directory of Open Access Journals (Sweden)

    Alvaro Serna

    2015-08-01

    Full Text Available A rule-based control is presented for desalination plants operating under variable, renewable power availability. This control algorithm is based on two sets of rules: first, a list that prioritizes the reverse osmosis (RO units of the plant is created, based on the current state and the expected water demand; secondly, the available energy is then dispatched to these units following this prioritized list. The selected strategy is tested on a specific case study: a reverse osmosis plant designed for the production of desalinated water powered by wind and wave energy. Simulation results illustrate the correct performance of the plant under this control.

  19. An Efficient Inductive Genetic Learning Algorithm for Fuzzy Relational Rules

    Directory of Open Access Journals (Sweden)

    Antonio

    2012-04-01

    Full Text Available Fuzzy modelling research has traditionally focused on certain types of fuzzy rules. However, the use of alternative rule models could improve the ability of fuzzy systems to represent a specific problem. In this proposal, an extended fuzzy rule model, that can include relations between variables in the antecedent of rules is presented. Furthermore, a learning algorithm based on the iterative genetic approach which is able to represent the knowledge using this model is proposed as well. On the other hand, potential relations among initial variables imply an exponential growth in the feasible rule search space. Consequently, two filters for detecting relevant potential relations are added to the learning algorithm. These filters allows to decrease the search space complexity and increase the algorithm efficiency. Finally, we also present an experimental study to demonstrate the benefits of using fuzzy relational rules.

  20. Modeling of a Parabolic Trough Solar Field for Acceptance Testing: A Case Study

    Energy Technology Data Exchange (ETDEWEB)

    Wagner, M. J.; Mehos, M. S.; Kearney, D. W.; McMahan, A. C.

    2011-01-01

    As deployment of parabolic trough concentrating solar power (CSP) systems ramps up, the need for reliable and robust performance acceptance test guidelines for the solar field is also amplified. Project owners and/or EPC contractors often require extensive solar field performance testing as part of the plant commissioning process in order to ensure that actual solar field performance satisfies both technical specifications and performance guaranties between the involved parties. Performance test code work is currently underway at the National Renewable Energy Laboratory (NREL) in collaboration with the SolarPACES Task-I activity, and within the ASME PTC-52 committee. One important aspect of acceptance testing is the selection of a robust technology performance model. NREL1 has developed a detailed parabolic trough performance model within the SAM software tool. This model is capable of predicting solar field, sub-system, and component performance. It has further been modified for this work to support calculation at subhourly time steps. This paper presents the methodology and results of a case study comparing actual performance data for a parabolic trough solar field to the predicted results using the modified SAM trough model. Due to data limitations, the methodology is applied to a single collector loop, though it applies to larger subfields and entire solar fields. Special consideration is provided for the model formulation, improvements to the model formulation based on comparison with the collected data, and uncertainty associated with the measured data. Additionally, this paper identifies modeling considerations that are of particular importance in the solar field acceptance testing process and uses the model to provide preliminary recommendations regarding acceptable steady-state testing conditions at the single-loop level.

  1. 40 CFR 63.344 - Performance test requirements and test methods.

    Science.gov (United States)

    2010-07-01

    ... electroplating tanks or chromium anodizing tanks. The sampling time and sample volume for each run of Methods 306... Chromium Anodizing Tanks § 63.344 Performance test requirements and test methods. (a) Performance test... Emissions From Decorative and Hard Chromium Electroplating and Anodizing Operations,” appendix A of this...

  2. Development of a smart key performance indicator for in-situ load tests

    NARCIS (Netherlands)

    Dieteren, G.; Bigaj-van Vliet, A.J.; Yang, Y.; Sangers, A.

    2017-01-01

    In-situ load testing of reinforced concrete (RC) structures is often performed to confirm the presence of the required resistance for the intended use (Conformity Load Testing) or to support the assessments of the residual capacity by models (Supplementary Load Testing for Condition Assessment).

  3. A new in silico classification model for ready biodegradability, based on molecular fragments.

    Science.gov (United States)

    Lombardo, Anna; Pizzo, Fabiola; Benfenati, Emilio; Manganaro, Alberto; Ferrari, Thomas; Gini, Giuseppina

    2014-08-01

    Regulations such as the European REACH (Registration, Evaluation, Authorization and restriction of Chemicals) often require chemicals to be evaluated for ready biodegradability, to assess the potential risk for environmental and human health. Because not all chemicals can be tested, there is an increasing demand for tools for quick and inexpensive biodegradability screening, such as computer-based (in silico) theoretical models. We developed an in silico model starting from a dataset of 728 chemicals with ready biodegradability data (MITI-test Ministry of International Trade and Industry). We used the novel software SARpy to automatically extract, through a structural fragmentation process, a set of substructures statistically related to ready biodegradability. Then, we analysed these substructures in order to build some general rules. The model consists of a rule-set made up of the combination of the statistically relevant fragments and of the expert-based rules. The model gives good statistical performance with 92%, 82% and 76% accuracy on the training, test and external set respectively. These results are comparable with other in silico models like BIOWIN developed by the United States Environmental Protection Agency (EPA); moreover this new model includes an easily understandable explanation. Copyright © 2014 Elsevier Ltd. All rights reserved.

  4. A test of inflated zeros for Poisson regression models.

    Science.gov (United States)

    He, Hua; Zhang, Hui; Ye, Peng; Tang, Wan

    2017-01-01

    Excessive zeros are common in practice and may cause overdispersion and invalidate inference when fitting Poisson regression models. There is a large body of literature on zero-inflated Poisson models. However, methods for testing whether there are excessive zeros are less well developed. The Vuong test comparing a Poisson and a zero-inflated Poisson model is commonly applied in practice. However, the type I error of the test often deviates seriously from the nominal level, rendering serious doubts on the validity of the test in such applications. In this paper, we develop a new approach for testing inflated zeros under the Poisson model. Unlike the Vuong test for inflated zeros, our method does not require a zero-inflated Poisson model to perform the test. Simulation studies show that when compared with the Vuong test our approach not only better at controlling type I error rate, but also yield more power.

  5. Multi-scale inference of interaction rules in animal groups using Bayesian model selection.

    Directory of Open Access Journals (Sweden)

    Richard P Mann

    Full Text Available Inference of interaction rules of animals moving in groups usually relies on an analysis of large scale system behaviour. Models are tuned through repeated simulation until they match the observed behaviour. More recent work has used the fine scale motions of animals to validate and fit the rules of interaction of animals in groups. Here, we use a Bayesian methodology to compare a variety of models to the collective motion of glass prawns (Paratya australiensis. We show that these exhibit a stereotypical 'phase transition', whereby an increase in density leads to the onset of collective motion in one direction. We fit models to this data, which range from: a mean-field model where all prawns interact globally; to a spatial Markovian model where prawns are self-propelled particles influenced only by the current positions and directions of their neighbours; up to non-Markovian models where prawns have 'memory' of previous interactions, integrating their experiences over time when deciding to change behaviour. We show that the mean-field model fits the large scale behaviour of the system, but does not capture the observed locality of interactions. Traditional self-propelled particle models fail to capture the fine scale dynamics of the system. The most sophisticated model, the non-Markovian model, provides a good match to the data at both the fine scale and in terms of reproducing global dynamics, while maintaining a biologically plausible perceptual range. We conclude that prawns' movements are influenced by not just the current direction of nearby conspecifics, but also those encountered in the recent past. Given the simplicity of prawns as a study system our research suggests that self-propelled particle models of collective motion should, if they are to be realistic at multiple biological scales, include memory of previous interactions and other non-Markovian effects.

  6. Decision models for use with criterion-referenced tests

    NARCIS (Netherlands)

    van der Linden, Willem J.

    1980-01-01

    The problem of mastery decisions and optimizing cutoff scores on criterion-referenced tests is considered. This problem can be formalized as an (empirical) Bayes problem with decisions rules of a monotone shape. Next, the derivation of optimal cutoff scores for threshold, linear, and normal ogive

  7. Functional Testing Protocols for Commercial Building Efficiency Baseline Modeling Software

    Energy Technology Data Exchange (ETDEWEB)

    Jump, David; Price, Phillip N.; Granderson, Jessica; Sohn, Michael

    2013-09-06

    This document describes procedures for testing and validating proprietary baseline energy modeling software accuracy in predicting energy use over the period of interest, such as a month or a year. The procedures are designed according to the methodology used for public domain baselining software in another LBNL report that was (like the present report) prepared for Pacific Gas and Electric Company: ?Commercial Building Energy Baseline Modeling Software: Performance Metrics and Method Testing with Open Source Models and Implications for Proprietary Software Testing Protocols? (referred to here as the ?Model Analysis Report?). The test procedure focuses on the quality of the software?s predictions rather than on the specific algorithms used to predict energy use. In this way the software vendor is not required to divulge or share proprietary information about how their software works, while enabling stakeholders to assess its performance.

  8. Effectiveness of emergency nurses' use of the Ottawa Ankle Rules to initiate radiographic tests on improving healthcare outcomes for patients with ankle injuries: A systematic review.

    Science.gov (United States)

    Ho, Jonathan Ka-Ming; Chau, Janita Pak-Chun; Cheung, Nancy Man-Ching

    2016-11-01

    The Ottawa Ankle Rules provide guidelines for clinicians on the recommendation of radiographic tests to verify fractures in patients with ankle injuries. The use of the Ottawa Ankle Rules by emergency nurses has been suggested to minimise unnecessary radiographic-test requests and reduce patients' length of stay in emergency departments. However, the findings of studies in this area are inconsistent. A systematic review was conducted to synthesise the most accurate evidence available on the extent to which emergency nurses' use of the Ottawa Ankle Rules to initiate radiographic tests improves healthcare outcomes for patients with ankle injuries. The systematic review attempted to identify all relevant published and unpublished studies in English and Chinese from databases such as Ovid MEDLINE, EMBASE, ProQuest Health and Medical Complete, EBM Reviews, SPORTDiscus, CINAHL Plus, the British Nursing Index, Scopus, the Chinese Biomedical Literature Database, China Journal Net, WanFang Data, the National Central Library Periodical Literature System, HyRead, the Digital Dissertation Consortium, MedNar and Google Scholar. Two reviewers independently assessed the eligibility of all of the studies identified during the search, based on their titles and abstracts. If a study met the criteria for inclusion, or inconclusive information was available in its title and abstract, the full text was retrieved for further analysis. The methodological quality of all of the eligible studies was assessed independently by the two reviewers. The search of databases and other sources yielded 1603 records. The eligibility of 17 full-text articles was assessed, and nine studies met the inclusion criteria. All nine studies were subjected to narrative analysis, and five were meta-analysed. All of the studies investigated the use of the refined Ottawa Ankle Rules. The results indicated that emergency nurses' use of the refined Ottawa Ankle Rules minimised unnecessary radiographic-test requests

  9. Stereotype Threat, Test Anxiety, and Mathematics Performance

    Science.gov (United States)

    Tempel, Tobias; Neumann, Roland

    2014-01-01

    We investigated the combined effects of stereotype threat and trait test anxiety on mathematics test performance. Stereotype threat and test anxiety interacted with each other in affecting performance. Trait test anxiety predicted performance only in a diagnostic condition that prevented stereotype threat by stereotype denial. A state measure of…

  10. Matrix diffusion model. In situ tests using natural analogues

    International Nuclear Information System (INIS)

    Rasilainen, K.

    1997-11-01

    Matrix diffusion is an important retarding and dispersing mechanism for substances carried by groundwater in fractured bedrock. Natural analogues provide, unlike laboratory or field experiments, a possibility to test the model of matrix diffusion in situ over long periods of time. This thesis documents quantitative model tests against in situ observations, done to support modelling of matrix diffusion in performance assessments of nuclear waste repositories

  11. Systemic reaction after performing a food prick-to-prick test. A case report

    Directory of Open Access Journals (Sweden)

    Karen Estefanía Hernández-Moreno

    2017-02-01

    Full Text Available Background: Skin prick test is the most widely used test for the diagnosis of IgE-mediated conditions. Commercial extracts are used for its performance, but in the case of fruits and vegetables it is preferable using fresh food. Although both tests possess a good safety profile, hypersensitivity reactions have been recorded. Clinical case: Forty-seven-year old woman with a history of persistent allergic rhinitis, sensitized to the pollen of grasses, olive and salsola; she was referred to an allergology department due to anaphylaxis triggered by the consumption of avocado, cantaloupe, carrots and watermelon. Minutes after skin prick test with standardized extract and skin prick with fresh foods, she developed dyspnea, pruritus, erythema, dizziness and sibilance; she was administered 0.5 mg of intramuscular adrenalin and 4 salbutamol inhalations and placed in the Trendelemburg position. Dyspnea persisted, and vital signs monitoring showed heart and respiratory rates increase and, hence, salbutamol was applied again, together with 2 L/min of oxygen delivered by nasal cannula, intravenous fluids and 100 mg intravenous hydrocortisone; improvement was observed at 40 minutes. The patient was hospitalized for 48 hours. Conclusions: Although skin tests are safe, the risk of hypersensitivity and anaphylactic reactions should not be ruled out, especially in susceptible patients.

  12. Sepsis and meningitis in hospitalized children: performance of clinical signs and their prediction rules in a case-control study.

    Science.gov (United States)

    Verbakel, Jan Y; MacFaul, Roderick; Aertgeerts, Bert; Buntinx, Frank; Thompson, Matthew

    2014-06-01

    Feverish illness is a common presentation to acute pediatric services. Clinical staff faces the challenge of differentiating the few children with meningitis or sepsis from the majority with self-limiting illness. We aimed to determine the diagnostic value of clinical features and their prediction rules (CPR) for identifying children with sepsis or meningitis among those children admitted to a District General Hospital with acute febrile illness. Acutely ill children admitted to a District General Hospital in England were included in this case-control study between 2000 and 2005. We examined the diagnostic accuracy of individual clinical signs and 6 CPRs, including the National Institute for Clinical Excellence "traffic light" system, to determine clinical utility in identifying children with a diagnosis of sepsis or meningitis. Loss of consciousness, prolonged capillary refill, decreased alertness, respiratory effort, and the physician's illness assessment had high positive likelihood ratios (9-114), although with wide confidence intervals, to rule in sepsis or meningitis. The National Institute for Clinical Excellence traffic light system, the modified Yale Observation Scale, and the Pediatric Advanced Warning Score performed poorly with positive likelihood ratios ranging from 1 to 3. The pediatrician's overall illness assessment was the most useful feature to rule in sepsis or meningitis in these hospitalized children. Clinical prediction rules did not effectively rule in sepsis or meningitis. The modified Yale Observation Scale should be used with caution. Single clinical signs could complement these scores to rule in sepsis or meningitis. Further research is needed to validate these CPRs.

  13. A system for nonmonotonic rules on the web

    NARCIS (Netherlands)

    Antoniou, G.; Bikakis, A.; Wagner, G.R.

    2004-01-01

    Defeasible reasoning is a rule-based approach for efficient reasoning with incomplete and inconsistent information. Such reasoning is, among others, useful for ontology integration, where conflicting information arises naturally; and for the modeling of business rules and policies, where rules with

  14. A Model on the Contribution of School Assets to the Achievement of Adolescents' Well-Being and Academic Performance.

    Science.gov (United States)

    Pertegal, Miguel-Ángel; Oliva, Alfredo

    2017-10-10

    The aim of this study was to examine a model on the contribution of school assets on the development of adolescent´s well-being and school success. The sample comprised 1944 adolescents (893 girls and 1051 boys) aged between 12 and 17 years (M = 14.4; SD = 1.13), from secondary schools in Western Andalusia, which completed some self-report questionnaires. The results of structural equation modeling showed the goodness of fit of the initial theoretical model. This model confirmed the importance of school connectedness as a key factor in the relationships between other school assets (social climate; clarity of the rules and values, and positive opportunities and empowerment) and commitment to learning, academic performance and life satisfaction. However, the re-specification of the initial model considered two complementary paths with theoretical sense: first, a direct influence between clarity of the rules and values and commitment to learning, and second, between academic performance and life satisfaction. This model obtained better goodness of fit indices than the first one: χ2 = 16.32; gl = 8; p = .038; χ2/gl = 2.04; SRMR = .018; RSMEA = .023 (95% C.I. = .005; 040); NNFI = .98; CFI = .99. From our study, the need to invest in initiatives focused on the promotion of adolescents' links with their school emerges as a key goal to contribute towards, at the same time, both a good academic performance and a better life satisfaction.

  15. Assessing predation risk: optimal behaviour and rules of thumb.

    Science.gov (United States)

    Welton, Nicky J; McNamara, John M; Houston, Alasdair I

    2003-12-01

    We look at a simple model in which an animal makes behavioural decisions over time in an environment in which all parameters are known to the animal except predation risk. In the model there is a trade-off between gaining information about predation risk and anti-predator behaviour. All predator attacks lead to death for the prey, so that the prey learns about predation risk by virtue of the fact that it is still alive. We show that it is not usually optimal to behave as if the current unbiased estimate of the predation risk is its true value. We consider two different ways to model reproduction; in the first scenario the animal reproduces throughout its life until it dies, and in the second scenario expected reproductive success depends on the level of energy reserves the animal has gained by some point in time. For both of these scenarios we find results on the form of the optimal strategy and give numerical examples which compare optimal behaviour with behaviour under simple rules of thumb. The numerical examples suggest that the value of the optimal strategy over the rules of thumb is greatest when there is little current information about predation risk, learning is not too costly in terms of predation, and it is energetically advantageous to learn about predation. We find that for the model and parameters investigated, a very simple rule of thumb such as 'use the best constant control' performs well.

  16. Comparison of Natural Language Processing Rules-based and Machine-learning Systems to Identify Lumbar Spine Imaging Findings Related to Low Back Pain.

    Science.gov (United States)

    Tan, W Katherine; Hassanpour, Saeed; Heagerty, Patrick J; Rundell, Sean D; Suri, Pradeep; Huhdanpaa, Hannu T; James, Kathryn; Carrell, David S; Langlotz, Curtis P; Organ, Nancy L; Meier, Eric N; Sherman, Karen J; Kallmes, David F; Luetmer, Patrick H; Griffith, Brent; Nerenz, David R; Jarvik, Jeffrey G

    2018-03-28

    To evaluate a natural language processing (NLP) system built with open-source tools for identification of lumbar spine imaging findings related to low back pain on magnetic resonance and x-ray radiology reports from four health systems. We used a limited data set (de-identified except for dates) sampled from lumbar spine imaging reports of a prospectively assembled cohort of adults. From N = 178,333 reports, we randomly selected N = 871 to form a reference-standard dataset, consisting of N = 413 x-ray reports and N = 458 MR reports. Using standardized criteria, four spine experts annotated the presence of 26 findings, where 71 reports were annotated by all four experts and 800 were each annotated by two experts. We calculated inter-rater agreement and finding prevalence from annotated data. We randomly split the annotated data into development (80%) and testing (20%) sets. We developed an NLP system from both rule-based and machine-learned models. We validated the system using accuracy metrics such as sensitivity, specificity, and area under the receiver operating characteristic curve (AUC). The multirater annotated dataset achieved inter-rater agreement of Cohen's kappa > 0.60 (substantial agreement) for 25 of 26 findings, with finding prevalence ranging from 3% to 89%. In the testing sample, rule-based and machine-learned predictions both had comparable average specificity (0.97 and 0.95, respectively). The machine-learned approach had a higher average sensitivity (0.94, compared to 0.83 for rules-based), and a higher overall AUC (0.98, compared to 0.90 for rules-based). Our NLP system performed well in identifying the 26 lumbar spine findings, as benchmarked by reference-standard annotation by medical experts. Machine-learned models provided substantial gains in model sensitivity with slight loss of specificity, and overall higher AUC. Copyright © 2018 The Association of University Radiologists. All rights reserved.

  17. Rules in School. Strategies for Teachers Series.

    Science.gov (United States)

    Brady, Kathryn; Forton, Mary Beth; Porter, Deborah; Wood, Chip

    This book offers an approach for helping K-8 students become invested in creating and living by classroom rules. It provides techniques for: helping students articulate their hopes and dreams for school; involving students in generating classroom rules that grow out of their hopes and dreams; modeling, practicing, and role playing the rules; using…

  18. Exploratory analysis of methods for automated classification of laboratory test orders into syndromic groups in veterinary medicine.

    Directory of Open Access Journals (Sweden)

    Fernanda C Dórea

    Full Text Available BACKGROUND: Recent focus on earlier detection of pathogen introduction in human and animal populations has led to the development of surveillance systems based on automated monitoring of health data. Real- or near real-time monitoring of pre-diagnostic data requires automated classification of records into syndromes--syndromic surveillance--using algorithms that incorporate medical knowledge in a reliable and efficient way, while remaining comprehensible to end users. METHODS: This paper describes the application of two of machine learning (Naïve Bayes and Decision Trees and rule-based methods to extract syndromic information from laboratory test requests submitted to a veterinary diagnostic laboratory. RESULTS: High performance (F1-macro = 0.9995 was achieved through the use of a rule-based syndrome classifier, based on rule induction followed by manual modification during the construction phase, which also resulted in clear interpretability of the resulting classification process. An unmodified rule induction algorithm achieved an F(1-micro score of 0.979 though this fell to 0.677 when performance for individual classes was averaged in an unweighted manner (F(1-macro, due to the fact that the algorithm failed to learn 3 of the 16 classes from the training set. Decision Trees showed equal interpretability to the rule-based approaches, but achieved an F(1-micro score of 0.923 (falling to 0.311 when classes are given equal weight. A Naïve Bayes classifier learned all classes and achieved high performance (F(1-micro= 0.994 and F(1-macro = .955, however the classification process is not transparent to the domain experts. CONCLUSION: The use of a manually customised rule set allowed for the development of a system for classification of laboratory tests into syndromic groups with very high performance, and high interpretability by the domain experts. Further research is required to develop internal validation rules in order to establish

  19. A comparison between model and rule based control of a periodic activated sludge process

    DEFF Research Database (Denmark)

    Isaacs, Steven Howard; Thornberg, D.

    1997-01-01

    Two strategies for control of nitrogen removal in an alternating activated sludge plant are compared. One is based on simple model predictions determining the cycle length at the beginning of each cycle. The other is based on simple rules relating present ammonia and nitrate concentrations. Both ...

  20. Feynman rules for the Standard Model Effective Field Theory in R ξ -gauges

    Science.gov (United States)

    Dedes, A.; Materkowska, W.; Paraskevas, M.; Rosiek, J.; Suxho, K.

    2017-06-01

    We assume that New Physics effects are parametrized within the Standard Model Effective Field Theory (SMEFT) written in a complete basis of gauge invariant operators up to dimension 6, commonly referred to as "Warsaw basis". We discuss all steps necessary to obtain a consistent transition to the spontaneously broken theory and several other important aspects, including the BRST-invariance of the SMEFT action for linear R ξ -gauges. The final theory is expressed in a basis characterized by SM-like propagators for all physical and unphysical fields. The effect of the non-renormalizable operators appears explicitly in triple or higher multiplicity vertices. In this mass basis we derive the complete set of Feynman rules, without resorting to any simplifying assumptions such as baryon-, lepton-number or CP conservation. As it turns out, for most SMEFT vertices the expressions are reasonably short, with a noticeable exception of those involving 4, 5 and 6 gluons. We have also supplemented our set of Feynman rules, given in an appendix here, with a publicly available Mathematica code working with the FeynRules package and producing output which can be integrated with other symbolic algebra or numerical codes for automatic SMEFT amplitude calculations.

  1. Design, modeling and testing of data converters

    CERN Document Server

    Kiaei, Sayfe; Xu, Fang

    2014-01-01

    This book presents the a scientific discussion of the state-of-the-art techniques and designs for modeling, testing and for the performance analysis of data converters. The focus is put on sustainable data conversion. Sustainability has become a public issue that industries and users can not ignore. Devising environmentally friendly solutions for data conversion designing, modeling and testing is nowadays a requirement that researchers and practitioners must consider in their activities. This book presents the outcome of the IWADC workshop 2011, held in Orvieto, Italy.

  2. A Designer’s Guide to Human Performance Modelling (La Modelisation des Performances Humaines: Manuel du Concepteur).

    Science.gov (United States)

    1998-12-01

    into the Systems Engineering Process 17 5.3 Validation of HPMs 18 5.4 Commercialisation of human performance modelling software 18 5.5 Model Tool...budget) so that inappropriate models/tools are not offered. The WG agreed that another form of ’ educating ’ designers in the use of models was by means... Commercialisation of human performance modelling Software 5.2.8 Include human performance in system test. g More and more, customer’s are mandating the provision

  3. DKIST enclosure modeling and verification during factory assembly and testing

    Science.gov (United States)

    Larrakoetxea, Ibon; McBride, William; Marshall, Heather K.; Murga, Gaizka

    2014-08-01

    The Daniel K. Inouye Solar Telescope (DKIST, formerly the Advanced Technology Solar Telescope, ATST) is unique as, apart from protecting the telescope and its instrumentation from the weather, it holds the entrance aperture stop and is required to position it with millimeter-level accuracy. The compliance of the Enclosure design with the requirements, as of Final Design Review in January 2012, was supported by mathematical models and other analyses which included structural and mechanical analyses (FEA), control models, ventilation analysis (CFD), thermal models, reliability analysis, etc. During the Enclosure Factory Assembly and Testing the compliance with the requirements has been verified using the real hardware and the models created during the design phase have been revisited. The tests performed during shutter mechanism subsystem (crawler test stand) functional and endurance testing (completed summer 2013) and two comprehensive system-level factory acceptance testing campaigns (FAT#1 in December 2013 and FAT#2 in March 2014) included functional and performance tests on all mechanisms, off-normal mode tests, mechanism wobble tests, creation of the Enclosure pointing map, control system tests, and vibration tests. The comparison of the assumptions used during the design phase with the properties measured during the test campaign provides an interesting reference for future projects.

  4. Multitasking TORT Under UNICOS: Parallel Performance Models and Measurements

    International Nuclear Information System (INIS)

    Azmy, Y.Y.; Barnett, D.A.

    1999-01-01

    The existing parallel algorithms in the TORT discrete ordinates were updated to function in a UNI-COS environment. A performance model for the parallel overhead was derived for the existing algorithms. The largest contributors to the parallel overhead were identified and a new algorithm was developed. A parallel overhead model was also derived for the new algorithm. The results of the comparison of parallel performance models were compared to applications of the code to two TORT standard test problems and a large production problem. The parallel performance models agree well with the measured parallel overhead

  5. Multitasking TORT under UNICOS: Parallel performance models and measurements

    International Nuclear Information System (INIS)

    Barnett, A.; Azmy, Y.Y.

    1999-01-01

    The existing parallel algorithms in the TORT discrete ordinates code were updated to function in a UNICOS environment. A performance model for the parallel overhead was derived for the existing algorithms. The largest contributors to the parallel overhead were identified and a new algorithm was developed. A parallel overhead model was also derived for the new algorithm. The results of the comparison of parallel performance models were compared to applications of the code to two TORT standard test problems and a large production problem. The parallel performance models agree well with the measured parallel overhead

  6. Using fuzzy rule-based knowledge model for optimum plating conditions search

    Science.gov (United States)

    Solovjev, D. S.; Solovjeva, I. A.; Litovka, Yu V.; Arzamastsev, A. A.; Glazkov, V. P.; L’vov, A. A.

    2018-03-01

    The paper discusses existing approaches to plating process modeling in order to decrease the distribution thickness of plating surface cover. However, these approaches do not take into account the experience, knowledge, and intuition of the decision-makers when searching the optimal conditions of electroplating technological process. The original approach to optimal conditions search for applying the electroplating coatings, which uses the rule-based model of knowledge and allows one to reduce the uneven product thickness distribution, is proposed. The block diagrams of a conventional control system of a galvanic process as well as the system based on the production model of knowledge are considered. It is shown that the fuzzy production model of knowledge in the control system makes it possible to obtain galvanic coatings of a given thickness unevenness with a high degree of adequacy to the experimental data. The described experimental results confirm the theoretical conclusions.

  7. Biclustering Learning of Trading Rules.

    Science.gov (United States)

    Huang, Qinghua; Wang, Ting; Tao, Dacheng; Li, Xuelong

    2015-10-01

    Technical analysis with numerous indicators and patterns has been regarded as important evidence for making trading decisions in financial markets. However, it is extremely difficult for investors to find useful trading rules based on numerous technical indicators. This paper innovatively proposes the use of biclustering mining to discover effective technical trading patterns that contain a combination of indicators from historical financial data series. This is the first attempt to use biclustering algorithm on trading data. The mined patterns are regarded as trading rules and can be classified as three trading actions (i.e., the buy, the sell, and no-action signals) with respect to the maximum support. A modified K nearest neighborhood ( K -NN) method is applied to classification of trading days in the testing period. The proposed method [called biclustering algorithm and the K nearest neighbor (BIC- K -NN)] was implemented on four historical datasets and the average performance was compared with the conventional buy-and-hold strategy and three previously reported intelligent trading systems. Experimental results demonstrate that the proposed trading system outperforms its counterparts and will be useful for investment in various financial markets.

  8. Thermal Model Predictions of Advanced Stirling Radioisotope Generator Performance

    Science.gov (United States)

    Wang, Xiao-Yen J.; Fabanich, William Anthony; Schmitz, Paul C.

    2014-01-01

    This paper presents recent thermal model results of the Advanced Stirling Radioisotope Generator (ASRG). The three-dimensional (3D) ASRG thermal power model was built using the Thermal Desktop(trademark) thermal analyzer. The model was correlated with ASRG engineering unit test data and ASRG flight unit predictions from Lockheed Martin's (LM's) I-deas(trademark) TMG thermal model. The auxiliary cooling system (ACS) of the ASRG is also included in the ASRG thermal model. The ACS is designed to remove waste heat from the ASRG so that it can be used to heat spacecraft components. The performance of the ACS is reported under nominal conditions and during a Venus flyby scenario. The results for the nominal case are validated with data from Lockheed Martin. Transient thermal analysis results of ASRG for a Venus flyby with a representative trajectory are also presented. In addition, model results of an ASRG mounted on a Cassini-like spacecraft with a sunshade are presented to show a way to mitigate the high temperatures of a Venus flyby. It was predicted that the sunshade can lower the temperature of the ASRG alternator by 20 C for the representative Venus flyby trajectory. The 3D model also was modified to predict generator performance after a single Advanced Stirling Convertor failure. The geometry of the Microtherm HT insulation block on the outboard side was modified to match deformation and shrinkage observed during testing of a prototypic ASRG test fixture by LM. Test conditions and test data were used to correlate the model by adjusting the thermal conductivity of the deformed insulation to match the post-heat-dump steady state temperatures. Results for these conditions showed that the performance of the still-functioning inboard ACS was unaffected.

  9. A structured analysis of uncertainty surrounding modeled impacts of groundwater-extraction rules

    Science.gov (United States)

    Guillaume, Joseph H. A.; Qureshi, M. Ejaz; Jakeman, Anthony J.

    2012-08-01

    Integrating economic and groundwater models for groundwater-management can help improve understanding of trade-offs involved between conflicting socioeconomic and biophysical objectives. However, there is significant uncertainty in most strategic decision-making situations, including in the models constructed to represent them. If not addressed, this uncertainty may be used to challenge the legitimacy of the models and decisions made using them. In this context, a preliminary uncertainty analysis was conducted of a dynamic coupled economic-groundwater model aimed at assessing groundwater extraction rules. The analysis demonstrates how a variety of uncertainties in such a model can be addressed. A number of methods are used including propagation of scenarios and bounds on parameters, multiple models, block bootstrap time-series sampling and robust linear regression for model calibration. These methods are described within the context of a theoretical uncertainty management framework, using a set of fundamental uncertainty management tasks and an uncertainty typology.

  10. Analysis of diffractive pd to Xd and pp to Xp interactions and test of the finite-mass sum rule

    CERN Document Server

    Akimov, Y; Golovanov, L B; Goulianos, K; Gross, D; Malamud, E; Melissinos, A C; Mukhin, S; Nitz, D; Olsen, S; Sticker, H; Tsarev, V A; Yamada, R; Zimmerman, P

    1976-01-01

    The first moment finite mass sum rule is tested by utilising cross- sections for pp to Xp extracted from recent Fermilab data on pd to Xd and also comparing with CERN ISR data. The dependences on M/sub x//sup 2/, t and s are also discussed. (11 refs).

  11. On-line detection of apnea/hypopnea events using SpO2 signal: a rule-based approach employing binary classifier models.

    Science.gov (United States)

    Koley, Bijoy Laxmi; Dey, Debangshu

    2014-01-01

    This paper presents an online method for automatic detection of apnea/hypopnea events, with the help of oxygen saturation (SpO2) signal, measured at fingertip by Bluetooth nocturnal pulse oximeter. Event detection is performed by identifying abnormal data segments from the recorded SpO2 signal, employing a binary classifier model based on a support vector machine (SVM). Thereafter the abnormal segment is further analyzed to detect different states within the segment, i.e., steady, desaturation, and resaturation, with the help of another SVM-based binary ensemble classifier model. Finally, a heuristically obtained rule-based system is used to identify the apnea/hypopnea events from the time-sequenced decisions of these classifier models. In the developmental phase, a set of 34 time domain-based features was extracted from the segmented SpO2 signal using an overlapped windowing technique. Later, an optimal set of features was selected on the basis of recursive feature elimination technique. A total of 34 subjects were included in the study. The results show average event detection accuracies of 96.7% and 93.8% for the offline and the online tests, respectively. The proposed system provides direct estimation of the apnea/hypopnea index with the help of a relatively inexpensive and widely available pulse oximeter. Moreover, the system can be monitored and accessed by physicians through LAN/WAN/Internet and can be extended to deploy in Bluetooth-enabled mobile phones.

  12. Rule-following as an Anticipatory Act: Interaction in Second Person and an Internal Measurement Model of Dialogue

    Science.gov (United States)

    Takahashi, Tatsuji; Gunji, Yukio-Pegio

    2008-10-01

    We pursue anticipation in second person or normative anticipation. As the first step, we make the three concepts second person, internal measurement and asynchroneity clearer by introducing the velocity of logic νl and the velocity of communication νc, in the context of social communication. After proving anticipatory nature of rule-following or language use in general via Kripke's "rule-following paradox," we present a mathematical model expressing the internality essential to second person, taking advantage of equivalences and differences in the formal language theory. As a consequence, we show some advantages of negatively considered concepts and arguments by concretizing them into an elementary and explicit formal model. The time development of the model shows a self-organizing property which never results if we adopt a third person stance.

  13. Can Failure Succeed? Using Racial Subgroup Rules to Analyze the Effect of School Accountability Failure on Student Performance

    Science.gov (United States)

    Sims, David P.

    2013-01-01

    Many school accountability programs are built on the premise that the sanctions attached to failure will produce higher future student achievement. Furthermore, such programs often include subgroup achievement rules that attempt to hold schools accountable for the performance of all demographic classes of students. This paper looks at two issues:…

  14. Effort, symptom validity testing, performance validity testing and traumatic brain injury.

    Science.gov (United States)

    Bigler, Erin D

    2014-01-01

    To understand the neurocognitive effects of brain injury, valid neuropsychological test findings are paramount. This review examines the research on what has been referred to a symptom validity testing (SVT). Above a designated cut-score signifies a 'passing' SVT performance which is likely the best indicator of valid neuropsychological test findings. Likewise, substantially below cut-point performance that nears chance or is at chance signifies invalid test performance. Significantly below chance is the sine qua non neuropsychological indicator for malingering. However, the interpretative problems with SVT performance below the cut-point yet far above chance are substantial, as pointed out in this review. This intermediate, border-zone performance on SVT measures is where substantial interpretative challenges exist. Case studies are used to highlight the many areas where additional research is needed. Historical perspectives are reviewed along with the neurobiology of effort. Reasons why performance validity testing (PVT) may be better than the SVT term are reviewed. Advances in neuroimaging techniques may be key in better understanding the meaning of border zone SVT failure. The review demonstrates the problems with rigidity in interpretation with established cut-scores. A better understanding of how certain types of neurological, neuropsychiatric and/or even test conditions may affect SVT performance is needed.

  15. Prefrontal and parietal activity is modulated by the rule complexity of inductive reasoning and can be predicted by a cognitive model.

    Science.gov (United States)

    Jia, Xiuqin; Liang, Peipeng; Shi, Lin; Wang, Defeng; Li, Kuncheng

    2015-01-01

    In neuroimaging studies, increased task complexity can lead to increased activation in task-specific regions or to activation of additional regions. How the brain adapts to increased rule complexity during inductive reasoning remains unclear. In the current study, three types of problems were created: simple rule induction (i.e., SI, with rule complexity of 1), complex rule induction (i.e., CI, with rule complexity of 2), and perceptual control. Our findings revealed that increased activations accompany increased rule complexity in the right dorsal lateral prefrontal cortex (DLPFC) and medial posterior parietal cortex (precuneus). A cognitive model predicted both the behavioral and brain imaging results. The current findings suggest that neural activity in frontal and parietal regions is modulated by rule complexity, which may shed light on the neural mechanisms of inductive reasoning. Copyright © 2014. Published by Elsevier Ltd.

  16. Thermal Testing and Model Correlation of the Magnetospheric Multiscale (MMS) Observatories

    Science.gov (United States)

    Kim, Jong S.; Teti, Nicholas M.

    2015-01-01

    The Magnetospheric Multiscale (MMS) mission is a Solar Terrestrial Probes mission comprising four identically instrumented spacecraft that will use Earth's magnetosphere as a laboratory to study the microphysics of three fundamental plasma processes: magnetic reconnection, energetic particle acceleration, and turbulence. This paper presents the complete thermal balance (TB) test performed on the first of four observatories to go through thermal vacuum (TV) and the minibalance testing that was performed on the subsequent observatories to provide a comparison of all four. The TV and TB tests were conducted in a thermal vacuum chamber at the Naval Research Laboratory (NRL) in Washington, D.C. with the vacuum level higher than 1.3 x 10 (sup -4) pascals (10 (sup -6) torr) and the surrounding temperature achieving -180 degrees Centigrade. Three TB test cases were performed that included hot operational science, cold operational science and a cold survival case. In addition to the three balance cases a two hour eclipse and a four hour eclipse simulation was performed during the TV test to provide additional transient data points that represent the orbit in eclipse (or Earth's shadow) The goal was to perform testing such that the flight orbital environments could be simulated as closely as possible. A thermal model correlation between the thermal analysis and the test results was completed. Over 400 1-Wire temperature sensors, 200 thermocouples and 125 flight thermistor temperature sensors recorded data during TV and TB testing. These temperature versus time profiles and their agreements with the analytical results obtained using Thermal Desktop and SINDA/FLUINT are discussed. The model correlation for the thermal mathematical model (TMM) is conducted based on the numerical analysis results and the test data. The philosophy of model correlation was to correlate the model to within 3 degrees Centigrade of the test data using the standard deviation and mean deviation error

  17. DEVELOP-FPS: a First Person Shooter Development Tool for Rule-based Scripts

    Directory of Open Access Journals (Sweden)

    Bruno Correia

    2012-09-01

    Full Text Available We present DEVELOP-FPS, a software tool specially designed for the development of First Person Shooter (FPS players controlled by Rule Based Scripts. DEVELOP-FPS may be used by FPS developers to create, debug, maintain and compare rule base player behaviours, providing a set of useful functionalities: i for an easy preparation of the right scenarios for game debugging and testing; ii for controlling the game execution: users can stop and resume the game execution at any instant, monitoring and controlling every player in the game, monitoring the state of each player, their rule base activation, being able to issue commands to control their behaviour; and iii to automatically run a certain number of game executions and collect data in order to evaluate and compare the players performance along a sufficient number of similar experiments.

  18. Business rules formalisation for information systems

    Directory of Open Access Journals (Sweden)

    Ivana Rábová

    2010-01-01

    Full Text Available The article deals with relation business rules and business applications and describes a number of structures for support of information systems implementation and customization. Particular formats of structure are different according to different type of business rules. We arise from model of enterprise architecture that is a significant document of all what happens in business and serves for blueprint and facilitates of managers decisions. Most complicated part of enterprise architecture is business rule. When we gain its accurate formulation and when we achieve to formalize and to store business rule in special repository we can manage it actualize it and use it for many reasons. The article emphasizes formats of business rule formalization and its reference to business applications implementation.

  19. A Simulation Model for Tensile Fracture Procedure Analysis of Graphite Material based on Damage Evolution

    International Nuclear Information System (INIS)

    Zhao Erqiang; Ma Shaopeng; Wang Hongtao

    2014-01-01

    Graphite material is generally easy to be damaged by the widely distributed micro-cracks when subjects to load. For numerically analyzing of the structure made of graphite material, the influences of the degradation of the material in damaged areas need to be considered. In this paper, an axial tension test method is proposed to obtain the dynamic damage evolution rule of the material. Using the degradation rule (variation of elastic modulus), the finite element model is then constructed to analyze the tensile fracture process of the L-shaped graphite specimen. An axial tension test of graphite is performed to obtain the stress-strain curve. Based on the variation of the measured curve, the damage evolution rule of the material are fitted out. A simulation model based on the above measured results is then constructed on ABAQUS by user subroutine. Using this simulation model, the tension failure process of L-shaped graphite specimen with fillet are simulated. The calculated and experimental results on fracture load are in good agreement. The damage simulation model based on the stress-strain curve of axial tensile test can be used in other tensile fracture analysis. (author)

  20. Labeling and effectiveness testing; sunscreen drug products for over-the-counter human use; delay of compliance dates. Final rule; delay of compliance dates; request for comments.

    Science.gov (United States)

    2012-05-11

    The Food and Drug Administration (FDA) is delaying the compliance dates for the final rule for over-the-counter (OTC) sunscreen drug products that published in the Federal Register of June 17, 2011 (76 FR 35620). The final rule establishes labeling and effectiveness testing for certain OTC sunscreen products containing specified active ingredients and marketed without approved applications. It also amends labeling claims that are not currently supported by data and lifts the previously-published delay of implementation of the Drug Facts labeling requirements for OTC sunscreens. The 2011 final rule's compliance dates are being delayed because information received after publication of the 2011 final rule indicates that full implementation of the 2011 final rule's requirements for all affected products will require an additional 6 months. This final rule is part of FDA's ongoing review of OTC drug products.