WorldWideScience

Sample records for rule-based down-scaling methodology

  1. Bioclim deliverable D8a: development of the rule-based down-scaling methodology for BIOCLIM Work-package 3

    International Nuclear Information System (INIS)

    2003-01-01

    The BIOCLIM project on modelling sequential Biosphere systems under Climate change for radioactive waste disposal is part of the EURATOM fifth European framework programme. The project was launched in October 2000 for a three-year period. The project aims at providing a scientific basis and practical methodology for assessing the possible long term impacts on the safety of radioactive waste repositories in deep formations due to climate and environmental change. Five work packages (WP) have been identified to fulfill the project objectives. One of the tasks of BIOCLIM WP3 was to develop a rule-based approach for down-scaling from the MoBidiC model of intermediate complexity in order to provide consistent estimates of monthly temperature and precipitation for the specific regions of interest to BIOCLIM (Central Spain, Central England and Northeast France, together with Germany and the Czech Republic). A statistical down-scaling methodology has been developed by Philippe Marbaix of CEA/LSCE for use with the second climate model of intermediate complexity used in BIOCLIM - CLIMBER-GREMLINS. The rule-based methodology assigns climate states or classes to a point on the time continuum of a region according to a combination of simple threshold values which can be determined from the coarse scale climate model. Once climate states or classes have been defined, monthly temperature and precipitation climatologies are constructed using analogue stations identified from a data base of present-day climate observations. The most appropriate climate classification for BIOCLIM purposes is the Koeppen/Trewartha scheme. This scheme has the advantage of being empirical, but only requires monthly averages of temperature and precipitation as input variables. Section 2 of this deliverable (D8a) outline how each of the eight methodological steps have been undertaken for each of the three main BIOCLIM study regions (Central England, Northeast France and Central Spain) using Mo

  2. Fuzzy-Rule-Based Object Identification Methodology for NAVI System

    Directory of Open Access Journals (Sweden)

    Yaacob Sazali

    2005-01-01

    Full Text Available We present an object identification methodology applied in a navigation assistance for visually impaired (NAVI system. The NAVI has a single board processing system (SBPS, a digital video camera mounted headgear, and a pair of stereo earphones. The captured image from the camera is processed by the SBPS to generate a specially structured stereo sound suitable for vision impaired people in understanding the presence of objects/obstacles in front of them. The image processing stage is designed to identify the objects in the captured image. Edge detection and edge-linking procedures are applied in the processing of image. A concept of object preference is included in the image processing scheme and this concept is realized using a fuzzy-rule base. The blind users are trained with the stereo sound produced by NAVI for achieving a collision-free autonomous navigation.

  3. Fuzzy-Rule-Based Object Identification Methodology for NAVI System

    Science.gov (United States)

    Nagarajan, R.; Sainarayanan, G.; Yaacob, Sazali; Porle, Rosalyn R.

    2005-12-01

    We present an object identification methodology applied in a navigation assistance for visually impaired (NAVI) system. The NAVI has a single board processing system (SBPS), a digital video camera mounted headgear, and a pair of stereo earphones. The captured image from the camera is processed by the SBPS to generate a specially structured stereo sound suitable for vision impaired people in understanding the presence of objects/obstacles in front of them. The image processing stage is designed to identify the objects in the captured image. Edge detection and edge-linking procedures are applied in the processing of image. A concept of object preference is included in the image processing scheme and this concept is realized using a fuzzy-rule base. The blind users are trained with the stereo sound produced by NAVI for achieving a collision-free autonomous navigation.

  4. Bioclim Deliverable D8b: development of the physical/statistical down-scaling methodology and application to climate model Climber for BIOCLIM Work-package 3

    International Nuclear Information System (INIS)

    2003-01-01

    too coarse and simplified. This is why we first need to find these 'physically based' relations between large scale model outputs and regional scale predictors. This is a solution to the specific problem of down-scaling from an intermediate complexity model such as CLIMBER. There are several other types of down-scaling methodologies, such has the dynamical and rule-based method presented in other BIOCLIM deliverables. A specificity of the present method is to attempt to use physical considerations in the down-scaling while a detailed 'dynamical' approach is out of reach because CLIMBER mainly provides the average climate. By contrast, an input of time-variability at various scales is necessary for a more dynamical approach. This report is organised as follows: Section 2 relates to the design and validation of the method, while section 3 reports the application to BIOCLIM simulations. We first present the employed data sources, which are the model results and the observed climatology. We then present the principles of the down-scaling method, the formulation of the predictors and the calibration of the statistical model, including results for the last glacial maximum. In section 3, the results are first presented as time series for each site, then as maps at specific times, or snapshots

  5. Sensor-based activity recognition using extended belief rule-based inference methodology.

    Science.gov (United States)

    Calzada, A; Liu, J; Nugent, C D; Wang, H; Martinez, L

    2014-01-01

    The recently developed extended belief rule-based inference methodology (RIMER+) recognizes the need of modeling different types of information and uncertainty that usually coexist in real environments. A home setting with sensors located in different rooms and on different appliances can be considered as a particularly relevant example of such an environment, which brings a range of challenges for sensor-based activity recognition. Although RIMER+ has been designed as a generic decision model that could be applied in a wide range of situations, this paper discusses how this methodology can be adapted to recognize human activities using binary sensors within smart environments. The evaluation of RIMER+ against other state-of-the-art classifiers in terms of accuracy, efficiency and applicability was found to be significantly relevant, specially in situations of input data incompleteness, and it demonstrates the potential of this methodology and underpins the basis to develop further research on the topic.

  6. Intelligent wear mode identification system for marine diesel engines based on multi-level belief rule base methodology

    Science.gov (United States)

    Yan, Xinping; Xu, Xiaojian; Sheng, Chenxing; Yuan, Chengqing; Li, Zhixiong

    2018-01-01

    Wear faults are among the chief causes of main-engine damage, significantly influencing the secure and economical operation of ships. It is difficult for engineers to utilize multi-source information to identify wear modes, so an intelligent wear mode identification model needs to be developed to assist engineers in diagnosing wear faults in diesel engines. For this purpose, a multi-level belief rule base (BBRB) system is proposed in this paper. The BBRB system consists of two-level belief rule bases, and the 2D and 3D characteristics of wear particles are used as antecedent attributes on each level. Quantitative and qualitative wear information with uncertainties can be processed simultaneously by the BBRB system. In order to enhance the efficiency of the BBRB, the silhouette value is adopted to determine referential points and the fuzzy c-means clustering algorithm is used to transform input wear information into belief degrees. In addition, the initial parameters of the BBRB system are constructed on the basis of expert-domain knowledge and then optimized by the genetic algorithm to ensure the robustness of the system. To verify the validity of the BBRB system, experimental data acquired from real-world diesel engines are analyzed. Five-fold cross-validation is conducted on the experimental data and the BBRB is compared with the other four models in the cross-validation. In addition, a verification dataset containing different wear particles is used to highlight the effectiveness of the BBRB system in wear mode identification. The verification results demonstrate that the proposed BBRB is effective and efficient for wear mode identification with better performance and stability than competing systems.

  7. A rule-based phase control methodology for a slider-crank wave energy converter power take-off system

    Energy Technology Data Exchange (ETDEWEB)

    Sang, Yuanrui; Karayaka, H. Bora; Yan, Yanjun; Zhang, James Z.; Bogucki, Darek; Yu, Yi-Hsiang

    2017-09-01

    The slider crank is a proven mechanical linkage system with a long history of successful applications, and the slider-crank ocean wave energy converter (WEC) is a type of WEC that converts linear motion into rotation. This paper presents a control algorithm for a slider-crank WEC. In this study, a time-domain hydrodynamic analysis is adopted, and an AC synchronous machine is used in the power take-off system to achieve relatively high system performance. Also, a rule-based phase control strategy is applied to maximize energy extraction, making the system suitable for not only regular sinusoidal waves but also irregular waves. Simulations are carried out under regular sinusoidal wave and synthetically produced irregular wave conditions; performance validations are also presented with high-precision, real ocean wave surface elevation data. The influences of significant wave height, and peak period upon energy extraction of the system are studied. Energy extraction results using the proposed method are compared to those of the passive loading and complex conjugate control strategies; results show that the level of energy extraction is between those of the passive loading and complex conjugate control strategies, and the suboptimal nature of this control strategy is verified.

  8. Bioclim Deliverable D6b: application of statistical down-scaling within the BIOCLIM hierarchical strategy: methods, data requirements and underlying assumptions

    International Nuclear Information System (INIS)

    2004-01-01

    -study regions were identified, together with the additional issues which arise in applying these techniques to output from the BIOCLIM simulations. This preliminary work is described in this BIOCLIM technical note. It provides an overview of statistical down-scaling methods, together with their underlying assumptions and advantages/disadvantages. Specific issues relating to their application within the BIOCLIM context (i.e., application to the IPSL C M4 D snapshot simulations) are identified, for example, the stationarity issue. The predictor and predictand data sets that would be required to implement these methods within the BIOCLIM hierarchical strategy are also outlined, together with the methodological steps involved. Implementation of these techniques was delayed in order to give priority to the application of the rule-based down-scaling method developed in WP3 to WP2 EMIC output (see Deliverable D8a). This task was not originally planned, but has allowed more comprehensive comparison and evaluation of the BIOCLIM scenarios and down-scaling methods to be undertaken

  9. Rule-based decision making model

    International Nuclear Information System (INIS)

    Sirola, Miki

    1998-01-01

    A rule-based decision making model is designed in G2 environment. A theoretical and methodological frame for the model is composed and motivated. The rule-based decision making model is based on object-oriented modelling, knowledge engineering and decision theory. The idea of safety objective tree is utilized. Advanced rule-based methodologies are applied. A general decision making model 'decision element' is constructed. The strategy planning of the decision element is based on e.g. value theory and utility theory. A hypothetical process model is built to give input data for the decision element. The basic principle of the object model in decision making is division in tasks. Probability models are used in characterizing component availabilities. Bayes' theorem is used to recalculate the probability figures when new information is got. The model includes simple learning features to save the solution path. A decision analytic interpretation is given to the decision making process. (author)

  10. Bioclim Deliverable D6a: regional climatic characteristics for the European sites at specific times: the dynamical down-scaling

    International Nuclear Information System (INIS)

    2003-01-01

    The overall aim of BIOCLIM is to assess the possible long-term impacts due to climate change on the safety of radioactive waste repositories in deep formations. This aim is addressed through the following specific objectives: - Development of practical and innovative strategies for representing sequential climatic changes to the geosphere-biosphere system for existing sites over central Europe, addressing the timescale of one million years, which is relevant to the geological disposal of radioactive waste. - Exploration and evaluation of the potential effects of climate change on the nature of the biosphere systems used to assess the environmental impact. - Dissemination of information on the new methodologies and the results obtained from the project among the international waste management community for use in performance assessments of potential or planned radioactive waste repositories. The BIOCLIM project is designed to advance the state-of-the-art of biosphere modelling for use in Performance Assessments. Therefore, two strategies are developed for representing sequential climatic changes to geosphere-biosphere systems. The hierarchical strategy successively uses a hierarchy of climate models. These models vary from simple 2-D models, which simulate interactions between a few aspects of the Earth system at a rough surface resolution, through General Circulation Model (GCM) and vegetation model, which simulate in great detail the dynamics and physics of the atmosphere, ocean and biosphere, to regional models, which focus on the European regions and sites of interest. Moreover, rule-based and statistical down-scaling procedures are also considered. Comparisons are provided in terms of climate and vegetation cover at the selected times and for the study regions. The integrated strategy consists of using integrated climate models, representing all the physical mechanisms important for long-term continuous climate variations, to simulate the climate evolution over

  11. A Belief Rule-Based Expert System to Diagnose Influenza

    DEFF Research Database (Denmark)

    Hossain, Mohammad Shahadat; Khalid, Md. Saifuddin; Akter, Shamima

    2014-01-01

    , development and application of an expert system to diagnose influenza under uncertainty. The recently developed generic belief rule-based inference methodology by using the evidential reasoning (RIMER) approach is employed to develop this expert system, termed as Belief Rule Based Expert System (BRBES......). The RIMER approach can handle different types of uncertainties, both in knowledge representation, and in inference procedures. The knowledge-base of this system was constructed by using records of the real patient data along with in consultation with the Influenza specialists of Bangladesh. Practical case...

  12. A Constructivist Approach to Rule Bases

    NARCIS (Netherlands)

    Sileno, G.; Boer, A.; van Engers, T.; Loiseau, S.; Filipe, J.; Duval, B.; van den Herik, J.

    2015-01-01

    The paper presents a set of algorithms for the conversion of rule bases between priority-based and constraint-based representations. Inspired by research in precedential reasoning in law, such algorithms can be used for the analysis of a rule base, and for the study of the impact of the introduction

  13. A Belief Rule-Based Expert System to Assess Bronchiolitis Suspicion from Signs and Symptoms Under Uncertainty

    DEFF Research Database (Denmark)

    Karim, Rezuan; Hossain, Mohammad Shahadat; Khalid, Md. Saifuddin

    2017-01-01

    developed generic belief rule-based inference methodology by using evidential reasoning (RIMER) acts as the inference engine of this BRBES while belief rule base as the knowledge representation schema. The knowledge base of the system is constructed by using real patient data and expert opinion from...

  14. Genetic Programming for the Generation of Crisp and Fuzzy Rule Bases in Classification and Diagnosis of Medical Data

    DEFF Research Database (Denmark)

    Dounias, George; Tsakonas, Athanasios; Jantzen, Jan

    2002-01-01

    This paper demonstrates two methodologies for the construction of rule-based systems in medical decision making. The first approach consists of a method combining genetic programming and heuristic hierarchical rule-base construction. The second model is composed by a strongly-typed genetic...

  15. Personalization of Rule-based Web Services.

    Science.gov (United States)

    Choi, Okkyung; Han, Sang Yong

    2008-04-04

    Nowadays Web users have clearly expressed their wishes to receive personalized services directly. Personalization is the way to tailor services directly to the immediate requirements of the user. However, the current Web Services System does not provide any features supporting this such as consideration of personalization of services and intelligent matchmaking. In this research a flexible, personalized Rule-based Web Services System to address these problems and to enable efficient search, discovery and construction across general Web documents and Semantic Web documents in a Web Services System is proposed. This system utilizes matchmaking among service requesters', service providers' and users' preferences using a Rule-based Search Method, and subsequently ranks search results. A prototype of efficient Web Services search and construction for the suggested system is developed based on the current work.

  16. Rule based deterioration identification and management system

    International Nuclear Information System (INIS)

    Kataoka, S.; Pavinich, W.; Lapides, M.

    1993-01-01

    Under the sponsorship of IHI and EPRI, a rule-based screening system has been developed that can be used by utility engineers to determine which deterioration mechanisms are acting on specific LWR components, and to evaluate the efficacy of an age-related deterioration management program. The screening system was developed using the rule-based shell, NEXPERT, which provides traceability to the data sources used in the logic development. The system addresses all the deterioration mechanisms of specific metals encountered in either BWRs or PWRs. Deterioration mechanisms are listed with reasons why they may occur during the design life of LWRs, considering the plant environment, manufacturing process, service history, material chemical composition, etc. of components in a specific location of a LWR. To eliminate the evaluation of inactive deterioration quickly, a tier structure is applied to the rules. The reasons why deterioration will occur are extracted automatically by backward chaining. To reduce the amount of user input, plant environmental data are stored in files as default environmental data. (author)

  17. Evolving Rule-Based Systems in two Medical Domains using Genetic Programming

    DEFF Research Database (Denmark)

    Tsakonas, A.; Dounias, G.; Jantzen, Jan

    2004-01-01

    We demonstrate, compare and discuss the application of two genetic programming methodologies for the construction of rule-based systems in two medical domains: the diagnosis of Aphasia's subtypes and the classification of Pap-Smear Test examinations. The first approach consists of a scheme...

  18. Quasistatic zooming of FDTD E-field computations: the impact of down-scaling techniques

    Energy Technology Data Exchange (ETDEWEB)

    Van de Kamer, J.B.; Kroeze, H.; De Leeuw, A.A.C.; Lagendijk, J.J.W. [Department of Radiotherapy, University Medical Center Utrecht, Heidelberglaan 100, 3584 CX, Utrecht (Netherlands)

    2001-05-01

    Due to current computer limitations, regional hyperthermia treatment planning (HTP) is practically limited to a resolution of 1 cm, whereas a millimetre resolution is desired. Using the centimetre resolution E-vector-field distribution, computed with, for example, the finite-difference time-domain (FDTD) method and the millimetre resolution patient anatomy it is possible to obtain a millimetre resolution SAR distribution in a volume of interest (VOI) by means of quasistatic zooming. To compute the required low-resolution E-vector-field distribution, a low-resolution dielectric geometry is needed which is constructed by down-scaling the millimetre resolution dielectric geometry. In this study we have investigated which down-scaling technique results in a dielectric geometry that yields the best low-resolution E-vector-field distribution as input for quasistatic zooming. A segmented 2 mm resolution CT data set of a patient has been down-scaled to 1 cm resolution using three different techniques: 'winner-takes-all', 'volumetric averaging' and 'anisotropic volumetric averaging'. The E-vector-field distributions computed for those low-resolution dielectric geometries have been used as input for quasistatic zooming. The resulting zoomed-resolution SAR distributions were compared with a reference: the 2 mm resolution SAR distribution computed with the FDTD method. The E-vector-field distribution for both a simple phantom and the complex partial patient geometry down-scaled using 'anisotropic volumetric averaging' resulted in zoomed-resolution SAR distributions that best approximate the corresponding high-resolution SAR distribution (correlation 97, 96% and absolute averaged difference 6, 14% respectively). (author)

  19. A Belief Rule-Based (BRB) Decision Support System for Assessing Clinical Asthma Suspicion

    DEFF Research Database (Denmark)

    Hossain, Mohammad Shahadat; Hossain, Emran; Khalid, Md. Saifuddin

    2014-01-01

    conditions of uncertainty. The Belief Rule-Based Inference Methodology Using the Evidential Reasoning (RIMER) approach was adopted to develop this expert system; which is named the Belief Rule-Based Expert System (BRBES). The system can handle various types of uncertainty in knowledge representation...... and inference procedures. The knowledge base of this system was constructed by using real patient data and expert opinion. Practical case studies were used to validate the system. The system-generated results are more effective and reliable in terms of accuracy than the results generated by a manual system....

  20. A rule-based software test data generator

    Science.gov (United States)

    Deason, William H.; Brown, David B.; Chang, Kai-Hsiung; Cross, James H., II

    1991-01-01

    Rule-based software test data generation is proposed as an alternative to either path/predicate analysis or random data generation. A prototype rule-based test data generator for Ada programs is constructed and compared to a random test data generator. Four Ada procedures are used in the comparison. Approximately 2000 rule-based test cases and 100,000 randomly generated test cases are automatically generated and executed. The success of the two methods is compared using standard coverage metrics. Simple statistical tests showing that even the primitive rule-based test data generation prototype is significantly better than random data generation are performed. This result demonstrates that rule-based test data generation is feasible and shows great promise in assisting test engineers, especially when the rule base is developed further.

  1. A C++ Class for Rule-Base Objects

    Directory of Open Access Journals (Sweden)

    William J. Grenney

    1992-01-01

    Full Text Available A C++ class, called Tripod, was created as a tool to assist with the development of rule-base decision support systems. The Tripod class contains data structures for the rule-base and member functions for operating on the data. The rule-base is defined by three ASCII files. These files are translated by a preprocessor into a single file that is located when a rule-base object is instantiated. The Tripod class was tested as part of a proto-type decision support system (DSS for winter highway maintenance in the Intermountain West. The DSS is composed of two principal modules: the main program, called the wrapper, and a Tripod rule-base object. The wrapper is a procedural module that interfaces with remote sensors and an external meterological database. The rule-base contains the logic for advising an inexperienced user and for assisting with the decision making process.

  2. Integrated Case Based and Rule Based Reasoning for Decision Support

    OpenAIRE

    Eshete, Azeb Bekele

    2009-01-01

    This project is a continuation of my specialization project which was focused on studying theoretical concepts related to case based reasoning method, rule based reasoning method and integration of them. The integration of rule-based and case-based reasoning methods has shown a substantial improvement with regards to performance over the individual methods. Verdande Technology As wants to try integrating the rule based reasoning method with an existing case based system. This project focu...

  3. Horizontal and Vertical Rule Bases Method in Fuzzy Controllers

    OpenAIRE

    Aminifar, Sadegh; bin Marzuki, Arjuna

    2013-01-01

    Concept of horizontal and vertical rule bases is introduced. Using this method enables the designers to look for main behaviors of system and describes them with greater approximations. The rules which describe the system in first stage are called horizontal rule base. In the second stage, the designer modulates the obtained surface by describing needed changes on first surface for handling real behaviors of system. The rules used in the second stage are called vertical rule base. Horizontal...

  4. Evolving rule-based systems in two medical domains using genetic programming.

    Science.gov (United States)

    Tsakonas, Athanasios; Dounias, Georgios; Jantzen, Jan; Axer, Hubertus; Bjerregaard, Beth; von Keyserlingk, Diedrich Graf

    2004-11-01

    To demonstrate and compare the application of different genetic programming (GP) based intelligent methodologies for the construction of rule-based systems in two medical domains: the diagnosis of aphasia's subtypes and the classification of pap-smear examinations. Past data representing (a) successful diagnosis of aphasia's subtypes from collaborating medical experts through a free interview per patient, and (b) correctly classified smears (images of cells) by cyto-technologists, previously stained using the Papanicolaou method. Initially a hybrid approach is proposed, which combines standard genetic programming and heuristic hierarchical crisp rule-base construction. Then, genetic programming for the production of crisp rule based systems is attempted. Finally, another hybrid intelligent model is composed by a grammar driven genetic programming system for the generation of fuzzy rule-based systems. Results denote the effectiveness of the proposed systems, while they are also compared for their efficiency, accuracy and comprehensibility, to those of an inductive machine learning approach as well as to those of a standard genetic programming symbolic expression approach. The proposed GP-based intelligent methodologies are able to produce accurate and comprehensible results for medical experts performing competitive to other intelligent approaches. The aim of the authors was the production of accurate but also sensible decision rules that could potentially help medical doctors to extract conclusions, even at the expense of a higher classification score achievement.

  5. Rule-based energy management strategies for hybrid vehicles

    NARCIS (Netherlands)

    Hofman, T.; Druten, van R.M.; Serrarens, A.F.A.; Steinbuch, M.

    2007-01-01

    Int. J. of Electric and Hybrid Vehicles (IJEHV), The highest control layer of a (hybrid) vehicular drive train is termed the Energy Management Strategy (EMS). In this paper an overview of different control methods is given and a new rule-based EMS is introduced based on the combination of Rule-Based

  6. Constructing rule-based models using the belief functions framework

    NARCIS (Netherlands)

    Almeida, R.J.; Denoeux, T.; Kaymak, U.; Greco, S.; Bouchon-Meunier, B.; Coletti, G.; Fedrizzi, M.; Matarazzo, B.; Yager, R.R.

    2012-01-01

    Abstract. We study a new approach to regression analysis. We propose a new rule-based regression model using the theoretical framework of belief functions. For this purpose we use the recently proposed Evidential c-means (ECM) to derive rule-based models solely from data. ECM allocates, for each

  7. A Fuzzy Rule-based Controller For Automotive Vehicle Guidance

    OpenAIRE

    Hessburg, Thomas; Tomizuka, Masayoshi

    1991-01-01

    A fuzzy rule-based controller is applied to lateral guidance of a vehicle for an automated highway system. The fuzzy rules, based on human drivers' experiences, are developed to track the center of a lane in the presence of external disturbances and over a range of vehicle operating conditions.

  8. Rule based systems for big data a machine learning approach

    CERN Document Server

    Liu, Han; Cocea, Mihaela

    2016-01-01

    The ideas introduced in this book explore the relationships among rule based systems, machine learning and big data. Rule based systems are seen as a special type of expert systems, which can be built by using expert knowledge or learning from real data. The book focuses on the development and evaluation of rule based systems in terms of accuracy, efficiency and interpretability. In particular, a unified framework for building rule based systems, which consists of the operations of rule generation, rule simplification and rule representation, is presented. Each of these operations is detailed using specific methods or techniques. In addition, this book also presents some ensemble learning frameworks for building ensemble rule based systems.

  9. Direct Down-scale Experiments of Concentration Column Designs for SHINE Process

    Energy Technology Data Exchange (ETDEWEB)

    Youker, Amanda J. [Argonne National Lab. (ANL), Argonne, IL (United States); Stepinski, Dominique C. [Argonne National Lab. (ANL), Argonne, IL (United States); Vandegrift, George F. [Argonne National Lab. (ANL), Argonne, IL (United States)

    2017-05-01

    Argonne is assisting SHINE Medical Technologies in their efforts to become a domestic Mo-99 producer. The SHINE accelerator-driven process uses a uranyl-sulfate target solution for the production of fission-product Mo-99. Argonne has developed a molybdenum recovery and purification process for this target solution. The process includes an initial Mo recovery column followed by a concentration column to reduce the product volume from 15-25 L to < 1 L prior to entry into the LEU Modified Cintichem (LMC) process for purification.1 This report discusses direct down-scale experiments of the plant-scale concentration column design, where the effects of loading velocity and temperature were investigated.

  10. Regional and urban down scaling of global climate scenarios for health impact assessments

    Energy Technology Data Exchange (ETDEWEB)

    San Jose, R.; Perez, J. L.; Perez, L.; Gonzalez, R. M.; Pecci, J.; Garzon, A.; Palacios, M.

    2015-07-01

    In this contribution we have used global climate RCP IPCC scenarios to produce climate and air pollution maps at regional (25 km resolution) and urban scale with 200 m spatial resolution over Europe and five European cities in order to investigate the impact on meteorological variables and pollutant concentrations . We have used the very well known mesoscale meteorological model WRF-Chem (NOAA, US). We have used 2011 as control past year and two RCP scenarios from CCSM global climate model with 4.5 W/m2 and 8.5 W/m2 for 2030, 2050 and 2100 years. After running WRF-Chem model, using the boundary conditions provided by RCP scenarios with the emissions of 2011, we have performed a detailed down scaling process using CALMET diagnostic model to obtain a full 200 m spatial resolution map of five European cities (London, Antwerp, Madrid, Milan, and Helsinki). We will show the results and the health impacts for future RCP IPCC climate scenarios in comparison with the 2011 control year information for climate and health indicators. Finally, we have also investigated the impact of the aerosol effects in the short wave radiation mean value. Two simulations with the WRF-Chem model have been performed over Europe in 2010. A baseline simulation without any feedback effects and a second simulation including the direct effects affecting the solar radiation reaching the surface as well as the indirect aerosol effect with potential impacts on increasing or decreasing the precipitation rates. Aerosol effects produce an increase of incoming radiation over Atlantic Ocean (up to 70%) because the prescribed aerosol concentrations in the WRF-Chem without feedbacks is substantially higher than the aerosol concentrations produced when we activate the feedback effects. The decrease in solar radiation in the Sahara area (10%) is found to be produced because the prescribed aerosol concentration in the no feedback simulation is lower than when we activate the feedback effects. (Author)

  11. Enhancing reliable online transaction with intelligent rule-based ...

    African Journals Online (AJOL)

    Enhancing reliable online transaction with intelligent rule-based fraud detection technique. ... These are with a bid to reducing amongst other things the cost of production and also dissuade the poor handling of Nigeria currency. The CBN pronouncement has necessitated the upsurge in transactions completed with credit ...

  12. Rule-based emergency action level monitor prototype

    International Nuclear Information System (INIS)

    Touchton, R.A.; Gunter, A.D.; Cain, D.

    1985-01-01

    In late 1983, the Electric Power Research Institute (EPRI) began a program to encourage and stimulate the development of artificial intelligence (AI) applications for the nuclear industry. Development of a rule-based emergency action level classification system prototype is discussed. The paper describes both the full prototype currently under development and the completed, simplified prototype

  13. Rule-based Test Generation with Mind Maps

    Directory of Open Access Journals (Sweden)

    Dimitry Polivaev

    2012-02-01

    Full Text Available This paper introduces basic concepts of rule based test generation with mind maps, and reports experiences learned from industrial application of this technique in the domain of smart card testing by Giesecke & Devrient GmbH over the last years. It describes the formalization of test selection criteria used by our test generator, our test generation architecture and test generation framework.

  14. Horizontal and Vertical Rule Bases Method in Fuzzy Controllers

    Directory of Open Access Journals (Sweden)

    Sadegh Aminifar

    2013-01-01

    Full Text Available Concept of horizontal and vertical rule bases is introduced. Using this method enables the designers to look for main behaviors of system and describes them with greater approximations. The rules which describe the system in first stage are called horizontal rule base. In the second stage, the designer modulates the obtained surface by describing needed changes on first surface for handling real behaviors of system. The rules used in the second stage are called vertical rule base. Horizontal and vertical rule bases method has a great roll in easing of extracting the optimum control surface by using too lesser rules than traditional fuzzy systems. This research involves with control of a system with high nonlinearity and in difficulty to model it with classical methods. As a case study for testing proposed method in real condition, the designed controller is applied to steaming room with uncertain data and variable parameters. A comparison between PID and traditional fuzzy counterpart and our proposed system shows that our proposed system outperforms PID and traditional fuzzy systems in point of view of number of valve switching and better surface following. The evaluations have done both with model simulation and DSP implementation.

  15. Comparison of two down-scaling methods for climate study and climate change on the mountain areas in France

    International Nuclear Information System (INIS)

    Piazza, Marie; Page, Christian; Sanchez-Gomez, Emilia; Terray, Laurent; Deque, Michel

    2013-01-01

    Mountain regions are highly vulnerable to climate change and are likely to be among the areas most impacted by global warming. But climate projections for the end of the 21. century are developed with general circulation models of climate, which do not present a sufficient horizontal resolution to accurately evaluate the impacts of warming on these regions. Several techniques are then used to perform a spatial down-scaling (on the order of 10 km). There are two categories of down-scaling methods: dynamical methods that require significant computational resources for the achievement of regional climate simulations at high resolution, and statistical methods that require few resources but an observation dataset over a long period and of good quality. In this study, climate simulations of the global atmospheric model ARPEGE projections over France are down-scaled according to a dynamical method, performed with the ALADIN-Climate regional model, and a statistical method performed with the software DSClim developed at CERFACS. The two down-scaling methods are presented and the results on the climate of the French mountains are evaluated for the current climate. Both methods give similar results for average snowfall. However extreme events of total precipitation (droughts, intense precipitation events) are largely underestimated by the statistical method. Then, the results of both methods are compared for two future climate projections, according to the greenhouse gas emissions scenario A1B of IPCC. The two methods agree on fewer frost days, a significant decrease in the amounts of solid precipitation and an average increase in the percentage of dry days of more than 10%. The results obtained on Corsica are more heterogeneous but they are questionable because the reduced spatial domain is probably not very relevant regarding statistical sampling. (authors)

  16. A low-jitter RF PLL frequency synthesizer with high-speed mixed-signal down-scaling circuits

    International Nuclear Information System (INIS)

    Tang Lu; Wang Zhigong; Xue Hong; He Xiaohu; Xu Yong; Sun Ling

    2010-01-01

    A low-jitter RF phase locked loop (PLL) frequency synthesizer with high-speed mixed-signal down-scaling circuits is proposed. Several techniques are proposed to reduce the design complexity and improve the performance of the mixed-signal down-scaling circuit in the PLL. An improved D-latch is proposed to increase the speed and the driving capability of the DMP in the down-scaling circuit. Through integrating the D-latch with 'OR' logic for dual-modulus operation, the delays associated with both the 'OR' and D-flip-flop (DFF) operations are reduced, and the complexity of the circuit is also decreased. The programmable frequency divider of the down-scaling circuit is realized in a new method based on deep submicron CMOS technology standard cells and a more accurate wire-load model. The charge pump in the PLL is also realized with a novel architecture to improve the current matching characteristic so as to reduce the jitter of the system. The proposed RF PLL frequency synthesizer is realized with a TSMC 0.18-μm CMOS process. The measured phase noise of the PLL frequency synthesizer output at 100 kHz offset from the center frequency is only -101.52 dBc/Hz. The circuit exhibits a low RMS jitter of 3.3 ps. The power consumption of the PLL frequency synthesizer is also as low as 36 mW at a 1.8 V power supply. (semiconductor integrated circuits)

  17. A rule-based smart automated fertilization and irrigation systems

    Science.gov (United States)

    Yousif, Musab El-Rashid; Ghafar, Khairuddin; Zahari, Rahimi; Lim, Tiong Hoo

    2018-04-01

    Smart automation in industries has become very important as it can improve the reliability and efficiency of the systems. The use of smart technologies in agriculture have increased over the year to ensure and control the production of crop and address food security. However, it is important to use proper irrigation systems avoid water wastage and overfeeding of the plant. In this paper, a Smart Rule-based Automated Fertilization and Irrigation System is proposed and evaluated. We propose a rule based decision making algorithm to monitor and control the food supply to the plant and the soil quality. A build-in alert system is also used to update the farmer using a text message. The system is developed and evaluated using a real hardware.

  18. GRAMMAR RULE BASED INFORMATION RETRIEVAL MODEL FOR BIG DATA

    Directory of Open Access Journals (Sweden)

    T. Nadana Ravishankar

    2015-07-01

    Full Text Available Though Information Retrieval (IR in big data has been an active field of research for past few years; the popularity of the native languages presents a unique challenge in big data information retrieval systems. There is a need to retrieve information which is present in English and display it in the native language for users. This aim of cross language information retrieval is complicated by unique features of the native languages such as: morphology, compound word formations, word spelling variations, ambiguity, word synonym, other language influence and etc. To overcome some of these issues, the native language is modeled using a grammar rule based approach in this work. The advantage of this approach is that the native language is modeled and its unique features are encoded using a set of inference rules. This rule base coupled with the customized ontological system shows considerable potential and is found to show better precision and recall.

  19. Designing Fuzzy Rule Based Expert System for Cyber Security

    OpenAIRE

    Goztepe, Kerim

    2016-01-01

    The state of cyber security has begun to attract more attention and interest outside the community of computer security experts. Cyber security is not a single problem, but rather a group of highly different problems involving different sets of threats. Fuzzy Rule based system for cyber security is a system consists of a rule depository and a mechanism for accessing and running the rules. The depository is usually constructed with a collection of related rule sets. The aim of this study is to...

  20. An Embedded Rule-Based Diagnostic Expert System in Ada

    Science.gov (United States)

    Jones, Robert E.; Liberman, Eugene M.

    1992-01-01

    Ada is becoming an increasingly popular programming language for large Government-funded software projects. Ada with it portability, transportability, and maintainability lends itself well to today's complex programming environment. In addition, expert systems have also assumed a growing role in providing human-like reasoning capability expertise for computer systems. The integration is discussed of expert system technology with Ada programming language, especially a rule-based expert system using an ART-Ada (Automated Reasoning Tool for Ada) system shell. NASA Lewis was chosen as a beta test site for ART-Ada. The test was conducted by implementing the existing Autonomous Power EXpert System (APEX), a Lisp-based power expert system, in ART-Ada. Three components, the rule-based expert systems, a graphics user interface, and communications software make up SMART-Ada (Systems fault Management with ART-Ada). The rules were written in the ART-Ada development environment and converted to Ada source code. The graphics interface was developed with the Transportable Application Environment (TAE) Plus, which generates Ada source code to control graphics images. SMART-Ada communicates with a remote host to obtain either simulated or real data. The Ada source code generated with ART-Ada, TAE Plus, and communications code was incorporated into an Ada expert system that reads the data from a power distribution test bed, applies the rule to determine a fault, if one exists, and graphically displays it on the screen. The main objective, to conduct a beta test on the ART-Ada rule-based expert system shell, was achieved. The system is operational. New Ada tools will assist in future successful projects. ART-Ada is one such tool and is a viable alternative to the straight Ada code when an application requires a rule-based or knowledge-based approach.

  1. Guidelines for visualizing and annotating rule-based models†

    Science.gov (United States)

    Chylek, Lily A.; Hu, Bin; Blinov, Michael L.; Emonet, Thierry; Faeder, James R.; Goldstein, Byron; Gutenkunst, Ryan N.; Haugh, Jason M.; Lipniacki, Tomasz; Posner, Richard G.; Yang, Jin; Hlavacek, William S.

    2011-01-01

    Rule-based modeling provides a means to represent cell signaling systems in a way that captures site-specific details of molecular interactions. For rule-based models to be more widely understood and (re)used, conventions for model visualization and annotation are needed. We have developed the concepts of an extended contact map and a model guide for illustrating and annotating rule-based models. An extended contact map represents the scope of a model by providing an illustration of each molecule, molecular component, direct physical interaction, post-translational modification, and enzyme-substrate relationship considered in a model. A map can also illustrate allosteric effects, structural relationships among molecular components, and compartmental locations of molecules. A model guide associates elements of a contact map with annotation and elements of an underlying model, which may be fully or partially specified. A guide can also serve to document the biological knowledge upon which a model is based. We provide examples of a map and guide for a published rule-based model that characterizes early events in IgE receptor (FcεRI) signaling. We also provide examples of how to visualize a variety of processes that are common in cell signaling systems but not considered in the example model, such as ubiquitination. An extended contact map and an associated guide can document knowledge of a cell signaling system in a form that is visual as well as executable. As a tool for model annotation, a map and guide can communicate the content of a model clearly and with precision, even for large models. PMID:21647530

  2. Guidelines for visualizing and annotating rule-based models.

    Science.gov (United States)

    Chylek, Lily A; Hu, Bin; Blinov, Michael L; Emonet, Thierry; Faeder, James R; Goldstein, Byron; Gutenkunst, Ryan N; Haugh, Jason M; Lipniacki, Tomasz; Posner, Richard G; Yang, Jin; Hlavacek, William S

    2011-10-01

    Rule-based modeling provides a means to represent cell signaling systems in a way that captures site-specific details of molecular interactions. For rule-based models to be more widely understood and (re)used, conventions for model visualization and annotation are needed. We have developed the concepts of an extended contact map and a model guide for illustrating and annotating rule-based models. An extended contact map represents the scope of a model by providing an illustration of each molecule, molecular component, direct physical interaction, post-translational modification, and enzyme-substrate relationship considered in a model. A map can also illustrate allosteric effects, structural relationships among molecular components, and compartmental locations of molecules. A model guide associates elements of a contact map with annotation and elements of an underlying model, which may be fully or partially specified. A guide can also serve to document the biological knowledge upon which a model is based. We provide examples of a map and guide for a published rule-based model that characterizes early events in IgE receptor (FcεRI) signaling. We also provide examples of how to visualize a variety of processes that are common in cell signaling systems but not considered in the example model, such as ubiquitination. An extended contact map and an associated guide can document knowledge of a cell signaling system in a form that is visual as well as executable. As a tool for model annotation, a map and guide can communicate the content of a model clearly and with precision, even for large models.

  3. Rule-bases construction through self-learning for a table-based Sugeno-Takagi fuzzy logic control system

    Directory of Open Access Journals (Sweden)

    C. Boldisor

    2009-12-01

    Full Text Available A self-learning based methodology for building the rule-base of a fuzzy logic controller (FLC is presented and verified, aiming to engage intelligent characteristics to a fuzzy logic control systems. The methodology is a simplified version of those presented in today literature. Some aspects are intentionally ignored since it rarely appears in control system engineering and a SISO process is considered here. The fuzzy inference system obtained is a table-based Sugeno-Takagi type. System’s desired performance is defined by a reference model and rules are extracted from recorded data, after the correct control actions are learned. The presented algorithm is tested in constructing the rule-base of a fuzzy controller for a DC drive application. System’s performances and method’s viability are analyzed.

  4. Post-decomposition optimizations using pattern matching and rule-based clustering for multi-patterning technology

    Science.gov (United States)

    Wang, Lynn T.-N.; Madhavan, Sriram

    2018-03-01

    A pattern matching and rule-based polygon clustering methodology with DFM scoring is proposed to detect decomposition-induced manufacturability detractors and fix the layout designs prior to manufacturing. A pattern matcher scans the layout for pre-characterized patterns from a library. If a pattern were detected, rule-based clustering identifies the neighboring polygons that interact with those captured by the pattern. Then, DFM scores are computed for the possible layout fixes: the fix with the best score is applied. The proposed methodology was applied to two 20nm products with a chip area of 11 mm2 on the metal 2 layer. All the hotspots were resolved. The number of DFM spacing violations decreased by 7-15%.

  5. Connecting clinical and actuarial prediction with rule-based methods.

    Science.gov (United States)

    Fokkema, Marjolein; Smits, Niels; Kelderman, Henk; Penninx, Brenda W J H

    2015-06-01

    Meta-analyses comparing the accuracy of clinical versus actuarial prediction have shown actuarial methods to outperform clinical methods, on average. However, actuarial methods are still not widely used in clinical practice, and there has been a call for the development of actuarial prediction methods for clinical practice. We argue that rule-based methods may be more useful than the linear main effect models usually employed in prediction studies, from a data and decision analytic as well as a practical perspective. In addition, decision rules derived with rule-based methods can be represented as fast and frugal trees, which, unlike main effects models, can be used in a sequential fashion, reducing the number of cues that have to be evaluated before making a prediction. We illustrate the usability of rule-based methods by applying RuleFit, an algorithm for deriving decision rules for classification and regression problems, to a dataset on prediction of the course of depressive and anxiety disorders from Penninx et al. (2011). The RuleFit algorithm provided a model consisting of 2 simple decision rules, requiring evaluation of only 2 to 4 cues. Predictive accuracy of the 2-rule model was very similar to that of a logistic regression model incorporating 20 predictor variables, originally applied to the dataset. In addition, the 2-rule model required, on average, evaluation of only 3 cues. Therefore, the RuleFit algorithm appears to be a promising method for creating decision tools that are less time consuming and easier to apply in psychological practice, and with accuracy comparable to traditional actuarial methods. (c) 2015 APA, all rights reserved).

  6. Hierarchical graphs for rule-based modeling of biochemical systems

    Directory of Open Access Journals (Sweden)

    Hu Bin

    2011-02-01

    Full Text Available Abstract Background In rule-based modeling, graphs are used to represent molecules: a colored vertex represents a component of a molecule, a vertex attribute represents the internal state of a component, and an edge represents a bond between components. Components of a molecule share the same color. Furthermore, graph-rewriting rules are used to represent molecular interactions. A rule that specifies addition (removal of an edge represents a class of association (dissociation reactions, and a rule that specifies a change of a vertex attribute represents a class of reactions that affect the internal state of a molecular component. A set of rules comprises an executable model that can be used to determine, through various means, the system-level dynamics of molecular interactions in a biochemical system. Results For purposes of model annotation, we propose the use of hierarchical graphs to represent structural relationships among components and subcomponents of molecules. We illustrate how hierarchical graphs can be used to naturally document the structural organization of the functional components and subcomponents of two proteins: the protein tyrosine kinase Lck and the T cell receptor (TCR complex. We also show that computational methods developed for regular graphs can be applied to hierarchical graphs. In particular, we describe a generalization of Nauty, a graph isomorphism and canonical labeling algorithm. The generalized version of the Nauty procedure, which we call HNauty, can be used to assign canonical labels to hierarchical graphs or more generally to graphs with multiple edge types. The difference between the Nauty and HNauty procedures is minor, but for completeness, we provide an explanation of the entire HNauty algorithm. Conclusions Hierarchical graphs provide more intuitive formal representations of proteins and other structured molecules with multiple functional components than do the regular graphs of current languages for

  7. Rule-Based Event Processing and Reaction Rules

    Science.gov (United States)

    Paschke, Adrian; Kozlenkov, Alexander

    Reaction rules and event processing technologies play a key role in making business and IT / Internet infrastructures more agile and active. While event processing is concerned with detecting events from large event clouds or streams in almost real-time, reaction rules are concerned with the invocation of actions in response to events and actionable situations. They state the conditions under which actions must be taken. In the last decades various reaction rule and event processing approaches have been developed, which for the most part have been advanced separately. In this paper we survey reaction rule approaches and rule-based event processing systems and languages.

  8. Biometric image enhancement using decision rule based image fusion techniques

    Science.gov (United States)

    Sagayee, G. Mary Amirtha; Arumugam, S.

    2010-02-01

    Introducing biometrics into information systems may result in considerable benefits. Most of the researchers confirmed that the finger print is widely used than the iris or face and more over it is the primary choice for most privacy concerned applications. For finger prints applications, choosing proper sensor is at risk. The proposed work deals about, how the image quality can be improved by introducing image fusion technique at sensor levels. The results of the images after introducing the decision rule based image fusion technique are evaluated and analyzed with its entropy levels and root mean square error.

  9. ConGEMs: Condensed Gene Co-Expression Module Discovery Through Rule-Based Clustering and Its Application to Carcinogenesis

    Directory of Open Access Journals (Sweden)

    Saurav Mallik

    2017-12-01

    Full Text Available For transcriptomic analysis, there are numerous microarray-based genomic data, especially those generated for cancer research. The typical analysis measures the difference between a cancer sample-group and a matched control group for each transcript or gene. Association rule mining is used to discover interesting item sets through rule-based methodology. Thus, it has advantages to find causal effect relationships between the transcripts. In this work, we introduce two new rule-based similarity measures—weighted rank-based Jaccard and Cosine measures—and then propose a novel computational framework to detect condensed gene co-expression modules ( C o n G E M s through the association rule-based learning system and the weighted similarity scores. In practice, the list of evolved condensed markers that consists of both singular and complex markers in nature depends on the corresponding condensed gene sets in either antecedent or consequent of the rules of the resultant modules. In our evaluation, these markers could be supported by literature evidence, KEGG (Kyoto Encyclopedia of Genes and Genomes pathway and Gene Ontology annotations. Specifically, we preliminarily identified differentially expressed genes using an empirical Bayes test. A recently developed algorithm—RANWAR—was then utilized to determine the association rules from these genes. Based on that, we computed the integrated similarity scores of these rule-based similarity measures between each rule-pair, and the resultant scores were used for clustering to identify the co-expressed rule-modules. We applied our method to a gene expression dataset for lung squamous cell carcinoma and a genome methylation dataset for uterine cervical carcinogenesis. Our proposed module discovery method produced better results than the traditional gene-module discovery measures. In summary, our proposed rule-based method is useful for exploring biomarker modules from transcriptomic data.

  10. ConGEMs: Condensed Gene Co-Expression Module Discovery Through Rule-Based Clustering and Its Application to Carcinogenesis.

    Science.gov (United States)

    Mallik, Saurav; Zhao, Zhongming

    2017-12-28

    For transcriptomic analysis, there are numerous microarray-based genomic data, especially those generated for cancer research. The typical analysis measures the difference between a cancer sample-group and a matched control group for each transcript or gene. Association rule mining is used to discover interesting item sets through rule-based methodology. Thus, it has advantages to find causal effect relationships between the transcripts. In this work, we introduce two new rule-based similarity measures-weighted rank-based Jaccard and Cosine measures-and then propose a novel computational framework to detect condensed gene co-expression modules ( C o n G E M s) through the association rule-based learning system and the weighted similarity scores. In practice, the list of evolved condensed markers that consists of both singular and complex markers in nature depends on the corresponding condensed gene sets in either antecedent or consequent of the rules of the resultant modules. In our evaluation, these markers could be supported by literature evidence, KEGG (Kyoto Encyclopedia of Genes and Genomes) pathway and Gene Ontology annotations. Specifically, we preliminarily identified differentially expressed genes using an empirical Bayes test. A recently developed algorithm-RANWAR-was then utilized to determine the association rules from these genes. Based on that, we computed the integrated similarity scores of these rule-based similarity measures between each rule-pair, and the resultant scores were used for clustering to identify the co-expressed rule-modules. We applied our method to a gene expression dataset for lung squamous cell carcinoma and a genome methylation dataset for uterine cervical carcinogenesis. Our proposed module discovery method produced better results than the traditional gene-module discovery measures. In summary, our proposed rule-based method is useful for exploring biomarker modules from transcriptomic data.

  11. A hierarchical fuzzy rule-based approach to aphasia diagnosis.

    Science.gov (United States)

    Akbarzadeh-T, Mohammad-R; Moshtagh-Khorasani, Majid

    2007-10-01

    Aphasia diagnosis is a particularly challenging medical diagnostic task due to the linguistic uncertainty and vagueness, inconsistencies in the definition of aphasic syndromes, large number of measurements with imprecision, natural diversity and subjectivity in test objects as well as in opinions of experts who diagnose the disease. To efficiently address this diagnostic process, a hierarchical fuzzy rule-based structure is proposed here that considers the effect of different features of aphasia by statistical analysis in its construction. This approach can be efficient for diagnosis of aphasia and possibly other medical diagnostic applications due to its fuzzy and hierarchical reasoning construction. Initially, the symptoms of the disease which each consists of different features are analyzed statistically. The measured statistical parameters from the training set are then used to define membership functions and the fuzzy rules. The resulting two-layered fuzzy rule-based system is then compared with a back propagating feed-forward neural network for diagnosis of four Aphasia types: Anomic, Broca, Global and Wernicke. In order to reduce the number of required inputs, the technique is applied and compared on both comprehensive and spontaneous speech tests. Statistical t-test analysis confirms that the proposed approach uses fewer Aphasia features while also presenting a significant improvement in terms of accuracy.

  12. Rule-based expert system for maritime anomaly detection

    Science.gov (United States)

    Roy, Jean

    2010-04-01

    Maritime domain operators/analysts have a mandate to be aware of all that is happening within their areas of responsibility. This mandate derives from the needs to defend sovereignty, protect infrastructures, counter terrorism, detect illegal activities, etc., and it has become more challenging in the past decade, as commercial shipping turned into a potential threat. In particular, a huge portion of the data and information made available to the operators/analysts is mundane, from maritime platforms going about normal, legitimate activities, and it is very challenging for them to detect and identify the non-mundane. To achieve such anomaly detection, they must establish numerous relevant situational facts from a variety of sensor data streams. Unfortunately, many of the facts of interest just cannot be observed; the operators/analysts thus use their knowledge of the maritime domain and their reasoning faculties to infer these facts. As they are often overwhelmed by the large amount of data and information, automated reasoning tools could be used to support them by inferring the necessary facts, ultimately providing indications and warning on a small number of anomalous events worthy of their attention. Along this line of thought, this paper describes a proof-of-concept prototype of a rule-based expert system implementing automated rule-based reasoning in support of maritime anomaly detection.

  13. Uncertain rule-based fuzzy systems introduction and new directions

    CERN Document Server

    Mendel, Jerry M

    2017-01-01

    The second edition of this textbook provides a fully updated approach to fuzzy sets and systems that can model uncertainty — i.e., “type-2” fuzzy sets and systems. The author demonstrates how to overcome the limitations of classical fuzzy sets and systems, enabling a wide range of applications from time-series forecasting to knowledge mining to control. In this new edition, a bottom-up approach is presented that begins by introducing classical (type-1) fuzzy sets and systems, and then explains how they can be modified to handle uncertainty. The author covers fuzzy rule-based systems – from type-1 to interval type-2 to general type-2 – in one volume. For hands-on experience, the book provides information on accessing MatLab and Java software to complement the content. The book features a full suite of classroom material. Presents fully updated material on new breakthroughs in human-inspired rule-based techniques for handling real-world uncertainties; Allows those already familiar with type-1 fuzzy se...

  14. Fuzzy rule-based model for hydropower reservoirs operation

    Energy Technology Data Exchange (ETDEWEB)

    Moeini, R.; Afshar, A.; Afshar, M.H. [School of Civil Engineering, Iran University of Science and Technology, Tehran (Iran, Islamic Republic of)

    2011-02-15

    Real-time hydropower reservoir operation is a continuous decision-making process of determining the water level of a reservoir or the volume of water released from it. The hydropower operation is usually based on operating policies and rules defined and decided upon in strategic planning. This paper presents a fuzzy rule-based model for the operation of hydropower reservoirs. The proposed fuzzy rule-based model presents a set of suitable operating rules for release from the reservoir based on ideal or target storage levels. The model operates on an 'if-then' principle, in which the 'if' is a vector of fuzzy premises and the 'then' is a vector of fuzzy consequences. In this paper, reservoir storage, inflow, and period are used as premises and the release as the consequence. The steps involved in the development of the model include, construction of membership functions for the inflow, storage and the release, formulation of fuzzy rules, implication, aggregation and defuzzification. The required knowledge bases for the formulation of the fuzzy rules is obtained form a stochastic dynamic programming (SDP) model with a steady state policy. The proposed model is applied to the hydropower operation of ''Dez'' reservoir in Iran and the results are presented and compared with those of the SDP model. The results indicate the ability of the method to solve hydropower reservoir operation problems. (author)

  15. Rule-based detection of intrathoracic airway trees

    International Nuclear Information System (INIS)

    Sonka, M.; Park, W.; Hoffman, E.A.

    1996-01-01

    New sensitive and reliable methods for assessing alterations in regional lung structure and function are critically important for the investigation and treatment of pulmonary diseases. Accurate identification of the airway tree will provide an assessment of airway structure and will provide a means by which multiple volumetric images of the lung at the same lung volume over time can be used to assess regional parenchymal changes. The authors describe a novel rule-based method for the segmentation of airway trees from three-dimensional (3-D) sets of computed tomography (CT) images, and its validation. The presented method takes advantage of a priori anatomical knowledge about pulmonary airway and vascular trees and their interrelationships. The method is based on a combination of 3-D seeded region growing that is used to identify large airways, rule-based two-dimensional (2-D) segmentation of individual CT slices to identify probable locations of smaller diameter airways, and merging of airway regions across the 3-D set of slices resulting in a tree-like airway structure. The method was validated in 40 3-mm-thick CT sections from five data sets of canine lungs scanned via electron beam CT in vivo with lung volume held at a constant pressure. The method's performance was compared with that of the conventional 3-D region growing method. The method substantially outperformed an existing conventional approach to airway tree detection

  16. A high-level language for rule-based modelling.

    Science.gov (United States)

    Pedersen, Michael; Phillips, Andrew; Plotkin, Gordon D

    2015-01-01

    Rule-based languages such as Kappa excel in their support for handling the combinatorial complexities prevalent in many biological systems, including signalling pathways. But Kappa provides little structure for organising rules, and large models can therefore be hard to read and maintain. This paper introduces a high-level, modular extension of Kappa called LBS-κ. We demonstrate the constructs of the language through examples and three case studies: a chemotaxis switch ring, a MAPK cascade, and an insulin signalling pathway. We then provide a formal definition of LBS-κ through an abstract syntax and a translation to plain Kappa. The translation is implemented in a compiler tool which is available as a web application. We finally demonstrate how to increase the expressivity of LBS-κ through embedded scripts in a general-purpose programming language, a technique which we view as generally applicable to other domain specific languages.

  17. A rule-based automatic sleep staging method.

    Science.gov (United States)

    Liang, Sheng-Fu; Kuo, Chin-En; Hu, Yu-Han; Cheng, Yu-Shian

    2012-03-30

    In this paper, a rule-based automatic sleep staging method was proposed. Twelve features including temporal and spectrum analyses of the EEG, EOG, and EMG signals were utilized. Normalization was applied to each feature to eliminating individual differences. A hierarchical decision tree with fourteen rules was constructed for sleep stage classification. Finally, a smoothing process considering the temporal contextual information was applied for the continuity. The overall agreement and kappa coefficient of the proposed method applied to the all night polysomnography (PSG) of seventeen healthy subjects compared with the manual scorings by R&K rules can reach 86.68% and 0.79, respectively. This method can integrate with portable PSG system for sleep evaluation at-home in the near future. Copyright © 2012 Elsevier B.V. All rights reserved.

  18. Design Transformations for Rule-based Procedural Modeling

    KAUST Repository

    Lienhard, Stefan; Lau, Cheryl; Mü ller, Pascal; Wonka, Peter; Pauly, Mark

    2017-01-01

    We introduce design transformations for rule-based procedural models, e.g., for buildings and plants. Given two or more procedural designs, each specified by a grammar, a design transformation combines elements of the existing designs to generate new designs. We introduce two technical components to enable design transformations. First, we extend the concept of discrete rule switching to rule merging, leading to a very large shape space for combining procedural models. Second, we propose an algorithm to jointly derive two or more grammars, called grammar co-derivation. We demonstrate two applications of our work: we show that our framework leads to a larger variety of models than previous work, and we show fine-grained transformation sequences between two procedural models.

  19. Design Transformations for Rule-based Procedural Modeling

    KAUST Repository

    Lienhard, Stefan

    2017-05-24

    We introduce design transformations for rule-based procedural models, e.g., for buildings and plants. Given two or more procedural designs, each specified by a grammar, a design transformation combines elements of the existing designs to generate new designs. We introduce two technical components to enable design transformations. First, we extend the concept of discrete rule switching to rule merging, leading to a very large shape space for combining procedural models. Second, we propose an algorithm to jointly derive two or more grammars, called grammar co-derivation. We demonstrate two applications of our work: we show that our framework leads to a larger variety of models than previous work, and we show fine-grained transformation sequences between two procedural models.

  20. Genetic learning in rule-based and neural systems

    Science.gov (United States)

    Smith, Robert E.

    1993-01-01

    The design of neural networks and fuzzy systems can involve complex, nonlinear, and ill-conditioned optimization problems. Often, traditional optimization schemes are inadequate or inapplicable for such tasks. Genetic Algorithms (GA's) are a class of optimization procedures whose mechanics are based on those of natural genetics. Mathematical arguments show how GAs bring substantial computational leverage to search problems, without requiring the mathematical characteristics often necessary for traditional optimization schemes (e.g., modality, continuity, availability of derivative information, etc.). GA's have proven effective in a variety of search tasks that arise in neural networks and fuzzy systems. This presentation begins by introducing the mechanism and theoretical underpinnings of GA's. GA's are then related to a class of rule-based machine learning systems called learning classifier systems (LCS's). An LCS implements a low-level production-system that uses a GA as its primary rule discovery mechanism. This presentation illustrates how, despite its rule-based framework, an LCS can be thought of as a competitive neural network. Neural network simulator code for an LCS is presented. In this context, the GA is doing more than optimizing and objective function. It is searching for an ecology of hidden nodes with limited connectivity. The GA attempts to evolve this ecology such that effective neural network performance results. The GA is particularly well adapted to this task, given its naturally-inspired basis. The LCS/neural network analogy extends itself to other, more traditional neural networks. Conclusions to the presentation discuss the implications of using GA's in ecological search problems that arise in neural and fuzzy systems.

  1. Frontal impact response of a virtual low percentile six years old human thorax developed by automatic down-scaling

    Directory of Open Access Journals (Sweden)

    Špička J.

    2015-06-01

    Full Text Available Traffic accidents cause one of the highest numbers of severe injuries in the whole population spectrum. The numbers of deaths and seriously injured citizens prove that traffic accidents and their consequences are still a serious problem to be solved. The paper contributes to the field of vehicle safety technology with a virtual approach. Exploitation of the previously developed scaling algorithm enables the creation of a specific anthropometric model based on a validated reference model. The aim of the paper is to prove the biofidelity of the small percentile six years old virtual human model developed by automatic down-scaling in a frontal impact. For the automatically developed six years old virtual specific anthropometric model, the Kroell impact test is simulated and the results are compared to the experimental data. The chosen approach shows good correspondence of the scaled model performance to the experimental corridors.

  2. A rule-based stemmer for Arabic Gulf dialect

    Directory of Open Access Journals (Sweden)

    Belal Abuata

    2015-04-01

    Full Text Available Arabic dialects arewidely used from many years ago instead of Modern Standard Arabic language in many fields. The presence of dialects in any language is a big challenge. Dialects add a new set of variational dimensions in some fields like natural language processing, information retrieval and even in Arabic chatting between different Arab nationals. Spoken dialects have no standard morphological, phonological and lexical like Modern Standard Arabic. Hence, the objective of this paper is to describe a procedure or algorithm by which a stem for the Arabian Gulf dialect can be defined. The algorithm is rule based. Special rules are created to remove the suffixes and prefixes of the dialect words. Also, the algorithm applies rules related to the word size and the relation between adjacent letters. The algorithm was tested for a number of words and given a good correct stem ratio. The algorithm is also compared with two Modern Standard Arabic algorithms. The results showed that Modern Standard Arabic stemmers performed poorly with Arabic Gulf dialect and our algorithm performed poorly when applied for Modern Standard Arabic words.

  3. Rule-Based Storytelling Text-to-Speech (TTS Synthesis

    Directory of Open Access Journals (Sweden)

    Ramli Izzad

    2016-01-01

    Full Text Available In recent years, various real life applications such as talking books, gadgets and humanoid robots have drawn the attention to pursue research in the area of expressive speech synthesis. Speech synthesis is widely used in various applications. However, there is a growing need for an expressive speech synthesis especially for communication and robotic. In this paper, global and local rule are developed to convert neutral to storytelling style speech for the Malay language. In order to generate rules, modification of prosodic parameters such as pitch, intensity, duration, tempo and pauses are considered. Modification of prosodic parameters is examined by performing prosodic analysis on a story collected from an experienced female and male storyteller. The global and local rule is applied in sentence level and synthesized using HNM. Subjective tests are conducted to evaluate the synthesized storytelling speech quality of both rules based on naturalness, intelligibility, and similarity to the original storytelling speech. The results showed that global rule give a better result than local rule

  4. A Novel Rules Based Approach for Estimating Software Birthmark

    Science.gov (United States)

    Binti Alias, Norma; Anwar, Sajid

    2015-01-01

    Software birthmark is a unique quality of software to detect software theft. Comparing birthmarks of software can tell us whether a program or software is a copy of another. Software theft and piracy are rapidly increasing problems of copying, stealing, and misusing the software without proper permission, as mentioned in the desired license agreement. The estimation of birthmark can play a key role in understanding the effectiveness of a birthmark. In this paper, a new technique is presented to evaluate and estimate software birthmark based on the two most sought-after properties of birthmarks, that is, credibility and resilience. For this purpose, the concept of soft computing such as probabilistic and fuzzy computing has been taken into account and fuzzy logic is used to estimate properties of birthmark. The proposed fuzzy rule based technique is validated through a case study and the results show that the technique is successful in assessing the specified properties of the birthmark, its resilience and credibility. This, in turn, shows how much effort will be required to detect the originality of the software based on its birthmark. PMID:25945363

  5. A Rules-Based Simulation of Bacterial Turbulence

    Science.gov (United States)

    Mikel-Stites, Maxwell; Staples, Anne

    2015-11-01

    In sufficiently dense bacterial populations (>40% bacteria by volume), unusual collective swimming behaviors have been consistently observed, resembling von Karman vortex streets. The source of these collective swimming behavior has yet to be fully determined, and as of yet, no research has been conducted that would define whether or not this behavior is derived predominantly from the properties of the surrounding media, or if it is an emergent behavior as a result of the ``rules'' governing the behavior of individual bacteria. The goal of this research is to ascertain whether or not it is possible to design a simulation that can replicate the qualitative behavior of the densely packed bacterial populations using only behavioral rules to govern the actions of each bacteria, with the physical properties of the media being neglected. The results of the simulation will address whether or not it is possible for the system's overall behavior to be driven exclusively by these rule-based dynamics. In order to examine this, the behavioral simulation was written in MATLAB on a fixed grid, and updated sequentially with the bacterial behavior, including randomized tumbling, gathering and perceptual sub-functions. If the simulation is successful, it will serve as confirmation that it is possible to generate these qualitatively vortex-like behaviors without specific physical media (that the phenomena arises in emergent fashion from behavioral rules), or as evidence that the observed behavior requires some specific set of physical parameters.

  6. Automated rule-base control for nuclear power plants

    International Nuclear Information System (INIS)

    Colley, R.W.

    1983-09-01

    An effort is underway to optimize the roles of man and machine in the control of liquid-metal-cooled fast breeder reactors. The work reported here describes: (1) a methodology for the decomposition of a process into a hierarchical structure; (2) an explicit methodology, Sequencing Established States, to limit the state space search for process control; and (3) the Procedure Prompting System which demonstrates the use of the above methodologies for automatically generating instructions to provide guidance to an operator for both normal and off-normal plant conditions

  7. A new type of simplified fuzzy rule-based system

    Science.gov (United States)

    Angelov, Plamen; Yager, Ronald

    2012-02-01

    Over the last quarter of a century, two types of fuzzy rule-based (FRB) systems dominated, namely Mamdani and Takagi-Sugeno type. They use the same type of scalar fuzzy sets defined per input variable in their antecedent part which are aggregated at the inference stage by t-norms or co-norms representing logical AND/OR operations. In this paper, we propose a significantly simplified alternative to define the antecedent part of FRB systems by data Clouds and density distribution. This new type of FRB systems goes further in the conceptual and computational simplification while preserving the best features (flexibility, modularity, and human intelligibility) of its predecessors. The proposed concept offers alternative non-parametric form of the rules antecedents, which fully reflects the real data distribution and does not require any explicit aggregation operations and scalar membership functions to be imposed. Instead, it derives the fuzzy membership of a particular data sample to a Cloud by the data density distribution of the data associated with that Cloud. Contrast this to the clustering which is parametric data space decomposition/partitioning where the fuzzy membership to a cluster is measured by the distance to the cluster centre/prototype ignoring all the data that form that cluster or approximating their distribution. The proposed new approach takes into account fully and exactly the spatial distribution and similarity of all the real data by proposing an innovative and much simplified form of the antecedent part. In this paper, we provide several numerical examples aiming to illustrate the concept.

  8. Rule-based model of vein graft remodeling.

    Directory of Open Access Journals (Sweden)

    Minki Hwang

    Full Text Available When vein segments are implanted into the arterial system for use in arterial bypass grafting, adaptation to the higher pressure and flow of the arterial system is accomplished thorough wall thickening and expansion. These early remodeling events have been found to be closely coupled to the local hemodynamic forces, such as shear stress and wall tension, and are believed to be the foundation for later vein graft failure. To further our mechanistic understanding of the cellular and extracellular interactions that lead to global changes in tissue architecture, a rule-based modeling method is developed through the application of basic rules of behaviors for these molecular and cellular activities. In the current method, smooth muscle cell (SMC, extracellular matrix (ECM, and monocytes are selected as the three components that occupy the elements of a grid system that comprise the developing vein graft intima. The probabilities of the cellular behaviors are developed based on data extracted from in vivo experiments. At each time step, the various probabilities are computed and applied to the SMC and ECM elements to determine their next physical state and behavior. One- and two-dimensional models are developed to test and validate the computational approach. The importance of monocyte infiltration, and the associated effect in augmenting extracellular matrix deposition, was evaluated and found to be an important component in model development. Final model validation is performed using an independent set of experiments, where model predictions of intimal growth are evaluated against experimental data obtained from the complex geometry and shear stress patterns offered by a mid-graft focal stenosis, where simulation results show good agreements with the experimental data.

  9. Design of a Fuzzy Rule Base Expert System to Predict and Classify ...

    African Journals Online (AJOL)

    The main objective of design of a rule base expert system using fuzzy logic approach is to predict and forecast the risk level of cardiac patients to avoid sudden death. In this proposed system, uncertainty is captured using rule base and classification using fuzzy c-means clustering is discussed to overcome the risk level, ...

  10. Functional networks inference from rule-based machine learning models.

    Science.gov (United States)

    Lazzarini, Nicola; Widera, Paweł; Williamson, Stuart; Heer, Rakesh; Krasnogor, Natalio; Bacardit, Jaume

    2016-01-01

    Functional networks play an important role in the analysis of biological processes and systems. The inference of these networks from high-throughput (-omics) data is an area of intense research. So far, the similarity-based inference paradigm (e.g. gene co-expression) has been the most popular approach. It assumes a functional relationship between genes which are expressed at similar levels across different samples. An alternative to this paradigm is the inference of relationships from the structure of machine learning models. These models are able to capture complex relationships between variables, that often are different/complementary to the similarity-based methods. We propose a protocol to infer functional networks from machine learning models, called FuNeL. It assumes, that genes used together within a rule-based machine learning model to classify the samples, might also be functionally related at a biological level. The protocol is first tested on synthetic datasets and then evaluated on a test suite of 8 real-world datasets related to human cancer. The networks inferred from the real-world data are compared against gene co-expression networks of equal size, generated with 3 different methods. The comparison is performed from two different points of view. We analyse the enriched biological terms in the set of network nodes and the relationships between known disease-associated genes in a context of the network topology. The comparison confirms both the biological relevance and the complementary character of the knowledge captured by the FuNeL networks in relation to similarity-based methods and demonstrates its potential to identify known disease associations as core elements of the network. Finally, using a prostate cancer dataset as a case study, we confirm that the biological knowledge captured by our method is relevant to the disease and consistent with the specialised literature and with an independent dataset not used in the inference process. The

  11. Selecting Tanker Steaming Speeds under Uncertainty: A Rule-Based Bayesian Reasoning Approach

    Directory of Open Access Journals (Sweden)

    N.S.F. Abdul Rahman

    2015-06-01

    Full Text Available In the tanker industry, there are a lot of uncertain conditions that tanker companies have to deal with. For example, the global financial crisis and economic recession, the increase of bunker fuel prices and global climate change. Such conditions have forced tanker companies to change tankers speed from full speed to slow speed, extra slow speed and super slow speed. Due to such conditions, the objective of this paper is to present a methodology for determining vessel speeds of tankers that minimize the cost of the vessels under such conditions. The four levels of vessel speed in the tanker industry will be investigated and will incorporate a number of uncertain conditions. This will be done by developing a scientific model using a rule-based Bayesian reasoning method. The proposed model has produced 96 rules that can be used as guidance in the decision making process. Such results help tanker companies to determine the appropriate vessel speed to be used in a dynamic operational environmental.

  12. Rough set and rule-based multicriteria decision aiding

    Directory of Open Access Journals (Sweden)

    Roman Slowinski

    2012-08-01

    Full Text Available The aim of multicriteria decision aiding is to give the decision maker a recommendation concerning a set of objects evaluated from multiple points of view called criteria. Since a rational decision maker acts with respect to his/her value system, in order to recommend the most-preferred decision, one must identify decision maker's preferences. In this paper, we focus on preference discovery from data concerning some past decisions of the decision maker. We consider the preference model in the form of a set of "if..., then..." decision rules discovered from the data by inductive learning. To structure the data prior to induction of rules, we use the Dominance-based Rough Set Approach (DRSA. DRSA is a methodology for reasoning about data, which handles ordinal evaluations of objects on considered criteria and monotonic relationships between these evaluations and the decision. We review applications of DRSA to a large variety of multicriteria decision problems.

  13. Associations between rule-based parenting practices and child screen viewing: A cross-sectional study

    Directory of Open Access Journals (Sweden)

    Joanna M. Kesten

    2015-01-01

    Conclusions: Limit setting is associated with greater SV. Collaborative rule setting may be effective for managing boys' game-console use. More research is needed to understand rule-based parenting practices.

  14. Exploration of SWRL Rule Bases through Visualization, Paraphrasing, and Categorization of Rules

    Science.gov (United States)

    Hassanpour, Saeed; O'Connor, Martin J.; Das, Amar K.

    Rule bases are increasingly being used as repositories of knowledge content on the Semantic Web. As the size and complexity of these rule bases increases, developers and end users need methods of rule abstraction to facilitate rule management. In this paper, we describe a rule abstraction method for Semantic Web Rule Language (SWRL) rules that is based on lexical analysis and a set of heuristics. Our method results in a tree data structure that we exploit in creating techniques to visualize, paraphrase, and categorize SWRL rules. We evaluate our approach by applying it to several biomedical ontologies that contain SWRL rules, and show how the results reveal rule patterns within the rule base. We have implemented our method as a plug-in tool for Protégé-OWL, the most widely used ontology modeling software for the Semantic Web. Our tool can allow users to rapidly explore content and patterns in SWRL rule bases, enabling their acquisition and management.

  15. Design and performance of a rule-based controller in a naturally ventilated room

    OpenAIRE

    Marjanovic-Halburd, Ljiljana; Angelov, P.; Eftekhari, M. M.

    2003-01-01

    This paper reflects the final phase of the EPSRC project, and the PhD work of Marjanovic, on rule-based control in naturally ventilated buildings. Marjanovic is the second author. Eftekhari was her PhD supervisor.

  16. Rule-Based Analytic Asset Management for Space Exploration Systems (RAMSES), Phase II

    Data.gov (United States)

    National Aeronautics and Space Administration — Payload Systems Inc. (PSI) and the Massachusetts Institute of Technology (MIT) were selected to jointly develop the Rule-based Analytic Asset Management for Space...

  17. Strategy-Driven Exploration for Rule-Based Models of Biochemical Systems with Porgy

    OpenAIRE

    Andrei , Oana; Fernández , Maribel; Kirchner , Hélène; Pinaud , Bruno

    2016-01-01

    This paper presents Porgy – an interactive visual environment for rule-based modelling of biochemical systems. We model molecules and molecule interactions as port graphs and port graph rewrite rules, respectively. We use rewriting strategies to control which rules to apply, and where and when to apply them. Our main contributions to rule-based modelling of biochemical systems lie in the strategy language and the associated visual and interactive features offered by Porgy. These features faci...

  18. A Fuzzy Rule-Based Expert System for Evaluating Intellectual Capital

    Directory of Open Access Journals (Sweden)

    Mohammad Hossein Fazel Zarandi

    2012-01-01

    Full Text Available A fuzzy rule-based expert system is developed for evaluating intellectual capital. A fuzzy linguistic approach assists managers to understand and evaluate the level of each intellectual capital item. The proposed fuzzy rule-based expert system applies fuzzy linguistic variables to express the level of qualitative evaluation and criteria of experts. Feasibility of the proposed model is demonstrated by the result of intellectual capital performance evaluation for a sample company.

  19. Analysis and minimization of overtraining effect in rule-based classifiers for computer-aided diagnosis

    International Nuclear Information System (INIS)

    Li Qiang; Doi Kunio

    2006-01-01

    Computer-aided diagnostic (CAD) schemes have been developed to assist radiologists detect various lesions in medical images. In CAD schemes, classifiers play a key role in achieving a high lesion detection rate and a low false-positive rate. Although many popular classifiers such as linear discriminant analysis and artificial neural networks have been employed in CAD schemes for reduction of false positives, a rule-based classifier has probably been the simplest and most frequently used one since the early days of development of various CAD schemes. However, with existing rule-based classifiers, there are major disadvantages that significantly reduce their practicality and credibility. The disadvantages include manual design, poor reproducibility, poor evaluation methods such as resubstitution, and a large overtraining effect. An automated rule-based classifier with a minimized overtraining effect can overcome or significantly reduce the extent of the above-mentioned disadvantages. In this study, we developed an 'optimal' method for the selection of cutoff thresholds and a fully automated rule-based classifier. Experimental results performed with Monte Carlo simulation and a real lung nodule CT data set demonstrated that the automated threshold selection method can completely eliminate overtraining effect in the procedure of cutoff threshold selection, and thus can minimize overall overtraining effect in the constructed rule-based classifier. We believe that this threshold selection method is very useful in the construction of automated rule-based classifiers with minimized overtraining effect

  20. Scalable rule-based modelling of allosteric proteins and biochemical networks.

    Directory of Open Access Journals (Sweden)

    Julien F Ollivier

    2010-11-01

    Full Text Available Much of the complexity of biochemical networks comes from the information-processing abilities of allosteric proteins, be they receptors, ion-channels, signalling molecules or transcription factors. An allosteric protein can be uniquely regulated by each combination of input molecules that it binds. This "regulatory complexity" causes a combinatorial increase in the number of parameters required to fit experimental data as the number of protein interactions increases. It therefore challenges the creation, updating, and re-use of biochemical models. Here, we propose a rule-based modelling framework that exploits the intrinsic modularity of protein structure to address regulatory complexity. Rather than treating proteins as "black boxes", we model their hierarchical structure and, as conformational changes, internal dynamics. By modelling the regulation of allosteric proteins through these conformational changes, we often decrease the number of parameters required to fit data, and so reduce over-fitting and improve the predictive power of a model. Our method is thermodynamically grounded, imposes detailed balance, and also includes molecular cross-talk and the background activity of enzymes. We use our Allosteric Network Compiler to examine how allostery can facilitate macromolecular assembly and how competitive ligands can change the observed cooperativity of an allosteric protein. We also develop a parsimonious model of G protein-coupled receptors that explains functional selectivity and can predict the rank order of potency of agonists acting through a receptor. Our methodology should provide a basis for scalable, modular and executable modelling of biochemical networks in systems and synthetic biology.

  1. A two-stage stochastic rule-based model to determine pre-assembly buffer content

    Science.gov (United States)

    Gunay, Elif Elcin; Kula, Ufuk

    2018-01-01

    This study considers instant decision-making needs of the automobile manufactures for resequencing vehicles before final assembly (FA). We propose a rule-based two-stage stochastic model to determine the number of spare vehicles that should be kept in the pre-assembly buffer to restore the altered sequence due to paint defects and upstream department constraints. First stage of the model decides the spare vehicle quantities, where the second stage model recovers the scrambled sequence respect to pre-defined rules. The problem is solved by sample average approximation (SAA) algorithm. We conduct a numerical study to compare the solutions of heuristic model with optimal ones and provide following insights: (i) as the mismatch between paint entrance and scheduled sequence decreases, the rule-based heuristic model recovers the scrambled sequence as good as the optimal resequencing model, (ii) the rule-based model is more sensitive to the mismatch between the paint entrance and scheduled sequences for recovering the scrambled sequence, (iii) as the defect rate increases, the difference in recovery effectiveness between rule-based heuristic and optimal solutions increases, (iv) as buffer capacity increases, the recovery effectiveness of the optimization model outperforms heuristic model, (v) as expected the rule-based model holds more inventory than the optimization model.

  2. DEVELOP-FPS: a First Person Shooter Development Tool for Rule-based Scripts

    Directory of Open Access Journals (Sweden)

    Bruno Correia

    2012-09-01

    Full Text Available We present DEVELOP-FPS, a software tool specially designed for the development of First Person Shooter (FPS players controlled by Rule Based Scripts. DEVELOP-FPS may be used by FPS developers to create, debug, maintain and compare rule base player behaviours, providing a set of useful functionalities: i for an easy preparation of the right scenarios for game debugging and testing; ii for controlling the game execution: users can stop and resume the game execution at any instant, monitoring and controlling every player in the game, monitoring the state of each player, their rule base activation, being able to issue commands to control their behaviour; and iii to automatically run a certain number of game executions and collect data in order to evaluate and compare the players performance along a sufficient number of similar experiments.

  3. Automatic detection of esophageal pressure events. Is there an alternative to rule-based criteria?

    DEFF Research Database (Denmark)

    Kruse-Andersen, S; Rütz, K; Kolberg, Jens Godsk

    1995-01-01

    of relevant pressure peaks at the various recording levels. Until now, this selection has been performed entirely by rule-based systems, requiring each pressure deflection to fit within predefined rigid numerical limits in order to be detected. However, due to great variations in the shapes of the pressure...... curves generated by muscular contractions, rule-based criteria do not always select the pressure events most relevant for further analysis. We have therefore been searching for a new concept for automatic event recognition. The present study describes a new system, based on the method of neurocomputing.......79-0.99 and accuracies of 0.89-0.98, depending on the recording level within the esophageal lumen. The neural networks often recognized peaks that clearly represented true contractions but that had been rejected by a rule-based system. We conclude that neural networks have potentials for automatic detections...

  4. Diversity of Rule-based Approaches: Classic Systems and Recent Applications

    Directory of Open Access Journals (Sweden)

    Grzegorz J. Nalepa

    2016-11-01

    Full Text Available Rules are a common symbolic model of knowledge. Rule-based systems share roots in cognitive science and artificial intelligence. In the former, they are mostly used in cognitive architectures; in the latter, they are developed in several domains including knowledge engineering and machine learning. This paper aims to give an overview of these issues with the focus on the current research perspective of artificial intelligence. Moreover, in this setting we discuss our results in the design of rule-based systems and their applications in context-aware and business intelligence systems.

  5. Research on Fault Diagnosis Method Based on Rule Base Neural Network

    Directory of Open Access Journals (Sweden)

    Zheng Ni

    2017-01-01

    Full Text Available The relationship between fault phenomenon and fault cause is always nonlinear, which influences the accuracy of fault location. And neural network is effective in dealing with nonlinear problem. In order to improve the efficiency of uncertain fault diagnosis based on neural network, a neural network fault diagnosis method based on rule base is put forward. At first, the structure of BP neural network is built and the learning rule is given. Then, the rule base is built by fuzzy theory. An improved fuzzy neural construction model is designed, in which the calculated methods of node function and membership function are also given. Simulation results confirm the effectiveness of this method.

  6. A Belief Rule Based Expert System to Assess Mental Disorder under Uncertainty

    DEFF Research Database (Denmark)

    Hossain, Mohammad Shahadat; Afif Monrat, Ahmed; Hasan, Mamun

    2016-01-01

    to ignorance, incompleteness, and randomness. So, a belief rule-based expert system (BRBES) has been designed and developed with the capability of handling the uncertainties mentioned. Evidential reasoning works as the inference engine and the belief rule base as the knowledge representation schema......Mental disorder is a change of mental or behavioral pattern that causes sufferings and impairs the ability to function in ordinary life. In psychopathology, the assessment methods of mental disorder contain various types of uncertainties associated with signs and symptoms. This study identifies...

  7. Annotation of rule-based models with formal semantics to enable creation, analysis, reuse and visualization

    Science.gov (United States)

    Misirli, Goksel; Cavaliere, Matteo; Waites, William; Pocock, Matthew; Madsen, Curtis; Gilfellon, Owen; Honorato-Zimmer, Ricardo; Zuliani, Paolo; Danos, Vincent; Wipat, Anil

    2016-01-01

    Motivation: Biological systems are complex and challenging to model and therefore model reuse is highly desirable. To promote model reuse, models should include both information about the specifics of simulations and the underlying biology in the form of metadata. The availability of computationally tractable metadata is especially important for the effective automated interpretation and processing of models. Metadata are typically represented as machine-readable annotations which enhance programmatic access to information about models. Rule-based languages have emerged as a modelling framework to represent the complexity of biological systems. Annotation approaches have been widely used for reaction-based formalisms such as SBML. However, rule-based languages still lack a rich annotation framework to add semantic information, such as machine-readable descriptions, to the components of a model. Results: We present an annotation framework and guidelines for annotating rule-based models, encoded in the commonly used Kappa and BioNetGen languages. We adapt widely adopted annotation approaches to rule-based models. We initially propose a syntax to store machine-readable annotations and describe a mapping between rule-based modelling entities, such as agents and rules, and their annotations. We then describe an ontology to both annotate these models and capture the information contained therein, and demonstrate annotating these models using examples. Finally, we present a proof of concept tool for extracting annotations from a model that can be queried and analyzed in a uniform way. The uniform representation of the annotations can be used to facilitate the creation, analysis, reuse and visualization of rule-based models. Although examples are given, using specific implementations the proposed techniques can be applied to rule-based models in general. Availability and implementation: The annotation ontology for rule-based models can be found at http

  8. Down-scaling wind energy resource from mesoscale to local scale by nesting and data assimilation with a CFD model

    International Nuclear Information System (INIS)

    Duraisamy Jothiprakasam, Venkatesh

    2014-01-01

    The development of wind energy generation requires precise and well-established methods for wind resource assessment, which is the initial step in every wind farm project. During the last two decades linear flow models were widely used in the wind industry for wind resource assessment and micro-siting. But the linear models inaccuracies in predicting the wind speeds in very complex terrain are well known and led to use of CFD, capable of modeling the complex flow in details around specific geographic features. Mesoscale models (NWP) are able to predict the wind regime at resolutions of several kilometers, but are not well suited to resolve the wind speed and turbulence induced by the topography features on the scale of a few hundred meters. CFD has proven successful in capturing flow details at smaller scales, but needs an accurate specification of the inlet conditions. Thus coupling NWP and CFD models is a better modeling approach for wind energy applications. A one-year field measurement campaign carried out in a complex terrain in southern France during 2007-2008 provides a well-documented data set both for input and validation data. The proposed new methodology aims to address two problems: the high spatial variation of the topography on the domain lateral boundaries, and the prediction errors of the mesoscale model. It is applied in this work using the open source CFD code Code-Saturne, coupled with the mesoscale forecast model of Meteo-France (ALADIN). The improvement is obtained by combining the mesoscale data as inlet condition and field measurement data assimilation into the CFD model. Newtonian relaxation (nudging) data assimilation technique is used to incorporate the measurement data into the CFD simulations. The methodology to reconstruct long term averages uses a clustering process to group the similar meteorological conditions and to reduce the number of CFD simulations needed to reproduce 1 year of atmospheric flow over the site. The assimilation

  9. Generation of facial expressions from emotion using a fuzzy rule based system

    NARCIS (Netherlands)

    Bui, T.D.; Heylen, Dirk K.J.; Poel, Mannes; Nijholt, Antinus; Stumptner, Markus; Corbett, Dan; Brooks, Mike

    2001-01-01

    We propose a fuzzy rule-based system to map representations of the emotional state of an animated agent onto muscle contraction values for the appropriate facial expressions. Our implementation pays special attention to the way in which continuous changes in the intensity of emotions can be

  10. Automated implementation of rule-based expert systems with neural networks for time-critical applications

    Science.gov (United States)

    Ramamoorthy, P. A.; Huang, Song; Govind, Girish

    1991-01-01

    In fault diagnosis, control and real-time monitoring, both timing and accuracy are critical for operators or machines to reach proper solutions or appropriate actions. Expert systems are becoming more popular in the manufacturing community for dealing with such problems. In recent years, neural networks have revived and their applications have spread to many areas of science and engineering. A method of using neural networks to implement rule-based expert systems for time-critical applications is discussed here. This method can convert a given rule-based system into a neural network with fixed weights and thresholds. The rules governing the translation are presented along with some examples. We also present the results of automated machine implementation of such networks from the given rule-base. This significantly simplifies the translation process to neural network expert systems from conventional rule-based systems. Results comparing the performance of the proposed approach based on neural networks vs. the classical approach are given. The possibility of very large scale integration (VLSI) realization of such neural network expert systems is also discussed.

  11. Control of Angra 1' PZR by a fuzzy rule base build through genetic programming

    International Nuclear Information System (INIS)

    Caldas, Gustavo Henrique Flores; Schirru, Roberto

    2002-01-01

    There is an optimum pressure for the normal operation of nuclear power plant reactors and thresholds that must be respected during transients, what make the pressurizer an important control mechanism. Inside a pressurizer there are heaters and a shower. From their actuation levels, they control the vapor pressure inside the pressurizer and, consequently, inside the primary circuit. Therefore, the control of the pressurizer consists in controlling the actuation levels of the heaters and of the shower. In the present work this function is implemented through a fuzzy controller. Besides the efficient way of exerting control, this approach presents the possibility of extracting knowledge of how this control is been made. A fuzzy controller consists basically in an inference machine and a rule base, the later been constructed with specialized knowledge. In some circumstances, however, this knowledge is not accurate, and may lead to non-efficient results. With the development of artificial intelligence techniques, there wore found methods to substitute specialists, simulating its knowledge. Genetic programming is an evolutionary algorithm particularly efficient in manipulating rule base structures. In this work genetic programming was used as a substitute for the specialist. The goal is to test if an irrational object, a computer, is capable, by it self, to find out a rule base reproducing a pre-established actuation levels profile. The result is positive, with the discovery of a fuzzy rule base presenting an insignificant error. A remarkable result that proves the efficiency of the approach. (author)

  12. Rule-based emotion detection on social media : putting tweets on Plutchik's wheel

    NARCIS (Netherlands)

    Tromp, E.; Pechenizkiy, M.

    2014-01-01

    We study sentiment analysis beyond the typical granularity of polarity and instead use Plutchik's wheel of emotions model. We introduce RBEM-Emo as an extension to the Rule-Based Emission Model algorithm to deduce such emotions from human-written messages. We evaluate our approach on two different

  13. Knowledge rule base for the beam optics program TRACE 3-D

    International Nuclear Information System (INIS)

    Gillespie, G.H.; Van Staagen, P.K.; Hill, B.W.

    1993-01-01

    An expert system type of knowledge rule base has been developed for the input parameters used by the particle beam transport program TRACE 3-D. The goal has been to provide the program's user with adequate on-screen information to allow him to initially set up a problem with minimal open-quotes off-lineclose quotes calculations. The focus of this work has been in developing rules for the parameters which define the beam line transport elements. Ten global parameters, the particle mass and charge, beam energy, etc., are used to provide open-quotes expertclose quotes estimates of lower and upper limits for each of the transport element parameters. For example, the limits for the field strength of the quadrupole element are based on a water-cooled, iron-core electromagnet with dimensions derived from practical engineering constraints, and the upper limit for the effective length is scaled with the particle momenta so that initially parallel trajectories do not cross the axis inside the magnet. Limits for the quadrupole doublet and triplet parameters incorporate these rules and additional rules based on stable FODO lattices and bidirectional focusing requirements. The structure of the rule base is outlined and examples for the quadrupole singlet, doublet and triplet are described. The rule base has been implemented within the Shell for Particle Accelerator Related Codes (SPARC) graphical user interface (GUI)

  14. Techniques and implementation of the embedded rule-based expert system using Ada

    Science.gov (United States)

    Liberman, Eugene M.; Jones, Robert E.

    1991-01-01

    Ada is becoming an increasingly popular programming language for large Government-funded software projects. Ada with its portability, transportability, and maintainability lends itself well to today's complex programming environment. In addition, expert systems have also assured a growing role in providing human-like reasoning capability and expertise for computer systems. The integration of expert system technology with Ada programming language, specifically a rule-based expert system using an ART-Ada (Automated Reasoning Tool for Ada) system shell is discussed. The NASA Lewis Research Center was chosen as a beta test site for ART-Ada. The test was conducted by implementing the existing Autonomous Power EXpert System (APEX), a Lisp-base power expert system, in ART-Ada. Three components, the rule-based expert system, a graphics user interface, and communications software make up SMART-Ada (Systems fault Management with ART-Ada). The main objective, to conduct a beta test on the ART-Ada rule-based expert system shell, was achieved. The system is operational. New Ada tools will assist in future successful projects. ART-Ada is one such tool and is a viable alternative to the straight Ada code when an application requires a rule-based or knowledge-based approach.

  15. Rule-based versus probabilistic selection for active surveillance using three definitions of insignificant prostate cancer

    NARCIS (Netherlands)

    L.D.F. Venderbos (Lionne); M.J. Roobol-Bouts (Monique); C.H. Bangma (Chris); R.C.N. van den Bergh (Roderick); L.P. Bokhorst (Leonard); D. Nieboer (Daan); Godtman, R; J. Hugosson (Jonas); van der Kwast, T; E.W. Steyerberg (Ewout)

    2016-01-01

    textabstractTo study whether probabilistic selection by the use of a nomogram could improve patient selection for active surveillance (AS) compared to the various sets of rule-based AS inclusion criteria currently used. We studied Dutch and Swedish patients participating in the European Randomized

  16. Evaluation of Rule-based Modularization in Model Transformation Languages illustrated with ATL

    NARCIS (Netherlands)

    Ivanov, Ivan; van den Berg, Klaas; Jouault, Frédéric

    This paper studies ways for modularizing transformation definitions in current rule-based model transformation languages. Two scenarios are shown in which the modular units are identified on the base of the relations between source and target metamodels and on the base of generic transformation

  17. Rule-based modularization in model transformation languages illustrated with ATL

    NARCIS (Netherlands)

    Ivanov, Ivan; van den Berg, Klaas; Jouault, Frédéric

    2007-01-01

    This paper studies ways for modularizing transformation definitions in current rule-based model transformation languages. Two scenarios are shown in which the modular units are identified on the basis of relations between source and target metamodels and on the base of generic transformation

  18. Rule-based category learning in children: the role of age and executive functioning.

    Directory of Open Access Journals (Sweden)

    Rahel Rabi

    Full Text Available Rule-based category learning was examined in 4-11 year-olds and adults. Participants were asked to learn a set of novel perceptual categories in a classification learning task. Categorization performance improved with age, with younger children showing the strongest rule-based deficit relative to older children and adults. Model-based analyses provided insight regarding the type of strategy being used to solve the categorization task, demonstrating that the use of the task appropriate strategy increased with age. When children and adults who identified the correct categorization rule were compared, the performance deficit was no longer evident. Executive functions were also measured. While both working memory and inhibitory control were related to rule-based categorization and improved with age, working memory specifically was found to marginally mediate the age-related improvements in categorization. When analyses focused only on the sample of children, results showed that working memory ability and inhibitory control were associated with categorization performance and strategy use. The current findings track changes in categorization performance across childhood, demonstrating at which points performance begins to mature and resemble that of adults. Additionally, findings highlight the potential role that working memory and inhibitory control may play in rule-based category learning.

  19. Using Rule-Based Computer Programming to Unify Communication Rules Research.

    Science.gov (United States)

    Sanford, David L.; Roach, J. W.

    This paper proposes the use of a rule-based computer programming language as a standard for the expression of rules, arguing that the adoption of a standard would enable researchers to communicate about rules in a consistent and significant way. Focusing on the formal equivalence of artificial intelligence (AI) programming to different types of…

  20. Rule-based conversion of closely-related languages: a Dutch-to-Afrikaans convertor

    CSIR Research Space (South Africa)

    Van Huyssteen, GB

    2009-11-01

    Full Text Available and performance of a rule-based Dutch-to-Afrikaans converter, with the aim to transform Dutch text so that it looks more like an Afrikaans text (even though it might not even be a good Dutch translation). The rules we used is based on systematic orthographic...

  1. Belief-rule-based expert systems for evaluation of e-government

    DEFF Research Database (Denmark)

    Hossain, Mohammad Shahadat; Zander, Pär-Ola Mikael; Kamal, Md Sarwar

    2015-01-01

    , known as the Belief Rule Based Expert System (BRBES) and implemented in the local e-government of Bangladesh. The results have been compared with a recently developed method of evaluating e-government, and it is demonstrated that the results of the BRBES are more accurate and reliable. The BRBES can...

  2. Flavours of XChange, a Rule-Based Reactive Language for the (Semantic) Web

    OpenAIRE

    Bailey, James; Bry, François; Eckert, Michael; Patrânjan, Paula Lavinia

    2005-01-01

    This article introduces XChange, a rule-based reactive language for the Web. Stressing application scenarios, it first argues that high-level reactive languages are needed for bothWeb and SemanticWeb applications. Then, it discusses technologies and paradigms relevant to high-level reactive languages for the (Semantic) Web. Finally, it presents the Event-Condition-Action rules of XChange.

  3. Capacities and overlap indexes with an application in fuzzy rule-based classification systems

    Czech Academy of Sciences Publication Activity Database

    Paternain, D.; Bustince, H.; Pagola, M.; Sussner, P.; Kolesárová, A.; Mesiar, Radko

    2016-01-01

    Roč. 305, č. 1 (2016), s. 70-94 ISSN 0165-0114 Institutional support: RVO:67985556 Keywords : Capacity * Overlap index * Overlap function * Choquet integral * Fuzzy rule-based classification systems Subject RIV: BA - General Mathematics Impact factor: 2.718, year: 2016 http://library.utia.cas.cz/separaty/2016/E/mesiar-0465739.pdf

  4. Evaluation of Machine Learning and Rules-Based Approaches for Predicting Antimicrobial Resistance Profiles in Gram-negative Bacilli from Whole Genome Sequence Data.

    Science.gov (United States)

    Pesesky, Mitchell W; Hussain, Tahir; Wallace, Meghan; Patel, Sanket; Andleeb, Saadia; Burnham, Carey-Ann D; Dantas, Gautam

    2016-01-01

    incomplete genome assembly confounded the rules-based algorithm, resulting in predictions based on gene family, rather than on knowledge of the specific variant found. Low-frequency resistance caused errors in the machine-learning algorithm because those genes were not seen or seen infrequently in the test set. We also identified an example of variability in the phenotype-based results that led to disagreement with both genotype-based methods. Genotype-based antimicrobial susceptibility testing shows great promise as a diagnostic tool, and we outline specific research goals to further refine this methodology.

  5. Evaluation of Machine Learning and Rules-Based Approaches for Predicting Antimicrobial Resistance Profiles in Gram-negative Bacilli from Whole Genome Sequence Data

    Directory of Open Access Journals (Sweden)

    Mitchell Pesesky

    2016-11-01

    factors and incomplete genome assembly confounded the rules-based algorithm, resulting in predictions based on gene family, rather than on knowledge of the specific variant found. Low-frequency resistance caused errors in the machine-learning algorithm because those genes were not seen or seen infrequently in the test set. We also identified an example of variability in the phenotype-based results that led to disagreement with both genotype-based methods. Genotype-based antimicrobial susceptibility testing shows great promise as a diagnostic tool, and we outline specific research goals to further refine this methodology.

  6. A rule-based computer control system for PBX-M neutral beams

    International Nuclear Information System (INIS)

    Frank, K.T.; Kozub, T.A.; Kugel, H.W.

    1987-01-01

    The Princeton Beta Experiment (PBX) neutral beams have been routinely operated under automatic computer control. A major upgrade of the computer configuration was undertaken to coincide with the PBX machine modification. The primary tasks included in the computer control system are data acquisition, waveform reduction, automatic control and data storage. The portion of the system which will remain intact is the rule-based approach to automatic control. Increased computational and storage capability will allow the expansion of the knowledge base previously used. The hardware configuration supported by the PBX Neutral Beam (XNB) software includes a dedicated Microvax with five CAMAC crates and four process controllers. The control algorithms are rule-based and goal-driven. The automatic control system raises ion source electrical parameters to selected energy goals and maintains these levels until new goals are requested or faults are detected

  7. A self-learning rule base for command following in dynamical systems

    Science.gov (United States)

    Tsai, Wei K.; Lee, Hon-Mun; Parlos, Alexander

    1992-01-01

    In this paper, a self-learning Rule Base for command following in dynamical systems is presented. The learning is accomplished though reinforcement learning using an associative memory called SAM. The main advantage of SAM is that it is a function approximator with explicit storage of training samples. A learning algorithm patterned after the dynamic programming is proposed. Two artificially created, unstable dynamical systems are used for testing, and the Rule Base was used to generate a feedback control to improve the command following ability of the otherwise uncontrolled systems. The numerical results are very encouraging. The controlled systems exhibit a more stable behavior and a better capability to follow reference commands. The rules resulting from the reinforcement learning are explicitly stored and they can be modified or augmented by human experts. Due to overlapping storage scheme of SAM, the stored rules are similar to fuzzy rules.

  8. Fifty years of computer analysis in chest imaging: rule-based, machine learning, deep learning.

    Science.gov (United States)

    van Ginneken, Bram

    2017-03-01

    Half a century ago, the term "computer-aided diagnosis" (CAD) was introduced in the scientific literature. Pulmonary imaging, with chest radiography and computed tomography, has always been one of the focus areas in this field. In this study, I describe how machine learning became the dominant technology for tackling CAD in the lungs, generally producing better results than do classical rule-based approaches, and how the field is now rapidly changing: in the last few years, we have seen how even better results can be obtained with deep learning. The key differences among rule-based processing, machine learning, and deep learning are summarized and illustrated for various applications of CAD in the chest.

  9. Rule Based System for Medicine Inventory Control Using Radio Frequency Identification (RFID

    Directory of Open Access Journals (Sweden)

    Ardhyanti Mita Nugraha Joanna

    2018-01-01

    Full Text Available Rule based system is very efficient to ensure stock of drug to remain available by utilizing Radio Frequency Identification (RFID as input means automatically. This method can ensure the stock of drugs to remain available by analyzing the needs of drug users. The research data was the amount of drug usage in hospital for 1 year. The data was processed by using ABC classification to determine the drug with fast, medium and slow movement. In each classification result, rule based algorithm was given for determination of safety stock and Reorder Point (ROP. This research yielded safety stock and ROP values that vary depending on the class of each drug. Validation is done by comparing the calculation of safety stock and reorder point both manually and by system, then, it was found that the mean deviation value at safety stock was 0,03 and and ROP was 0,08.

  10. Towards a framework for threaded inference in rule-based systems

    Directory of Open Access Journals (Sweden)

    Luis Casillas Santillan

    2013-11-01

    Full Text Available nformation and communication technologies have shown a significant advance and fast pace in their performance and pervasiveness. Knowledge has become a significant asset for organizations, which need to deal with large amounts of data and information to produce valuable knowledge. Dealing with knowledge is turning the axis for organizations in the new economy. One of the choices to gather the goal of knowledge managing is the use of rule-based systems. This kind of approach is the new chance for expert-systems’ technology. Modern languages and cheap computing allow the implementation of concurrent systems for dealing huge volumes of information in organizations. The present work is aimed at proposing the use of contemporary programming elements, as easy to exploit threading, when implementing rule-based treatment over huge data volumes.

  11. Rule Based System for Medicine Inventory Control Using Radio Frequency Identification (RFID)

    Science.gov (United States)

    Nugraha, Joanna Ardhyanti Mita; Suryono; Suseno, dan Jatmiko Endro

    2018-02-01

    Rule based system is very efficient to ensure stock of drug to remain available by utilizing Radio Frequency Identification (RFID) as input means automatically. This method can ensure the stock of drugs to remain available by analyzing the needs of drug users. The research data was the amount of drug usage in hospital for 1 year. The data was processed by using ABC classification to determine the drug with fast, medium and slow movement. In each classification result, rule based algorithm was given for determination of safety stock and Reorder Point (ROP). This research yielded safety stock and ROP values that vary depending on the class of each drug. Validation is done by comparing the calculation of safety stock and reorder point both manually and by system, then, it was found that the mean deviation value at safety stock was 0,03 and and ROP was 0,08.

  12. Changing from a Rules-based to a Principles-based Accounting Logic: A Review

    Directory of Open Access Journals (Sweden)

    Marta Silva Guerreiro

    2014-06-01

    Full Text Available We explore influences on unlisted companies when Portugal moved from a code law, rules-based accounting system, to a principles-based accounting system of adapted International Financial Reporting Standards (IFRS. Institutionalisation of the new principles-based system was generally facilitated by a socio-economic and political context that increasingly supported IFRS logic. This helped central actors gain political opportunity, mobilise important allies, and accommodate major protagonists. The preparedness of unlisted companies to adopt the new IFRS-based accounting system voluntarily was explained by their desire to maintain social legitimacy. However, it was affected negatively by the embeddedness of rule-based practices in the ‘old’ prevailing institutional logic.

  13. RULE-BASE METHOD FOR ANALYSIS OF QUALITY E-LEARNING IN HIGHER EDUCATION

    Directory of Open Access Journals (Sweden)

    darsih darsih darsih

    2016-04-01

    Full Text Available ABSTRACT Assessing the quality of e-learning courses to measure the success of e-learning systems in online learning is essential. The system can be used to improve education. The study analyzes the quality of e-learning course on the web site www.kulon.undip.ac.id used a questionnaire with questions based on the variables of ISO 9126. Penilaiann Likert scale was used with a web app. Rule-base reasoning method is used to subject the quality of e-learningyang assessed. A case study conducted in four e-learning courses with 133 sample / respondents as users of the e-learning course. From the obtained results of research conducted both for the value of e-learning from each subject tested. In addition, each e-learning courses have different advantages depending on certain variables. Keywords : E-Learning, Rule-Base, Questionnaire, Likert, Measuring.

  14. Fuzzy Rule-based Analysis of Promotional Efficiency in Vietnam’s Tourism Industry

    OpenAIRE

    Nguyen Quang VINH; Dam Van KHANH; Nguyen Viet ANH

    2015-01-01

    This study aims to determine an effective method of measuring the efficiency of promotional strategies for tourist destinations. Complicating factors that influence promotional efficiency (PE), such as promotional activities (PA), destination attribute (DA), and destination image (DI), make it difficult to evaluate the effectiveness of PE. This study develops a rule-based decision support mechanism using fuzzy set theory and the Analytic Hierarchy Process (AHP) to evaluate the effectiveness o...

  15. A knowledge representation meta-model for rule-based modelling of signalling networks

    Directory of Open Access Journals (Sweden)

    Adrien Basso-Blandin

    2016-03-01

    Full Text Available The study of cellular signalling pathways and their deregulation in disease states, such as cancer, is a large and extremely complex task. Indeed, these systems involve many parts and processes but are studied piecewise and their literatures and data are consequently fragmented, distributed and sometimes—at least apparently—inconsistent. This makes it extremely difficult to build significant explanatory models with the result that effects in these systems that are brought about by many interacting factors are poorly understood. The rule-based approach to modelling has shown some promise for the representation of the highly combinatorial systems typically found in signalling where many of the proteins are composed of multiple binding domains, capable of simultaneous interactions, and/or peptide motifs controlled by post-translational modifications. However, the rule-based approach requires highly detailed information about the precise conditions for each and every interaction which is rarely available from any one single source. Rather, these conditions must be painstakingly inferred and curated, by hand, from information contained in many papers—each of which contains only part of the story. In this paper, we introduce a graph-based meta-model, attuned to the representation of cellular signalling networks, which aims to ease this massive cognitive burden on the rule-based curation process. This meta-model is a generalization of that used by Kappa and BNGL which allows for the flexible representation of knowledge at various levels of granularity. In particular, it allows us to deal with information which has either too little, or too much, detail with respect to the strict rule-based meta-model. Our approach provides a basis for the gradual aggregation of fragmented biological knowledge extracted from the literature into an instance of the meta-model from which we can define an automated translation into executable Kappa programs.

  16. Moving from Rule-based to Principle-based in Public Sector: Preparers' Perspective

    OpenAIRE

    Roshayani Arshad; Normah Omar; Siti Fatimah Awang

    2013-01-01

    The move from cash accounting to accrual accounting, or rule-based to principle-based accounting, by many governments is part of an ongoing efforts in promoting a more business-like and performance-focused public sector. Using questionnaire responses from preparers of financial statements of public universities in Malaysia, this study examines the implementation challenges and benefits of principle-based accounting. Results from these responses suggest that most respondents perceived signific...

  17. LPS: a rule-based, schema-oriented knowledge representation system

    Energy Technology Data Exchange (ETDEWEB)

    Anzai, Y; Mitsuya, Y; Nakajima, S; Ura, S

    1981-01-01

    A new knowledge representation system called LPS is presented. The global control structure of LPS is rule-based, but the local representational structure is schema-oriented. The present version of LPS was designed to increase the understandability of representation while keeping time efficiency reasonable. Pattern matching through slot-networks and meta-actions from among the implemented facilities of LPS, are especially described in detail. 7 references.

  18. Development of a rule-based diagnostic platform on an object-oriented expert system shell

    International Nuclear Information System (INIS)

    Wang, Wenlin; Yang, Ming; Seong, Poong Hyun

    2016-01-01

    Highlights: • Multilevel Flow Model represents system knowledge as a domain map in expert system. • Rule-based fault diagnostic expert system can identify root cause via a causal chain. • Rule-based fault diagnostic expert system can be used for fault simulation training. - Abstract: This paper presents the development and implementation of a real-time rule-based diagnostic platform. The knowledge is acquired from domain experts and textbooks and the design of the fault diagnosis expert system was performed in the following ways: (i) establishing of corresponding classes and instances to build the domain map, (ii) creating of generic fault models based on events, and (iii) building of diagnostic reasoning based on rules. Knowledge representation is a complicated issue of expert systems. One highlight of this paper is that the Multilevel Flow Model has been used to represent the knowledge, which composes the domain map within the expert system as well as providing a concise description of the system. The developed platform is illustrated using the pressure safety system of a pressurized water reactor as an example of the simulation test bed; the platform is developed using the commercial and industrially validated software G2. The emulation test was conducted and it has been proven that the fault diagnosis expert system can identify the faults correctly and in a timely way; this system can be used as a simulation-based training tool to assist operators to make better decisions.

  19. Sistem Evaluasi Jamunan Mutu Menggunakan Rule Based System Untuk Monitoring Mutu Perguruan Tinggi

    Directory of Open Access Journals (Sweden)

    Sri Hartono

    2017-05-01

    Full Text Available The needs for continuous quality improvement resulting in the more complex. The research aims to develop system of quality assurance evaluation using rule based system to monitor the quality of higher education. This process of the research begins by documenting the daily activity of study program which consists of lecturer data, research data, service data, staff data, student data, and infrastructure data into a database. The data were evaluated by using rule based system  by adopting rules on quality standards of study program of National Accreditation Board for Higher Education as the knowledge base. Evaluation process was carried out by using the forward chaining methods by matching the existing data to the knowledge base to determine the quality status of each quality standard. While the reccomendation process was carried out by using the backward chaining methods by matching the results of quality status to the desired projection of quality status to determine the nearest target which can be achieved. The result of the research is system of quality assurance evaluation with rule based system that is capable of producing an output system in the form of internal evaluation report and recommendation system that can be used to monitor the quality of higher education.

  20. Strategies for adding adaptive learning mechanisms to rule-based diagnostic expert systems

    Science.gov (United States)

    Stclair, D. C.; Sabharwal, C. L.; Bond, W. E.; Hacke, Keith

    1988-01-01

    Rule-based diagnostic expert systems can be used to perform many of the diagnostic chores necessary in today's complex space systems. These expert systems typically take a set of symptoms as input and produce diagnostic advice as output. The primary objective of such expert systems is to provide accurate and comprehensive advice which can be used to help return the space system in question to nominal operation. The development and maintenance of diagnostic expert systems is time and labor intensive since the services of both knowledge engineer(s) and domain expert(s) are required. The use of adaptive learning mechanisms to increment evaluate and refine rules promises to reduce both time and labor costs associated with such systems. This paper describes the basic adaptive learning mechanisms of strengthening, weakening, generalization, discrimination, and discovery. Next basic strategies are discussed for adding these learning mechanisms to rule-based diagnostic expert systems. These strategies support the incremental evaluation and refinement of rules in the knowledge base by comparing the set of advice given by the expert system (A) with the correct diagnosis (C). Techniques are described for selecting those rules in the in the knowledge base which should participate in adaptive learning. The strategies presented may be used with a wide variety of learning algorithms. Further, these strategies are applicable to a large number of rule-based diagnostic expert systems. They may be used to provide either immediate or deferred updating of the knowledge base.

  1. An evaluation and implementation of rule-based Home Energy Management System using the Rete algorithm.

    Science.gov (United States)

    Kawakami, Tomoya; Fujita, Naotaka; Yoshihisa, Tomoki; Tsukamoto, Masahiko

    2014-01-01

    In recent years, sensors become popular and Home Energy Management System (HEMS) takes an important role in saving energy without decrease in QoL (Quality of Life). Currently, many rule-based HEMSs have been proposed and almost all of them assume "IF-THEN" rules. The Rete algorithm is a typical pattern matching algorithm for IF-THEN rules. Currently, we have proposed a rule-based Home Energy Management System (HEMS) using the Rete algorithm. In the proposed system, rules for managing energy are processed by smart taps in network, and the loads for processing rules and collecting data are distributed to smart taps. In addition, the number of processes and collecting data are reduced by processing rules based on the Rete algorithm. In this paper, we evaluated the proposed system by simulation. In the simulation environment, rules are processed by a smart tap that relates to the action part of each rule. In addition, we implemented the proposed system as HEMS using smart taps.

  2. Spatial Rule-Based Modeling: A Method and Its Application to the Human Mitotic Kinetochore

    Directory of Open Access Journals (Sweden)

    Jan Huwald

    2013-07-01

    Full Text Available A common problem in the analysis of biological systems is the combinatorial explosion that emerges from the complexity of multi-protein assemblies. Conventional formalisms, like differential equations, Boolean networks and Bayesian networks, are unsuitable for dealing with the combinatorial explosion, because they are designed for a restricted state space with fixed dimensionality. To overcome this problem, the rule-based modeling language, BioNetGen, and the spatial extension, SRSim, have been developed. Here, we describe how to apply rule-based modeling to integrate experimental data from different sources into a single spatial simulation model and how to analyze the output of that model. The starting point for this approach can be a combination of molecular interaction data, reaction network data, proximities, binding and diffusion kinetics and molecular geometries at different levels of detail. We describe the technique and then use it to construct a model of the human mitotic inner and outer kinetochore, including the spindle assembly checkpoint signaling pathway. This allows us to demonstrate the utility of the procedure, show how a novel perspective for understanding such complex systems becomes accessible and elaborate on challenges that arise in the formulation, simulation and analysis of spatial rule-based models.

  3. Rule-based modeling: a computational approach for studying biomolecular site dynamics in cell signaling systems

    Science.gov (United States)

    Chylek, Lily A.; Harris, Leonard A.; Tung, Chang-Shung; Faeder, James R.; Lopez, Carlos F.

    2013-01-01

    Rule-based modeling was developed to address the limitations of traditional approaches for modeling chemical kinetics in cell signaling systems. These systems consist of multiple interacting biomolecules (e.g., proteins), which themselves consist of multiple parts (e.g., domains, linear motifs, and sites of phosphorylation). Consequently, biomolecules that mediate information processing generally have the potential to interact in multiple ways, with the number of possible complexes and post-translational modification states tending to grow exponentially with the number of binary interactions considered. As a result, only large reaction networks capture all possible consequences of the molecular interactions that occur in a cell signaling system, which is problematic because traditional modeling approaches for chemical kinetics (e.g., ordinary differential equations) require explicit network specification. This problem is circumvented through representation of interactions in terms of local rules. With this approach, network specification is implicit and model specification is concise. Concise representation results in a coarse graining of chemical kinetics, which is introduced because all reactions implied by a rule inherit the rate law associated with that rule. Coarse graining can be appropriate if interactions are modular, and the coarseness of a model can be adjusted as needed. Rules can be specified using specialized model-specification languages, and recently developed tools designed for specification of rule-based models allow one to leverage powerful software engineering capabilities. A rule-based model comprises a set of rules, which can be processed by general-purpose simulation and analysis tools to achieve different objectives (e.g., to perform either a deterministic or stochastic simulation). PMID:24123887

  4. Rule-based modeling: a computational approach for studying biomolecular site dynamics in cell signaling systems.

    Science.gov (United States)

    Chylek, Lily A; Harris, Leonard A; Tung, Chang-Shung; Faeder, James R; Lopez, Carlos F; Hlavacek, William S

    2014-01-01

    Rule-based modeling was developed to address the limitations of traditional approaches for modeling chemical kinetics in cell signaling systems. These systems consist of multiple interacting biomolecules (e.g., proteins), which themselves consist of multiple parts (e.g., domains, linear motifs, and sites of phosphorylation). Consequently, biomolecules that mediate information processing generally have the potential to interact in multiple ways, with the number of possible complexes and posttranslational modification states tending to grow exponentially with the number of binary interactions considered. As a result, only large reaction networks capture all possible consequences of the molecular interactions that occur in a cell signaling system, which is problematic because traditional modeling approaches for chemical kinetics (e.g., ordinary differential equations) require explicit network specification. This problem is circumvented through representation of interactions in terms of local rules. With this approach, network specification is implicit and model specification is concise. Concise representation results in a coarse graining of chemical kinetics, which is introduced because all reactions implied by a rule inherit the rate law associated with that rule. Coarse graining can be appropriate if interactions are modular, and the coarseness of a model can be adjusted as needed. Rules can be specified using specialized model-specification languages, and recently developed tools designed for specification of rule-based models allow one to leverage powerful software engineering capabilities. A rule-based model comprises a set of rules, which can be processed by general-purpose simulation and analysis tools to achieve different objectives (e.g., to perform either a deterministic or stochastic simulation). © 2013 Wiley Periodicals, Inc.

  5. Compensatory Processing During Rule-Based Category Learning in Older Adults

    Science.gov (United States)

    Bharani, Krishna L.; Paller, Ken A.; Reber, Paul J.; Weintraub, Sandra; Yanar, Jorge; Morrison, Robert G.

    2016-01-01

    Healthy older adults typically perform worse than younger adults at rule-based category learning, but better than patients with Alzheimer's or Parkinson's disease. To further investigate aging's effect on rule-based category learning, we monitored event-related potentials (ERPs) while younger and neuropsychologically typical older adults performed a visual category-learning task with a rule-based category structure and trial-by-trial feedback. Using these procedures, we previously identified ERPs sensitive to categorization strategy and accuracy in young participants. In addition, previous studies have demonstrated the importance of neural processing in the prefrontal cortex and the medial temporal lobe for this task. In this study, older adults showed lower accuracy and longer response times than younger adults, but there were two distinct subgroups of older adults. One subgroup showed near-chance performance throughout the procedure, never categorizing accurately. The other subgroup reached asymptotic accuracy that was equivalent to that in younger adults, although they categorized more slowly. These two subgroups were further distinguished via ERPs. Consistent with the compensation theory of cognitive aging, older adults who successfully learned showed larger frontal ERPs when compared with younger adults. Recruitment of prefrontal resources may have improved performance while slowing response times. Additionally, correlations of feedback-locked P300 amplitudes with category-learning accuracy differentiated successful younger and older adults. Overall, the results suggest that the ability to adapt one's behavior in response to feedback during learning varies across older individuals, and that the failure of some to adapt their behavior may reflect inadequate engagement of prefrontal cortex. PMID:26422522

  6. Ruled-based control of off-grid desalination powered by renewable energies

    Directory of Open Access Journals (Sweden)

    Alvaro Serna

    2015-08-01

    Full Text Available A rule-based control is presented for desalination plants operating under variable, renewable power availability. This control algorithm is based on two sets of rules: first, a list that prioritizes the reverse osmosis (RO units of the plant is created, based on the current state and the expected water demand; secondly, the available energy is then dispatched to these units following this prioritized list. The selected strategy is tested on a specific case study: a reverse osmosis plant designed for the production of desalinated water powered by wind and wave energy. Simulation results illustrate the correct performance of the plant under this control.

  7. Paper Improving Rule Based Stemmers to Solve Some Special Cases of Arabic Language

    Directory of Open Access Journals (Sweden)

    Soufiane Farrah

    2017-04-01

    Full Text Available Analysis of Arabic language has become a necessity because of its big evolution; we propose in this paper a rule based extraction method of Arabic text to solve some weaknesses founded on previous research works. Our approach is divided on preprocessing phase, on which we proceed to the tokenization of the text, and formatting it by removing any punctuation, diacritics and non-letter characters. Treatment phase based on the elimination of several sets of affixes (diacritics, prefixes, and suffixes, and on the application of several patterns. A check phase that verifies if the root extracted is correct, by searching the result in root dictionaries.

  8. Weighted Evidence Combination Rule Based on Evidence Distance and Uncertainty Measure: An Application in Fault Diagnosis

    Directory of Open Access Journals (Sweden)

    Lei Chen

    2018-01-01

    Full Text Available Conflict management in Dempster-Shafer theory (D-S theory is a hot topic in information fusion. In this paper, a novel weighted evidence combination rule based on evidence distance and uncertainty measure is proposed. The proposed approach consists of two steps. First, the weight is determined based on the evidence distance. Then, the weight value obtained in first step is modified by taking advantage of uncertainty. Our proposed method can efficiently handle high conflicting evidences with better performance of convergence. A numerical example and an application based on sensor fusion in fault diagnosis are given to demonstrate the efficiency of our proposed method.

  9. A rule-based approach to model checking of UML state machines

    Science.gov (United States)

    Grobelna, Iwona; Grobelny, Michał; Stefanowicz, Łukasz

    2016-12-01

    In the paper a new approach to formal verification of control process specification expressed by means of UML state machines in version 2.x is proposed. In contrast to other approaches from the literature, we use the abstract and universal rule-based logical model suitable both for model checking (using the nuXmv model checker), but also for logical synthesis in form of rapid prototyping. Hence, a prototype implementation in hardware description language VHDL can be obtained that fully reflects the primary, already formally verified specification in form of UML state machines. Presented approach allows to increase the assurance that implemented system meets the user-defined requirements.

  10. Rule-Based and Case-Based Reasoning in Housing Prices

    OpenAIRE

    Gabrielle Gayer; Itzhak Gilboa; Offer Lieberman

    2004-01-01

    People reason about real-estate prices both in terms of general rules and in terms of analogies to similar cases. We propose to empirically test which mode of reasoning fits the data better. To this end, we develop the statistical techniques required for the estimation of the case-based model. It is hypothesized that case-based reasoning will have relatively more explanatory power in databases of rental apartments, whereas rule-based reasoning will have a relative advantage in sales data. We ...

  11. The development of cause analysis system for CPCS trip using the rule-base deduction

    International Nuclear Information System (INIS)

    Park, Hee Seok; Kim, Dong Hoon; Seo, Ho Joon; Koo, In Soo; Park, Suk Joon

    1992-01-01

    The Core Protection Calculator System(CPCS) was developed to initiate a Reactor Trip under the circumstance of certain transients by Combustion Engineering Company. The major function of the CPCS is to generate contact outputs for the Departure from Nucleate Boiling Ratio(DNBR) Trip and Local Power Density(LPD) Trip. But in CPCS the trip causes can not be identified, only trip status is displayed. It may take much time and efforts for plant operator to analyse the trip causes of CPCS. So, the Cause Analysis System for CPCS(CASCPCS) has been developed using the rule-base deduction method to aid the operators in Nuclear Power Plant

  12. A RULE-BASED SYSTEM APPROACH FOR SAFETY MANAGEMENT IN HAZARDOUS WORK SYSTEMS

    Directory of Open Access Journals (Sweden)

    Ercüment N. DİZDAR

    1998-03-01

    Full Text Available Developments in technology increased the importance of safety management in work life. These improvements also resulted in a requirement of more investment and assignment on human in work systems. Here we face this problem: Can we make it possible to forecast the possible accidents that workers can face, and prevent these accidents by taking necessary precautions? In this study made, we aimed at developing an rule-based system to forecast the occupational accidents in coming periods at the departments of the facilities in hazardous work systems. The validity of the developed system was proved by implementing it into practice in hazardous work systems in manufacturing industry.

  13. A Cross-Wavelet Transform Aided Rule Based Approach for Early Prediction of Lean Blow-out in Swirl-Stabilized Dump Combustor

    Directory of Open Access Journals (Sweden)

    Debangshu Dey

    2015-03-01

    Full Text Available Lean or ultralean combustion is one of the popular strategies to achieve very low emission levels. However, it is extremely susceptible to lean blow-out (LBO. The present work explores a Cross-wavelet transform (XWT aided rule based scheme for early prediction of lean blowout. XWT can be considered as an advancement of wavelet analysis which gives correlation between two waveforms in time-frequency space. In the present scheme a swirl-stabilized dump combustor is used as a laboratory-scale model of a generic gas turbine combustor with LPG as fuel. Various time series data of CH chemiluminescence signal are recorded for different flame conditions by varying equivalence ratio, flow rate and level of air-fuel premixing. Some features are extracted from the cross-wavelet spectrum of the recorded waveforms and a reference wave. The extracted features are observed to classify the flame condition into three major classes: near LBO, moderate and healthy. Moreover, a Rough Set based technique is also applied on the extracted features to generate a rule base so that it can be fed to a real time controller or expert system to take necessary control action to prevent LBO. Results show that the proposed methodology performs with an acceptable degree of accuracy.

  14. An investigation of care-based vs. rule-based morality in frontotemporal dementia, Alzheimer's disease, and healthy controls.

    Science.gov (United States)

    Carr, Andrew R; Paholpak, Pongsatorn; Daianu, Madelaine; Fong, Sylvia S; Mather, Michelle; Jimenez, Elvira E; Thompson, Paul; Mendez, Mario F

    2015-11-01

    Behavioral changes in dementia, especially behavioral variant frontotemporal dementia (bvFTD), may result in alterations in moral reasoning. Investigators have not clarified whether these alterations reflect differential impairment of care-based vs. rule-based moral behavior. This study investigated 18 bvFTD patients, 22 early onset Alzheimer's disease (eAD) patients, and 20 healthy age-matched controls on care-based and rule-based items from the Moral Behavioral Inventory and the Social Norms Questionnaire, neuropsychological measures, and magnetic resonance imaging (MRI) regions of interest. There were significant group differences with the bvFTD patients rating care-based morality transgressions less severely than the eAD group and rule-based moral behavioral transgressions more severely than controls. Across groups, higher care-based morality ratings correlated with phonemic fluency on neuropsychological tests, whereas higher rule-based morality ratings correlated with increased difficulty set-shifting and learning new rules to tasks. On neuroimaging, severe care-based reasoning correlated with cortical volume in right anterior temporal lobe, and rule-based reasoning correlated with decreased cortical volume in the right orbitofrontal cortex. Together, these findings suggest that frontotemporal disease decreases care-based morality and facilitates rule-based morality possibly from disturbed contextual abstraction and set-shifting. Future research can examine whether frontal lobe disorders and bvFTD result in a shift from empathic morality to the strong adherence to conventional rules. Published by Elsevier Ltd.

  15. A Rule Based Approach to ISS Interior Volume Control and Layout

    Science.gov (United States)

    Peacock, Brian; Maida, Jim; Fitts, David; Dory, Jonathan

    2001-01-01

    Traditional human factors design involves the development of human factors requirements based on a desire to accommodate a certain percentage of the intended user population. As the product is developed human factors evaluation involves comparison between the resulting design and the specifications. Sometimes performance metrics are involved that allow leniency in the design requirements given that the human performance result is satisfactory. Clearly such approaches may work but they give rise to uncertainty and negotiation. An alternative approach is to adopt human factors design rules that articulate a range of each design continuum over which there are varying outcome expectations and interactions with other variables, including time. These rules are based on a consensus of human factors specialists, designers, managers and customers. The International Space Station faces exactly this challenge in interior volume control, which is based on anthropometric, performance and subjective preference criteria. This paper describes the traditional approach and then proposes a rule-based alternative. The proposed rules involve spatial, temporal and importance dimensions. If successful this rule-based concept could be applied to many traditional human factors design variables and could lead to a more effective and efficient contribution of human factors input to the design process.

  16. Rule-based topology system for spatial databases to validate complex geographic datasets

    Science.gov (United States)

    Martinez-Llario, J.; Coll, E.; Núñez-Andrés, M.; Femenia-Ribera, C.

    2017-06-01

    A rule-based topology software system providing a highly flexible and fast procedure to enforce integrity in spatial relationships among datasets is presented. This improved topology rule system is built over the spatial extension Jaspa. Both projects are open source, freely available software developed by the corresponding author of this paper. Currently, there is no spatial DBMS that implements a rule-based topology engine (considering that the topology rules are designed and performed in the spatial backend). If the topology rules are applied in the frontend (as in many GIS desktop programs), ArcGIS is the most advanced solution. The system presented in this paper has several major advantages over the ArcGIS approach: it can be extended with new topology rules, it has a much wider set of rules, and it can mix feature attributes with topology rules as filters. In addition, the topology rule system can work with various DBMSs, including PostgreSQL, H2 or Oracle, and the logic is performed in the spatial backend. The proposed topology system allows users to check the complex spatial relationships among features (from one or several spatial layers) that require some complex cartographic datasets, such as the data specifications proposed by INSPIRE in Europe and the Land Administration Domain Model (LADM) for Cadastral data.

  17. A Rule-Based Data Transfer Protocol for On-Demand Data Exchange in Vehicular Environment

    Directory of Open Access Journals (Sweden)

    Liao Hsien-Chou

    2009-01-01

    Full Text Available The purpose of Intelligent Transport System (ITS is mainly to increase the driving safety and efficiency. Data exchange is an important way to achieve the purpose. An on-demand data exchange is especially useful to assist a driver avoiding some emergent events. In order to handle the data exchange under dynamic situations, a rule-based data transfer protocol is proposed in this paper. A set of rules is designed according to the principle of request-forward-reply (RFR. That is, they are used to determine the timing of data broadcasting, forwarding, and replying automatically. Two typical situations are used to demonstrate the operation of rules. One is the front view of a driver occluded by other vehicles. The other is the traffic jam. The proposed protocol is flexible and extensible for unforeseen situations. Three simulation tools were also implemented to demonstrate the feasibility of the protocol and measure the network transmission under high density of vehicles. The simulation results show that the rule-based protocol is efficient on data exchange to increase the driving safety.

  18. Organizational Knowledge Transfer Using Ontologies and a Rule-Based System

    Science.gov (United States)

    Okabe, Masao; Yoshioka, Akiko; Kobayashi, Keido; Yamaguchi, Takahira

    In recent automated and integrated manufacturing, so-called intelligence skill is becoming more and more important and its efficient transfer to next-generation engineers is one of the urgent issues. In this paper, we propose a new approach without costly OJT (on-the-job training), that is, combinational usage of a domain ontology, a rule ontology and a rule-based system. Intelligence skill can be decomposed into pieces of simple engineering rules. A rule ontology consists of these engineering rules as primitives and the semantic relations among them. A domain ontology consists of technical terms in the engineering rules and the semantic relations among them. A rule ontology helps novices get the total picture of the intelligence skill and a domain ontology helps them understand the exact meanings of the engineering rules. A rule-based system helps domain experts externalize their tacit intelligence skill to ontologies and also helps novices internalize them. As a case study, we applied our proposal to some actual job at a remote control and maintenance office of hydroelectric power stations in Tokyo Electric Power Co., Inc. We also did an evaluation experiment for this case study and the result supports our proposal.

  19. Integration of object-oriented knowledge representation with the CLIPS rule based system

    Science.gov (United States)

    Logie, David S.; Kamil, Hasan

    1990-01-01

    The paper describes a portion of the work aimed at developing an integrated, knowledge based environment for the development of engineering-oriented applications. An Object Representation Language (ORL) was implemented in C++ which is used to build and modify an object-oriented knowledge base. The ORL was designed in such a way so as to be easily integrated with other representation schemes that could effectively reason with the object base. Specifically, the integration of the ORL with the rule based system C Language Production Systems (CLIPS), developed at the NASA Johnson Space Center, will be discussed. The object-oriented knowledge representation provides a natural means of representing problem data as a collection of related objects. Objects are comprised of descriptive properties and interrelationships. The object-oriented model promotes efficient handling of the problem data by allowing knowledge to be encapsulated in objects. Data is inherited through an object network via the relationship links. Together, the two schemes complement each other in that the object-oriented approach efficiently handles problem data while the rule based knowledge is used to simulate the reasoning process. Alone, the object based knowledge is little more than an object-oriented data storage scheme; however, the CLIPS inference engine adds the mechanism to directly and automatically reason with that knowledge. In this hybrid scheme, the expert system dynamically queries for data and can modify the object base with complete access to all the functionality of the ORL from rules.

  20. Implementasi Rule Based Expert Systems untuk Realtime Monitoring Penyelesaian Perkara Pidana Menggunakan Teknologi Radio Frequency Identification

    Directory of Open Access Journals (Sweden)

    Mar Fuah

    2017-05-01

    Full Text Available One of the problems in the criminal case completions is that the difficulty of making decision to estimate when the settlement of the case file will be fulfilled. It is caused by the number of case files handled and detention time changing. Therefore, the fast and accurate information is needed. The research aims to develop a monitoring system tracking and tracking of scheduling rules using Rule Based Expert Systems method with 17 rules, and supported by Radio Frequency Identification technology (RFID in the form of computer applications. Based on the output of the system, an analysis is performed in the criminal case settlement process with a set of IF-THEN rules. The RFID reader read the data of case files through radio wave signals emitted by the antenna toward active-Tag attached in the criminal case file. The system is designed to monitor the tracking and tracing of RFID-based scheduling rules in realtime way that was built in the form of computer application in accordance with the system design. This study results in no failure in reading active tags by the RFID reader to detect criminal case files that had been examined. There were many case files handled in three different location, they were the constabulary, prosecutor, and judges of district court and RFID was able to identify them simultaneously. So, RFID supports the implementation of Rule Based Expert Systems very much for realtime monitoring in criminal case accomplishment.

  1. A Rule-Based Model for Bankruptcy Prediction Based on an Improved Genetic Ant Colony Algorithm

    Directory of Open Access Journals (Sweden)

    Yudong Zhang

    2013-01-01

    Full Text Available In this paper, we proposed a hybrid system to predict corporate bankruptcy. The whole procedure consists of the following four stages: first, sequential forward selection was used to extract the most important features; second, a rule-based model was chosen to fit the given dataset since it can present physical meaning; third, a genetic ant colony algorithm (GACA was introduced; the fitness scaling strategy and the chaotic operator were incorporated with GACA, forming a new algorithm—fitness-scaling chaotic GACA (FSCGACA, which was used to seek the optimal parameters of the rule-based model; and finally, the stratified K-fold cross-validation technique was used to enhance the generalization of the model. Simulation experiments of 1000 corporations’ data collected from 2006 to 2009 demonstrated that the proposed model was effective. It selected the 5 most important factors as “net income to stock broker’s equality,” “quick ratio,” “retained earnings to total assets,” “stockholders’ equity to total assets,” and “financial expenses to sales.” The total misclassification error of the proposed FSCGACA was only 7.9%, exceeding the results of genetic algorithm (GA, ant colony algorithm (ACA, and GACA. The average computation time of the model is 2.02 s.

  2. Criterial noise effects on rule-based category learning: the impact of delayed feedback.

    Science.gov (United States)

    Ell, Shawn W; Ing, A David; Maddox, W Todd

    2009-08-01

    Variability in the representation of the decision criterion is assumed in many category-learning models, yet few studies have directly examined its impact. On each trial, criterial noise should result in drift in the criterion and will negatively impact categorization accuracy, particularly in rule-based categorization tasks, where learning depends on the maintenance and manipulation of decision criteria. In three experiments, we tested this hypothesis and examined the impact of working memory on slowing the drift rate. In Experiment 1, we examined the effect of drift by inserting a 5-sec delay between the categorization response and the delivery of corrective feedback, and working memory demand was manipulated by varying the number of decision criteria to be learned. Delayed feedback adversely affected performance, but only when working memory demand was high. In Experiment 2, we built on a classic finding in the absolute identification literature and demonstrated that distributing the criteria across multiple dimensions decreases the impact of drift during the delay. In Experiment 3, we confirmed that the effect of drift during the delay is moderated by working memory. These results provide important insights into the interplay between criterial noise and working memory, as well as providing important constraints for models of rule-based category learning.

  3. Automated detection of pain from facial expressions: a rule-based approach using AAM

    Science.gov (United States)

    Chen, Zhanli; Ansari, Rashid; Wilkie, Diana J.

    2012-02-01

    In this paper, we examine the problem of using video analysis to assess pain, an important problem especially for critically ill, non-communicative patients, and people with dementia. We propose and evaluate an automated method to detect the presence of pain manifested in patient videos using a unique and large collection of cancer patient videos captured in patient homes. The method is based on detecting pain-related facial action units defined in the Facial Action Coding System (FACS) that is widely used for objective assessment in pain analysis. In our research, a person-specific Active Appearance Model (AAM) based on Project-Out Inverse Compositional Method is trained for each patient individually for the modeling purpose. A flexible representation of the shape model is used in a rule-based method that is better suited than the more commonly used classifier-based methods for application to the cancer patient videos in which pain-related facial actions occur infrequently and more subtly. The rule-based method relies on the feature points that provide facial action cues and is extracted from the shape vertices of AAM, which have a natural correspondence to face muscular movement. In this paper, we investigate the detection of a commonly used set of pain-related action units in both the upper and lower face. Our detection results show good agreement with the results obtained by three trained FACS coders who independently reviewed and scored the action units in the cancer patient videos.

  4. Rule-based land cover classification from very high-resolution satellite image with multiresolution segmentation

    Science.gov (United States)

    Haque, Md. Enamul; Al-Ramadan, Baqer; Johnson, Brian A.

    2016-07-01

    Multiresolution segmentation and rule-based classification techniques are used to classify objects from very high-resolution satellite images of urban areas. Custom rules are developed using different spectral, geometric, and textural features with five scale parameters, which exploit varying classification accuracy. Principal component analysis is used to select the most important features out of a total of 207 different features. In particular, seven different object types are considered for classification. The overall classification accuracy achieved for the rule-based method is 95.55% and 98.95% for seven and five classes, respectively. Other classifiers that are not using rules perform at 84.17% and 97.3% accuracy for seven and five classes, respectively. The results exploit coarse segmentation for higher scale parameter and fine segmentation for lower scale parameter. The major contribution of this research is the development of rule sets and the identification of major features for satellite image classification where the rule sets are transferable and the parameters are tunable for different types of imagery. Additionally, the individual objectwise classification and principal component analysis help to identify the required object from an arbitrary number of objects within images given ground truth data for the training.

  5. AN QUALITY BASED ENHANCEMENT OF USER DATA PROTECTION VIA FUZZY RULE BASED SYSTEMS IN CLOUD ENVIRONMENT

    Directory of Open Access Journals (Sweden)

    R Poorva Devi

    2016-04-01

    Full Text Available So far, in cloud computing distinct customer is accessed and consumed enormous amount of services through web, offered by cloud service provider (CSP. However cloud is providing one of the services is, security-as-a-service to its clients, still people are terrified to use the service from cloud vendor. Number of solutions, security components and measurements are coming with the new scope for the cloud security issue, but 79.2% security outcome only obtained from the different scientists, researchers and other cloud based academy community. To overcome the problem of cloud security the proposed model that is, “Quality based Enhancing the user data protection via fuzzy rule based systems in cloud environment”, will helps to the cloud clients by the way of accessing the cloud resources through remote monitoring management (RMMM and what are all the services are currently requesting and consuming by the cloud users that can be well analyzed with Managed service provider (MSP rather than a traditional CSP. Normally, people are trying to secure their own private data by applying some key management and cryptographic based computations again it will direct to the security problem. In order to provide good quality of security target result by making use of fuzzy rule based systems (Constraint & Conclusion segments in cloud environment. By using this technique, users may obtain an efficient security outcome through the cloud simulation tool of Apache cloud stack simulator.

  6. An expert system design to diagnose cancer by using a new method reduced rule base.

    Science.gov (United States)

    Başçiftçi, Fatih; Avuçlu, Emre

    2018-04-01

    A Medical Expert System (MES) was developed which uses Reduced Rule Base to diagnose cancer risk according to the symptoms in an individual. A total of 13 symptoms were used. With the new MES, the reduced rules are controlled instead of all possibilities (2 13 = 8192 different possibilities occur). By controlling reduced rules, results are found more quickly. The method of two-level simplification of Boolean functions was used to obtain Reduced Rule Base. Thanks to the developed application with the number of dynamic inputs and outputs on different platforms, anyone can easily test their own cancer easily. More accurate results were obtained considering all the possibilities related to cancer. Thirteen different risk factors were determined to determine the type of cancer. The truth table produced in our study has 13 inputs and 4 outputs. The Boolean Function Minimization method is used to obtain less situations by simplifying logical functions. Diagnosis of cancer quickly thanks to control of the simplified 4 output functions. Diagnosis made with the 4 output values obtained using Reduced Rule Base was found to be quicker than diagnosis made by screening all 2 13 = 8192 possibilities. With the improved MES, more probabilities were added to the process and more accurate diagnostic results were obtained. As a result of the simplification process in breast and renal cancer diagnosis 100% diagnosis speed gain, in cervical cancer and lung cancer diagnosis rate gain of 99% was obtained. With Boolean function minimization, less number of rules is evaluated instead of evaluating a large number of rules. Reducing the number of rules allows the designed system to work more efficiently and to save time, and facilitates to transfer the rules to the designed Expert systems. Interfaces were developed in different software platforms to enable users to test the accuracy of the application. Any one is able to diagnose the cancer itself using determinative risk factors. Thereby

  7. Residents' surgical performance during the laboratory years: an analysis of rule-based errors.

    Science.gov (United States)

    Nathwani, Jay N; Wise, Brett J; Garren, Margaret E; Mohamadipanah, Hossein; Van Beek, Nicole; DiMarco, Shannon M; Pugh, Carla M

    2017-11-01

    Nearly one-third of surgical residents will enter into academic development during their surgical residency by dedicating time to a research fellowship for 1-3 y. Major interest lies in understanding how laboratory residents' surgical skills are affected by minimal clinical exposure during academic development. A widely held concern is that the time away from clinical exposure results in surgical skills decay. This study examines the impact of the academic development years on residents' operative performance. We hypothesize that the use of repeated, annual assessments may result in learning even without individual feedback on participants simulated performance. Surgical performance data were collected from laboratory residents (postgraduate years 2-5) during the summers of 2014, 2015, and 2016. Residents had 15 min to complete a shortened, simulated laparoscopic ventral hernia repair procedure. Final hernia repair skins from all participants were scored using a previously validated checklist. An analysis of variance test compared the mean performance scores of repeat participants to those of first time participants. Twenty-seven (37% female) laboratory residents provided 2-year assessment data over the 3-year span of the study. Second time performance revealed improvement from a mean score of 14 (standard error = 1.0) in the first year to 17.2 (SD = 0.9) in the second year, (F[1, 52] = 5.6, P = 0.022). Detailed analysis demonstrated improvement in performance for 3 grading criteria that were considered to be rule-based errors. There was no improvement in operative strategy errors. Analysis of longitudinal performance of laboratory residents shows higher scores for repeat participants in the category of rule-based errors. These findings suggest that laboratory residents can learn from rule-based mistakes when provided with annual performance-based assessments. This benefit was not seen with operative strategy errors and has important implications for

  8. A General Attribute and Rule Based Role-Based Access Control Model

    Institute of Scientific and Technical Information of China (English)

    2007-01-01

    Growing numbers of users and many access control policies which involve many different resource attributes in service-oriented environments bring various problems in protecting resource. This paper analyzes the relationships of resource attributes to user attributes in all policies, and propose a general attribute and rule based role-based access control(GAR-RBAC) model to meet the security needs. The model can dynamically assign users to roles via rules to meet the need of growing numbers of users. These rules use different attribute expression and permission as a part of authorization constraints, and are defined by analyzing relations of resource attributes to user attributes in many access policies that are defined by the enterprise. The model is a general access control model, and can support many access control policies, and also can be used to wider application for service. The paper also describes how to use the GAR-RBAC model in Web service environments.

  9. Association Rule-based Predictive Model for Machine Failure in Industrial Internet of Things

    Science.gov (United States)

    Kwon, Jung-Hyok; Lee, Sol-Bee; Park, Jaehoon; Kim, Eui-Jik

    2017-09-01

    This paper proposes an association rule-based predictive model for machine failure in industrial Internet of things (IIoT), which can accurately predict the machine failure in real manufacturing environment by investigating the relationship between the cause and type of machine failure. To develop the predictive model, we consider three major steps: 1) binarization, 2) rule creation, 3) visualization. The binarization step translates item values in a dataset into one or zero, then the rule creation step creates association rules as IF-THEN structures using the Lattice model and Apriori algorithm. Finally, the created rules are visualized in various ways for users’ understanding. An experimental implementation was conducted using R Studio version 3.3.2. The results show that the proposed predictive model realistically predicts machine failure based on association rules.

  10. Increasing Supply-Chain Visibility with Rule-Based RFID Data Analysis

    DEFF Research Database (Denmark)

    Ilic, A.; Andersen, Thomas; Michahelles, F.

    2009-01-01

    RFID technology tracks the flow of physical items and goods in supply chains to help users detect inefficiencies, such as shipment delays, theft, or inventory problems. An inevitable consequence, however, is that it generates huge numbers of events. To exploit these large amounts of data, the Sup......RFID technology tracks the flow of physical items and goods in supply chains to help users detect inefficiencies, such as shipment delays, theft, or inventory problems. An inevitable consequence, however, is that it generates huge numbers of events. To exploit these large amounts of data......, the Supply Chain Visualizer increases supply-chain visibility by analyzing RFID data, using a mix of automated analysis techniques and human effort. The tool's core concepts include rule-based analysis techniques and a map-based representation interface. With these features, it lets users visualize...

  11. An Enhanced Rule-Based Web Scanner Based on Similarity Score

    Directory of Open Access Journals (Sweden)

    LEE, M.

    2016-08-01

    Full Text Available This paper proposes an enhanced rule-based web scanner in order to get better accuracy in detecting web vulnerabilities than the existing tools, which have relatively high false alarm rate when the web pages are installed in unconventional directory paths. Using the proposed matching method based on similarity score, the proposed scheme can determine whether two pages have the same vulnerabilities or not. With this method, the proposed scheme is able to figure out the target web pages are vulnerable by comparing them to the web pages that are known to have vulnerabilities. We show the proposed scanner reduces 12% false alarm rate compared to the existing well-known scanner through the performance evaluation via various experiments. The proposed scheme is especially helpful in detecting vulnerabilities of the web applications which come from well-known open-source web applications after small customization, which happens frequently in many small-sized companies.

  12. Using fuzzy rule-based knowledge model for optimum plating conditions search

    Science.gov (United States)

    Solovjev, D. S.; Solovjeva, I. A.; Litovka, Yu V.; Arzamastsev, A. A.; Glazkov, V. P.; L’vov, A. A.

    2018-03-01

    The paper discusses existing approaches to plating process modeling in order to decrease the distribution thickness of plating surface cover. However, these approaches do not take into account the experience, knowledge, and intuition of the decision-makers when searching the optimal conditions of electroplating technological process. The original approach to optimal conditions search for applying the electroplating coatings, which uses the rule-based model of knowledge and allows one to reduce the uneven product thickness distribution, is proposed. The block diagrams of a conventional control system of a galvanic process as well as the system based on the production model of knowledge are considered. It is shown that the fuzzy production model of knowledge in the control system makes it possible to obtain galvanic coatings of a given thickness unevenness with a high degree of adequacy to the experimental data. The described experimental results confirm the theoretical conclusions.

  13. Fuzzy Rule-based Analysis of Promotional Efficiency in Vietnam’s Tourism Industry

    Directory of Open Access Journals (Sweden)

    Nguyen Quang VINH

    2015-06-01

    Full Text Available This study aims to determine an effective method of measuring the efficiency of promotional strategies for tourist destinations. Complicating factors that influence promotional efficiency (PE, such as promotional activities (PA, destination attribute (DA, and destination image (DI, make it difficult to evaluate the effectiveness of PE. This study develops a rule-based decision support mechanism using fuzzy set theory and the Analytic Hierarchy Process (AHP to evaluate the effectiveness of promotional strategies. Additionally, a statistical analysis is conducted using SPSS (Statistics Package for Social Science to confirm the results of the fuzzy AHP analysis. This study finds that government policy is the most important factor for PE and that service staff (internal beauty is more important than tourism infrastructure (external beauty in terms of customer satisfaction and long-term strategy in PE. With respect to DI, experts are concerned first with tourist perceived value, second with tourist satisfaction and finally with tourist loyalty.

  14. A rule-based expert system for generating control displays at the Advanced Photon Source

    International Nuclear Information System (INIS)

    Coulter, K.J.

    1993-01-01

    The integration of a rule-based expert system for generating screen displays for controlling and monitoring instrumentation under the Experimental Physics and Industrial Control System (EPICS) is presented. The expert system is implemented using CLIPS, an expert system shell from the Software Technology Branch at Lyndon B. Johnson Space Center. The user selects the hardware input and output to be displayed and the expert system constructs a graphical control screen appropriate for the data. Such a system provides a method for implementing a common look and feel for displays created by several different users and reduces the amount of time required to create displays for new hardware configurations. Users are able to modify the displays as needed using the EPICS display editor tool

  15. Analysis of Rules for Islamic Inheritance Law in Indonesia Using Hybrid Rule Based Learning

    Science.gov (United States)

    Khosyi'ah, S.; Irfan, M.; Maylawati, D. S.; Mukhlas, O. S.

    2018-01-01

    Along with the development of human civilization in Indonesia, the changes and reform of Islamic inheritance law so as to conform to the conditions and culture cannot be denied. The distribution of inheritance in Indonesia can be done automatically by storing the rule of Islamic inheritance law in the expert system. In this study, we analyze the knowledge of experts in Islamic inheritance in Indonesia and represent it in the form of rules using rule-based Forward Chaining (FC) and Davis-Putman-Logemann-Loveland (DPLL) algorithms. By hybridizing FC and DPLL algorithms, the rules of Islamic inheritance law in Indonesia are clearly defined and measured. The rules were conceptually validated by some experts in Islamic laws and informatics. The results revealed that generally all rules were ready for use in an expert system.

  16. Method for automatic control rod operation using rule-based control

    International Nuclear Information System (INIS)

    Kinoshita, Mitsuo; Yamada, Naoyuki; Kiguchi, Takashi

    1988-01-01

    An automatic control rod operation method using rule-based control is proposed. Its features are as follows: (1) a production system to recognize plant events, determine control actions and realize fast inference (fast selection of a suitable production rule), (2) use of the fuzzy control technique to determine quantitative control variables. The method's performance was evaluated by simulation tests on automatic control rod operation at a BWR plant start-up. The results were as follows; (1) The performance which is related to stabilization of controlled variables and time required for reactor start-up, was superior to that of other methods such as PID control and program control methods, (2) the process time to select and interpret the suitable production rule, which was the same as required for event recognition or determination of control action, was short (below 1 s) enough for real time control. The results showed that the method is effective for automatic control rod operation. (author)

  17. A rule-based expert system for generating control displays at the advanced photon source

    International Nuclear Information System (INIS)

    Coulter, K.J.

    1994-01-01

    The integration of a rule-based expert system for generating screen displays for controlling and monitoring instrumentation under the Experimental Physics and Industrial Control System (EPICS) is presented. The expert system is implemented using CLIPS, an expert system shell from the Software Technology Branch at Lyndon B. Johnson Space Center. The user selects the hardware input and output to be displayed and the expert system constructs a graphical control screen appropriate for the data. Such a system provides a method for implementing a common look and feel for displays created by several different users and reduces the amount of time required to create displays for new hardware configurations. Users are able to modify the displays as needed using the EPICS display editor tool. ((orig.))

  18. Rule-Based vs. Behavior-Based Self-Deployment for Mobile Wireless Sensor Networks.

    Science.gov (United States)

    Urdiales, Cristina; Aguilera, Francisco; González-Parada, Eva; Cano-García, Jose; Sandoval, Francisco

    2016-07-07

    In mobile wireless sensor networks (MWSN), nodes are allowed to move autonomously for deployment. This process is meant: (i) to achieve good coverage; and (ii) to distribute the communication load as homogeneously as possible. Rather than optimizing deployment, reactive algorithms are based on a set of rules or behaviors, so nodes can determine when to move. This paper presents an experimental evaluation of both reactive deployment approaches: rule-based and behavior-based ones. Specifically, we compare a backbone dispersion algorithm with a social potential fields algorithm. Most tests are done under simulation for a large number of nodes in environments with and without obstacles. Results are validated using a small robot network in the real world. Our results show that behavior-based deployment tends to provide better coverage and communication balance, especially for a large number of nodes in areas with obstacles.

  19. Research on key technology of the verification system of steel rule based on vision measurement

    Science.gov (United States)

    Jia, Siyuan; Wang, Zhong; Liu, Changjie; Fu, Luhua; Li, Yiming; Lu, Ruijun

    2018-01-01

    The steel rule plays an important role in quantity transmission. However, the traditional verification method of steel rule based on manual operation and reading brings about low precision and low efficiency. A machine vison based verification system of steel rule is designed referring to JJG1-1999-Verificaiton Regulation of Steel Rule [1]. What differentiates this system is that it uses a new calibration method of pixel equivalent and decontaminates the surface of steel rule. Experiments show that these two methods fully meet the requirements of the verification system. Measuring results strongly prove that these methods not only meet the precision of verification regulation, but also improve the reliability and efficiency of the verification system.

  20. Neural Substrates of Similarity and Rule-based Strategies in Judgment

    Directory of Open Access Journals (Sweden)

    Bettina eVon Helversen

    2014-10-01

    Full Text Available Making accurate judgments is a core human competence and a prerequisite for success in many areas of life. Plenty of evidence exists that people can employ different judgment strategies to solve identical judgment problems. In categorization, it has been demonstrated that similarity-based and rule-based strategies are associated with activity in different brain regions. Building on this research, the present work tests whether solving two identical judgment problems recruits different neural substrates depending on people's judgment strategies. Combining cognitive modeling of judgment strategies at the behavioral level with functional magnetic resonance imaging (fMRI, we compare brain activity when using two archetypal judgment strategies: a similarity-based exemplar strategy and a rule-based heuristic strategy. Using an exemplar-based strategy should recruit areas involved in long-term memory processes to a larger extent than a heuristic strategy. In contrast, using a heuristic strategy should recruit areas involved in the application of rules to a larger extent than an exemplar-based strategy. Largely consistent with our hypotheses, we found that using an exemplar-based strategy led to relatively higher BOLD activity in the anterior prefrontal and inferior parietal cortex, presumably related to retrieval and selective attention processes. In contrast, using a heuristic strategy led to relatively higher activity in areas in the dorsolateral prefrontal and the temporal-parietal cortex associated with cognitive control and information integration. Thus, even when people solve identical judgment problems, different neural substrates can be recruited depending on the judgment strategy involved.

  1. Accurate crop classification using hierarchical genetic fuzzy rule-based systems

    Science.gov (United States)

    Topaloglou, Charalampos A.; Mylonas, Stelios K.; Stavrakoudis, Dimitris G.; Mastorocostas, Paris A.; Theocharis, John B.

    2014-10-01

    This paper investigates the effectiveness of an advanced classification system for accurate crop classification using very high resolution (VHR) satellite imagery. Specifically, a recently proposed genetic fuzzy rule-based classification system (GFRBCS) is employed, namely, the Hierarchical Rule-based Linguistic Classifier (HiRLiC). HiRLiC's model comprises a small set of simple IF-THEN fuzzy rules, easily interpretable by humans. One of its most important attributes is that its learning algorithm requires minimum user interaction, since the most important learning parameters affecting the classification accuracy are determined by the learning algorithm automatically. HiRLiC is applied in a challenging crop classification task, using a SPOT5 satellite image over an intensively cultivated area in a lake-wetland ecosystem in northern Greece. A rich set of higher-order spectral and textural features is derived from the initial bands of the (pan-sharpened) image, resulting in an input space comprising 119 features. The experimental analysis proves that HiRLiC compares favorably to other interpretable classifiers of the literature, both in terms of structural complexity and classification accuracy. Its testing accuracy was very close to that obtained by complex state-of-the-art classification systems, such as the support vector machines (SVM) and random forest (RF) classifiers. Nevertheless, visual inspection of the derived classification maps shows that HiRLiC is characterized by higher generalization properties, providing more homogeneous classifications that the competitors. Moreover, the runtime requirements for producing the thematic map was orders of magnitude lower than the respective for the competitors.

  2. Collaborative Working e-Learning Environments Supported by Rule-Based e-Tutor

    Directory of Open Access Journals (Sweden)

    Salaheddin Odeh

    2007-10-01

    Full Text Available Collaborative working environments for distance education sets a goal of convenience and an adaptation into our technologically advanced societies. To achieve this revolutionary new way of learning, environments must allow the different participants to communicate and coordinate with each other in a productive manner. Productivity and efficiency is obtained through synchronized communication between the different coordinating partners, which means that multiple users can execute an experiment simultaneously. Within this process, coordination can be accomplished by voice communication and chat tools. In recent times, multi-user environments have been successfully applied in many applications such as air traffic control systems, team-oriented military systems, chat text tools, and multi-player games. Thus, understanding the ideas and the techniques behind these systems can be of great significance regarding the contribution of newer ideas to collaborative working e-learning environments. However, many problems still exist in distance learning and tele-education, such as not finding the proper assistance while performing the remote experiment. Therefore, the students become overwhelmed and the experiment will fail. In this paper, we are going to discuss a solution that enables students to obtain an automated help by either a human tutor or a rule-based e-tutor (embedded rule-based system for the purpose of student support in complex remote experimentative environments. The technical implementation of the system can be realized by using the powerful Microsoft .NET, which offers a complete integrated developmental environment (IDE with a wide collection of products and technologies. Once the system is developed, groups of students are independently able to coordinate and to execute the experiment at any time and from any place, organizing the work between them positively.

  3. Prioritized rule based load management technique for residential building powered by PV/battery system

    Directory of Open Access Journals (Sweden)

    T.R. Ayodele

    2017-06-01

    Full Text Available In recent years, Solar Photovoltaic (PV system has presented itself as one of the main solutions to the electricity poverty plaguing the majority of buildings in rural communities with solar energy potential. However, the stochasticity associated with solar PV power output owing to vagaries in weather conditions is a major challenge in the deployment of the systems. This study investigates approach for maximizing the benefits of a Stand-Alone Photovoltaic-Battery (SAPVB system via techniques that provide for optimum energy gleaning and management. A rule-based load management scheme is developed and tested for a residential building. The approach allows load prioritizing and shifting based on certain rules. To achieve this, the residential loads are classified into Critical Loads (CLs and Uncritical Loads (ULs. The CLs are given higher priority and therefore are allowed to operate at their scheduled time while the ULs are of less priority, hence can be shifted to a time where there is enough electric power generation from the PV arrays rather than the loads being operated at the time period set by the user. Four scenarios were created to give insight into the applicability of the proposed rule based load management scheme. The result revealed that when the load management technique is not utilized as in the case of scenario 1 (Base case, the percentage satisfaction of the critical and uncritical loads by the PV system are 49.8% and 23.7%. However with the implementation of the load management scheme in scenarios 2, 3 and 4, the percentage satisfaction of the loads (CLs, ULs are (93.8%, 74.2%, (90.9%, 70.1% and (87.2%, 65.4% for scenarios 2, 3 and 4, respectively.

  4. Improving the anesthetic process by a fuzzy rule based medical decision system.

    Science.gov (United States)

    Mendez, Juan Albino; Leon, Ana; Marrero, Ayoze; Gonzalez-Cava, Jose M; Reboso, Jose Antonio; Estevez, Jose Ignacio; Gomez-Gonzalez, José F

    2018-01-01

    The main objective of this research is the design and implementation of a new fuzzy logic tool for automatic drug delivery in patients undergoing general anesthesia. The aim is to adjust the drug dose to the real patient needs using heuristic knowledge provided by clinicians. A two-level computer decision system is proposed. The idea is to release the clinician from routine tasks so that he can focus on other variables of the patient. The controller uses the Bispectral Index (BIS) to assess the hypnotic state of the patient. Fuzzy controller was included in a closed-loop system to reach the BIS target and reject disturbances. BIS was measured using a BIS VISTA monitor, a device capable of calculating the hypnosis level of the patient through EEG information. An infusion pump with propofol 1% is used to supply the drug to the patient. The inputs to the fuzzy inference system are BIS error and BIS rate. The output is infusion rate increment. The mapping of the input information and the appropriate output is given by a rule-base based on knowledge of clinicians. To evaluate the performance of the fuzzy closed-loop system proposed, an observational study was carried out. Eighty one patients scheduled for ambulatory surgery were randomly distributed in 2 groups: one group using a fuzzy logic based closed-loop system (FCL) to automate the administration of propofol (42 cases); the second group using manual delivering of the drug (39 cases). In both groups, the BIS target was 50. The FCL, designed with intuitive logic rules based on the clinician experience, performed satisfactorily and outperformed the manual administration in patients in terms of accuracy through the maintenance stage. Copyright © 2018 Elsevier B.V. All rights reserved.

  5. SPATKIN: a simulator for rule-based modeling of biomolecular site dynamics on surfaces.

    Science.gov (United States)

    Kochanczyk, Marek; Hlavacek, William S; Lipniacki, Tomasz

    2017-11-15

    Rule-based modeling is a powerful approach for studying biomolecular site dynamics. Here, we present SPATKIN, a general-purpose simulator for rule-based modeling in two spatial dimensions. The simulation algorithm is a lattice-based method that tracks Brownian motion of individual molecules and the stochastic firing of rule-defined reaction events. Because rules are used as event generators, the algorithm is network-free, meaning that it does not require to generate the complete reaction network implied by rules prior to simulation. In a simulation, each molecule (or complex of molecules) is taken to occupy a single lattice site that cannot be shared with another molecule (or complex). SPATKIN is capable of simulating a wide array of membrane-associated processes, including adsorption, desorption and crowding. Models are specified using an extension of the BioNetGen language, which allows to account for spatial features of the simulated process. The C ++ source code for SPATKIN is distributed freely under the terms of the GNU GPLv3 license. The source code can be compiled for execution on popular platforms (Windows, Mac and Linux). An installer for 64-bit Windows and a macOS app are available. The source code and precompiled binaries are available at the SPATKIN Web site (http://pmbm.ippt.pan.pl/software/spatkin). spatkin.simulator@gmail.com. Supplementary data are available at Bioinformatics online. © The Author 2017. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com

  6. PRINCIPLES- AND RULES-BASED ACCOUNTING DEBATE. IMPLICATIONS FOR AN EMERGENT COUNTRY

    Directory of Open Access Journals (Sweden)

    Deaconu Adela

    2011-07-01

    Full Text Available By a qualitative analysis, this research observes whether a principles-based system or a mixed version of it with the rules-based system, applied in Romania - an emergent country - is appropriate taking into account the mentalities, the traditions, and other cultural elements that were typical of a rules-based system. We support the statement that, even if certain contextual variables are common to other developed countries, their environments significantly differ. To be effective, financial reporting must reflect the firm's context in which it is functioning. The research has a deductive approach based on the analysis of the cultural factors and their influence in the last years. For Romania it is argue a lower accounting professionalism associated with a low level of ambiguity tolerance. For the stage analysed in this study (after the year 2005 the professional reasoning - a proxy for the accounting professional behaviour - took into consideration the fiscal and legal requirements rather than the accounting principles and judgments. The research suggest that the Romanian accounting practice and the professionals are not fully prepared for a principles-based system environment, associated with the ability to find undisclosed events, facing ambiguity, identifying inferred relationships and using intuition, respectively working with uncertainty. We therefore reach the conclusion that in Romania institutional amendments affecting the professional expertise would be needed. The accounting regulations must be chosen with great caution and they must answer and/ or be adjusted, even if the process would be delayed, to national values, behaviour of companies and individual expertise and beliefs. Secondly, the benefits of applying accounting reasoning in this country may be enhanced through a better understanding of their content and through practical exercise. Here regulatory bodies may intervene for organizing professional training programs and acting

  7. Fuzzylot: a novel self-organising fuzzy-neural rule-based pilot system for automated vehicles.

    Science.gov (United States)

    Pasquier, M; Quek, C; Toh, M

    2001-10-01

    This paper presents part of our research work concerned with the realisation of an Intelligent Vehicle and the technologies required for its routing, navigation, and control. An automated driver prototype has been developed using a self-organising fuzzy rule-based system (POPFNN-CRI(S)) to model and subsequently emulate human driving expertise. The ability of fuzzy logic to represent vague information using linguistic variables makes it a powerful tool to develop rule-based control systems when an exact working model is not available, as is the case of any vehicle-driving task. Designing a fuzzy system, however, is a complex endeavour, due to the need to define the variables and their associated fuzzy sets, and determine a suitable rule base. Many efforts have thus been devoted to automating this process, yielding the development of learning and optimisation techniques. One of them is the family of POP-FNNs, or Pseudo-Outer Product Fuzzy Neural Networks (TVR, AARS(S), AARS(NS), CRI, Yager). These generic self-organising neural networks developed at the Intelligent Systems Laboratory (ISL/NTU) are based on formal fuzzy mathematical theory and are able to objectively extract a fuzzy rule base from training data. In this application, a driving simulator has been developed, that integrates a detailed model of the car dynamics, complete with engine characteristics and environmental parameters, and an OpenGL-based 3D-simulation interface coupled with driving wheel and accelerator/ brake pedals. The simulator has been used on various road scenarios to record from a human pilot driving data consisting of steering and speed control actions associated to road features. Specifically, the POPFNN-CRI(S) system is used to cluster the data and extract a fuzzy rule base modelling the human driving behaviour. Finally, the effectiveness of the generated rule base has been validated using the simulator in autopilot mode.

  8. ML-Space: Hybrid Spatial Gillespie and Particle Simulation of Multi-Level Rule-Based Models in Cell Biology.

    Science.gov (United States)

    Bittig, Arne T; Uhrmacher, Adelinde M

    2017-01-01

    Spatio-temporal dynamics of cellular processes can be simulated at different levels of detail, from (deterministic) partial differential equations via the spatial Stochastic Simulation algorithm to tracking Brownian trajectories of individual particles. We present a spatial simulation approach for multi-level rule-based models, which includes dynamically hierarchically nested cellular compartments and entities. Our approach ML-Space combines discrete compartmental dynamics, stochastic spatial approaches in discrete space, and particles moving in continuous space. The rule-based specification language of ML-Space supports concise and compact descriptions of models and to adapt the spatial resolution of models easily.

  9. Rule Based Expert System for Monitoring Real Time Drug Supply in Hospital Using Radio Frequency Identification Technology

    Science.gov (United States)

    Driandanu, Galih; Surarso, Bayu; Suryono

    2018-02-01

    A radio frequency identification (RFID) has obtained increasing attention with the emergence of various applications. This study aims to examine the implementation of rule based expert system supported by RFID technology into a monitoring information system of drug supply in a hospital. This research facilitates in monitoring the real time drug supply by using data sample from the hospital pharmacy. This system able to identify and count the number of drug and provide warning and report in real time. the conclusion is the rule based expert system and RFID technology can facilitate the performance in monitoring the drug supply quickly and precisely.

  10. Intelligent Recognition of Lung Nodule Combining Rule-based and C-SVM Classifiers

    Directory of Open Access Journals (Sweden)

    Bin Li

    2011-10-01

    Full Text Available Computer-aided detection(CAD system for lung nodules plays the important role in the diagnosis of lung cancer. In this paper, an improved intelligent recognition method of lung nodule in HRCT combing rule-based and costsensitive support vector machine(C-SVM classifiers is proposed for detecting both solid nodules and ground-glass opacity(GGO nodules(part solid and nonsolid. This method consists of several steps. Firstly, segmentation of regions of interest(ROIs, including pulmonary parenchyma and lung nodule candidates, is a difficult task. On one side, the presence of noise lowers the visibility of low-contrast objects. On the other side, different types of nodules, including small nodules, nodules connecting to vasculature or other structures, part-solid or nonsolid nodules, are complex, noisy, weak edge or difficult to define the boundary. In order to overcome the difficulties of obvious boundary-leak and slow evolvement speed problem in segmentatioin of weak edge, an overall segmentation method is proposed, they are: the lung parenchyma is extracted based on threshold and morphologic segmentation method; the image denoising and enhancing is realized by nonlinear anisotropic diffusion filtering(NADF method;candidate pulmonary nodules are segmented by the improved C-V level set method, in which the segmentation result of EM-based fuzzy threshold method is used as the initial contour of active contour model and a constrained energy term is added into the PDE of level set function. Then, lung nodules are classified by using the intelligent classifiers combining rules and C-SVM. Rule-based classification is first used to remove easily dismissible nonnodule objects, then C-SVM classification are used to further classify nodule candidates and reduce the number of false positive(FP objects. In order to increase the efficiency of SVM, an improved training method is used to train SVM, which uses the grid search method to search the optimal parameters

  11. Intelligent Recognition of Lung Nodule Combining Rule-based and C-SVM Classifiers

    Directory of Open Access Journals (Sweden)

    Bin Li

    2012-02-01

    Full Text Available Computer-aided detection(CAD system for lung nodules plays the important role in the diagnosis of lung cancer. In this paper, an improved intelligent recognition method of lung nodule in HRCT combing rule-based and cost-sensitive support vector machine(C-SVM classifiers is proposed for detecting both solid nodules and ground-glass opacity(GGO nodules(part solid and nonsolid. This method consists of several steps. Firstly, segmentation of regions of interest(ROIs, including pulmonary parenchyma and lung nodule candidates, is a difficult task. On one side, the presence of noise lowers the visibility of low-contrast objects. On the other side, different types of nodules, including small nodules, nodules connecting to vasculature or other structures, part-solid or nonsolid nodules, are complex, noisy, weak edge or difficult to define the boundary. In order to overcome the difficulties of obvious boundary-leak and slow evolvement speed problem in segmentatioin of weak edge, an overall segmentation method is proposed, they are: the lung parenchyma is extracted based on threshold and morphologic segmentation method; the image denoising and enhancing is realized by nonlinear anisotropic diffusion filtering(NADF method; candidate pulmonary nodules are segmented by the improved C-V level set method, in which the segmentation result of EM-based fuzzy threshold method is used as the initial contour of active contour model and a constrained energy term is added into the PDE of level set function. Then, lung nodules are classified by using the intelligent classifiers combining rules and C-SVM. Rule-based classification is first used to remove easily dismissible nonnodule objects, then C-SVM classification are used to further classify nodule candidates and reduce the number of false positive(FP objects. In order to increase the efficiency of SVM, an improved training method is used to train SVM, which uses the grid search method to search the optimal

  12. Ensemble Classifiers for Predicting HIV-1 Resistance from Three Rule-Based Genotypic Resistance Interpretation Systems.

    Science.gov (United States)

    Raposo, Letícia M; Nobre, Flavio F

    2017-08-30

    Resistance to antiretrovirals (ARVs) is a major problem faced by HIV-infected individuals. Different rule-based algorithms were developed to infer HIV-1 susceptibility to antiretrovirals from genotypic data. However, there is discordance between them, resulting in difficulties for clinical decisions about which treatment to use. Here, we developed ensemble classifiers integrating three interpretation algorithms: Agence Nationale de Recherche sur le SIDA (ANRS), Rega, and the genotypic resistance interpretation system from Stanford HIV Drug Resistance Database (HIVdb). Three approaches were applied to develop a classifier with a single resistance profile: stacked generalization, a simple plurality vote scheme and the selection of the interpretation system with the best performance. The strategies were compared with the Friedman's test and the performance of the classifiers was evaluated using the F-measure, sensitivity and specificity values. We found that the three strategies had similar performances for the selected antiretrovirals. For some cases, the stacking technique with naïve Bayes as the learning algorithm showed a statistically superior F-measure. This study demonstrates that ensemble classifiers can be an alternative tool for clinical decision-making since they provide a single resistance profile from the most commonly used resistance interpretation systems.

  13. Evolution of Collective Behaviour in an Artificial World Using Linguistic Fuzzy Rule-Based Systems.

    Directory of Open Access Journals (Sweden)

    Jure Demšar

    Full Text Available Collective behaviour is a fascinating and easily observable phenomenon, attractive to a wide range of researchers. In biology, computational models have been extensively used to investigate various properties of collective behaviour, such as: transfer of information across the group, benefits of grouping (defence against predation, foraging, group decision-making process, and group behaviour types. The question 'why,' however remains largely unanswered. Here the interest goes into which pressures led to the evolution of such behaviour, and evolutionary computational models have already been used to test various biological hypotheses. Most of these models use genetic algorithms to tune the parameters of previously presented non-evolutionary models, but very few attempt to evolve collective behaviour from scratch. Of these last, the successful attempts display clumping or swarming behaviour. Empirical evidence suggests that in fish schools there exist three classes of behaviour; swarming, milling and polarized. In this paper we present a novel, artificial life-like evolutionary model, where individual agents are governed by linguistic fuzzy rule-based systems, which is capable of evolving all three classes of behaviour.

  14. RFID sensor-tags feeding a context-aware rule-based healthcare monitoring system.

    Science.gov (United States)

    Catarinucci, Luca; Colella, Riccardo; Esposito, Alessandra; Tarricone, Luciano; Zappatore, Marco

    2012-12-01

    Along with the growing of the aging population and the necessity of efficient wellness systems, there is a mounting demand for new technological solutions able to support remote and proactive healthcare. An answer to this need could be provided by the joint use of the emerging Radio Frequency Identification (RFID) technologies and advanced software choices. This paper presents a proposal for a context-aware infrastructure for ubiquitous and pervasive monitoring of heterogeneous healthcare-related scenarios, fed by RFID-based wireless sensors nodes. The software framework is based on a general purpose architecture exploiting three key implementation choices: ontology representation, multi-agent paradigm and rule-based logic. From the hardware point of view, the sensing and gathering of context-data is demanded to a new Enhanced RFID Sensor-Tag. This new device, de facto, makes possible the easy integration between RFID and generic sensors, guaranteeing flexibility and preserving the benefits in terms of simplicity of use and low cost of UHF RFID technology. The system is very efficient and versatile and its customization to new scenarios requires a very reduced effort, substantially limited to the update/extension of the ontology codification. Its effectiveness is demonstrated by reporting both customization effort and performance results obtained from validation in two different healthcare monitoring contexts.

  15. Fuzzy rule-based forecast of meteorological drought in western Niger

    Science.gov (United States)

    Abdourahamane, Zakari Seybou; Acar, Reşat

    2018-01-01

    Understanding the causes of rainfall anomalies in the West African Sahel to effectively predict drought events remains a challenge. The physical mechanisms that influence precipitation in this region are complex, uncertain, and imprecise in nature. Fuzzy logic techniques are renowned to be highly efficient in modeling such dynamics. This paper attempts to forecast meteorological drought in Western Niger using fuzzy rule-based modeling techniques. The 3-month scale standardized precipitation index (SPI-3) of four rainfall stations was used as predictand. Monthly data of southern oscillation index (SOI), South Atlantic sea surface temperature (SST), relative humidity (RH), and Atlantic sea level pressure (SLP), sourced from the National Oceanic and Atmosphere Administration (NOAA), were used as predictors. Fuzzy rules and membership functions were generated using fuzzy c-means clustering approach, expert decision, and literature review. For a minimum lead time of 1 month, the model has a coefficient of determination R 2 between 0.80 and 0.88, mean square error (MSE) below 0.17, and Nash-Sutcliffe efficiency (NSE) ranging between 0.79 and 0.87. The empirical frequency distributions of the predicted and the observed drought classes are equal at the 99% of confidence level based on two-sample t test. Results also revealed the discrepancy in the influence of SOI and SLP on drought occurrence at the four stations while the effect of SST and RH are space independent, being both significantly correlated (at α based forecast model shows better forecast skills.

  16. Knowledge Representation and Inference for Analysis and Design of Database and Tabular Rule-Based Systems

    Directory of Open Access Journals (Sweden)

    Antoni Ligeza

    2001-01-01

    Full Text Available Rulebased systems constitute a powerful tool for specification of knowledge in design and implementation of knowledge based systems. They provide also a universal programming paradigm for domains such as intelligent control, decision support, situation classification and operational knowledge encoding. In order to assure safe and reliable performance, such system should satisfy certain formal requirements, including completeness and consistency. This paper addresses the issue of analysis and verification of selected properties of a class of such system in a systematic way. A uniform, tabular scheme of single-level rule-based systems is considered. Such systems can be applied as a generalized form of databases for specification of data pattern (unconditional knowledge, or can be used for defining attributive decision tables (conditional knowledge in form of rules. They can also serve as lower-level components of a hierarchical multi-level control and decision support knowledge-based systems. An algebraic knowledge representation paradigm using extended tabular representation, similar to relational database tables is presented and algebraic bases for system analysis, verification and design support are outlined.

  17. Model Servqual Rule Base Asean University Network untuk Penilaian Kualitas Program Studi

    Directory of Open Access Journals (Sweden)

    Esti Wijayanti

    2016-05-01

    Full Text Available As well known that AUN (Asean University Network.AUN and ABET (Accreditation Boardb for Enginnering and Technology are non-profit organitatinon which have. AUN (Asean University Network were using variable with refer to AUN’s criteria’s there consist of fifteen which are: Expected Learning Outcomes, Programme Specification, Programme Structure and Content, Teaching and Learning Strategy, Student Assessment, Academic Staff Quality, Support Staff Quality, Student Quality, Student Advice and Support, Facilities and Infrastructure, Quality Assurance of Teaching/Learning Process, Staff Development Activities, Stakeholders Feedback, Output, Stakeholders Satisfaction,and adopted score's scale 7. In there here, we discuss the fifteen AUN’s of AUN in the criterias. There servqual of as can be into five dimensions, assurance, empathy, responsive, reliability and facilty in order to make the assessment's process easier. This research outcome indicated that this proposed method can be used to evaluate an education program. The validation result by using AUN's data and the analysis of servqual rule base Asean University Network almost have the same pattern with correlation value is 0,985 and this is can be accepted because its validity have reach 97%.

  18. Application of rule-based data mining techniques to real time ATLAS Grid job monitoring data

    CERN Document Server

    Ahrens, R; The ATLAS collaboration; Kalinin, S; Maettig, P; Sandhoff, M; dos Santos, T; Volkmer, F

    2012-01-01

    The Job Execution Monitor (JEM) is a job-centric grid job monitoring software developed at the University of Wuppertal and integrated into the pilot-based “PanDA” job brokerage system leveraging physics analysis and Monte Carlo event production for the ATLAS experiment on the Worldwide LHC Computing Grid (WLCG). With JEM, job progress and grid worker node health can be supervised in real time by users, site admins and shift personnel. Imminent error conditions can be detected early and countermeasures can be initiated by the Job’s owner immideatly. Grid site admins can access aggregated data of all monitored jobs to infer the site status and to detect job and Grid worker node misbehaviour. Shifters can use the same aggregated data to quickly react to site error conditions and broken production tasks. In this work, the application of novel data-centric rule based methods and data-mining techniques to the real time monitoring data is discussed. The usage of such automatic inference techniques on monitorin...

  19. A Web-Based Rice Plant Expert System Using Rule-Based Reasoning

    Directory of Open Access Journals (Sweden)

    Anton Setiawan Honggowibowo

    2009-12-01

    Full Text Available Rice plants can be attacked by various kinds of diseases which are possible to be determined from their symptoms. However, it is to recognize that to find out the exact type of disease, an agricultural expert’s opinion is needed, meanwhile the numbers of agricultural experts are limited and there are too many problems to be solved at the same time. This makes a system with a capability as an expert is required. This system must contain the knowledge of the diseases and symptom of rice plants as an agricultural expert has to have. This research designs a web-based expert system using rule-based reasoning. The rule are modified from the method of forward chaining inference and backward chaining in order to to help farmers in the rice plant disease diagnosis. The web-based rice plants disease diagnosis expert system has the advantages to access and use easily. With web-based features inside, it is expected that the farmer can accesse the expert system everywhere to overcome the problem to diagnose rice diseases.

  20. Analysis, Simulation, and Verification of Knowledge-Based, Rule-Based, and Expert Systems

    Science.gov (United States)

    Hinchey, Mike; Rash, James; Erickson, John; Gracanin, Denis; Rouff, Chris

    2010-01-01

    Mathematically sound techniques are used to view a knowledge-based system (KBS) as a set of processes executing in parallel and being enabled in response to specific rules being fired. The set of processes can be manipulated, examined, analyzed, and used in a simulation. The tool that embodies this technology may warn developers of errors in their rules, but may also highlight rules (or sets of rules) in the system that are underspecified (or overspecified) and need to be corrected for the KBS to operate as intended. The rules embodied in a KBS specify the allowed situations, events, and/or results of the system they describe. In that sense, they provide a very abstract specification of a system. The system is implemented through the combination of the system specification together with an appropriate inference engine, independent of the algorithm used in that inference engine. Viewing the rule base as a major component of the specification, and choosing an appropriate specification notation to represent it, reveals how additional power can be derived from an approach to the knowledge-base system that involves analysis, simulation, and verification. This innovative approach requires no special knowledge of the rules, and allows a general approach where standardized analysis, verification, simulation, and model checking techniques can be applied to the KBS.

  1. Rule-based learning of regular past tense in children with specific language impairment.

    Science.gov (United States)

    Smith-Lock, Karen M

    2015-01-01

    The treatment of children with specific language impairment was used as a means to investigate whether a single- or dual-mechanism theory best conceptualizes the acquisition of English past tense. The dual-mechanism theory proposes that regular English past-tense forms are produced via a rule-based process whereas past-tense forms of irregular verbs are stored in the lexicon. Single-mechanism theories propose that both regular and irregular past-tense verbs are stored in the lexicon. Five 5-year-olds with specific language impairment received treatment for regular past tense. The children were tested on regular past-tense production and third-person singular "s" twice before treatment and once after treatment, at eight-week intervals. Treatment consisted of one-hour play-based sessions, once weekly, for eight weeks. Crucially, treatment focused on different lexical items from those in the test. Each child demonstrated significant improvement on the untreated past-tense test items after treatment, but no improvement on the untreated third-person singular "s". Generalization to untreated past-tense verbs could not be attributed to a frequency effect or to phonological similarity of trained and tested items. It is argued that the results are consistent with a dual-mechanism theory of past-tense inflection.

  2. A rule-based verification and control framework in ATLAS Trigger-DAQ

    CERN Document Server

    Kazarov, A; Lehmann-Miotto, G; Sloper, J E; Ryabov, Yu; Computing In High Energy and Nuclear Physics

    2007-01-01

    In order to meet the requirements of ATLAS data taking, the ATLAS Trigger-DAQ system is composed of O(1000) of applications running on more than 2600 computers in a network. With such system size, s/w and h/w failures are quite often. To minimize system downtime, the Trigger-DAQ control system shall include advanced verification and diagnostics facilities. The operator should use tests and expertise of the TDAQ and detectors developers in order to diagnose and recover from errors, if possible automatically. The TDAQ control system is built as a distributed tree of controllers, where behavior of each controller is defined in a rule-based language allowing easy customization. The control system also includes verification framework which allow users to develop and configure tests for any component in the system with different levels of complexity. It can be used as a stand-alone test facility for a small detector installation, as part of the general TDAQ initialization procedure, and for diagnosing the problems ...

  3. Mining association rule based on the diseases population for recommendation of medicine need

    Science.gov (United States)

    Harahap, M.; Husein, A. M.; Aisyah, S.; Lubis, F. R.; Wijaya, B. A.

    2018-04-01

    Selection of medicines that is inappropriate will lead to an empty result at medicines, this has an impact on medical services and economic value in hospital. The importance of an appropriate medicine selection process requires an automated way to select need based on the development of the patient's illness. In this study, we analyzed patient prescriptions to identify the relationship between the disease and the medicine used by the physician in treating the patient's illness. The analytical framework includes: (1) patient prescription data collection, (2) applying k-means clustering to classify the top 10 diseases, (3) applying Apriori algorithm to find association rules based on support, confidence and lift value. The results of the tests of patient prescription datasets in 2015-2016, the application of the k-means algorithm for the clustering of 10 dominant diseases significantly affects the value of trust and support of all association rules on the Apriori algorithm making it more consistent with finding association rules of disease and related medicine. The value of support, confidence and the lift value of disease and related medicine can be used as recommendations for appropriate medicine selection. Based on the conditions of disease progressions of the hospital, there is so more optimal medicine procurement.

  4. Analysis of QCD sum rule based on the maximum entropy method

    International Nuclear Information System (INIS)

    Gubler, Philipp

    2012-01-01

    QCD sum rule was developed about thirty years ago and has been used up to the present to calculate various physical quantities like hadrons. It has been, however, needed to assume 'pole + continuum' for the spectral function in the conventional analyses. Application of this method therefore came across with difficulties when the above assumption is not satisfied. In order to avoid this difficulty, analysis to make use of the maximum entropy method (MEM) has been developed by the present author. It is reported here how far this new method can be successfully applied. In the first section, the general feature of the QCD sum rule is introduced. In section 2, it is discussed why the analysis by the QCD sum rule based on the MEM is so effective. In section 3, the MEM analysis process is described, and in the subsection 3.1 likelihood function and prior probability are considered then in subsection 3.2 numerical analyses are picked up. In section 4, some cases of applications are described starting with ρ mesons, then charmoniums in the finite temperature and finally recent developments. Some figures of the spectral functions are shown. In section 5, summing up of the present analysis method and future view are given. (S. Funahashi)

  5. An Expert System for Diagnosis of Sleep Disorder Using Fuzzy Rule-Based Classification Systems

    Science.gov (United States)

    Septem Riza, Lala; Pradini, Mila; Fitrajaya Rahman, Eka; Rasim

    2017-03-01

    Sleep disorder is an anomaly that could cause problems for someone’ sleeping pattern. Nowadays, it becomes an issue since people are getting busy with their own business and have no time to visit the doctors. Therefore, this research aims to develop a system used for diagnosis of sleep disorder using Fuzzy Rule-Based Classification System (FRBCS). FRBCS is a method based on the fuzzy set concepts. It consists of two steps: (i) constructing a model/knowledge involving rulebase and database, and (ii) prediction over new data. In this case, the knowledge is obtained from experts whereas in the prediction stage, we perform fuzzification, inference, and classification. Then, a platform implementing the method is built with a combination between PHP and the R programming language using the “Shiny” package. To validate the system that has been made, some experiments have been done using data from a psychiatric hospital in West Java, Indonesia. Accuracy of the result and computation time are 84.85% and 0.0133 seconds, respectively.

  6. Auto-control of pumping operations in sewerage systems by rule-based fuzzy neural networks

    Directory of Open Access Journals (Sweden)

    Y.-M. Chiang

    2011-01-01

    Full Text Available Pumping stations play an important role in flood mitigation in metropolitan areas. The existing sewerage systems, however, are facing a great challenge of fast rising peak flow resulting from urbanization and climate change. It is imperative to construct an efficient and accurate operating prediction model for pumping stations to simulate the drainage mechanism for discharging the rainwater in advance. In this study, we propose two rule-based fuzzy neural networks, adaptive neuro-fuzzy inference system (ANFIS and counterpropagation fuzzy neural network for on-line predicting of the number of open and closed pumps of a pivotal pumping station in Taipei city up to a lead time of 20 min. The performance of ANFIS outperforms that of CFNN in terms of model efficiency, accuracy, and correctness. Furthermore, the results not only show the predictive water levels do contribute to the successfully operating pumping stations but also demonstrate the applicability and reliability of ANFIS in automatically controlling the urban sewerage systems.

  7. A study on development of a rule based expert system for steam generator life extension

    International Nuclear Information System (INIS)

    Park, Jin Kyun

    1994-02-01

    The need of predicting the integrity of the steam generator(SG) tubes and environmental conditions that affect their integrity is growing to secure nuclear power plant(NPP) safety and enhance plant availability. To achieve their objectives it is important to diagnose the integrity of the SG tubes. An expert system called FEMODES(failure mode diagnosis expert system) has been developed for diagnosis of such tube degradation phenomena as denting, intergranular attack(IGA) and stress corrosion cracking(SCC) in the secondary side of the SG. It is possible with use of FEMODES to estimate possibilities of SG tube degradation and diagnosis environmental conditions that influence such tube degradation. The method of certainty factor theory(CFT) and the rule based backward reasoning inference strategy are used to develop FEMODES. The information required for diagnosis is acquired from SG tube degradation experiences of two local reference plants, some limited oversea plants and technical reports/research papers about such tube degradation. Overall results estimated with use of FEMODES are in reasonable agreement with actual SG tube degradation. Some discrepancy observed in several estimated values of SG tube degradation appears to be due to insufficient heuristic knowledge for knowledge data base of FEMODES

  8. Under What Conditions Do Rules-Based and Capability-Based Management Modes Dominate?

    Directory of Open Access Journals (Sweden)

    Lukas Michel

    2018-04-01

    Full Text Available Despite real changes in the work place and the negative consequences of prevailing hierarchical structures with rigid management systems, little attention has yet been paid to shifting management modes to accommodate the dynamics of the external environment, particularly when a firm’s operating environment demands a high degree of flexibility. Building on the resource-based view as a basis for competitive advantage, we posit that differences in the stability of an organization’s environment and the degree of managerial control explain variations in the management mode used in firms. Unlike other studies which mainly focus on either the dynamics of the external environment or management control, we have developed a theoretical model combining both streams of research, in a context frame to describe under what conditions firms engage in rules-based, change-based, engagement-based and capability-based management modes. To test our theoretical framework, we conducted a survey with 54 firms in various industries and nations on how their organizations cope with a dynamic environment and what management style they used in response. Our study reveals that the appropriate mode can be determined by analyzing purpose, motivation, knowledge and information, as well as the degree of complexity, volatility and uncertainty the firm is exposed to. With our framework, we attempt to advance the understanding of when organizations should adapt their management style to the changing business environment.

  9. Battery sizing and rule-based operation of grid-connected photovoltaic-battery system: A case study in Sweden

    International Nuclear Information System (INIS)

    Zhang, Yang; Lundblad, Anders; Campana, Pietro Elia; Benavente, F.; Yan, Jinyue

    2017-01-01

    Highlights: • Battery sizing and rule-based operation are achieved concurrently. • Hybrid operation strategy that combines different strategies is proposed. • Three operation strategies are compared through multi-objective optimization. • High Net Present Value and Self Sufficiency Ratio are achieved at the same time. - Abstract: The optimal components design for grid-connected photovoltaic-battery systems should be determined with consideration of system operation. This study proposes a method to simultaneously optimize the battery capacity and rule-based operation strategy. The investigated photovoltaic-battery system is modeled using single diode photovoltaic model and Improved Shepherd battery model. Three rule-based operation strategies—including the conventional operation strategy, the dynamic price load shifting strategy, and the hybrid operation strategy—are designed and evaluated. The rule-based operation strategies introduce different operation parameters to run the system operation. multi-objective Genetic Algorithm is employed to optimize the decisional variables, including battery capacity and operation parameters, towards maximizing the system’s Self Sufficiency Ratio and Net Present Value. The results indicate that employing battery with the conventional operation strategy is not profitable, although it increases Self Sufficiency Ratio. The dynamic price load shifting strategy has similar performance with the conventional operation strategy because the electricity price variation is not large enough. The proposed hybrid operation strategy outperforms other investigated strategies. When the battery capacity is lower than 72 kW h, Self Sufficiency Ratio and Net Present Value increase simultaneously with the battery capacity.

  10. Rule Based Reasoning Untuk Monitoring Distribusi Bahan Bakar Minyak Secara Online dan Realtime menggunakan Radio Frequency Identification

    Directory of Open Access Journals (Sweden)

    Mokhamad Iklil Mustofa

    2017-05-01

    Full Text Available The scarcity of fuel oil in Indonesia often occurs due to delays in delivery caused by natural factors or transportation constraints. Theaim of this  research is to develop systems of fuel distribution monitoring online and realtime using rule base reasoning method and radio frequency identification technology. The rule-based reasoning method is used as a rule-based reasoning model used for monitoring distribution and determine rule-based safety stock. The monitoring system program is run with a web-based computer application. Radio frequency identification technology is used by utilizing radio waves as an media identification. This technology is used as a system of tracking and gathering information from objects automatically. The research data uses data of delayed distribution of fuel from fuel terminal to consumer. The monitoring technique uses the time of departure, the estimated time to arrive, the route / route passed by a fuel tanker attached to the radio frequency Identification tag. This monitoring system is carried out by the radio frequency identification reader connected online at any gas station or specified position that has been designed with study case in Semarang. The results of the research covering  the status of rule based reasoning that sends status, that is timely and appropriate paths, timely and truncated pathways, late and on track, late and cut off, and tank lost. The monitoring system is also used in determining the safety stock warehouse, with the safety stock value determined based on the condition of the stock warehouse rules.

  11. A comparison of rule-based and machine learning approaches for classifying patient portal messages.

    Science.gov (United States)

    Cronin, Robert M; Fabbri, Daniel; Denny, Joshua C; Rosenbloom, S Trent; Jackson, Gretchen Purcell

    2017-09-01

    Secure messaging through patient portals is an increasingly popular way that consumers interact with healthcare providers. The increasing burden of secure messaging can affect clinic staffing and workflows. Manual management of portal messages is costly and time consuming. Automated classification of portal messages could potentially expedite message triage and delivery of care. We developed automated patient portal message classifiers with rule-based and machine learning techniques using bag of words and natural language processing (NLP) approaches. To evaluate classifier performance, we used a gold standard of 3253 portal messages manually categorized using a taxonomy of communication types (i.e., main categories of informational, medical, logistical, social, and other communications, and subcategories including prescriptions, appointments, problems, tests, follow-up, contact information, and acknowledgement). We evaluated our classifiers' accuracies in identifying individual communication types within portal messages with area under the receiver-operator curve (AUC). Portal messages often contain more than one type of communication. To predict all communication types within single messages, we used the Jaccard Index. We extracted the variables of importance for the random forest classifiers. The best performing approaches to classification for the major communication types were: logistic regression for medical communications (AUC: 0.899); basic (rule-based) for informational communications (AUC: 0.842); and random forests for social communications and logistical communications (AUCs: 0.875 and 0.925, respectively). The best performing classification approach of classifiers for individual communication subtypes was random forests for Logistical-Contact Information (AUC: 0.963). The Jaccard Indices by approach were: basic classifier, Jaccard Index: 0.674; Naïve Bayes, Jaccard Index: 0.799; random forests, Jaccard Index: 0.859; and logistic regression, Jaccard

  12. Omega version 2.2: Rule-based deterioration identification and management system. Final report

    International Nuclear Information System (INIS)

    Kataoka, S.; Kojima, T.; Pavinich, W.A.; Andrews, J.D.

    1996-06-01

    This report presents the Omega Version 2.2 (Ωs) rule-based computer program for identifying material deteriorations in the metallic structures, systems and components of LWR nuclear power units. The basis of Us is that understanding what material deteriorations might occur as a function of service life is fundamental to: (1) the development and optimization of preventive maintenance programs, (2) ensuring that current maintenance programs recognize applicable degradations, and (3) demonstrating the adequacy of deterioration management to safety regulatory authorities. The system was developed to assist utility engineers in determining which aging degradation mechanisms are acting on specific components. Direction is also provided to extend this system to manage deterioration and evaluate the efficacy of existing age-related degradation mitigation programs. This system can provide support for justification for continued operation and license renewal. It provides traceability to the data sources used in the logic development. A tiered approach is used to quickly isolate potential age-related degradation for components in a particular location. A potential degradation mechanism is then screened by additional rules to establish its plausibility. Ωs includes a user-friendly system interface and provides default environmental data and materials in the event they are unknown to the user. Ωs produces a report, with references, that validates the elimination of a degradation mechanism from further consideration or the determination that a specific degradation mechanism is acting on a specific material. This report also describes logic for identifying deterioration caused by intrusions and inspection-based deteriorations, along with future plans to program and integrate these features with Ωs

  13. Ketamine alters lateral prefrontal oscillations in a rule-based working memory task.

    Science.gov (United States)

    Ma, Liya; Skoblenick, Kevin; Johnston, Kevin; Everling, Stefan

    2018-02-02

    Acute administration of N-methyl-D-aspartate receptor (NMDAR) antagonists in healthy humans and animals produces working memory deficits similar to those observed in schizophrenia. However, it is unclear whether they also lead to altered low-frequency (rule-based prosaccade and antisaccade working memory task, both before and after systemic injections of a subanesthetic dose (delay periods and inter-trial intervals. It also increased task-related alpha-band activities, likely reflecting compromised attention. Beta-band oscillations may be especially relevant to working memory processes, as stronger beta power weakly but significantly predicted shorter saccadic reaction time. Also in beta band, ketamine reduced the performance-related oscillation as well as the rule information encoded in the spectral power. Ketamine also reduced rule information in the spike-field phase consistency in almost all frequencies up to 60Hz. Our findings support NMDAR antagonists in non-human primates as a meaningful model for altered neural oscillations and synchrony, which reflect a disorganized network underlying the working memory deficits in schizophrenia. SIGNIFICANCE STATEMENT Low doses of ketamine-an NMDA receptor blocker-produce working memory deficits similar to those observed in schizophrenia. In the LPFC, a key brain region for working memory, we found that ketamine altered neural oscillatory activities in similar ways that differentiate schizophrenic patients and healthy subjects, during both task and non-task periods. Ketamine induced stronger gamma (30-60Hz) and weaker beta (13-30Hz) oscillations, reflecting local hyperactivity and reduced long-range communications. Furthermore, ketamine reduced performance-related oscillatory activities, as well as the rule information encoded in the oscillations and in the synchrony between single cell activities and oscillations. The ketamine model helps link the molecular and cellular basis of neural oscillatory changes to the working

  14. Rule-based Approach on Extraction of Malay Compound Nouns in Standard Malay Document

    Science.gov (United States)

    Abu Bakar, Zamri; Kamal Ismail, Normaly; Rawi, Mohd Izani Mohamed

    2017-08-01

    Malay compound noun is defined as a form of words that exists when two or more words are combined into a single syntax and it gives a specific meaning. Compound noun acts as one unit and it is spelled separately unless an established compound noun is written closely from two words. The basic characteristics of compound noun can be seen in the Malay sentences which are the frequency of that word in the text itself. Thus, this extraction of compound nouns is significant for the following research which is text summarization, grammar checker, sentiments analysis, machine translation and word categorization. There are many research efforts that have been proposed in extracting Malay compound noun using linguistic approaches. Most of the existing methods were done on the extraction of bi-gram noun+noun compound. However, the result still produces some problems as to give a better result. This paper explores a linguistic method for extracting compound Noun from stand Malay corpus. A standard dataset are used to provide a common platform for evaluating research on the recognition of compound Nouns in Malay sentences. Therefore, an improvement for the effectiveness of the compound noun extraction is needed because the result can be compromised. Thus, this study proposed a modification of linguistic approach in order to enhance the extraction of compound nouns processing. Several pre-processing steps are involved including normalization, tokenization and tagging. The first step that uses the linguistic approach in this study is Part-of-Speech (POS) tagging. Finally, we describe several rules-based and modify the rules to get the most relevant relation between the first word and the second word in order to assist us in solving of the problems. The effectiveness of the relations used in our study can be measured using recall, precision and F1-score techniques. The comparison of the baseline values is very essential because it can provide whether there has been an improvement

  15. Simulation of operating rules and discretional decisions using a fuzzy rule-based system integrated into a water resources management model

    Science.gov (United States)

    Macian-Sorribes, Hector; Pulido-Velazquez, Manuel

    2013-04-01

    water stored in the reservoirs) and the month of the year as inputs; and the demand deliveries as outputs. The developed simulation management model integrates the fuzzy-ruled system of the operation of the two main reservoirs of the basin with the corresponding mass balance equations, the physical or boundary conditions and the water allocation rules among the competing demands. Historical information on inflow time series is used as inputs to the model simulation, being trained and validated using historical information on reservoir storage level and flow in several streams of the Mijares river. This methodology provides a more flexible and close to real policies approach. The model is easy to develop and to understand due to its rule-based structure, which mimics the human way of thinking. This can improve cooperation and negotiation between managers, decision-makers and stakeholders. The approach can be also applied to analyze the historical operation of the reservoir (what we have called a reservoir operation "audit").

  16. Multi-arrhythmias detection with an XML rule-based system from 12-Lead Electrocardiogram.

    Science.gov (United States)

    Khelassi, Abdeldjalil; Yelles-Chaouche, Sarra-Nassira; Benais, Faiza

    2017-05-01

    The computer-aided detection of cardiac arrhythmias stills a crucial application in medical technologies. The rule based systems RBS ensure a high level of transparency and interpretability of the obtained results. To facilitate the diagnosis of the cardiologists and to reduce the uncertainty made in this diagnosis. In this research article, we have realized a classification and automatic recognition of cardiac arrhythmias, by using XML rules that represent the cardiologist knowledge. Thirteen experiments with different knowledge bases were realized for improving the performance of the used method in the detection of 13 cardiac arrhythmias. In the first 12 experiments, we have designed a specialized knowledge base for each cardiac arrhythmia, which contains just one arrhythmia detection rule. In the last experiment, we applied the knowledge base which contains rules of 12 arrhythmias. We used, for the experiments, an international data set with 279 features and 452 records characterizing 12 leads of ECG signal and social information of patients. The data sets were constructed and published at Bilkent University of Ankara, Turkey. In addition, the second version of the self-developed software "XMLRULE" was used; the software can infer more than one class and facilitate the interpretability of the obtained results. The 12 first experiments give 82.80% of correct detection as the mean of all experiments, the results were between 19% and 100% with a low rate in just one experiment. The last experiment in which all arrhythmias are considered, the results of correct detection was 38.33% with 90.55% of sensibility and 46.24% of specificity. It was clearly show that in these results the good choice of the classification model is very beneficial in terms of performance. The obtained results were better than the published results with other computational methods for the mono class detection, but it was less in multi-class detection. The RBS is the most transparent method for

  17. Comparing a rule based vs. statistical system for automatic categorization of MEDLINE documents according to biomedical specialty

    OpenAIRE

    Humphrey, Susanne M.; Névéol, Aurélie; Browne, Allen; Gobeill, Julien; Ruch, Patrick; Darmoni, Stéfan J.

    2010-01-01

    Automatic document categorization is an important research problem in Information Science and Natural Language Processing. Many applications, including Word Sense Disambiguation and Information Retrieval in large collections, can benefit from such categorization. This paper focuses on automatic categorization of documents from the biomedical literature into broad discipline-based categories. Two different systems are described and contrasted: CISMeF, which uses rules based on human indexing o...

  18. Oil palm fresh fruit bunch ripeness classification based on rule- based expert system of ROI image processing technique results

    International Nuclear Information System (INIS)

    Alfatni, M S M; Shariff, A R M; Marhaban, M H; Shafie, S B; Saaed, O M B; Abdullah, M Z; BAmiruddin, M D

    2014-01-01

    There is a processing need for a fast, easy and accurate classification system for oil palm fruit ripeness. Such a system will be invaluable to farmers and plantation managers who need to sell their oil palm fresh fruit bunch (FFB) for the mill as this will avoid disputes. In this paper,a new approach was developed under the name of expert rules-based systembased on the image processing techniques results of thethree different oil palm FFB region of interests (ROIs), namely; ROI1 (300x300 pixels), ROI2 (50x50 pixels) and ROI3 (100x100 pixels). The results show that the best rule-based ROIs for statistical colour feature extraction with k-nearest neighbors (KNN) classifier at 94% were chosen as well as the ROIs that indicated results higher than the rule-based outcome, such as the ROIs of statistical colour feature extraction with artificial neural network (ANN) classifier at 94%, were selected for further FFB ripeness inspection system

  19. A simple rule based model for scheduling farm management operations in SWAT

    Science.gov (United States)

    Schürz, Christoph; Mehdi, Bano; Schulz, Karsten

    2016-04-01

    For many interdisciplinary questions at the watershed scale, the Soil and Water Assessment Tool (SWAT; Arnold et al., 1998) has become an accepted and widely used tool. Despite its flexibility, the model is highly demanding when it comes to input data. At SWAT's core the water balance and the modeled nutrient cycles are plant growth driven (implemented with the EPIC crop growth model). Therefore, land use and crop data with high spatial and thematic resolution, as well as detailed information on cultivation and farm management practices are required. For many applications of the model however, these data are unavailable. In order to meet these requirements, SWAT offers the option to trigger scheduled farm management operations by applying the Potential Heat Unit (PHU) concept. The PHU concept solely takes into account the accumulation of daily mean temperature for management scheduling. Hence, it contradicts several farming strategies that take place in reality; such as: i) Planting and harvesting dates are set much too early or too late, as the PHU concept is strongly sensitivity to inter-annual temperature fluctuations; ii) The timing of fertilizer application, in SWAT this often occurs simultaneously on the same date in in each field; iii) and can also coincide with precipitation events. Particularly, the latter two can lead to strong peaks in modeled nutrient loads. To cope with these shortcomings we propose a simple rule based model (RBM) to schedule management operations according to realistic farmer management practices in SWAT. The RBM involves simple strategies requiring only data that are input into the SWAT model initially, such as temperature and precipitation data. The user provides boundaries of time periods for operation schedules to take place for all crops in the model. These data are readily available from the literature or from crop variety trials. The RBM applies the dates by complying with the following rules: i) Operations scheduled in the

  20. Decision support system for triage management: A hybrid approach using rule-based reasoning and fuzzy logic.

    Science.gov (United States)

    Dehghani Soufi, Mahsa; Samad-Soltani, Taha; Shams Vahdati, Samad; Rezaei-Hachesu, Peyman

    2018-06-01

    Fast and accurate patient triage for the response process is a critical first step in emergency situations. This process is often performed using a paper-based mode, which intensifies workload and difficulty, wastes time, and is at risk of human errors. This study aims to design and evaluate a decision support system (DSS) to determine the triage level. A combination of the Rule-Based Reasoning (RBR) and Fuzzy Logic Classifier (FLC) approaches were used to predict the triage level of patients according to the triage specialist's opinions and Emergency Severity Index (ESI) guidelines. RBR was applied for modeling the first to fourth decision points of the ESI algorithm. The data relating to vital signs were used as input variables and modeled using fuzzy logic. Narrative knowledge was converted to If-Then rules using XML. The extracted rules were then used to create the rule-based engine and predict the triage levels. Fourteen RBR and 27 fuzzy rules were extracted and used in the rule-based engine. The performance of the system was evaluated using three methods with real triage data. The accuracy of the clinical decision support systems (CDSSs; in the test data) was 99.44%. The evaluation of the error rate revealed that, when using the traditional method, 13.4% of the patients were miss-triaged, which is statically significant. The completeness of the documentation also improved from 76.72% to 98.5%. Designed system was effective in determining the triage level of patients and it proved helpful for nurses as they made decisions, generated nursing diagnoses based on triage guidelines. The hybrid approach can reduce triage misdiagnosis in a highly accurate manner and improve the triage outcomes. Copyright © 2018 Elsevier B.V. All rights reserved.

  1. An Investigation of Care-Based vs. Rule-Based Morality in Frontotemporal Dementia, Alzheimer’s Disease, and Healthy Controls

    Science.gov (United States)

    Carr, Andrew R.; Paholpak, Pongsatorn; Daianu, Madelaine; Fong, Sylvia S.; Mather, Michelle; Jimenez, Elvira E.; Thompson, Paul; Mendez, Mario F.

    2015-01-01

    Behavioral changes in dementia, especially behavioral variant frontotemporal dementia (bvFTD), may result in alterations in moral reasoning. Investigators have not clarified whether these alterations reflect differential impairment of care-based vs. rule-based moral behavior. This study investigated 18 bvFTD patients, 22 early onset Alzheimer’s disease (eAD) patients, and 20 healthy age-matched controls on care-based and rule-based items from the Moral Behavioral Inventory and the Social Norms Questionnaire, neuropsychological measures, and magnetic resonance imaging (MRI) regions of interest. There were significant group differences with the bvFTD patients rating care-based morality transgressions less severely than the eAD group and rule-based moral behavioral transgressions more severely than controls. Across groups, higher care-based morality ratings correlated with phonemic fluency on neuropsychological tests, whereas higher rule-based morality ratings correlated with increased difficulty set-shifting and learning new rules to tasks. On neuroimaging, severe care-based reasoning correlated with cortical volume in right anterior temporal lobe, and rule-based reasoning correlated with decreased cortical volume in the right orbitofrontal cortex. Together, these findings suggest that frontotemporal disease decreases care-based morality and facilitates rule-based morality possibly from disturbed contextual abstraction and set-shifting. Future research can examine whether frontal lobe disorders and bvFTD result in a shift from empathic morality to the strong adherence to conventional rules. PMID:26432341

  2. Conditioning of high voltage radio frequency cavities by using fuzzy logic in connection with rule based programming

    CERN Document Server

    Perréard, S

    1993-01-01

    Many processes are controlled by experts using some kind of mental model to decide actions and make conclusions. This model, based on heuristic knowledge, can often be conveniently represented in rules and has not to be particularly accurate. This is the case for the problem of conditioning high voltage radio-frequency cavities: the expert has to decide, by observing some criteria, if he can increase or if he has to decrease the voltage and by how much. A program has been implemented which can be applied to a class of similar problems. The kernel of the program is a small rule base, which is independent of the kind of cavity. To model a specific cavity, we use fuzzy logic which is implemented as a separate routine called by the rule base. We use fuzzy logic to translate from numeric to symbolic information. The example we chose for applying this kind of technique can be implemented by sequential programming. The two versions exist for comparison. However, we believe that this kind of programming can be powerf...

  3. Using rule-based machine learning for candidate disease gene prioritization and sample classification of cancer gene expression data.

    Directory of Open Access Journals (Sweden)

    Enrico Glaab

    Full Text Available Microarray data analysis has been shown to provide an effective tool for studying cancer and genetic diseases. Although classical machine learning techniques have successfully been applied to find informative genes and to predict class labels for new samples, common restrictions of microarray analysis such as small sample sizes, a large attribute space and high noise levels still limit its scientific and clinical applications. Increasing the interpretability of prediction models while retaining a high accuracy would help to exploit the information content in microarray data more effectively. For this purpose, we evaluate our rule-based evolutionary machine learning systems, BioHEL and GAssist, on three public microarray cancer datasets, obtaining simple rule-based models for sample classification. A comparison with other benchmark microarray sample classifiers based on three diverse feature selection algorithms suggests that these evolutionary learning techniques can compete with state-of-the-art methods like support vector machines. The obtained models reach accuracies above 90% in two-level external cross-validation, with the added value of facilitating interpretation by using only combinations of simple if-then-else rules. As a further benefit, a literature mining analysis reveals that prioritizations of informative genes extracted from BioHEL's classification rule sets can outperform gene rankings obtained from a conventional ensemble feature selection in terms of the pointwise mutual information between relevant disease terms and the standardized names of top-ranked genes.

  4. Using rule-based machine learning for candidate disease gene prioritization and sample classification of cancer gene expression data.

    Science.gov (United States)

    Glaab, Enrico; Bacardit, Jaume; Garibaldi, Jonathan M; Krasnogor, Natalio

    2012-01-01

    Microarray data analysis has been shown to provide an effective tool for studying cancer and genetic diseases. Although classical machine learning techniques have successfully been applied to find informative genes and to predict class labels for new samples, common restrictions of microarray analysis such as small sample sizes, a large attribute space and high noise levels still limit its scientific and clinical applications. Increasing the interpretability of prediction models while retaining a high accuracy would help to exploit the information content in microarray data more effectively. For this purpose, we evaluate our rule-based evolutionary machine learning systems, BioHEL and GAssist, on three public microarray cancer datasets, obtaining simple rule-based models for sample classification. A comparison with other benchmark microarray sample classifiers based on three diverse feature selection algorithms suggests that these evolutionary learning techniques can compete with state-of-the-art methods like support vector machines. The obtained models reach accuracies above 90% in two-level external cross-validation, with the added value of facilitating interpretation by using only combinations of simple if-then-else rules. As a further benefit, a literature mining analysis reveals that prioritizations of informative genes extracted from BioHEL's classification rule sets can outperform gene rankings obtained from a conventional ensemble feature selection in terms of the pointwise mutual information between relevant disease terms and the standardized names of top-ranked genes.

  5. Real-time Geographic Information System (GIS) for Monitoring the Area of Potential Water Level Using Rule Based System

    Science.gov (United States)

    Anugrah, Wirdah; Suryono; Suseno, Jatmiko Endro

    2018-02-01

    Management of water resources based on Geographic Information System can provide substantial benefits to water availability settings. Monitoring the potential water level is needed in the development sector, agriculture, energy and others. In this research is developed water resource information system using real-time Geographic Information System concept for monitoring the potential water level of web based area by applying rule based system method. GIS consists of hardware, software, and database. Based on the web-based GIS architecture, this study uses a set of computer that are connected to the network, run on the Apache web server and PHP programming language using MySQL database. The Ultrasound Wireless Sensor System is used as a water level data input. It also includes time and geographic location information. This GIS maps the five sensor locations. GIS is processed through a rule based system to determine the level of potential water level of the area. Water level monitoring information result can be displayed on thematic maps by overlaying more than one layer, and also generating information in the form of tables from the database, as well as graphs are based on the timing of events and the water level values.

  6. Developing a fuzzy rule based cognitive map for total system safety assessment

    International Nuclear Information System (INIS)

    Lemos, Francisco Luiz de; Sullivan, Terry

    2007-01-01

    Total System Performance Assessment, TSPA, for radioactive waste disposal is a multi and interdisciplinary task that is characterized by complex interactions between parameters and processes; lack of data; and ignorance regarding natural processes and conditions. The vagueness in the determination of ranges of values of parameters and identification of interacting processes pose further difficulties to the analysts with regard to the establishment of the relations between processes and parameters. More specifically the vagueness makes uncertainty propagation and sensitivity analysis challenging to analyze. To cope with these difficulties experts often use simplifications and linguistic terms to express their state of knowledge about a certain situation. For example, experts use terms such as 'low pH', 'very unlikely', etc to describe their perception about natural processes or conditions. In this work we propose the use of Fuzzy Cognitive Maps, FCM, for representation of interrelation between processes and parameters as well as to promote a better understanding of the system performance. Fuzzy cognitive maps are suited for the case where the causal relations are not clearly defined and, therefore, can not be represented by crisp values. In other words, instead of representing the quality of the interactions by crisp values, they are assigned degrees of truth. For example, we can assign values to the effect of one process on another such that (+) 1 corresponds to positive, (-) 1 to negative and 0 to neutral effects respectively. In this case the effect of a process A, on a process, B, can be depicted as function of the membership to the fuzzy set 'causal effect' of the cause process to the target one. One of the main advantages of this methodology would be that it allows one to aggregate the linguistic expressions as descriptions of processes. For example, a process can be known to have a 'very strong' positive effect on another one, or using fuzzy sets terminology

  7. Three dimensional pattern recognition using feature-based indexing and rule-based search

    Science.gov (United States)

    Lee, Jae-Kyu

    . This data base organization according to object features facilitates machine learning in the context of a knowledge-base driven recognition algorithm. Lastly, feature-based indexing permits the recognition of 3D objects based on a comparatively small number of stored views, further limiting the size of the feature database. Experiments with real images as well as synthetic images including occluded (partially visible) objects are presented. The experiments show almost perfect recognition with feature-based indexing, if the detected features in the test scene are viewed from the same angle as the view on which the model is based. The experiments also show that the knowledge base is a highly effective and efficient search tool recognition performance is improved without increasing the database size requirements. The experimental results indicate that feature-based indexing in combination with a knowledge-based system will be a useful methodology for automatic target recognition (ATR).

  8. Climatic forecast: down-scaling and extremes

    International Nuclear Information System (INIS)

    Deque, M.; Li, L.

    2007-01-01

    There is a strong demand for specifying the future climate at local scale and about extreme events. New methods, allowing a better output from the climate models, are currently being developed and French laboratories involved in the Escrime project are actively participating. (authors)

  9. A new intuitionistic fuzzy rule-based decision-making system for an operating system process scheduler.

    Science.gov (United States)

    Butt, Muhammad Arif; Akram, Muhammad

    2016-01-01

    We present a new intuitionistic fuzzy rule-based decision-making system based on intuitionistic fuzzy sets for a process scheduler of a batch operating system. Our proposed intuitionistic fuzzy scheduling algorithm, inputs the nice value and burst time of all available processes in the ready queue, intuitionistically fuzzify the input values, triggers appropriate rules of our intuitionistic fuzzy inference engine and finally calculates the dynamic priority (dp) of all the processes in the ready queue. Once the dp of every process is calculated the ready queue is sorted in decreasing order of dp of every process. The process with maximum dp value is sent to the central processing unit for execution. Finally, we show complete working of our algorithm on two different data sets and give comparisons with some standard non-preemptive process schedulers.

  10. A rule-based expert system for control rod pattern of boiling water reactors by hovering around haling exposure shape

    International Nuclear Information System (INIS)

    Kao, P.-W.; Lin, L.-S.; Yang, J.-T.

    2004-01-01

    Feasible strategies for automatic BWR control rod pattern generation have been implemented in a rule-based expert system. These strategies are majorly based on a concept for which exposure distributions are hovering around the Haling exposure distribution through a cycle while radial and axial power distributions are dominantly controlled by some abstracted factors indicating the desired distributions. The system can either automatically generate expert-level control rod patterns or search for criteria-satisfied patterns originated from user's input. It has successfully been demonstrated by generating control rod patterns for the the 1775 MWth Chinshan plant in Unit I Cycle 13 alternate loading pattern and Unit 2 Cycle 8 but with longer cycle length. All rod patterns for two cycles result in all-rod-out at EOC and no violation against the four criteria. The demonstrations show that the system is considerably good in choosing initial trial rod patterns and adjusting rod patterns to satisfy the design criteria. (author)

  11. Desain Sistem Semantic Data Warehouse dengan Metode Ontology dan Rule Based untuk Mengolah Data Akademik Universitas XYZ di Bali

    Directory of Open Access Journals (Sweden)

    Made Pradnyana Ambara

    2016-06-01

    Full Text Available Data warehouse pada umumnya yang sering dikenal data warehouse tradisional mempunyai beberapa kelemahan yang mengakibatkan kualitas data yang dihasilkan tidak spesifik dan efektif. Sistem semantic data warehouse merupakan solusi untuk menangani permasalahan pada data warehouse tradisional dengan kelebihan antara lain: manajeman kualitas data yang spesifik dengan format data seragam untuk mendukung laporan OLAP yang baik, dan performance pencarian informasi yang lebih efektif dengan kata kunci bahasa alami. Pemodelan sistem semantic data warehouse menggunakan metode ontology menghasilkan model resource description framework schema (RDFS logic yang akan ditransformasikan menjadi snowflake schema. Laporan akademik yang dibutuhkan dihasilkan melalui metode nine step Kimball dan pencarian semantic menggunakan metode rule based. Pengujian dilakukan menggunakan dua metode uji yaitu pengujian dengan black box testing dan angket kuesioner cheklist. Dari hasil penelitian ini dapat disimpulkan bahwa sistem semantic data warehouse dapat membantu proses pengolahan data akademik yang menghasilkan laporan yang berkualitas untuk mendukung proses pengambilan keputusan.

  12. Earthquake hazard assessment in the Zagros Orogenic Belt of Iran using a fuzzy rule-based model

    Science.gov (United States)

    Farahi Ghasre Aboonasr, Sedigheh; Zamani, Ahmad; Razavipour, Fatemeh; Boostani, Reza

    2017-08-01

    Producing accurate seismic hazard map and predicting hazardous areas is necessary for risk mitigation strategies. In this paper, a fuzzy logic inference system is utilized to estimate the earthquake potential and seismic zoning of Zagros Orogenic Belt. In addition to the interpretability, fuzzy predictors can capture both nonlinearity and chaotic behavior of data, where the number of data is limited. In this paper, earthquake pattern in the Zagros has been assessed for the intervals of 10 and 50 years using fuzzy rule-based model. The Molchan statistical procedure has been used to show that our forecasting model is reliable. The earthquake hazard maps for this area reveal some remarkable features that cannot be observed on the conventional maps. Regarding our achievements, some areas in the southern (Bandar Abbas), southwestern (Bandar Kangan) and western (Kermanshah) parts of Iran display high earthquake severity even though they are geographically far apart.

  13. Transcranial infrared laser stimulation improves rule-based, but not information-integration, category learning in humans.

    Science.gov (United States)

    Blanco, Nathaniel J; Saucedo, Celeste L; Gonzalez-Lima, F

    2017-03-01

    This is the first randomized, controlled study comparing the cognitive effects of transcranial laser stimulation on category learning tasks. Transcranial infrared laser stimulation is a new non-invasive form of brain stimulation that shows promise for wide-ranging experimental and neuropsychological applications. It involves using infrared laser to enhance cerebral oxygenation and energy metabolism through upregulation of the respiratory enzyme cytochrome oxidase, the primary infrared photon acceptor in cells. Previous research found that transcranial infrared laser stimulation aimed at the prefrontal cortex can improve sustained attention, short-term memory, and executive function. In this study, we directly investigated the influence of transcranial infrared laser stimulation on two neurobiologically dissociable systems of category learning: a prefrontal cortex mediated reflective system that learns categories using explicit rules, and a striatally mediated reflexive learning system that forms gradual stimulus-response associations. Participants (n=118) received either active infrared laser to the lateral prefrontal cortex or sham (placebo) stimulation, and then learned one of two category structures-a rule-based structure optimally learned by the reflective system, or an information-integration structure optimally learned by the reflexive system. We found that prefrontal rule-based learning was substantially improved following transcranial infrared laser stimulation as compared to placebo (treatment X block interaction: F(1, 298)=5.117, p=0.024), while information-integration learning did not show significant group differences (treatment X block interaction: F(1, 288)=1.633, p=0.202). These results highlight the exciting potential of transcranial infrared laser stimulation for cognitive enhancement and provide insight into the neurobiological underpinnings of category learning. Copyright © 2017 Elsevier Inc. All rights reserved.

  14. A Rules-Based Algorithm to Prioritize Poor Prognosis Cancer Patients in Need of Advance Care Planning.

    Science.gov (United States)

    Bestvina, Christine M; Wroblewski, Kristen E; Daly, Bobby; Beach, Brittany; Chow, Selina; Hantel, Andrew; Malec, Monica; Huber, Michael T; Polite, Blase N

    2018-03-13

    Accurate understanding of the prognosis of an advanced cancer patient can lead to decreased aggressive care at the end of life and earlier hospice enrollment. Our goal was to determine the association between high-risk clinical events identified by a simple, rules-based algorithm and decreased overall survival, to target poor prognosis cancer patients who would urgently benefit from advanced care planning. A retrospective analysis was performed on outpatient oncology patients with an index visit from April 1, 2015, through June 30, 2015. We examined a three-month window for "high-risk events," defined as (1) change in chemotherapy, (2) emergency department (ED) visit, and (3) hospitalization. Patients were followed until January 31, 2017. A total of 219 patients receiving palliative chemotherapy at the University of Chicago Medicine with a prognosis of ≤12 months were included. The main outcome was overall survival, and each "high-risk event" was treated as a time-varying covariate in a Cox proportional hazards regression model to calculate a hazard ratio (HR) of death. A change in chemotherapy regimen, ED visit, hospitalization, and at least one high-risk event occurred in 54% (118/219), 10% (22/219), 26% (57/219), and 67% (146/219) of patients, respectively. The adjusted HR of death for patients with a high-risk event was 1.72 (95% confidence interval [CI] 1.19-2.46, p = 0.003), with hospitalization reaching significance (HR 2.74, 95% CI 1.84-4.09, p rules-based algorithm identified those with the greatest risk of death among a poor prognosis patient group. Implementation of this algorithm in the electronic health record can identify patients with increased urgency to address goals of care.

  15. Fuzzy rule-based landslide susceptibility mapping in Yığılca Forest District (Northwest of Turkey

    Directory of Open Access Journals (Sweden)

    Abdurrahim Aydın

    2016-07-01

    Full Text Available Landslide susceptibility map of Yığılca Forest District was formed based on developed fuzzy rules using GIS-based FuzzyCell software. An inventory of 315 landslides was updated through fieldworks after inventory map previously generated by the authors. Based on the landslide susceptibility mapping study previously made in the same area, for the comparison of two maps, same 8 landslide conditioning parameters were selected and then fuzzified for the landslide susceptibility mapping: land use, lithology, elevation, slope, aspect, distance to streams, distance to roads, and plan curvature. Mamdani model was selected as fuzzy inference system. After fuzzy rules definition, Center of Area (COA was selected as defuzzification method in model. The output of developed model was normalized between 0 and 1, and then divided five classes such as very low, low, moderate, high, and very high. According to developed model based 8 conditioning parameters, landslide susceptibility in Yığılca Forest District varies between 32 and 67 (in range of 0-100 with 0.703 Area Under the Curve (AUC value. According to classified landslide susceptibility map, in Yığılca Forest District, 32.89% of the total area has high and very high susceptibility while 29.59% of the area has low and very low susceptibility and the rest located in moderate susceptibility. The result of developed fuzzy rule based model compared with previously generated landslide map with logistic regression (LR. According to comparison of the results of two studies, higher differences exist in terms of AUC value and dispersion of susceptibility classes. This is because fuzzy rule based model completely depends on how parameters are classified and fuzzified and also depends on how truly the expert composed the rules. Even so, GIS-based fuzzy applications provide very valuable facilities for reasoning, which makes it possible to take into account inaccuracies and uncertainties.

  16. Evaluation of a rule-based method for epidemiological document classification towards the automation of systematic reviews.

    Science.gov (United States)

    Karystianis, George; Thayer, Kristina; Wolfe, Mary; Tsafnat, Guy

    2017-06-01

    Most data extraction efforts in epidemiology are focused on obtaining targeted information from clinical trials. In contrast, limited research has been conducted on the identification of information from observational studies, a major source for human evidence in many fields, including environmental health. The recognition of key epidemiological information (e.g., exposures) through text mining techniques can assist in the automation of systematic reviews and other evidence summaries. We designed and applied a knowledge-driven, rule-based approach to identify targeted information (study design, participant population, exposure, outcome, confounding factors, and the country where the study was conducted) from abstracts of epidemiological studies included in several systematic reviews of environmental health exposures. The rules were based on common syntactical patterns observed in text and are thus not specific to any systematic review. To validate the general applicability of our approach, we compared the data extracted using our approach versus hand curation for 35 epidemiological study abstracts manually selected for inclusion in two systematic reviews. The returned F-score, precision, and recall ranged from 70% to 98%, 81% to 100%, and 54% to 97%, respectively. The highest precision was observed for exposure, outcome and population (100%) while recall was best for exposure and study design with 97% and 89%, respectively. The lowest recall was observed for the population (54%), which also had the lowest F-score (70%). The generated performance of our text-mining approach demonstrated encouraging results for the identification of targeted information from observational epidemiological study abstracts related to environmental exposures. We have demonstrated that rules based on generic syntactic patterns in one corpus can be applied to other observational study design by simple interchanging the dictionaries aiming to identify certain characteristics (i.e., outcomes

  17. Optimal Rule-Based Power Management for Online, Real-Time Applications in HEVs with Multiple Sources and Objectives: A Review

    Directory of Open Access Journals (Sweden)

    Bedatri Moulik

    2015-08-01

    Full Text Available The field of hybrid vehicles has undergone intensive research and development, primarily due to the increasing concern of depleting resources and increasing pollution. In order to investigate further options to optimize the performance of hybrid vehicles with regards to different criteria, such as fuel economy, battery aging, etc., a detailed state-of-the-art review is presented in this contribution. Different power management and optimization techniques are discussed focusing on rule-based power management and multi-objective optimization techniques. The extent of rule-based power management and optimization in solving battery aging issues is investigated along with an implementation in real-time driving scenarios where no pre-defined drive cycle is followed. The goal of this paper is to illustrate the significance and applications of rule-based power management optimization based on previous contributions.

  18. Development of a cause analysis system for a CPCS trip by using the rule-base deduction method.

    Science.gov (United States)

    Park, Je-Yun; Koo, In-Soo; Sohn, Chang-Ho; Kim, Jung-Seon; Cho, Gi-Ho; Park, Hee-Seok

    2009-07-01

    A Core Protection Calculator System (CPCS) was developed to initiate a Reactor Trip under the circumstance of certain transients by a Combustion Engineering Company. The major function of the Core Protection Calculator System is to generate contact outputs for the Departure from Nucleate Boiling Ratio (DNBR) Trip and a Local Power Density (LPD) Trip. But in a Core Protection Calculator System, a trip cause cannot be identified, thus only trip signals are transferred to the Plant Protection System (PPS) and only the trip status is displayed. It could take a considerable amount of time and effort for a plant operator to analyze the trip causes of a Core Protection Calculator System. So, a Cause Analysis System for a Core Protection Calculator System (CASCPCS) has been developed by using the rule-base deduction method to assist operators in a Nuclear Power Plant. CASCPCS consists of three major parts. Inference engine has a role of controlling the searching knowledge base, executing the rules and tracking the inference process by using the depth-first searching method. Knowledge base consists of four major parts: rules, data base constants, trip buffer variables and causes. And a user interface is implemented by using menu-driven and window display techniques. The advantage of CASCPCS is that it saves time and effort to diagnose the trip causes of a Core Protection Calculator System, it increases a plant's availability and reliability, and it makes it easy to manage CASCPCS because of using only a cursor control.

  19. Rule-based Mamdani-type fuzzy modelling of thermal performance of fintube evaporator under frost conditions

    Directory of Open Access Journals (Sweden)

    Ozen Dilek Nur

    2016-01-01

    Full Text Available Frost formation brings about insulating effects over the surface of a heat exchanger and thereby deteriorating total heat transfer of the heat exchanger. In this study, a fin-tube evaporator is modeled by making use of Rule-based Mamdani-Type Fuzzy (RBMTF logic where total heat transfer, air inlet temperature of 2 °C to 7 °C and four different fluid speed groups (ua1=1; 1.44; 1.88 m s-1, ua2=2.32; 2.76 m s-1, ua3=3.2; 3.64 m s-1, ua4=4.08; 4.52; 4.96 m s-1 for the evaporator were taken into consideration. In the developed RBMTF system, outlet parameter UA was determined using inlet parameters Ta and ua. The RBMTF was trained and tested by using MATLAB® fuzzy logic toolbox. R2 (% for the training data and test data were found to be 99.91%. With this study, it has been shown that RBMTF model can be reliably used in determination of a total heat transfer of a fin-tube evaporator.

  20. ALPHABET SIGN LANGUAGE RECOGNITION USING LEAP MOTION TECHNOLOGY AND RULE BASED BACKPROPAGATION-GENETIC ALGORITHM NEURAL NETWORK (RBBPGANN

    Directory of Open Access Journals (Sweden)

    Wijayanti Nurul Khotimah

    2017-01-01

    Full Text Available Sign Language recognition was used to help people with normal hearing communicate effectively with the deaf and hearing-impaired. Based on survey that conducted by Multi-Center Study in Southeast Asia, Indonesia was on the top four position in number of patients with hearing disability (4.6%. Therefore, the existence of Sign Language recognition is important. Some research has been conducted on this field. Many neural network types had been used for recognizing many kinds of sign languages. However, their performance are need to be improved. This work focuses on the ASL (Alphabet Sign Language in SIBI (Sign System of Indonesian Language which uses one hand and 26 gestures. Here, thirty four features were extracted by using Leap Motion. Further, a new method, Rule Based-Backpropagation Genetic Al-gorithm Neural Network (RB-BPGANN, was used to recognize these Sign Languages. This method is combination of Rule and Back Propagation Neural Network (BPGANN. Based on experiment this pro-posed application can recognize Sign Language up to 93.8% accuracy. It was very good to recognize large multiclass instance and can be solution of overfitting problem in Neural Network algorithm.

  1. Fuzzy rule-based modelling for human health risk from naturally occurring radioactive materials in produced water

    International Nuclear Information System (INIS)

    Shakhawat, Chowdhury; Tahir, Husain; Neil, Bose

    2006-01-01

    Produced water, discharged from offshore oil and gas operations, contains chemicals from formation water, condensed water, and any chemical added down hole or during the oil/water separation process. Although, most of the contaminants fall below the detection limits within a short distance from the discharge port, a few of the remaining contaminants including naturally occurring radioactive materials (NORM) are of concern due to their bioavailability in the media and bioaccumulation characteristics in finfish and shellfish species used for human consumption. In the past, several initiatives have been taken to model human health risk from NORM in produced water. The parameters of the available risk assessment models are imprecise and sparse in nature. In this study, a fuzzy possibilistic evaluation using fuzzy rule based modeling has been presented. Being conservative in nature, the possibilistic approach considers possible input parameter values; thus provides better environmental prediction than the Monte Carlo (MC) calculation. The uncertainties of the input parameters were captured with fuzzy triangular membership functions (TFNs). Fuzzy if-then rules were applied for input concentrations of two isotopes of radium, namely 226 Ra, and 228 Ra, available in produced water and bulk dilution to evaluate the radium concentration in fish tissue used for human consumption. The bulk dilution was predicted using four input parameters: produced water discharge rate, ambient seawater velocity, depth of discharge port and density gradient. The evaluated cancer risk shows compliance with the regulatory guidelines; thus minimum risk to human health is expected from NORM components in produced water

  2. Application of a rule-based model to estimate mercury exchange for three background biomes in the continental United States

    Science.gov (United States)

    Hartman, J.S.; Weisberg, P.J.; Pillai, R.; Ericksen, J.A.; Kuiken, T.; Lindberg, S.E.; Zhang, H.; Rytuba, J.J.; Gustin, M.S.

    2009-01-01

    Ecosystems that have low mercury (Hg) concentrations (i.e., not enriched or impactedbygeologic or anthropogenic processes) cover most of the terrestrial surface area of the earth yet their role as a net source or sink for atmospheric Hg is uncertain. Here we use empirical data to develop a rule-based model implemented within a geographic information system framework to estimate the spatial and temporal patterns of Hg flux for semiarid deserts, grasslands, and deciduous forests representing 45% of the continental United States. This exercise provides an indication of whether these ecosystems are a net source or sink for atmospheric Hg as well as a basis for recommendation of data to collect in future field sampling campaigns. Results indicated that soil alone was a small net source of atmospheric Hg and that emitted Hg could be accounted for based on Hg input by wet deposition. When foliar assimilation and wet deposition are added to the area estimate of soil Hg flux these biomes are a sink for atmospheric Hg. ?? 2009 American Chemical Society.

  3. Differential impact of relevant and irrelevant dimension primes on rule-based and information-integration category learning.

    Science.gov (United States)

    Grimm, Lisa R; Maddox, W Todd

    2013-11-01

    Research has identified multiple category-learning systems with each being "tuned" for learning categories with different task demands and each governed by different neurobiological systems. Rule-based (RB) classification involves testing verbalizable rules for category membership while information-integration (II) classification requires the implicit learning of stimulus-response mappings. In the first study to directly test rule priming with RB and II category learning, we investigated the influence of the availability of information presented at the beginning of the task. Participants viewed lines that varied in length, orientation, and position on the screen, and were primed to focus on stimulus dimensions that were relevant or irrelevant to the correct classification rule. In Experiment 1, we used an RB category structure, and in Experiment 2, we used an II category structure. Accuracy and model-based analyses suggested that a focus on relevant dimensions improves RB task performance later in learning while a focus on an irrelevant dimension improves II task performance early in learning. © 2013.

  4. MODEL OF MOBILE TRANSLATOR APPLICATION OF ENGLISH TO BAHASA INDONESIA WITH RULE-BASED AND J2ME

    Directory of Open Access Journals (Sweden)

    Dian Puspita Tedjosurya

    2014-05-01

    Full Text Available Along with the development of information technology in recent era, a number of new applications emerge, especially on mobile phones. The use of mobile phones, besides as communication media, is also as media of learning, such as translator application. Translator application can be a tool to learn a language, such as English to Bahasa Indonesia translator application. The purpose of this research is to allow user to be able to translate English to Bahasa Indonesia on mobile phone easily. Translator application on this research was developed using Java programming language (especially J2ME because of its advantage that can run on various operating systems and its open source that can be easily developed and distributed. In this research, data collection was done through literature study, observation, and browsing similar application. Development of the system used object-oriented analysis and design that can be described by using case diagrams, class diagrams, sequence diagrams, and activity diagrams. The translation process used rule-based method. Result of this research is the application of Java-based translator which can translate English sentence into Indonesian sentence. The application can be accessed using a mobile phone with Internet connection. The application has spelling check feature that is able to check the wrong word and provide alternative word that approaches the word input. Conclusion of this research is the application can translate sentence in daily conversation quite well with the sentence structure corresponds and is close to its original meaning.

  5. Using a down-scaled bioclimate envelope model to determine long-term temporal connectivity of Garry oak (Quercus garryana) habitat in western North America: implications for protected area planning.

    Science.gov (United States)

    Pellatt, Marlow G; Goring, Simon J; Bodtker, Karin M; Cannon, Alex J

    2012-04-01

    Under the Canadian Species at Risk Act (SARA), Garry oak (Quercus garryana) ecosystems are listed as "at-risk" and act as an umbrella for over one hundred species that are endangered to some degree. Understanding Garry oak responses to future climate scenarios at scales relevant to protected area managers is essential to effectively manage existing protected area networks and to guide the selection of temporally connected migration corridors, additional protected areas, and to maintain Garry oak populations over the next century. We present Garry oak distribution scenarios using two random forest models calibrated with down-scaled bioclimatic data for British Columbia, Washington, and Oregon based on 1961-1990 climate normals. The suitability models are calibrated using either both precipitation and temperature variables or using only temperature variables. We compare suitability predictions from four General Circulation Models (GCMs) and present CGCM2 model results under two emissions scenarios. For each GCM and emissions scenario we apply the two Garry oak suitability models and use the suitability models to determine the extent and temporal connectivity of climatically suitable Garry oak habitat within protected areas from 2010 to 2099. The suitability models indicate that while 164 km(2) of the total protected area network in the region (47,990 km(2)) contains recorded Garry oak presence, 1635 and 1680 km(2) of climatically suitable Garry oak habitat is currently under some form of protection. Of this suitable protected area, only between 6.6 and 7.3% will be "temporally connected" between 2010 and 2099 based on the CGCM2 model. These results highlight the need for public and private protected area organizations to work cooperatively in the development of corridors to maintain temporal connectivity in climatically suitable areas for the future of Garry oak ecosystems.

  6. Rule-Based Reasoning Is Fast and Belief-Based Reasoning Can Be Slow: Challenging Current Explanations of Belief-Bias and Base-Rate Neglect

    Science.gov (United States)

    Newman, Ian R.; Gibb, Maia; Thompson, Valerie A.

    2017-01-01

    It is commonly assumed that belief-based reasoning is fast and automatic, whereas rule-based reasoning is slower and more effortful. Dual-Process theories of reasoning rely on this speed-asymmetry explanation to account for a number of reasoning phenomena, such as base-rate neglect and belief-bias. The goal of the current study was to test this…

  7. Accounting standards and earnings management : The role of rules-based and principles-based accounting standards and incentives on accounting and transaction decisions

    NARCIS (Netherlands)

    Beest, van F.

    2012-01-01

    This book examines the effect that rules-based and principles-based accounting standards have on the level and nature of earnings management decisions. A cherry picking experiment is conducted to test the hypothesis that a substitution effect is expected from accounting decisions to transaction

  8. The Balance-Scale Task Revisited: A Comparison of Statistical Models for Rule-Based and Information-Integration Theories of Proportional Reasoning.

    Directory of Open Access Journals (Sweden)

    Abe D Hofman

    Full Text Available We propose and test three statistical models for the analysis of children's responses to the balance scale task, a seminal task to study proportional reasoning. We use a latent class modelling approach to formulate a rule-based latent class model (RB LCM following from a rule-based perspective on proportional reasoning and a new statistical model, the Weighted Sum Model, following from an information-integration approach. Moreover, a hybrid LCM using item covariates is proposed, combining aspects of both a rule-based and information-integration perspective. These models are applied to two different datasets, a standard paper-and-pencil test dataset (N = 779, and a dataset collected within an online learning environment that included direct feedback, time-pressure, and a reward system (N = 808. For the paper-and-pencil dataset the RB LCM resulted in the best fit, whereas for the online dataset the hybrid LCM provided the best fit. The standard paper-and-pencil dataset yielded more evidence for distinct solution rules than the online data set in which quantitative item characteristics are more prominent in determining responses. These results shed new light on the discussion on sequential rule-based and information-integration perspectives of cognitive development.

  9. On methodology

    DEFF Research Database (Denmark)

    Cheesman, Robin; Faraone, Roque

    2002-01-01

    This is an English version of the methodology chapter in the authors' book "El caso Berríos: Estudio sobre información errónea, desinformación y manipulación de la opinión pública".......This is an English version of the methodology chapter in the authors' book "El caso Berríos: Estudio sobre información errónea, desinformación y manipulación de la opinión pública"....

  10. Life-threatening false alarm rejection in ICU: using the rule-based and multi-channel information fusion method.

    Science.gov (United States)

    Liu, Chengyu; Zhao, Lina; Tang, Hong; Li, Qiao; Wei, Shoushui; Li, Jianqing

    2016-08-01

    False alarm (FA) rates as high as 86% have been reported in intensive care unit monitors. High FA rates decrease quality of care by slowing staff response times while increasing patient burdens and stresses. In this study, we proposed a rule-based and multi-channel information fusion method for accurately classifying the true or false alarms for five life-threatening arrhythmias: asystole (ASY), extreme bradycardia (EBR), extreme tachycardia (ETC), ventricular tachycardia (VTA) and ventricular flutter/fibrillation (VFB). The proposed method consisted of five steps: (1) signal pre-processing, (2) feature detection and validation, (3) true/false alarm determination for each channel, (4) 'real-time' true/false alarm determination and (5) 'retrospective' true/false alarm determination (if needed). Up to four signal channels, that is, two electrocardiogram signals, one arterial blood pressure and/or one photoplethysmogram signal were included in the analysis. Two events were set for the method validation: event 1 for 'real-time' and event 2 for 'retrospective' alarm classification. The results showed that 100% true positive ratio (i.e. sensitivity) on the training set were obtained for ASY, EBR, ETC and VFB types, and 94% for VTA type, accompanied by the corresponding true negative ratio (i.e. specificity) results of 93%, 81%, 78%, 85% and 50% respectively, resulting in the score values of 96.50, 90.70, 88.89, 92.31 and 64.90, as well as with a final score of 80.57 for event 1 and 79.12 for event 2. For the test set, the proposed method obtained the score of 88.73 for ASY, 77.78 for EBR, 89.92 for ETC, 67.74 for VFB and 61.04 for VTA types, with the final score of 71.68 for event 1 and 75.91 for event 2.

  11. Development & optimization of a rule-based energy management strategy for fuel economy improvement in hybrid electric vehicles

    Science.gov (United States)

    Asfoor, Mostafa

    The gradual decline of oil reserves and the increasing demand for energy over the past decades has resulted in automotive manufacturers seeking alternative solutions to reduce the dependency on fossil-based fuels for transportation. A viable technology that enables significant improvements in the overall energy conversion efficiencies is the hybridization of conventional vehicle drive systems. This dissertation builds on prior hybrid powertrain development at the University of Idaho. Advanced vehicle models of a passenger car with a conventional powertrain and three different hybrid powertrain layouts were created using GT-Suite. These different powertrain models were validated against a variety of standard driving cycles. The overall fuel economy, energy consumption, and losses were monitored, and a comprehensive energy analysis was performed to compare energy sources and sinks. The GT-Suite model was then used to predict the formula hybrid SAE vehicle performance. Inputs to this model were a numerically predicted engine performance map, an electric motor torque curve, vehicle geometry, and road load parameters derived from a roll-down test. In this case study, the vehicle had a supervisory controller that followed a rule-based energy management strategy to insure a proper power split during hybrid mode operation. The supervisory controller parameters were optimized using discrete grid optimization method that minimized the total amount of fuel consumed during a specific urban driving cycle with an average speed of approximately 30 [mph]. More than a 15% increase in fuel economy was achieved by adding supervisory control and managing power split. The vehicle configuration without the supervisory controller displayed a fuel economy of 25 [mpg]. With the supervisory controller this rose to 29 [mpg]. Wider applications of this research include hybrid vehicle controller designs that can extend the range and survivability of military combat platforms. Furthermore, the

  12. Estimation of Tree Cover in an Agricultural Parkland of Senegal Using Rule-Based Regression Tree Modeling

    Directory of Open Access Journals (Sweden)

    Stefanie M. Herrmann

    2013-10-01

    Full Text Available Field trees are an integral part of the farmed parkland landscape in West Africa and provide multiple benefits to the local environment and livelihoods. While field trees have received increasing interest in the context of strengthening resilience to climate variability and change, the actual extent of farmed parkland and spatial patterns of tree cover are largely unknown. We used the rule-based predictive modeling tool Cubist® to estimate field tree cover in the west-central agricultural region of Senegal. A collection of rules and associated multiple linear regression models was constructed from (1 a reference dataset of percent tree cover derived from very high spatial resolution data (2 m Orbview as the dependent variable, and (2 ten years of 10-day 250 m Moderate Resolution Imaging Spectrometer (MODIS Normalized Difference Vegetation Index (NDVI composites and derived phenological metrics as independent variables. Correlation coefficients between modeled and reference percent tree cover of 0.88 and 0.77 were achieved for training and validation data respectively, with absolute mean errors of 1.07 and 1.03 percent tree cover. The resulting map shows a west-east gradient from high tree cover in the peri-urban areas of horticulture and arboriculture to low tree cover in the more sparsely populated eastern part of the study area. A comparison of current (2000s tree cover along this gradient with historic cover as seen on Corona images reveals dynamics of change but also areas of remarkable stability of field tree cover since 1968. The proposed modeling approach can help to identify locations of high and low tree cover in dryland environments and guide ground studies and management interventions aimed at promoting the integration of field trees in agricultural systems.

  13. A noninvasive method for coronary artery diseases diagnosis using a clinically-interpretable fuzzy rule-based system

    Directory of Open Access Journals (Sweden)

    Hamid Reza Marateb

    2015-01-01

    Full Text Available Background: Coronary heart diseases/coronary artery diseases (CHDs/CAD, the most common form of cardiovascular disease (CVD, are a major cause for death and disability in developing/developed countries. CAD risk factors could be detected by physicians to prevent the CAD occurrence in the near future. Invasive coronary angiography, a current diagnosis method, is costly and associated with morbidity and mortality in CAD patients. The aim of this study was to design a computer-based noninvasive CAD diagnosis system with clinically interpretable rules. Materials and Methods: In this study, the Cleveland CAD dataset from the University of California UCI (Irvine was used. The interval-scale variables were discretized, with cut points taken from the literature. A fuzzy rule-based system was then formulated based on a neuro-fuzzy classifier (NFC whose learning procedure was speeded up by the scaled conjugate gradient algorithm. Two feature selection (FS methods, multiple logistic regression (MLR and sequential FS, were used to reduce the required attributes. The performance of the NFC (without/with FS was then assessed in a hold-out validation framework. Further cross-validation was performed on the best classifier. Results: In this dataset, 16 complete attributes along with the binary CHD diagnosis (gold standard for 272 subjects (68% male were analyzed. MLR + NFC showed the best performance. Its overall sensitivity, specificity, accuracy, type I error (α and statistical power were 79%, 89%, 84%, 0.1 and 79%, respectively. The selected features were "age and ST/heart rate slope categories," "exercise-induced angina status," fluoroscopy, and thallium-201 stress scintigraphy results. Conclusion: The proposed method showed "substantial agreement" with the gold standard. This algorithm is thus, a promising tool for screening CAD patients.

  14. Negation handling in sentiment classification using rule-based adapted from Indonesian language syntactic for Indonesian text in Twitter

    Science.gov (United States)

    Amalia, Rizkiana; Arif Bijaksana, Moch; Darmantoro, Dhinta

    2018-03-01

    The presence of the word negation is able to change the polarity of the text if it is not handled properly it will affect the performance of the sentiment classification. Negation words in Indonesian are ‘tidak’, ‘bukan’, ‘belum’ and ‘jangan’. Also, there is a conjunction word that able to reverse the actual values, as the word ‘tetapi’, or ‘tapi’. Unigram has shortcomings in dealing with the existence of negation because it treats negation word and the negated words as separate words. A general approach for negation handling in English text gives the tag ‘NEG_’ for following words after negation until the first punctuation. But this may gives the tag to un-negated, and this approach does not handle negation and conjunction in one sentences. The rule-based method to determine what words negated by adapting the rules of Indonesian language syntactic of negation to determine the scope of negation was proposed in this study. With adapting syntactic rules and tagging “NEG_” using SVM classifier with RBF kernel has better performance results than the other experiments. Considering the average F1-score value, the performance of this proposed method can be improved against baseline equal to 1.79% (baseline without negation handling) and 5% (baseline with existing negation handling) for a dataset that all tweets contain negation words. And also for the second dataset that has the various number of negation words in document tweet. It can be improved against baseline at 2.69% (without negation handling) and 3.17% (with existing negation handling).

  15. RISMA: A Rule-based Interval State Machine Algorithm for Alerts Generation, Performance Analysis and Monitoring Real-Time Data Processing

    Science.gov (United States)

    Laban, Shaban; El-Desouky, Aly

    2013-04-01

    The monitoring of real-time systems is a challenging and complicated process. So, there is a continuous need to improve the monitoring process through the use of new intelligent techniques and algorithms for detecting exceptions, anomalous behaviours and generating the necessary alerts during the workflow monitoring of such systems. The interval-based or period-based theorems have been discussed, analysed, and used by many researches in Artificial Intelligence (AI), philosophy, and linguistics. As explained by Allen, there are 13 relations between any two intervals. Also, there have also been many studies of interval-based temporal reasoning and logics over the past decades. Interval-based theorems can be used for monitoring real-time interval-based data processing. However, increasing the number of processed intervals makes the implementation of such theorems a complex and time consuming process as the relationships between such intervals are increasing exponentially. To overcome the previous problem, this paper presents a Rule-based Interval State Machine Algorithm (RISMA) for processing, monitoring, and analysing the behaviour of interval-based data, received from real-time sensors. The proposed intelligent algorithm uses the Interval State Machine (ISM) approach to model any number of interval-based data into well-defined states as well as inferring them. An interval-based state transition model and methodology are presented to identify the relationships between the different states of the proposed algorithm. By using such model, the unlimited number of relationships between similar large numbers of intervals can be reduced to only 18 direct relationships using the proposed well-defined states. For testing the proposed algorithm, necessary inference rules and code have been designed and applied to the continuous data received in near real-time from the stations of International Monitoring System (IMS) by the International Data Centre (IDC) of the Preparatory

  16. Methodological guidelines

    International Nuclear Information System (INIS)

    Halsnaes, K.; Callaway, J.M.; Meyer, H.J.

    1999-01-01

    The guideline document establishes a general overview of the main components of climate change mitigation assessment. This includes an outline of key economic concepts, scenario structure, common assumptions, modelling tools and country study assumptions. The guidelines are supported by Handbook Reports that contain more detailed specifications of calculation standards, input assumptions and available tools. The major objectives of the project have been provided a methodology, an implementing framework and a reporting system which countries can follow in meeting their future reporting obligations under the FCCC and for GEF enabling activities. The project builds upon the methodology development and application in the UNEP National Abatement Coasting Studies (UNEP, 1994a). The various elements provide countries with a road map for conducting climate change mitigation studies and submitting national reports as required by the FCCC. (au) 121 refs

  17. Methodological guidelines

    Energy Technology Data Exchange (ETDEWEB)

    Halsnaes, K.; Callaway, J.M.; Meyer, H.J.

    1999-04-01

    The guideline document establishes a general overview of the main components of climate change mitigation assessment. This includes an outline of key economic concepts, scenario structure, common assumptions, modelling tools and country study assumptions. The guidelines are supported by Handbook Reports that contain more detailed specifications of calculation standards, input assumptions and available tools. The major objectives of the project have been provided a methodology, an implementing framework and a reporting system which countries can follow in meeting their future reporting obligations under the FCCC and for GEF enabling activities. The project builds upon the methodology development and application in the UNEP National Abatement Coasting Studies (UNEP, 1994a). The various elements provide countries with a road map for conducting climate change mitigation studies and submitting national reports as required by the FCCC. (au) 121 refs.

  18. A decision support system and rule-based algorithm to augment the human interpretation of the 12-lead electrocardiogram.

    Science.gov (United States)

    Cairns, Andrew W; Bond, Raymond R; Finlay, Dewar D; Guldenring, Daniel; Badilini, Fabio; Libretti, Guido; Peace, Aaron J; Leslie, Stephen J

    The 12-lead Electrocardiogram (ECG) has been used to detect cardiac abnormalities in the same format for more than 70years. However, due to the complex nature of 12-lead ECG interpretation, there is a significant cognitive workload required from the interpreter. This complexity in ECG interpretation often leads to errors in diagnosis and subsequent treatment. We have previously reported on the development of an ECG interpretation support system designed to augment the human interpretation process. This computerised decision support system has been named 'Interactive Progressive based Interpretation' (IPI). In this study, a decision support algorithm was built into the IPI system to suggest potential diagnoses based on the interpreter's annotations of the 12-lead ECG. We hypothesise semi-automatic interpretation using a digital assistant can be an optimal man-machine model for ECG interpretation. To improve interpretation accuracy and reduce missed co-abnormalities. The Differential Diagnoses Algorithm (DDA) was developed using web technologies where diagnostic ECG criteria are defined in an open storage format, Javascript Object Notation (JSON), which is queried using a rule-based reasoning algorithm to suggest diagnoses. To test our hypothesis, a counterbalanced trial was designed where subjects interpreted ECGs using the conventional approach and using the IPI+DDA approach. A total of 375 interpretations were collected. The IPI+DDA approach was shown to improve diagnostic accuracy by 8.7% (although not statistically significant, p-value=0.1852), the IPI+DDA suggested the correct interpretation more often than the human interpreter in 7/10 cases (varying statistical significance). Human interpretation accuracy increased to 70% when seven suggestions were generated. Although results were not found to be statistically significant, we found; 1) our decision support tool increased the number of correct interpretations, 2) the DDA algorithm suggested the correct

  19. Transfer between local and global processing levels by pigeons (Columba livia) and humans (Homo sapiens) in exemplar- and rule-based categorization tasks.

    Science.gov (United States)

    Aust, Ulrike; Braunöder, Elisabeth

    2015-02-01

    The present experiment investigated pigeons' and humans' processing styles-local or global-in an exemplar-based visual categorization task in which category membership of every stimulus had to be learned individually, and in a rule-based task in which category membership was defined by a perceptual rule. Group Intact was trained with the original pictures (providing both intact local and global information), Group Scrambled was trained with scrambled versions of the same pictures (impairing global information), and Group Blurred was trained with blurred versions (impairing local information). Subsequently, all subjects were tested for transfer to the 2 untrained presentation modes. Humans outperformed pigeons regarding learning speed and accuracy as well as transfer performance and showed good learning irrespective of group assignment, whereas the pigeons of Group Blurred needed longer to learn the training tasks than the pigeons of Groups Intact and Scrambled. Also, whereas humans generalized equally well to any novel presentation mode, pigeons' transfer from and to blurred stimuli was impaired. Both species showed faster learning and, for the most part, better transfer in the rule-based than in the exemplar-based task, but there was no evidence of the used processing mode depending on the type of task (exemplar- or rule-based). Whereas pigeons relied on local information throughout, humans did not show a preference for either processing level. Additional tests with grayscale versions of the training stimuli, with versions that were both blurred and scrambled, and with novel instances of the rule-based task confirmed and further extended these findings. PsycINFO Database Record (c) 2015 APA, all rights reserved.

  20. MIRD methodology

    International Nuclear Information System (INIS)

    Rojo, Ana M.; Gomez Parada, Ines

    2004-01-01

    The MIRD (Medical Internal Radiation Dose) system was established by the Society of Nuclear Medicine of USA in 1960 to assist the medical community in the estimation of the dose in organs and tissues due to the incorporation of radioactive materials. Since then, 'MIRD Dose Estimate Report' (from the 1 to 12) and 'Pamphlets', of great utility for the dose calculations, were published. The MIRD system was planned essentially for the calculation of doses received by the patients during nuclear medicine diagnostic procedures. The MIRD methodology for the absorbed doses calculations in different tissues is explained

  1. PSA methodology

    Energy Technology Data Exchange (ETDEWEB)

    Magne, L

    1997-12-31

    The purpose of this text is first to ask a certain number of questions on the methods related to PSAs. Notably we will explore the positioning of the French methodological approach - as applied in the EPS 1300{sup 1} and EPS 900{sup 2} PSAs - compared to other approaches (Part One). This reflection leads to more general reflection: what contents, for what PSA? This is why, in Part Two, we will try to offer a framework for definition of the criteria a PSA should satisfy to meet the clearly identified needs. Finally, Part Three will quickly summarize the questions approached in the first two parts, as an introduction to the debate. 15 refs.

  2. PSA methodology

    International Nuclear Information System (INIS)

    Magne, L.

    1996-01-01

    The purpose of this text is first to ask a certain number of questions on the methods related to PSAs. Notably we will explore the positioning of the French methodological approach - as applied in the EPS 1300 1 and EPS 900 2 PSAs - compared to other approaches (Part One). This reflection leads to more general reflection: what contents, for what PSA? This is why, in Part Two, we will try to offer a framework for definition of the criteria a PSA should satisfy to meet the clearly identified needs. Finally, Part Three will quickly summarize the questions approached in the first two parts, as an introduction to the debate. 15 refs

  3. A study on the optimal fuel loading pattern design in pressurized water reactor using the artificial neural network and the fuzzy rule based system

    International Nuclear Information System (INIS)

    Kim, Han Gon; Chang, Soon Heung; Lee, Byung

    2004-01-01

    The Optimal Fuel Shuffling System (OFSS) is developed for optimal design of PWR fuel loading pattern. In this paper, an optimal loading pattern is defined that the local power peaking factor is lower than predetermined value during one cycle and the effective multiplication factor is maximized in order to extract maximum energy. OFSS is a hybrid system that a rule based system, a fuzzy logic, and an artificial neural network are connected each other. The rule based system classifies loading patterns into two classes using several heuristic rules and a fuzzy rule. A fuzzy rule is introduced to achieve more effective and fast searching. Its membership function is automatically updated in accordance with the prediction results. The artificial neural network predicts core parameters for the patterns generated from the rule based system. The back-propagation network is used for fast prediction of core parameters. The artificial neural network and the fuzzy logic can be used as the tool for improvement of existing algorithm's capabilities. OFSS was demonstrated and validated for cycle 1 of Kori unit 1 PWR. (author)

  4. A study on the optimal fuel loading pattern design in pressurized water reactor using the artificial neural network and the fuzzy rule based system

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Han Gon; Chang, Soon Heung; Lee, Byung [Department of Nuclear Engineering, Korea Advanced Institute of Science and Technology, Yusong-gu, Taejon (Korea, Republic of)

    2004-07-01

    The Optimal Fuel Shuffling System (OFSS) is developed for optimal design of PWR fuel loading pattern. In this paper, an optimal loading pattern is defined that the local power peaking factor is lower than predetermined value during one cycle and the effective multiplication factor is maximized in order to extract maximum energy. OFSS is a hybrid system that a rule based system, a fuzzy logic, and an artificial neural network are connected each other. The rule based system classifies loading patterns into two classes using several heuristic rules and a fuzzy rule. A fuzzy rule is introduced to achieve more effective and fast searching. Its membership function is automatically updated in accordance with the prediction results. The artificial neural network predicts core parameters for the patterns generated from the rule based system. The back-propagation network is used for fast prediction of core parameters. The artificial neural network and the fuzzy logic can be used as the tool for improvement of existing algorithm's capabilities. OFSS was demonstrated and validated for cycle 1 of Kori unit 1 PWR. (author)

  5. Testing methodologies

    Energy Technology Data Exchange (ETDEWEB)

    Bender, M.A.

    1990-01-01

    Several methodologies are available for screening human populations for exposure to ionizing radiation. Of these, aberration frequency determined in peripheral blood lymphocytes is the best developed. Individual exposures to large doses can easily be quantitated, and population exposures to occupational levels can be detected. However, determination of exposures to the very low doses anticipated from a low-level radioactive waste disposal site is more problematical. Aberrations occur spontaneously, without known cause. Exposure to radiation induces no new or novel types, but only increases their frequency. The limitations of chromosomal aberration dosimetry for detecting low level radiation exposures lie mainly in the statistical signal to noise'' problem, the distribution of aberrations among cells and among individuals, and the possible induction of aberrations by other environmental occupational or medical exposures. However, certain features of the human peripheral lymphocyte-chromosomal aberration system make it useful in screening for certain types of exposures. Future technical developments may make chromosomal aberration dosimetry more useful for low-level radiation exposures. Other methods, measuring gene mutations or even minute changes on the DNA level, while presently less will developed techniques, may eventually become even more practical and sensitive assays for human radiation exposure. 15 refs.

  6. Testing methodologies

    International Nuclear Information System (INIS)

    Bender, M.A.

    1990-01-01

    Several methodologies are available for screening human populations for exposure to ionizing radiation. Of these, aberration frequency determined in peripheral blood lymphocytes is the best developed. Individual exposures to large doses can easily be quantitated, and population exposures to occupational levels can be detected. However, determination of exposures to the very low doses anticipated from a low-level radioactive waste disposal site is more problematical. Aberrations occur spontaneously, without known cause. Exposure to radiation induces no new or novel types, but only increases their frequency. The limitations of chromosomal aberration dosimetry for detecting low level radiation exposures lie mainly in the statistical ''signal to noise'' problem, the distribution of aberrations among cells and among individuals, and the possible induction of aberrations by other environmental occupational or medical exposures. However, certain features of the human peripheral lymphocyte-chromosomal aberration system make it useful in screening for certain types of exposures. Future technical developments may make chromosomal aberration dosimetry more useful for low-level radiation exposures. Other methods, measuring gene mutations or even minute changes on the DNA level, while presently less will developed techniques, may eventually become even more practical and sensitive assays for human radiation exposure. 15 refs

  7. A study on the optimal fuel loading pattern design in pressurized water reactors using the artificial neural network and the fuzzy rule based system

    International Nuclear Information System (INIS)

    Kim, Han Gon

    1993-02-01

    In pressurized water reactors, the fuel reloading problem has significant meaning in terms of both safety and economic aspects. Therefore the general problem of incore fuel management for a PWR consists of determining the fuel reloading policy for each cycle that minimize unit energy cost under the constraints imposed on various core parameters, e.g., a local power peaking factor and an assembly burnup. This is equivalent that a cycle length is maximized for a given energy cost under the various constraints. Existing optimization methods do not ensure the global optimum solution because of the essential limitation of their searching algorithms. They only find near optimal solutions. To solve this limitation, a hybrid artificial neural network system is developed for the optimal fuel loading pattern design using a fuzzy rule based system and an artificial neural networks. This system finds the patterns that P max is lower than the predetermined value and K eff is larger than the reference value. The back-propagation networks are developed to predict PWR core parameters. Reference PWR is an 121-assembly typical PWR. The local power peaking factor and the effective multiplication factor at BOC condition are predicted. To obtain target values of these two parameters, the QCC code are used. Using this code, 1000 training patterns are obtained, randomly. Two networks are constructed, one for P max and another for K eff Both of two networks have 21 input layer neurons, 18 output layer neurons, and 120 and 393 hidden layer neurons, respectively. A new learning algorithm is proposed. This is called the advanced adaptive learning algorithm. The weight change step size of this algorithm is optimally varied inversely proportional to the average difference between an actual output value and an ideal target value. This algorithm greatly enhances the convergence speed of a BPN. In case of P max prediction, 98% of the untrained patterns are predicted within 6% error, and in case

  8. Oil drilling rig diesel power-plant fuel efficiency improvement potentials through rule-based generator scheduling and utilization of battery energy storage system

    International Nuclear Information System (INIS)

    Pavković, Danijel; Sedić, Almir; Guzović, Zvonimir

    2016-01-01

    Highlights: • Isolated oil drilling rig microgrid power flows are analyzed over 30 days. • Rule-based diesel generator scheduling is proposed to reduce fuel consumption. • A battery energy storage is parameterized and used for peak load leveling. • The effectiveness of proposed hybrid microgrid is verified by simulations. • Return-of-investment might be expected within 20% of battery system lifetime. - Abstract: This paper presents the development of a rule-based energy management control strategy suitable for isolated diesel power-plants equipped with a battery energy storage system for peak load shaving. The proposed control strategy includes the generator scheduling strategy and peak load leveling scheme based on current microgrid active and reactive power requirements. In order to investigate the potentials for fuel expenditure reduction, 30 days-worth of microgrid power flow data has been collected on an isolated land-based oil drilling rig powered by a diesel generator power-plant, characterized by highly-variable active and reactive load profiles due to intermittent engagements and disengagements of high-power electric machinery such as top-drive, draw-works and mud-pump motors. The analysis has indicated that by avoiding the low-power operation of individual generators and by providing the peak power requirements (peak shaving) from a dedicated energy storage system, the power-plant fuel efficiency may be notably improved. An averaged power flow simulation model has been built, comprising the proposed rule-based power flow control strategy and the averaged model of a suitably sized battery energy storage system equipped with grid-tied power converter and state-of-charge control system. The effectiveness of the proposed rule-based strategy has been evaluated by means of computer simulation analysis based on drilling rig microgrid active and reactive power data recorded during the 30 day period. The analysis has indicated that fuel consumption of

  9. Comparison of Natural Language Processing Rules-based and Machine-learning Systems to Identify Lumbar Spine Imaging Findings Related to Low Back Pain.

    Science.gov (United States)

    Tan, W Katherine; Hassanpour, Saeed; Heagerty, Patrick J; Rundell, Sean D; Suri, Pradeep; Huhdanpaa, Hannu T; James, Kathryn; Carrell, David S; Langlotz, Curtis P; Organ, Nancy L; Meier, Eric N; Sherman, Karen J; Kallmes, David F; Luetmer, Patrick H; Griffith, Brent; Nerenz, David R; Jarvik, Jeffrey G

    2018-03-28

    To evaluate a natural language processing (NLP) system built with open-source tools for identification of lumbar spine imaging findings related to low back pain on magnetic resonance and x-ray radiology reports from four health systems. We used a limited data set (de-identified except for dates) sampled from lumbar spine imaging reports of a prospectively assembled cohort of adults. From N = 178,333 reports, we randomly selected N = 871 to form a reference-standard dataset, consisting of N = 413 x-ray reports and N = 458 MR reports. Using standardized criteria, four spine experts annotated the presence of 26 findings, where 71 reports were annotated by all four experts and 800 were each annotated by two experts. We calculated inter-rater agreement and finding prevalence from annotated data. We randomly split the annotated data into development (80%) and testing (20%) sets. We developed an NLP system from both rule-based and machine-learned models. We validated the system using accuracy metrics such as sensitivity, specificity, and area under the receiver operating characteristic curve (AUC). The multirater annotated dataset achieved inter-rater agreement of Cohen's kappa > 0.60 (substantial agreement) for 25 of 26 findings, with finding prevalence ranging from 3% to 89%. In the testing sample, rule-based and machine-learned predictions both had comparable average specificity (0.97 and 0.95, respectively). The machine-learned approach had a higher average sensitivity (0.94, compared to 0.83 for rules-based), and a higher overall AUC (0.98, compared to 0.90 for rules-based). Our NLP system performed well in identifying the 26 lumbar spine findings, as benchmarked by reference-standard annotation by medical experts. Machine-learned models provided substantial gains in model sensitivity with slight loss of specificity, and overall higher AUC. Copyright © 2018 The Association of University Radiologists. All rights reserved.

  10. Automatic Laser Pointer Detection Algorithm for Environment Control Device Systems Based on Template Matching and Genetic Tuning of Fuzzy Rule-Based Systems

    Directory of Open Access Journals (Sweden)

    F.

    2012-04-01

    Full Text Available In this paper we propose a new approach for laser-based environment device control systems based on the automatic design of a Fuzzy Rule-Based System for laser pointer detection. The idea is to improve the success rate of the previous approaches decreasing as much as possible the false offs and increasing the success rate in images with laser spot, i.e., the detection of a false laser spot (since this could lead to dangerous situations. To this end, we propose to analyze both, the morphology and color of a laser spot image together, thus developing a new robust algorithm. Genetic Fuzzy Systems have also been employed to improve the laser spot system detection by means of a fine tuning of the involved membership functions thus reducing the system false offs, which is the main objective in this problem. The system presented in this paper, makes use of a Fuzzy Rule-Based System adjusted by a Genetic Algorithm, which, based on laser morphology and color analysis, shows a better success rate than previous approaches.

  11. Does GEM-Encoding Clinical Practice Guidelines Improve the Quality of Knowledge Bases? A Study with the Rule-Based Formalism

    Science.gov (United States)

    Georg, Gersende; Séroussi, Brigitte; Bouaud, Jacques

    2003-01-01

    The aim of this work was to determine whether the GEM-encoding step could improve the representation of clinical practice guidelines as formalized knowledge bases. We used the 1999 Canadian recommendations for the management of hypertension, chosen as the knowledge source in the ASTI project. We first clarified semantic ambiguities of therapeutic sequences recommended in the guideline by proposing an interpretative framework of therapeutic strategies. Then, after a formalization step to standardize the terms used to characterize clinical situations, we created the GEM-encoded instance of the guideline. We developed a module for the automatic derivation of a rule base, BR-GEM, from the instance. BR-GEM was then compared to the rule base, BR-ASTI, embedded within the critic mode of ASTI, and manually built by two physicians from the same Canadian guideline. As compared to BR-ASTI, BR-GEM is more specific and covers more clinical situations. When evaluated on 10 patient cases, the GEM-based approach led to promising results. PMID:14728173

  12. Does GEM-encoding clinical practice guidelines improve the quality of knowledge bases? A study with the rule-based formalism.

    Science.gov (United States)

    Georg, Georg; Séroussi, Brigitte; Bouaud, Jacques

    2003-01-01

    The aim of this work was to determine whether the GEM-encoding step could improve the representation of clinical practice guidelines as formalized knowledge bases. We used the 1999 Canadian recommendations for the management of hypertension, chosen as the knowledge source in the ASTI project. We first clarified semantic ambiguities of therapeutic sequences recommended in the guideline by proposing an interpretative framework of therapeutic strategies. Then, after a formalization step to standardize the terms used to characterize clinical situations, we created the GEM-encoded instance of the guideline. We developed a module for the automatic derivation of a rule base, BR-GEM, from the instance. BR-GEM was then compared to the rule base, BR-ASTI, embedded within the critic mode of ASTI, and manually built by two physicians from the same Canadian guideline. As compared to BR-ASTI, BR-GEM is more specific and covers more clinical situations. When evaluated on 10 patient cases, the GEM-based approach led to promising results.

  13. An Association Rule Based Method to Integrate Metro-Public Bicycle Smart Card Data for Trip Chain Analysis

    Directory of Open Access Journals (Sweden)

    De Zhao

    2018-01-01

    Full Text Available Smart card data provide valuable insights and massive samples for enhancing the understanding of transfer behavior between metro and public bicycle. However, smart cards for metro and public bicycle are often issued and managed by independent companies and this results in the same commuter having different identity tags in the metro and public bicycle smart card systems. The primary objective of this study is to develop a data fusion methodology for matching metro and public bicycle smart cards for the same commuter using historical smart card data. A novel method with association rules to match the data derived from the two systems is proposed and validation was performed. The results showed that our proposed method successfully matched 573 pairs of smart cards with an accuracy of 100%. We also validated the association rules method through visualization of individual metro and public bicycle trips. Based on the matched cards, interesting findings of metro-bicycle transfer have been derived, including the spatial pattern of the public bicycle as first/last mile solution as well as the duration of a metro trip chain.

  14. The CARPEDIEM Algorithm: A Rule-Based System for Identifying Heart Failure Phenotype with a Precision Public Health Approach

    Directory of Open Access Journals (Sweden)

    Michela Franchini

    2018-01-01

    clustering of markers of chronic conditions by adding additional methodologies, e.g., Social Network Analysis. CARPEDIEM establishes the potential that an algorithmic approach, based on integrating administrative data with other public data sources, can enable the development of low cost and high value population-based evaluations for improving public health and impacting public health policies.

  15. Quantitative evaluation of variations in rule-based classifications of land cover in urban neighbourhoods using WorldView-2 imagery

    Science.gov (United States)

    Belgiu, Mariana; ǎguţ, Lucian, , Dr; Strobl, Josef

    2014-01-01

    The increasing availability of high resolution imagery has triggered the need for automated image analysis techniques, with reduced human intervention and reproducible analysis procedures. The knowledge gained in the past might be of use to achieving this goal, if systematically organized into libraries which would guide the image analysis procedure. In this study we aimed at evaluating the variability of digital classifications carried out by three experts who were all assigned the same interpretation task. Besides the three classifications performed by independent operators, we developed an additional rule-based classification that relied on the image classifications best practices found in the literature, and used it as a surrogate for libraries of object characteristics. The results showed statistically significant differences among all operators who classified the same reference imagery. The classifications carried out by the experts achieved satisfactory results when transferred to another area for extracting the same classes of interest, without modification of the developed rules.

  16. Clinical trial methodology

    National Research Council Canada - National Science Library

    Peace, Karl E; Chen, Ding-Geng

    2011-01-01

    ... in the pharmaceutical industry, Clinical trial methodology emphasizes the importance of statistical thinking in clinical research and presents the methodology as a key component of clinical research...

  17. LipidMatch: an automated workflow for rule-based lipid identification using untargeted high-resolution tandem mass spectrometry data.

    Science.gov (United States)

    Koelmel, Jeremy P; Kroeger, Nicholas M; Ulmer, Candice Z; Bowden, John A; Patterson, Rainey E; Cochran, Jason A; Beecher, Christopher W W; Garrett, Timothy J; Yost, Richard A

    2017-07-10

    Lipids are ubiquitous and serve numerous biological functions; thus lipids have been shown to have great potential as candidates for elucidating biomarkers and pathway perturbations associated with disease. Methods expanding coverage of the lipidome increase the likelihood of biomarker discovery and could lead to more comprehensive understanding of disease etiology. We introduce LipidMatch, an R-based tool for lipid identification for liquid chromatography tandem mass spectrometry workflows. LipidMatch currently has over 250,000 lipid species spanning 56 lipid types contained in in silico fragmentation libraries. Unique fragmentation libraries, compared to other open source software, include oxidized lipids, bile acids, sphingosines, and previously uncharacterized adducts, including ammoniated cardiolipins. LipidMatch uses rule-based identification. For each lipid type, the user can select which fragments must be observed for identification. Rule-based identification allows for correct annotation of lipids based on the fragments observed, unlike typical identification based solely on spectral similarity scores, where over-reporting structural details that are not conferred by fragmentation data is common. Another unique feature of LipidMatch is ranking lipid identifications for a given feature by the sum of fragment intensities. For each lipid candidate, the intensities of experimental fragments with exact mass matches to expected in silico fragments are summed. The lipid identifications with the greatest summed intensity using this ranking algorithm were comparable to other lipid identification software annotations, MS-DIAL and Greazy. For example, for features with identifications from all 3 software, 92% of LipidMatch identifications by fatty acyl constituents were corroborated by at least one other software in positive mode and 98% in negative ion mode. LipidMatch allows users to annotate lipids across a wide range of high resolution tandem mass spectrometry

  18. A study on the development of the on-line operator aid system using rule based expert system and fuzzy logic for nuclear power plants

    International Nuclear Information System (INIS)

    Kang, Ki Sig

    1995-02-01

    The on - line Operator Aid SYStem (OASYS) has been developed to support operator's decision making process and to ensure the safety of nuclear power plants (NPPs) by timely providing operators with proper guidelines according to the plant operation mode. The OASYS consists of four systems such as the signal validation and management system (SVMS), the plant monitoring system (PMS), the alarm filtering and diagnostic system (AFDS), and the dynamic emergency procedure tracking system (DEPTS). The SVMS and the PMS help operators to maintain a plant as a normal operation condition. The AFDS covers the abnormal events until they result in exceeding the limit range of reactor trip signals, while after a reactor trip, the DEPTS aids operators with proper guidelines so as to shutdown safely. The OASYS uses a rule based expert system and a fuzzy logic. The rule based expert system is used to classify the pre-defined events and track the emergency operating procedures (EOPs) through data processing. The fuzzy logic is used to generate the conceptual high level alarms for the prognostic diagnosis and to evaluate the qualitative fuzzy criteria used in EOPs. Performance assessment of the OASYS demonstrates that it is capable of diagnosing plant abnormal conditions and providing operators appropriate guidelines with fast response time and consistency. The developed technology for OASYS will be used to design the Integrated Advanced Control Room in which a plant can be operated by one operator during normal operation. The advanced EOP for emergency operation has been developed by focusing attention on the importance of the operators' role in emergency conditions. To overcome the complexity of current EOPs and maintain the consistency of operators' action according to plant emergency conditions, operator's tasks were allocated according to their duties in the advanced EOP and the computerized operator aid system (COAS) has been developed as an alternative to reduce operator

  19. Rule-based Information Integration

    NARCIS (Netherlands)

    de Keijzer, Ander; van Keulen, Maurice

    2005-01-01

    In this report, we show the process of information integration. We specifically discuss the language used for integration. We show that integration consists of two phases, the schema mapping phase and the data integration phase. We formally define transformation rules, conversion, evolution and

  20. Transforming Adverse Cognition on the Path of Bhakti: Rule-Based Devotion, “My-Ness,” and the Existential Condition of Bondage

    Directory of Open Access Journals (Sweden)

    Travis Chilcott

    2016-05-01

    Full Text Available Early Gauḍīya Vaiṣṇava theologians developed a unique path of Hindu devotion during the 16th century through which an aspirant cultivates a rapturous form of selfless love (premā for Kṛṣṇa, who is recognized as the supreme and personal deity. In the course and consequence of cultivating this selfless love, the recommended practices of devotion are claimed to free one from the basic existential condition of bondage that is of concern for a wide range of South Asian religious and philosophical traditions. One of the principle cognitive tendencies characterizing this condition is to have thoughts and feelings of possessiveness over objects of the world, or what is referred to as the state of “my-ness” (mamatā, e.g., my home, my children, or my wealth. Using the therapeutic model of schema therapy as a heuristic analogue, this article explores the relationship between recommended practices of rule-based devotion (vaidhi-bhakti and the modulation of thoughts and feelings of possessiveness towards mundane objects. I argue that such practices function as learning strategies that can systematically rework and modulate how one relates to and responds to these objects in theologically desirable ways. I conclude by suggesting that connectionist theories of cognition and learning may offer a promising explanatory framework for understanding the dynamics of this kind of relationship.

  1. Identifying influenza-like illness presentation from unstructured general practice clinical narrative using a text classifier rule-based expert system versus a clinical expert.

    Science.gov (United States)

    MacRae, Jayden; Love, Tom; Baker, Michael G; Dowell, Anthony; Carnachan, Matthew; Stubbe, Maria; McBain, Lynn

    2015-10-06

    We designed and validated a rule-based expert system to identify influenza like illness (ILI) from routinely recorded general practice clinical narrative to aid a larger retrospective research study into the impact of the 2009 influenza pandemic in New Zealand. Rules were assessed using pattern matching heuristics on routine clinical narrative. The system was trained using data from 623 clinical encounters and validated using a clinical expert as a gold standard against a mutually exclusive set of 901 records. We calculated a 98.2 % specificity and 90.2 % sensitivity across an ILI incidence of 12.4 % measured against clinical expert classification. Peak problem list identification of ILI by clinical coding in any month was 9.2 % of all detected ILI presentations. Our system addressed an unusual problem domain for clinical narrative classification; using notational, unstructured, clinician entered information in a community care setting. It performed well compared with other approaches and domains. It has potential applications in real-time surveillance of disease, and in assisted problem list coding for clinicians. Our system identified ILI presentation with sufficient accuracy for use at a population level in the wider research study. The peak coding of 9.2 % illustrated the need for automated coding of unstructured narrative in our study.

  2. Development of Near Optimal Rule-Based Control for Plug-In Hybrid Electric Vehicles Taking into Account Drivetrain Component Losses

    Directory of Open Access Journals (Sweden)

    Hanho Son

    2016-05-01

    Full Text Available A near-optimal rule-based mode control (RBC strategy was proposed for a target plug-in hybrid electric vehicle (PHEV taking into account the drivetrain losses. Individual loss models were developed for drivetrain components including the gears, planetary gear (PG, bearings, and oil pump, based on experimental data and mathematical governing equations. Also, a loss model for the power electronic system was constructed, including loss from the motor-generator while rotating in the unloaded state. To evaluate the effect of the drivetrain losses on the operating mode control strategy, backward simulations were performed using dynamic programming (DP. DP selects the operating mode, which provides the highest efficiency for given driving conditions. It was found that the operating mode selection changes when drivetrain losses are included, depending on driving conditions. An operating mode schedule was developed with respect to the wheel power and vehicle speed, and based on the operating mode schedule, a RBC was obtained, which can be implemented in an on-line application. To evaluate the performance of the RBC, a forward simulator was constructed for the target PHEV. The simulation results show near-optimal performance of the RBC compared with dynamic-programming-based mode control in terms of the mode operation time and fuel economy. The RBC developed with drivetrain losses taken into account showed a 4%–5% improvement of the fuel economy over a similar RBC, which neglected the drivetrain losses.

  3. A Rule-Based Energy Management Strategy for a Plug-in Hybrid School Bus Based on a Controller Area Network Bus

    Directory of Open Access Journals (Sweden)

    Jiankun Peng

    2015-06-01

    Full Text Available This paper presents a rule-based energy management strategy for a plug-in hybrid school bus (PHSB. In order to verify the effectiveness and rationality of the proposed energy management strategy, the powertrain and control models were built with MATLAB/Simulink. The PHSB powertrain model includes an engine model, ISG (integrated started and generator model, drive motor model, power battery packs model, driver model, and vehicle longitudinal dynamics model. To evaluate the controller area network (CAN bus performance features such as the bus load, signal hysteresis, and to verify the reliability and real-time performance of the CAN bus multi-node control method, a co-simulation platform was built with CANoe and MATLAB/Simulink. The co-simulation results show that the control strategy can meet the requirements of the PHSB’s dynamic performance. Meanwhile, the charge-depleting mode (CD and charge-sustaining mode (CS can switch between each other and maintain a state-of-charge (SoC of around 30%, indicating that the energy management strategy effectively extends the working period of the CD mode and improves the fuel economy further. The energy consumption per 100 km includes 13.7 L diesel and 10.5 kW·h electricity with an initial SoC of 75%. The CANoe simulation results show that the bus communication performs well without error frames.

  4. A Rule-Based Spatial Reasoning Approach for OpenStreetMap Data Quality Enrichment; Case Study of Routing and Navigation

    Science.gov (United States)

    2017-01-01

    Finding relevant geospatial information is increasingly critical because of the growing volume of geospatial data available within the emerging “Big Data” era. Users are expecting that the availability of massive datasets will create more opportunities to uncover hidden information and answer more complex queries. This is especially the case with routing and navigation services where the ability to retrieve points of interest and landmarks make the routing service personalized, precise, and relevant. In this paper, we propose a new geospatial information approach that enables the retrieval of implicit information, i.e., geospatial entities that do not exist explicitly in the available source. We present an information broker that uses a rule-based spatial reasoning algorithm to detect topological relations. The information broker is embedded into a framework where annotations and mappings between OpenStreetMap data attributes and external resources, such as taxonomies, support the enrichment of queries to improve the ability of the system to retrieve information. Our method is tested with two case studies that leads to enriching the completeness of OpenStreetMap data with footway crossing points-of-interests as well as building entrances for routing and navigation purposes. It is concluded that the proposed approach can uncover implicit entities and contribute to extract required information from the existing datasets. PMID:29088125

  5. Usage-based vs. rule-based learning: the acquisition of word order in wh-questions in English and Norwegian.

    Science.gov (United States)

    Westergaard, Marit

    2009-11-01

    This paper discusses different approaches to language acquisition in relation to children's acquisition of word order in wh-questions in English and Norwegian. While generative models assert that children set major word order parameters and thus acquire a rule of subject-auxiliary inversion or generalized verb second (V2) at an early stage, some constructivist work argues that English-speaking children are simply reproducing frequent wh-word+auxiliary combinations in the input. The paper questions both approaches, re-evaluates some previous work, and provides some further data, concluding that the acquisition of wh-questions must be the result of a rule-based process. Based on variation in adult grammars, a cue-based model to language acquisition is presented, according to which children are sensitive to minor cues in the input, called micro-cues. V2 is not considered to be one major parameter, but several smaller-scale cues, which are responsible for children's lack of syntactic (over-)generalization in the acquisition process.

  6. On-line detection of apnea/hypopnea events using SpO2 signal: a rule-based approach employing binary classifier models.

    Science.gov (United States)

    Koley, Bijoy Laxmi; Dey, Debangshu

    2014-01-01

    This paper presents an online method for automatic detection of apnea/hypopnea events, with the help of oxygen saturation (SpO2) signal, measured at fingertip by Bluetooth nocturnal pulse oximeter. Event detection is performed by identifying abnormal data segments from the recorded SpO2 signal, employing a binary classifier model based on a support vector machine (SVM). Thereafter the abnormal segment is further analyzed to detect different states within the segment, i.e., steady, desaturation, and resaturation, with the help of another SVM-based binary ensemble classifier model. Finally, a heuristically obtained rule-based system is used to identify the apnea/hypopnea events from the time-sequenced decisions of these classifier models. In the developmental phase, a set of 34 time domain-based features was extracted from the segmented SpO2 signal using an overlapped windowing technique. Later, an optimal set of features was selected on the basis of recursive feature elimination technique. A total of 34 subjects were included in the study. The results show average event detection accuracies of 96.7% and 93.8% for the offline and the online tests, respectively. The proposed system provides direct estimation of the apnea/hypopnea index with the help of a relatively inexpensive and widely available pulse oximeter. Moreover, the system can be monitored and accessed by physicians through LAN/WAN/Internet and can be extended to deploy in Bluetooth-enabled mobile phones.

  7. A Rule-Based Spatial Reasoning Approach for OpenStreetMap Data Quality Enrichment; Case Study of Routing and Navigation

    Directory of Open Access Journals (Sweden)

    Amin Mobasheri

    2017-10-01

    Full Text Available Finding relevant geospatial information is increasingly critical because of the growing volume of geospatial data available within the emerging “Big Data” era. Users are expecting that the availability of massive datasets will create more opportunities to uncover hidden information and answer more complex queries. This is especially the case with routing and navigation services where the ability to retrieve points of interest and landmarks make the routing service personalized, precise, and relevant. In this paper, we propose a new geospatial information approach that enables the retrieval of implicit information, i.e., geospatial entities that do not exist explicitly in the available source. We present an information broker that uses a rule-based spatial reasoning algorithm to detect topological relations. The information broker is embedded into a framework where annotations and mappings between OpenStreetMap data attributes and external resources, such as taxonomies, support the enrichment of queries to improve the ability of the system to retrieve information. Our method is tested with two case studies that leads to enriching the completeness of OpenStreetMap data with footway crossing points-of-interests as well as building entrances for routing and navigation purposes. It is concluded that the proposed approach can uncover implicit entities and contribute to extract required information from the existing datasets.

  8. Methods for testing of geometrical down-scaled rotor blades

    DEFF Research Database (Denmark)

    Branner, Kim; Berring, Peter

    further developed since then. Structures in composite materials are generally difficult and time consuming to test for fatigue resistance. Therefore, several methods for testing of blades have been developed and exist today. Those methods are presented in [1]. Current experimental test performed on full...

  9. RED: A Java-MySQL Software for Identifying and Visualizing RNA Editing Sites Using Rule-Based and Statistical Filters.

    Directory of Open Access Journals (Sweden)

    Yongmei Sun

    Full Text Available RNA editing is one of the post- or co-transcriptional processes that can lead to amino acid substitutions in protein sequences, alternative pre-mRNA splicing, and changes in gene expression levels. Although several methods have been suggested to identify RNA editing sites, there remains challenges to be addressed in distinguishing true RNA editing sites from its counterparts on genome and technical artifacts. In addition, there lacks a software framework to identify and visualize potential RNA editing sites. Here, we presented a software - 'RED' (RNA Editing sites Detector - for the identification of RNA editing sites by integrating multiple rule-based and statistical filters. The potential RNA editing sites can be visualized at the genome and the site levels by graphical user interface (GUI. To improve performance, we used MySQL database management system (DBMS for high-throughput data storage and query. We demonstrated the validity and utility of RED by identifying the presence and absence of C→U RNA-editing sites experimentally validated, in comparison with REDItools, a command line tool to perform high-throughput investigation of RNA editing. In an analysis of a sample data-set with 28 experimentally validated C→U RNA editing sites, RED had sensitivity and specificity of 0.64 and 0.5. In comparison, REDItools had a better sensitivity (0.75 but similar specificity (0.5. RED is an easy-to-use, platform-independent Java-based software, and can be applied to RNA-seq data without or with DNA sequencing data. The package is freely available under the GPLv3 license at http://github.com/REDetector/RED or https://sourceforge.net/projects/redetector.

  10. RED: A Java-MySQL Software for Identifying and Visualizing RNA Editing Sites Using Rule-Based and Statistical Filters.

    Science.gov (United States)

    Sun, Yongmei; Li, Xing; Wu, Di; Pan, Qi; Ji, Yuefeng; Ren, Hong; Ding, Keyue

    2016-01-01

    RNA editing is one of the post- or co-transcriptional processes that can lead to amino acid substitutions in protein sequences, alternative pre-mRNA splicing, and changes in gene expression levels. Although several methods have been suggested to identify RNA editing sites, there remains challenges to be addressed in distinguishing true RNA editing sites from its counterparts on genome and technical artifacts. In addition, there lacks a software framework to identify and visualize potential RNA editing sites. Here, we presented a software - 'RED' (RNA Editing sites Detector) - for the identification of RNA editing sites by integrating multiple rule-based and statistical filters. The potential RNA editing sites can be visualized at the genome and the site levels by graphical user interface (GUI). To improve performance, we used MySQL database management system (DBMS) for high-throughput data storage and query. We demonstrated the validity and utility of RED by identifying the presence and absence of C→U RNA-editing sites experimentally validated, in comparison with REDItools, a command line tool to perform high-throughput investigation of RNA editing. In an analysis of a sample data-set with 28 experimentally validated C→U RNA editing sites, RED had sensitivity and specificity of 0.64 and 0.5. In comparison, REDItools had a better sensitivity (0.75) but similar specificity (0.5). RED is an easy-to-use, platform-independent Java-based software, and can be applied to RNA-seq data without or with DNA sequencing data. The package is freely available under the GPLv3 license at http://github.com/REDetector/RED or https://sourceforge.net/projects/redetector.

  11. Nodule Detection in a Lung Region that's Segmented with Using Genetic Cellular Neural Networks and 3D Template Matching with Fuzzy Rule Based Thresholding

    International Nuclear Information System (INIS)

    Ozekes, Serhat; Osman, Onur; Ucan, N.

    2008-01-01

    The purpose of this study was to develop a new method for automated lung nodule detection in serial section CT images with using the characteristics of the 3D appearance of the nodules that distinguish themselves from the vessels. Lung nodules were detected in four steps. First, to reduce the number of region of interests (ROIs) and the computation time, the lung regions of the CTs were segmented using Genetic Cellular Neural Networks (G-CNN). Then, for each lung region, ROIs were specified with using the 8 directional search; +1 or -1 values were assigned to each voxel. The 3D ROI image was obtained by combining all the 2-Dimensional (2D) ROI images. A 3D template was created to find the nodule-like structures on the 3D ROI image. Convolution of the 3D ROI image with the proposed template strengthens the shapes that are similar to those of the template and it weakens the other ones. Finally, fuzzy rule based thresholding was applied and the ROI's were found. To test the system's efficiency, we used 16 cases with a total of 425 slices, which were taken from the Lung Image Database Consortium (LIDC) dataset. The computer aided diagnosis (CAD) system achieved 100% sensitivity with 13.375 FPs per case when the nodule thickness was greater than or equal to 5.625 mm. Our results indicate that the detection performance of our algorithm is satisfactory, and this may well improve the performance of computer aided detection of lung nodules

  12. Validity of the accuracy of right ventricular volume measurement with single plane Simpson's rule based on a new half-circle model

    International Nuclear Information System (INIS)

    Cui Wei

    2003-01-01

    Objective: Right ventricular volume calculation is the basis of right ventricular function evaluation. The complexity of right ventricular shape makes it more difficult for measuring right ventricular volume, and therefore limits the widespread use of right ventricular function assessment. However, the short-axis view of the right ventricle is usually crescent, and might be assumed as half of a circle. This hypothesis can be applied to calculate right ventricular volume by using the plane Simpson's rule, but the final right ventricular volume should be about half of the original calculated value. The aim of this study was to test the accuracy of right ventricular volume measurement based on this new assumption in human right ventricular casts. Methods: Fifteen human right ventricular casts were obtained within 24 hours after death from 15 subjects who did not have overt cardiac disease. These casts were scanned with Toshiba multislice helical CT (Aquilion) using a scanning protocol similar to clinical practice. Right ventricular sagittal MPR images that correspond to right anterior oblique view were reconstructed later and single-plane Simpson's method was used to calculate right ventricular volumes. The calculated right ventricular volume was defined as the original calculated value divided by 2. The actual right ventricular casts volumes were determined by water displacement method. Results: The actual right ventricular volume was (64.23±24.51) ml, while the calculated volume was (53.18±26.22) ml. The calculated right ventricular volume significantly underestimated the actual volume by (11.05±13.09) ml (t=3.27, P=0.006). However, there was a close correlation between the actual cast volume and the calculated volume (r=0.869, P<0.001), with a regression equation of RV actual volume=21.04+0.812 x RV calculated volume. Conclusion: Right ventricular volume could be calculated accurately with single-plane Simpson's rule based on our new proposed half

  13. Scenario development methodologies

    International Nuclear Information System (INIS)

    Eng, T.; Hudson, J.; Stephansson, O.

    1994-11-01

    In the period 1981-1994, SKB has studied several methodologies to systematize and visualize all the features, events and processes (FEPs) that can influence a repository for radioactive waste in the future. All the work performed is based on the terminology and basic findings in the joint SKI/SKB work on scenario development presented in the SKB Technical Report 89-35. The methodologies studied are a) Event tree analysis, b) Influence diagrams and c) Rock Engineering Systems (RES) matrices. Each one of the methodologies is explained in this report as well as examples of applications. One chapter is devoted to a comparison between the two most promising methodologies, namely: Influence diagrams and the RES methodology. In conclusion a combination of parts of the Influence diagram and the RES methodology is likely to be a promising approach. 26 refs

  14. Reliability Centered Maintenance - Methodologies

    Science.gov (United States)

    Kammerer, Catherine C.

    2009-01-01

    Journal article about Reliability Centered Maintenance (RCM) methodologies used by United Space Alliance, LLC (USA) in support of the Space Shuttle Program at Kennedy Space Center. The USA Reliability Centered Maintenance program differs from traditional RCM programs because various methodologies are utilized to take advantage of their respective strengths for each application. Based on operational experience, USA has customized the traditional RCM methodology into a streamlined lean logic path and has implemented the use of statistical tools to drive the process. USA RCM has integrated many of the L6S tools into both RCM methodologies. The tools utilized in the Measure, Analyze, and Improve phases of a Lean Six Sigma project lend themselves to application in the RCM process. All USA RCM methodologies meet the requirements defined in SAE JA 1011, Evaluation Criteria for Reliability-Centered Maintenance (RCM) Processes. The proposed article explores these methodologies.

  15. Introduction to LCA Methodology

    DEFF Research Database (Denmark)

    Hauschild, Michael Z.

    2018-01-01

    In order to offer the reader an overview of the LCA methodology in the preparation of the more detailed description of its different phases, a brief introduction is given to the methodological framework according to the ISO 14040 standard and the main elements of each of its phases. Emphasis...

  16. Methodologies, languages and tools

    International Nuclear Information System (INIS)

    Amako, Katsuya

    1994-01-01

    This is a summary of the open-quotes Methodologies, Languages and Toolsclose quotes session in the CHEP'94 conference. All the contributions to methodologies and languages are relevant to the object-oriented approach. Other topics presented are related to various software tools in the down-sized computing environment

  17. Archetype modeling methodology.

    Science.gov (United States)

    Moner, David; Maldonado, José Alberto; Robles, Montserrat

    2018-03-01

    Clinical Information Models (CIMs) expressed as archetypes play an essential role in the design and development of current Electronic Health Record (EHR) information structures. Although there exist many experiences about using archetypes in the literature, a comprehensive and formal methodology for archetype modeling does not exist. Having a modeling methodology is essential to develop quality archetypes, in order to guide the development of EHR systems and to allow the semantic interoperability of health data. In this work, an archetype modeling methodology is proposed. This paper describes its phases, the inputs and outputs of each phase, and the involved participants and tools. It also includes the description of the possible strategies to organize the modeling process. The proposed methodology is inspired by existing best practices of CIMs, software and ontology development. The methodology has been applied and evaluated in regional and national EHR projects. The application of the methodology provided useful feedback and improvements, and confirmed its advantages. The conclusion of this work is that having a formal methodology for archetype development facilitates the definition and adoption of interoperable archetypes, improves their quality, and facilitates their reuse among different information systems and EHR projects. Moreover, the proposed methodology can be also a reference for CIMs development using any other formalism. Copyright © 2018 Elsevier Inc. All rights reserved.

  18. Menopause and Methodological Doubt

    Science.gov (United States)

    Spence, Sheila

    2005-01-01

    Menopause and methodological doubt begins by making a tongue-in-cheek comparison between Descartes' methodological doubt and the self-doubt that can arise around menopause. A hermeneutic approach is taken in which Cartesian dualism and its implications for the way women are viewed in society are examined, both through the experiences of women…

  19. VEM: Virtual Enterprise Methodology

    DEFF Research Database (Denmark)

    Tølle, Martin; Vesterager, Johan

    2003-01-01

    This chapter presents a virtual enterprise methodology (VEM) that outlines activities to consider when setting up and managing virtual enterprises (VEs). As a methodology the VEM helps companies to ask the right questions when preparing for and setting up an enterprise network, which works...

  20. Data Centric Development Methodology

    Science.gov (United States)

    Khoury, Fadi E.

    2012-01-01

    Data centric applications, an important effort of software development in large organizations, have been mostly adopting a software methodology, such as a waterfall or Rational Unified Process, as the framework for its development. These methodologies could work on structural, procedural, or object oriented based applications, but fails to capture…

  1. The Methodology of Magpies

    Science.gov (United States)

    Carter, Susan

    2014-01-01

    Arts/Humanities researchers frequently do not explain methodology overtly; instead, they "perform" it through their use of language, textual and historic cross-reference, and theory. Here, methodologies from literary studies are shown to add to Higher Education (HE) an exegetical and critically pluralist approach. This includes…

  2. Westinghouse loading pattern search methodology for complex core designs

    International Nuclear Information System (INIS)

    Chao, Y.A.; Alsop, B.H.; Johansen, B.J.; Morita, T.

    1991-01-01

    Pressurized water reactor core designs have become more complex and must meet a plethora of design constraints. Trends have been toward longer cycles with increased discharge burnup, increased burnable absorber (BA) number, mixed BA types, reduced radial leakage, axially blanketed fuel, and multiple-batch feed fuel regions. Obtaining economical reload core loading patterns (LPs) that meet design criteria is a difficult task to do manually. Automated LP search tools are needed. An LP search tool cannot possibly perform an exhaustive search because of the sheer size of the combinatorial problem. On the other hand, evolving complexity of the design features and constraints often invalidates expert rules based on past design experiences. Westinghouse has developed a sophisticated loading pattern search methodology. This methodology is embodied in the LPOP code, which Westinghouse nuclear designers use extensively. The LPOP code generates a variety of LPs meeting design constraints and performs a two-cycle economic evaluation of the generated LPs. The designer selects the most appropriate patterns for fine tuning and evaluation by the design codes. This paper describes the major features of the LPOP methodology that are relevant to fulfilling the aforementioned requirements. Data and examples are also provided to demonstrate the performance of LPOP in meeting the complex design needs

  3. Design Methodology - Design Synthesis

    DEFF Research Database (Denmark)

    Andreasen, Mogens Myrup

    2003-01-01

    Design Methodology is part of our practice and our knowledge about designing, and it has been strongly supported by the establishing and work of a design research community. The aim of this article is to broaden the reader¿s view of designing and Design Methodology. This is done by sketching...... the development of Design Methodology through time and sketching some important approaches and methods. The development is mainly forced by changing industrial condition, by the growth of IT support for designing, but also by the growth of insight into designing created by design researchers.......ABSTRACT Design Methodology shall be seen as our understanding of how to design; it is an early (emerging late 60ies) and original articulation of teachable and learnable methodics. The insight is based upon two sources: the nature of the designed artefacts and the nature of human designing. Today...

  4. GPS system simulation methodology

    Science.gov (United States)

    Ewing, Thomas F.

    1993-01-01

    The following topics are presented: background; Global Positioning System (GPS) methodology overview; the graphical user interface (GUI); current models; application to space nuclear power/propulsion; and interfacing requirements. The discussion is presented in vugraph form.

  5. Hazard classification methodology

    International Nuclear Information System (INIS)

    Brereton, S.J.

    1996-01-01

    This document outlines the hazard classification methodology used to determine the hazard classification of the NIF LTAB, OAB, and the support facilities on the basis of radionuclides and chemicals. The hazard classification determines the safety analysis requirements for a facility

  6. Nonlinear Image Denoising Methodologies

    National Research Council Canada - National Science Library

    Yufang, Bao

    2002-01-01

    In this thesis, we propose a theoretical as well as practical framework to combine geometric prior information to a statistical/probabilistic methodology in the investigation of a denoising problem...

  7. Clinical trial methodology

    National Research Council Canada - National Science Library

    Peace, Karl E; Chen, Ding-Geng

    2011-01-01

    "Now viewed as its own scientific discipline, clinical trial methodology encompasses the methods required for the protection of participants in a clinical trial and the methods necessary to provide...

  8. Methodology of sustainability accounting

    Directory of Open Access Journals (Sweden)

    O.H. Sokil

    2017-03-01

    Full Text Available Modern challenges of the theory and methodology of accounting are realized through the formation and implementation of new concepts, the purpose of which is to meet the needs of users in standard and unique information. The development of a methodology for sustainability accounting is a key aspect of the management of an economic entity. The purpose of the article is to form the methodological bases of accounting for sustainable development and determine its goals, objectives, object, subject, methods, functions and key aspects. The author analyzes the theoretical bases of the definition and considers the components of the traditional accounting methodology. Generalized structural diagram of the methodology for accounting for sustainable development is offered in the article. The complex of methods and principles of sustainable development accounting for systematized and non-standard provisions has been systematized. The new system of theoretical and methodological provisions of accounting for sustainable development is justified in the context of determining its purpose, objective, subject, object, methods, functions and key aspects.

  9. The impact of category structure and training methodology on learning and generalizing within-category representations.

    Science.gov (United States)

    Ell, Shawn W; Smith, David B; Peralta, Gabriela; Hélie, Sébastien

    2017-08-01

    When interacting with categories, representations focused on within-category relationships are often learned, but the conditions promoting within-category representations and their generalizability are unclear. We report the results of three experiments investigating the impact of category structure and training methodology on the learning and generalization of within-category representations (i.e., correlational structure). Participants were trained on either rule-based or information-integration structures using classification (Is the stimulus a member of Category A or Category B?), concept (e.g., Is the stimulus a member of Category A, Yes or No?), or inference (infer the missing component of the stimulus from a given category) and then tested on either an inference task (Experiments 1 and 2) or a classification task (Experiment 3). For the information-integration structure, within-category representations were consistently learned, could be generalized to novel stimuli, and could be generalized to support inference at test. For the rule-based structure, extended inference training resulted in generalization to novel stimuli (Experiment 2) and inference training resulted in generalization to classification (Experiment 3). These data help to clarify the conditions under which within-category representations can be learned. Moreover, these results make an important contribution in highlighting the impact of category structure and training methodology on the generalization of categorical knowledge.

  10. Regional Shelter Analysis Methodology

    Energy Technology Data Exchange (ETDEWEB)

    Dillon, Michael B. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Dennison, Deborah [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Kane, Jave [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Walker, Hoyt [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Miller, Paul [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2015-08-01

    The fallout from a nuclear explosion has the potential to injure or kill 100,000 or more people through exposure to external gamma (fallout) radiation. Existing buildings can reduce radiation exposure by placing material between fallout particles and exposed people. Lawrence Livermore National Laboratory was tasked with developing an operationally feasible methodology that could improve fallout casualty estimates. The methodology, called a Regional Shelter Analysis, combines the fallout protection that existing buildings provide civilian populations with the distribution of people in various locations. The Regional Shelter Analysis method allows the consideration of (a) multiple building types and locations within buildings, (b) country specific estimates, (c) population posture (e.g., unwarned vs. minimally warned), and (d) the time of day (e.g., night vs. day). The protection estimates can be combined with fallout predictions (or measurements) to (a) provide a more accurate assessment of exposure and injury and (b) evaluate the effectiveness of various casualty mitigation strategies. This report describes the Regional Shelter Analysis methodology, highlights key operational aspects (including demonstrating that the methodology is compatible with current tools), illustrates how to implement the methodology, and provides suggestions for future work.

  11. The policy trail methodology

    DEFF Research Database (Denmark)

    Holford, John; Larson, Anne; Melo, Susana

    of ‘policy trail’, arguing that it can overcome ‘methodological nationalism’ and link structure and agency in research on the ‘European educational space’. The ‘trail’ metaphor, she suggests, captures the intentionality and the erratic character of policy. The trail connects sites and brings about change......, but – although policy may be intended to be linear, with specific outcomes – policy often has to bend, and sometimes meets insurmountable obstacles. This symposium outlines and develops the methodology, but also reports on research undertaken within a major FP7 project (LLLIght’in’Europe, 2012-15) which made use......In recent years, the “policy trail” has been proposed as a methodology appropriate to the shifting and fluid governance of lifelong learning in the late modern world (Holford et al. 2013, Holford et al. 2013, Cort 2014). The contemporary environment is marked by multi-level governance (global...

  12. Changing methodologies in TESOL

    CERN Document Server

    Spiro, Jane

    2013-01-01

    Covering core topics from vocabulary and grammar to teaching, writing speaking and listening, this textbook shows you how to link research to practice in TESOL methodology. It emphasises how current understandings have impacted on the language classroom worldwide and investigates the meaning of 'methods' and 'methodology' and the importance of these for the teacher: as well as the underlying assumptions and beliefs teachers bring to bear in their practice. By introducing you to language teaching approaches, you will explore the way these are influenced by developments in our understanding of l

  13. Creativity in phenomenological methodology

    DEFF Research Database (Denmark)

    Dreyer, Pia; Martinsen, Bente; Norlyk, Annelise

    2014-01-01

    on the methodologies of van Manen, Dahlberg, Lindseth & Norberg, the aim of this paper is to argue that the increased focus on creativity and arts in research methodology is valuable to gain a deeper insight into lived experiences. We illustrate this point through examples from empirical nursing studies, and discuss......Nursing research is often concerned with lived experiences in human life using phenomenological and hermeneutic approaches. These empirical studies may use different creative expressions and art-forms to describe and enhance an embodied and personalised understanding of lived experiences. Drawing...... may support a respectful renewal of phenomenological research traditions in nursing research....

  14. Utilización de Sistemas Basados en Reglas y en Casos para diseñar transmisiones por tornillo sinfín // Use of rules based systems and cases based systems for worm gear design

    Directory of Open Access Journals (Sweden)

    Jorge Laureano Moya‐Rodríguez

    2012-01-01

    Full Text Available Las técnicas de Inteligencia Artificial se aplican hoy en día a diferentes problemas de Ingeniería,especialmente los Sistemas Basados en el Conocimiento. Entre estos últimos los más comunes son losSistemas Basados en Patrones, los Sistemas Basados en Reglas, los Sistemas Basados en Casos y losSistemas Híbridos. Los Sistemas Basados en Casos parten de problemas resueltos en un dominio deaplicación y mediante un proceso de adaptación, encuentran la solución a un nuevo problema. Estossistemas pueden ser usados con éxito para el diseño de engranajes, particularmente para el diseño detransmisiones por tornillo sin fin, sin embargo ello constituye un campo de las aplicaciones de laInteligencia Artificial aún inexplorada. En el presente trabajo se hace una comparación del uso de losSistemas Basados en Regla y los Sistemas Basados en Casos para el diseño de transmisiones portornillo sin fin y se muestran los resultados de la aplicación de los sistemas basados en regla al diseñoparticular de una transmisión por tornillo sin fin.Palabras claves: tornillo sin fin, engranajes, sistemas basados en casos, sistemas basados en reglas,inteligencia artificial.____________________________________________________________________________AbstractNowadays Artificial Intelligence techniques are applied successfully to different engineering problems,especially the “Knowledge Based Systems”. Among them the most common are the “Frame basedSystems”, “Rules Based Systems”, “Case Based Systems” and "Hybrid Systems". The “Case BasedSystems” (CBS analyze solved problems in an application domain and by means of a process ofadaptation; they find the solution to a new problem. These systems can be used successfully for thedesign of gears, particularly for designing worm gears; nevertheless it constitutes a field of the applicationsof artificial intelligence even unexplored. A comparison of the use of “Rules Based System” and

  15. Computer Network Operations Methodology

    Science.gov (United States)

    2004-03-01

    means of their computer information systems. Disrupt - This type of attack focuses on disrupting as “attackers might surreptitiously reprogram enemy...by reprogramming the computers that control distribution within the power grid. A disruption attack introduces disorder and inhibits the effective...between commanders. The use of methodologies is widespread and done subconsciously to assist individuals in decision making. The processes that

  16. SCI Hazard Report Methodology

    Science.gov (United States)

    Mitchell, Michael S.

    2010-01-01

    This slide presentation reviews the methodology in creating a Source Control Item (SCI) Hazard Report (HR). The SCI HR provides a system safety risk assessment for the following Ares I Upper Stage Production Contract (USPC) components (1) Pyro Separation Systems (2) Main Propulsion System (3) Reaction and Roll Control Systems (4) Thrust Vector Control System and (5) Ullage Settling Motor System components.

  17. A Functional HAZOP Methodology

    DEFF Research Database (Denmark)

    Liin, Netta; Lind, Morten; Jensen, Niels

    2010-01-01

    A HAZOP methodology is presented where a functional plant model assists in a goal oriented decomposition of the plant purpose into the means of achieving the purpose. This approach leads to nodes with simple functions from which the selection of process and deviation variables follow directly...

  18. Complicating Methodological Transparency

    Science.gov (United States)

    Bridges-Rhoads, Sarah; Van Cleave, Jessica; Hughes, Hilary E.

    2016-01-01

    A historical indicator of the quality, validity, and rigor of qualitative research has been the documentation and disclosure of the behind-the-scenes work of the researcher. In this paper, we use what we call "methodological data" as a tool to complicate the possibility and desirability of such transparency. Specifically, we draw on our…

  19. Methodological Advances in Dea

    NARCIS (Netherlands)

    L. Cherchye (Laurens); G.T. Post (Thierry)

    2001-01-01

    textabstractWe survey the methodological advances in DEA over the last 25 years and discuss the necessary conditions for a sound empirical application. We hope this survey will contribute to the further dissemination of DEA, the knowledge of its relative strengths and weaknesses, and the tools

  20. NUSAM Methodology for Assessment.

    Energy Technology Data Exchange (ETDEWEB)

    Leach, Janice [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Snell, Mark K. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2016-07-01

    This document provides a methodology for the performance-based assessment of security systems designed for the protection of nuclear and radiological materials and the processes that produce and/or involve them. It is intended for use with both relatively simple installations and with highly regulated complex sites with demanding security requirements.

  1. MIRD methodology. Part 1

    International Nuclear Information System (INIS)

    Rojo, Ana M.

    2004-01-01

    This lecture develops the MIRD (Medical Internal Radiation Dose) methodology for the evaluation of the internal dose due to the administration of radiopharmaceuticals. In this first part, the basic concepts and the main equations are presented. The ICRP Dosimetric System is also explained. (author)

  2. Response Surface Methodology

    NARCIS (Netherlands)

    Kleijnen, Jack P.C.

    2014-01-01

    Abstract: This chapter first summarizes Response Surface Methodology (RSM), which started with Box and Wilson’s article in 1951 on RSM for real, non-simulated systems. RSM is a stepwise heuristic that uses first-order polynomials to approximate the response surface locally. An estimated polynomial

  3. MIRD methodology. Part 2

    International Nuclear Information System (INIS)

    Gomez Parada, Ines

    2004-01-01

    This paper develops the MIRD (Medical Internal Radiation Dose) methodology for the evaluation of the internal dose due to the administration of radiopharmaceuticals. In this second part, different methods for the calculation of the accumulated activity are presented, together with the effective half life definition. Different forms of Retention Activity curves are also shown. (author)

  4. Soft Systems Methodology

    Science.gov (United States)

    Checkland, Peter; Poulter, John

    Soft systems methodology (SSM) is an approach for tackling problematical, messy situations of all kinds. It is an action-oriented process of inquiry into problematic situations in which users learn their way from finding out about the situation, to taking action to improve it. The learning emerges via an organised process in which the situation is explored using a set of models of purposeful action (each built to encapsulate a single worldview) as intellectual devices, or tools, to inform and structure discussion about a situation and how it might be improved. This paper, written by the original developer Peter Checkland and practitioner John Poulter, gives a clear and concise account of the approach that covers SSM's specific techniques, the learning cycle process of the methodology and the craft skills which practitioners develop. This concise but theoretically robust account nevertheless includes the fundamental concepts, techniques, core tenets described through a wide range of settings.

  5. Transparent Guideline Methodology Needed

    DEFF Research Database (Denmark)

    Lidal, Ingeborg; Norén, Camilla; Mäkelä, Marjukka

    2013-01-01

    As part of learning at the Nordic Workshop of Evidence-based Medicine, we have read with interest the practice guidelines for central venous access, published in your Journal in 2012.1 We appraised the quality of this guideline using the checklist developed by The Evidence-Based Medicine Working ...... are based on best currently available evidence. Our concerns are in two main categories: the rigor of development, including methodology of searching, evaluating, and combining the evidence; and editorial independence, including funding and possible conflicts of interest....... Group.2 Similar criteria for guideline quality have been suggested elsewhere.3 Our conclusion was that this much needed guideline is currently unclear about several aspects of the methodology used in developing the recommendations. This means potential users cannot be certain that the recommendations...

  6. Web survey methodology

    CERN Document Server

    Callegaro, Mario; Vehovar, Asja

    2015-01-01

    Web Survey Methodology guides the reader through the past fifteen years of research in web survey methodology. It both provides practical guidance on the latest techniques for collecting valid and reliable data and offers a comprehensive overview of research issues. Core topics from preparation to questionnaire design, recruitment testing to analysis and survey software are all covered in a systematic and insightful way. The reader will be exposed to key concepts and key findings in the literature, covering measurement, non-response, adjustments, paradata, and cost issues. The book also discusses the hottest research topics in survey research today, such as internet panels, virtual interviewing, mobile surveys and the integration with passive measurements, e-social sciences, mixed modes and business intelligence. The book is intended for students, practitioners, and researchers in fields such as survey and market research, psychological research, official statistics and customer satisfaction research.

  7. Steganography: LSB Methodology

    Science.gov (United States)

    2012-08-02

    of LSB steganography in grayscale and color images . In J. Dittmann, K. Nahrstedt, and P. Wohlmacher, editors, Proceedings of the ACM, Special...Fridrich, M. Gojan and R. Du paper titled “Reliable detection of LSB steganography in grayscale and color images ”. From a general perspective Figure 2...REPORT Steganography : LSB Methodology (Progress Report) 14. ABSTRACT 16. SECURITY CLASSIFICATION OF: In computer science, steganography is the science

  8. Soil Radiological Characterisation Methodology

    International Nuclear Information System (INIS)

    Attiogbe, Julien; Aubonnet, Emilie; De Maquille, Laurence; De Moura, Patrick; Desnoyers, Yvon; Dubot, Didier; Feret, Bruno; Fichet, Pascal; Granier, Guy; Iooss, Bertrand; Nokhamzon, Jean-Guy; Ollivier Dehaye, Catherine; Pillette-Cousin, Lucien; Savary, Alain

    2014-12-01

    This report presents the general methodology and best practice approaches which combine proven existing techniques for sampling and characterisation to assess the contamination of soils prior to remediation. It is based on feedback of projects conducted by main French nuclear stakeholders involved in the field of remediation and dismantling (EDF, CEA, AREVA and IRSN). The application of this methodology will enable the project managers to obtain the elements necessary for the drawing up of files associated with remediation operations, as required by the regulatory authorities. It is applicable to each of the steps necessary for the piloting of remediation work-sites, depending on the objectives targeted (release into the public domain, re-use, etc.). The main part describes the applied statistical methodology with the exploratory analysis and variogram data, identification of singular points and their location. The results obtained permit assessment of a mapping to identify the contaminated surface and subsurface areas. It stakes the way for radiological site characterisation since the initial investigations from historical and functional analysis to check that the remediation objectives have been met. It follows an example application from the feedback of the remediation of a contaminated site on the Fontenay aux Roses facility. It is supplemented by a glossary of main terms used in the field from different publications or international standards. This technical report is a support of the ISO Standard ISO ISO/TC 85/SC 5 N 18557 'Sampling and characterisation principles for soils, buildings and infrastructures contaminated by radionuclides for remediation purposes'. (authors) [fr

  9. Methodology applied to develop the DHIE: applied methodology

    CSIR Research Space (South Africa)

    Herselman, Marlien

    2016-12-01

    Full Text Available This section will address the methodology that was applied to develop the South African Digital Health Innovation Ecosystem (DHIE). Each chapter under Section B represents a specific phase in the methodology....

  10. Case Study Research Methodology

    Directory of Open Access Journals (Sweden)

    Mark Widdowson

    2011-01-01

    Full Text Available Commenting on the lack of case studies published in modern psychotherapy publications, the author reviews the strengths of case study methodology and responds to common criticisms, before providing a summary of types of case studies including clinical, experimental and naturalistic. Suggestions are included for developing systematic case studies and brief descriptions are given of a range of research resources relating to outcome and process measures. Examples of a pragmatic case study design and a hermeneutic single-case efficacy design are given and the paper concludes with some ethical considerations and an exhortation to the TA community to engage more widely in case study research.

  11. Sample size methodology

    CERN Document Server

    Desu, M M

    2012-01-01

    One of the most important problems in designing an experiment or a survey is sample size determination and this book presents the currently available methodology. It includes both random sampling from standard probability distributions and from finite populations. Also discussed is sample size determination for estimating parameters in a Bayesian setting by considering the posterior distribution of the parameter and specifying the necessary requirements. The determination of the sample size is considered for ranking and selection problems as well as for the design of clinical trials. Appropria

  12. Geotechnical site assessment methodology

    International Nuclear Information System (INIS)

    Tunbridge, L.W.; Richards, L.R.

    1985-09-01

    The reports comprising this volume concern the research conducted on geotechnical site assessment methodology at the Carwynnen test mine in granites in Cornwall, with particular reference to the effect of structures imposed by discontinuities on the engineering behaviour of rock masses. The topics covered are: in-situ stress measurements using (a) the hydraulic fracturing method, or (b) the US Bureau of Mines deformation probe; scanline discontinuity survey - coding form and instructions, and data; applicability of geostatistical estimation methods to scalar rock properties; comments on in-situ stress at the Carwynnen test mine and the state of stress in the British Isles. (U.K.)

  13. Microphysics evolution and methodology

    International Nuclear Information System (INIS)

    Dionisio, J.S.

    1990-01-01

    A few general features of microscopics evolution and their relationship with microscopics methodology are briefly surveyed. Several pluri-disciplinary and interdisciplinary aspects of microscopics research are also discussed in the present scientific context. The need for an equilibrium between individual tendencies and collective constraints required by team work, already formulated thirty years ago by Frederic Joliot, is particularly stressed in the present conjuncture of Nuclear Research favouring very large team projects and discouraging individual initiatives. The increasing importance of the science of science (due to their multiple social, economical, ecological aspects) and the stronger competition between national and international tendencies of scientific (and technical) cooperation are also discussed. (author)

  14. MIRD methodology; Metodologia MIRD

    Energy Technology Data Exchange (ETDEWEB)

    Rojo, Ana M [Autoridad Regulatoria Nuclear, Buenos Aires (Argentina); Gomez Parada, Ines [Sociedad Argentina de Radioproteccion, Buenos Aires (Argentina)

    2004-07-01

    The MIRD (Medical Internal Radiation Dose) system was established by the Society of Nuclear Medicine of USA in 1960 to assist the medical community in the estimation of the dose in organs and tissues due to the incorporation of radioactive materials. Since then, 'MIRD Dose Estimate Report' (from the 1 to 12) and 'Pamphlets', of great utility for the dose calculations, were published. The MIRD system was planned essentially for the calculation of doses received by the patients during nuclear medicine diagnostic procedures. The MIRD methodology for the absorbed doses calculations in different tissues is explained.

  15. Beam optimization: improving methodology

    International Nuclear Information System (INIS)

    Quinteiro, Guillermo F.

    2004-01-01

    Different optimization techniques commonly used in biology and food technology allow a systematic and complete analysis of response functions. In spite of the great interest in medical and nuclear physics in the problem of optimizing mixed beams, little attention has been given to sophisticate mathematical tools. Indeed, many techniques are perfectly suited to the typical problem of beam optimization. This article is intended as a guide to the use of two methods, namely Response Surface Methodology and Simplex, that are expected to fasten the optimization process and, meanwhile give more insight into the relationships among the dependent variables controlling the response

  16. Literacy research methodologies

    CERN Document Server

    Duke, Nell K

    2012-01-01

    The definitive reference on literacy research methods, this book serves as a key resource for researchers and as a text in graduate-level courses. Distinguished scholars clearly describe established and emerging methodologies, discuss the types of questions and claims for which each is best suited, identify standards of quality, and present exemplary studies that illustrate the approaches at their best. The book demonstrates how each mode of inquiry can yield unique insights into literacy learning and teaching and how the methods can work together to move the field forward.   New to This Editi

  17. Methodology of site studies

    International Nuclear Information System (INIS)

    Caries, J.C.; Hugon, J.; Grauby, A.

    1980-01-01

    This methodology consists in an essentially dynamic, estimated and follow-up analysis of the impact of discharges on all the environment compartments, whether natural or not, that play a part in the protection of man and his environment. It applies at two levels, to wit: the choice of site, or the detailed study of the site selected. Two examples of its application will be developed, namely: at the choice of site level in the case of marine sites, and of the detailed study level of the chosen site in that of a riverside site [fr

  18. Alternative pricing methodologies

    International Nuclear Information System (INIS)

    Anon.

    1991-01-01

    With the increased interest in competitive market forces and growing recognition of the deficiencies in current practices, FERC and others are exploring alternatives to embedded cost pricing. A number of these alternatives are discussed in this chapter. Marketplace pricing, discussed briefly here, is the subject of the next chapter. Obviously, the pricing formula may combine several of these methodologies. One utility of which the authors are aware is seeking a price equal to the sum of embedded costs, opportunity costs, line losses, value of service, FERC's percentage adder formula and a contract service charge

  19. LOFT uncertainty-analysis methodology

    International Nuclear Information System (INIS)

    Lassahn, G.D.

    1983-01-01

    The methodology used for uncertainty analyses of measurements in the Loss-of-Fluid Test (LOFT) nuclear-reactor-safety research program is described and compared with other methodologies established for performing uncertainty analyses

  20. LOFT uncertainty-analysis methodology

    International Nuclear Information System (INIS)

    Lassahn, G.D.

    1983-01-01

    The methodology used for uncertainty analyses of measurements in the Loss-of-Fluid Test (LOFT) nuclear reactor safety research program is described and compared with other methodologies established for performing uncertainty analyses

  1. RHIC Data Correlation Methodology

    International Nuclear Information System (INIS)

    Michnoff, R.; D'Ottavio, T.; Hoff, L.; MacKay, W.; Satogata, T.

    1999-01-01

    A requirement for RHIC data plotting software and physics analysis is the correlation of data from all accelerator data gathering systems. Data correlation provides the capability for a user to request a plot of multiple data channels vs. time, and to make meaningful time-correlated data comparisons. The task of data correlation for RHIC requires careful consideration because data acquisition triggers are generated from various asynchronous sources including events from the RHIC Event Link, events from the two Beam Sync Links, and other unrelated clocks. In order to correlate data from asynchronous acquisition systems a common time reference is required. The RHIC data correlation methodology will allow all RHIC data to be converted to a common wall clock time, while still preserving native acquisition trigger information. A data correlation task force team, composed of the authors of this paper, has been formed to develop data correlation design details and provide guidelines for software developers. The overall data correlation methodology will be presented in this paper

  2. Intelligent systems engineering methodology

    Science.gov (United States)

    Fouse, Scott

    1990-01-01

    An added challenge for the designers of large scale systems such as Space Station Freedom is the appropriate incorporation of intelligent system technology (artificial intelligence, expert systems, knowledge-based systems, etc.) into their requirements and design. This presentation will describe a view of systems engineering which successfully addresses several aspects of this complex problem: design of large scale systems, design with requirements that are so complex they only completely unfold during the development of a baseline system and even then continue to evolve throughout the system's life cycle, design that involves the incorporation of new technologies, and design and development that takes place with many players in a distributed manner yet can be easily integrated to meet a single view of the requirements. The first generation of this methodology was developed and evolved jointly by ISX and the Lockheed Aeronautical Systems Company over the past five years on the Defense Advanced Research Projects Agency/Air Force Pilot's Associate Program, one of the largest, most complex, and most successful intelligent systems constructed to date. As the methodology has evolved it has also been applied successfully to a number of other projects. Some of the lessons learned from this experience may be applicable to Freedom.

  3. SMART performance analysis methodology

    International Nuclear Information System (INIS)

    Lim, H. S.; Kim, H. C.; Lee, D. J.

    2001-04-01

    To ensure the required and desired operation over the plant lifetime, the performance analysis for the SMART NSSS design is done by means of the specified analysis methodologies for the performance related design basis events(PRDBE). The PRDBE is an occurrence(event) that shall be accommodated in the design of the plant and whose consequence would be no more severe than normal service effects of the plant equipment. The performance analysis methodology which systematizes the methods and procedures to analyze the PRDBEs is as follows. Based on the operation mode suitable to the characteristics of the SMART NSSS, the corresponding PRDBEs and allowable range of process parameters for these events are deduced. With the developed control logic for each operation mode, the system thermalhydraulics are analyzed for the chosen PRDBEs using the system analysis code. Particularly, because of different system characteristics of SMART from the existing commercial nuclear power plants, the operation mode, PRDBEs, control logic, and analysis code should be consistent with the SMART design. This report presents the categories of the PRDBEs chosen based on each operation mode and the transition among these and the acceptance criteria for each PRDBE. It also includes the analysis methods and procedures for each PRDBE and the concept of the control logic for each operation mode. Therefore this report in which the overall details for SMART performance analysis are specified based on the current SMART design, would be utilized as a guide for the detailed performance analysis

  4. Relative Hazard Calculation Methodology

    International Nuclear Information System (INIS)

    DL Strenge; MK White; RD Stenner; WB Andrews

    1999-01-01

    The methodology presented in this document was developed to provide a means of calculating the RH ratios to use in developing useful graphic illustrations. The RH equation, as presented in this methodology, is primarily a collection of key factors relevant to understanding the hazards and risks associated with projected risk management activities. The RH equation has the potential for much broader application than generating risk profiles. For example, it can be used to compare one risk management activity with another, instead of just comparing it to a fixed baseline as was done for the risk profiles. If the appropriate source term data are available, it could be used in its non-ratio form to estimate absolute values of the associated hazards. These estimated values of hazard could then be examined to help understand which risk management activities are addressing the higher hazard conditions at a site. Graphics could be generated from these absolute hazard values to compare high-hazard conditions. If the RH equation is used in this manner, care must be taken to specifically define and qualify the estimated absolute hazard values (e.g., identify which factors were considered and which ones tended to drive the hazard estimation)

  5. Insights into PRA methodologies

    International Nuclear Information System (INIS)

    Gallagher, D.; Lofgren, E.; Atefi, B.; Liner, R.; Blond, R.; Amico, P.

    1984-08-01

    Probabilistic Risk Assessments (PRAs) for six nuclear power plants were examined to gain insight into how the choice of analytical methods can affect the results of PRAs. The PRA sreflectope considered was limited to internally initiated accidents sequences through core melt. For twenty methodological topic areas, a baseline or minimal methodology was specified. The choice of methods for each topic in the six PRAs was characterized in terms of the incremental level of effort above the baseline. A higher level of effort generally reflects a higher level of detail or a higher degree of sophistication in the analytical approach to a particular topic area. The impact on results was measured in terms of how additional effort beyond the baseline level changed the relative importance and ordering of dominant accident sequences compared to what would have been observed had methods corresponding to the baseline level of effort been employed. This measure of impact is a more useful indicator of how methods affect perceptions of plant vulnerabilities than changes in core melt frequency would be. However, the change in core melt frequency was used as a secondary measure of impact for nine topics where availability of information permitted. Results are presented primarily in the form of effort-impact matrices for each of the twenty topic areas. A suggested effort-impact profile for future PRAs is presented

  6. Scrum methodology in banking environment

    OpenAIRE

    Strihová, Barbora

    2015-01-01

    Bachelor thesis "Scrum methodology in banking environment" is focused on one of agile methodologies called Scrum and description of the methodology used in banking environment. Its main goal is to introduce the Scrum methodology and outline a real project placed in a bank focused on software development through a case study, address problems of the project, propose solutions of the addressed problems and identify anomalies of Scrum in software development constrained by the banking environmen...

  7. Experimental Economics: Some Methodological Notes

    OpenAIRE

    Fiore, Annamaria

    2009-01-01

    The aim of this work is presenting in a self-contained paper some methodological aspects as they are received in the current experimental literature. The purpose has been to make a critical review of some very influential papers dealing with methodological issues. In other words, the idea is to have a single paper where people first approaching experimental economics can find summarised (some) of the most important methodological issues. In particular, the focus is on some methodological prac...

  8. Multimodal hybrid reasoning methodology for personalized wellbeing services.

    Science.gov (United States)

    Ali, Rahman; Afzal, Muhammad; Hussain, Maqbool; Ali, Maqbool; Siddiqi, Muhammad Hameed; Lee, Sungyoung; Ho Kang, Byeong

    2016-02-01

    A wellness system provides wellbeing recommendations to support experts in promoting a healthier lifestyle and inducing individuals to adopt healthy habits. Adopting physical activity effectively promotes a healthier lifestyle. A physical activity recommendation system assists users to adopt daily routines to form a best practice of life by involving themselves in healthy physical activities. Traditional physical activity recommendation systems focus on general recommendations applicable to a community of users rather than specific individuals. These recommendations are general in nature and are fit for the community at a certain level, but they are not relevant to every individual based on specific requirements and personal interests. To cover this aspect, we propose a multimodal hybrid reasoning methodology (HRM) that generates personalized physical activity recommendations according to the user׳s specific needs and personal interests. The methodology integrates the rule-based reasoning (RBR), case-based reasoning (CBR), and preference-based reasoning (PBR) approaches in a linear combination that enables personalization of recommendations. RBR uses explicit knowledge rules from physical activity guidelines, CBR uses implicit knowledge from experts׳ past experiences, and PBR uses users׳ personal interests and preferences. To validate the methodology, a weight management scenario is considered and experimented with. The RBR part of the methodology generates goal, weight status, and plan recommendations, the CBR part suggests the top three relevant physical activities for executing the recommended plan, and the PBR part filters out irrelevant recommendations from the suggested ones using the user׳s personal preferences and interests. To evaluate the methodology, a baseline-RBR system is developed, which is improved first using ranged rules and ultimately using a hybrid-CBR. A comparison of the results of these systems shows that hybrid-CBR outperforms the

  9. Geotechnical site assessment methodology

    International Nuclear Information System (INIS)

    Tunbridge, L.W.; Richards, L.R.

    1985-09-01

    A final report summarizing the research conducted on geotechnical site assessment methodology at the Carwynnen test mine in Cornwall. The geological setting of the test site in the Cornubian granite batholith is described. The effect of structure imposed by discontinuities on the engineering behaviour of rock masses is discussed and the scanline survey method of obtaining data on discontinuities in the rock mass is described. The applicability of some methods of statistical analysis for discontinuity data is reviewed. The requirement for remote geophysical methods of characterizing the mass is discussed and experiments using seismic and ultrasonic velocity measurements are reported. Methods of determining the in-situ stresses are described and the final results of a programme of in-situ stress measurements using the overcoring and hydrofracture methods are reported. (author)

  10. UNCOMMON SENSORY METHODOLOGIES

    Directory of Open Access Journals (Sweden)

    Vladimír Vietoris

    2015-02-01

    Full Text Available Sensory science is the young but the rapidly developing field of the food industry. Actually, the great emphasis is given to the production of rapid techniques of data collection, the difference between consumers and trained panel is obscured and the role of sensory methodologists is to prepare the ways for evaluation, by which a lay panel (consumers can achieve identical results as a trained panel. Currently, there are several conventional methods of sensory evaluation of food (ISO standards, but more sensory laboratories are developing methodologies that are not strict enough in the selection of evaluators, their mechanism is easily understandable and the results are easily interpretable. This paper deals with mapping of marginal methods used in sensory evaluation of food (new types of profiles, CATA, TDS, napping.

  11. Safety class methodology

    International Nuclear Information System (INIS)

    Donner, E.B.; Low, J.M.; Lux, C.R.

    1992-01-01

    DOE Order 6430.1A, General Design Criteria (GDC), requires that DOE facilities be evaluated with respect to ''safety class items.'' Although the GDC defines safety class items, it does not provide a methodology for selecting safety class items. The methodology described in this paper was developed to assure that Safety Class Items at the Savannah River Site (SRS) are selected in a consistent and technically defensible manner. Safety class items are those in the highest of four categories determined to be of special importance to nuclear safety and, merit appropriately higher-quality design, fabrication, and industrial test standards and codes. The identification of safety class items is approached using a cascading strategy that begins at the 'safety function' level (i.e., a cooling function, ventilation function, etc.) and proceeds down to the system, component, or structure level. Thus, the items that are required to support a safety function are SCls. The basic steps in this procedure apply to the determination of SCls for both new project activities, and for operating facilities. The GDC lists six characteristics of SCls to be considered as a starting point for safety item classification. They are as follows: 1. Those items whose failure would produce exposure consequences that would exceed the guidelines in Section 1300-1.4, ''Guidance on Limiting Exposure of the Public,'' at the site boundary or nearest point of public access 2. Those items required to maintain operating parameters within the safety limits specified in the Operational Safety Requirements during normal operations and anticipated operational occurrences. 3. Those items required for nuclear criticality safety. 4. Those items required to monitor the release of radioactive material to the environment during and after a Design Basis Accident. Those items required to achieve, and maintain the facility in a safe shutdown condition 6. Those items that control Safety Class Item listed above

  12. Situating methodology within qualitative research.

    Science.gov (United States)

    Kramer-Kile, Marnie L

    2012-01-01

    Qualitative nurse researchers are required to make deliberate and sometimes complex methodological decisions about their work. Methodology in qualitative research is a comprehensive approach in which theory (ideas) and method (doing) are brought into close alignment. It can be difficult, at times, to understand the concept of methodology. The purpose of this research column is to: (1) define qualitative methodology; (2) illuminate the relationship between epistemology, ontology and methodology; (3) explicate the connection between theory and method in qualitative research design; and 4) highlight relevant examples of methodological decisions made within cardiovascular nursing research. Although there is no "one set way" to do qualitative research, all qualitative researchers should account for the choices they make throughout the research process and articulate their methodological decision-making along the way.

  13. Methodological Problems of Nanotechnoscience

    Science.gov (United States)

    Gorokhov, V. G.

    Recently, we have reported on the definitions of nanotechnology as a new type of NanoTechnoScience and on the nanotheory as a cluster of the different natural and engineering theories. Nanotechnology is not only a new type of scientific-engineering discipline, but it evolves also in a “nonclassical” way. Nanoontology or nano scientific world view has a function of the methodological orientation for the choice the theoretical means and methods toward a solution to the scientific and engineering problems. This allows to change from one explanation and scientific world view to another without any problems. Thus, nanotechnology is both a field of scientific knowledge and a sphere of engineering activity, in other words, NanoTechnoScience is similar to Systems Engineering as the analysis and design of large-scale, complex, man/machine systems but micro- and nanosystems. Nano systems engineering as well as Macro systems engineering includes not only systems design but also complex research. Design orientation has influence on the change of the priorities in the complex research and of the relation to the knowledge, not only to “the knowledge about something”, but also to the knowledge as the means of activity: from the beginning control and restructuring of matter at the nano-scale is a necessary element of nanoscience.

  14. Methodological themes and variations

    International Nuclear Information System (INIS)

    Tetlock, P.E.

    1989-01-01

    This paper reports on the tangible progress that has been made in clarifying the underlying processes that affect both the likelihood of war in general and of nuclear war in particular. It also illustrates how difficult it is to make progress in this area. Nonetheless, what has been achieved should not be minimized. We have learned a good deal on both the theoretical and the methodological fronts and, perhaps, most important, we have learned a good deal about the limits of our knowledge. Knowledge of our ignorance---especially in a policy domain where confident, even glib, causal assertions are so common---can be a major contribution in itself. The most important service the behavioral and social sciences can currently provide to the policy making community may well be to make thoughtful skepticism respectable: to sensitize those who make key decisions to the uncertainty surrounding our understanding of international conflict and to the numerous qualifications that now need to be attached to simple causal theories concerning the origins of war

  15. Engineering radioecology: Methodological considerations

    International Nuclear Information System (INIS)

    Nechaev, A.F.; Projaev, V.V.; Sobolev, I.A.; Dmitriev, S.A.

    1995-01-01

    The term ''radioecology'' has been widely recognized in scientific and technical societies. At the same time, this scientific school (radioecology) does not have a precise/generally acknowledged structure, unified methodical basis, fixed subjects of investigation, etc. In other words, radioecology is a vast, important but rather amorphous conglomerate of various ideas, amalgamated mostly by their involvement in biospheric effects of ionizing radiation and some conceptual stereotypes. This paradox was acceptable up to a certain time. However, with the termination of the Cold War and because of remarkable political changes in the world, it has become possible to convert the problem of environmental restoration from the scientific sphere in particularly practical terms. Already the first steps clearly showed an imperfection of existing technologies, managerial and regulatory schemes; lack of qualified specialists, relevant methods and techniques; uncertainties in methodology of decision-making, etc. Thus, building up (or maybe, structuring) of special scientific and technological basis, which the authors call ''engineering radioecology'', seems to be an important task. In this paper they endeavored to substantiate the last thesis and to suggest some preliminary ideas concerning the subject matter of engineering radioecology

  16. Dosimetric methodology of the ICRP

    International Nuclear Information System (INIS)

    Eckerman, K.F.

    1994-01-01

    Establishment of guidance for the protection of workers and members of the public from radiation exposures necessitates estimation of the radiation dose to tissues of the body at risk. The dosimetric methodology formulated by the International Commission on Radiological Protection (ICRP) is intended to be responsive to this need. While developed for radiation protection, elements of the methodology are often applied in addressing other radiation issues; e.g., risk assessment. This chapter provides an overview of the methodology, discusses its recent extension to age-dependent considerations, and illustrates specific aspects of the methodology through a number of numerical examples

  17. Transmission pricing: paradigms and methodologies

    Energy Technology Data Exchange (ETDEWEB)

    Shirmohammadi, Dariush [Pacific Gas and Electric Co., San Francisco, CA (United States); Vieira Filho, Xisto; Gorenstin, Boris [Centro de Pesquisas de Energia Eletrica (CEPEL), Rio de Janeiro, RJ (Brazil); Pereira, Mario V.P. [Power System Research, Rio de Janeiro, RJ (Brazil)

    1994-12-31

    In this paper we describe the principles of several paradigms and methodologies for pricing transmission services. The paper outlines some of the main characteristics of these paradigms and methodologies such as where they may be used for best results. Due to their popularity, power flow based MW-mile and short run marginal cost pricing methodologies will be covered in some detail. We conclude the paper with examples of the application of these two pricing methodologies for pricing transmission services in Brazil. (author) 25 refs., 2 tabs.

  18. Analytical methodology for nuclear safeguards

    International Nuclear Information System (INIS)

    Ramakumar, K.L.

    2011-01-01

    This paper attempts to briefly describe the analytical methodologies available and also highlight some of the challenges, expectations from nuclear material accounting and control (NUMAC) point of view

  19. Country report: a methodology

    International Nuclear Information System (INIS)

    Colin, A.

    2013-01-01

    This paper describes a methodology which could be applicable to establish a country report. In the framework of nuclear non proliferation appraisal and IAEA safeguards implementation, it is important to be able to assess the potential existence of undeclared nuclear materials and activities as undeclared facilities in the country under review. In our views a country report should aim at providing detailed information on nuclear related activities for each country examined taken 'as a whole' such as nuclear development, scientific and technical capabilities, etc. In order to study a specific country, we need to know if there is already an operating nuclear civil programme or not. In the first case, we have to check carefully if it could divert nuclear material, if there are misused declared facilities or if they operate undeclared facilities and conduct undeclared activities aiming at manufacturing nuclear weapon. In the second case, we should pay attention to the development of a nuclear civil project. A country report is based on a wide span of information (most of the time coming from open sources but sometimes coming also from confidential or private ones). Therefore, it is important to carefully check the nature and the credibility (reliability?) of these sources through cross-check examination. Eventually, it is necessary to merge information from different sources and apply an expertise filter. We have at our disposal a lot of performing tools to help us to assess, understand and evaluate the situation (cartography, imagery, bibliometry, etc.). These tools allow us to offer the best conclusions as far as possible. The paper is followed by the slides of the presentation. (author)

  20. Microbiological Methodology in Astrobiology

    Science.gov (United States)

    Abyzov, S. S.; Gerasimenko, L. M.; Hoover, R. B.; Mitskevich, I. N.; Mulyukin, A. L.; Poglazova, M. N.; Rozanov, A. Y.

    2005-01-01

    Searching for life in astromaterials to be delivered from the future missions to extraterrestrial bodies is undoubtedly related to studies of the properties and signatures of living microbial cells and microfossils on Earth. As model terrestrial analogs of Martian polar subsurface layers are often regarded the Antarctic glacier and Earth permafrost habitats where alive microbial cells preserved viability for millennia years due to entering the anabiotic state. For the future findings of viable microorganisms in samples from extraterrestrial objects, it is important to use a combined methodology that includes classical microbiological methods, plating onto nutrient media, direct epifluorescence and electron microscopy examinations, detection of the elemental composition of cells, radiolabeling techniques, PCR and FISH methods. Of great importance is to ensure authenticity of microorganisms (if any in studied samples) and to standardize the protocols used to minimize a risk of external contamination. Although the convincing evidence of extraterrestrial microbial life will may come from the discovery of living cells in astromaterials, biomorphs and microfossils must also be regarded as a target in search of life evidence bearing in mind a scenario that alive microorganisms had not be preserved and underwent mineralization. Under the laboratory conditions, processes that accompanied fossilization of cyanobacteria were reconstructed, and artificially produced cyanobacterial stromatolites resembles by their morphological properties those found in natural Earth habitats. Regarding the vital importance of distinguishing between biogenic and abiogenic signatures and between living and fossil microorganisms in analyzed samples, it is worthwhile to use some previously developed approaches based on electron microscopy examinations and analysis of elemental composition of biomorphs in situ and comparison with the analogous data obtained for laboratory microbial cultures and

  1. Kaupapa Maori Methodology: Trusting the Methodology through Thick and Thin

    Science.gov (United States)

    Hiha, Anne Aroha

    2016-01-01

    Kaupapa Maori is thoroughly theorised in academia in Aotearoa and those wishing to use it as their research methodology can find support through the writing of a number of Maori academics. What is not so well articulated, is the experiential voice of those who have used Kaupapa Maori as research methodology. My identity as a Maori woman…

  2. A simple and accurate rule-based modeling framework for simulation of autocrine/paracrine stimulation of glioblastoma cell motility and proliferation by L1CAM in 2-D culture.

    Science.gov (United States)

    Caccavale, Justin; Fiumara, David; Stapf, Michael; Sweitzer, Liedeke; Anderson, Hannah J; Gorky, Jonathan; Dhurjati, Prasad; Galileo, Deni S

    2017-12-11

    Glioblastoma multiforme (GBM) is a devastating brain cancer for which there is no known cure. Its malignancy is due to rapid cell division along with high motility and invasiveness of cells into the brain tissue. Simple 2-dimensional laboratory assays (e.g., a scratch assay) commonly are used to measure the effects of various experimental perturbations, such as treatment with chemical inhibitors. Several mathematical models have been developed to aid the understanding of the motile behavior and proliferation of GBM cells. However, many are mathematically complicated, look at multiple interdependent phenomena, and/or use modeling software not freely available to the research community. These attributes make the adoption of models and simulations of even simple 2-dimensional cell behavior an uncommon practice by cancer cell biologists. Herein, we developed an accurate, yet simple, rule-based modeling framework to describe the in vitro behavior of GBM cells that are stimulated by the L1CAM protein using freely available NetLogo software. In our model L1CAM is released by cells to act through two cell surface receptors and a point of signaling convergence to increase cell motility and proliferation. A simple graphical interface is provided so that changes can be made easily to several parameters controlling cell behavior, and behavior of the cells is viewed both pictorially and with dedicated graphs. We fully describe the hierarchical rule-based modeling framework, show simulation results under several settings, describe the accuracy compared to experimental data, and discuss the potential usefulness for predicting future experimental outcomes and for use as a teaching tool for cell biology students. It is concluded that this simple modeling framework and its simulations accurately reflect much of the GBM cell motility behavior observed experimentally in vitro in the laboratory. Our framework can be modified easily to suit the needs of investigators interested in other

  3. Autonomous Rule Based Robot Navigation In Orchards

    DEFF Research Database (Denmark)

    Andersen, Jens Christian; Ravn, Ole; Andersen, Nils Axel

    2010-01-01

    Orchard navigation using sensor-based localization and exible mission management facilitates successful missions independent of the Global Positioning System (GPS). This is especially important while driving between tight tree rows where the GPS coverage is poor. This paper suggests localization ...

  4. Rule-based machine translation for Aymara

    NARCIS (Netherlands)

    Coler, Matthew; Homola, Petr; Jones, Mari

    2014-01-01

    This paper presents the ongoing result of an approach developed by the collaboration of a computational linguist with a field linguist that addresses one of the oft-overlooked keys to language maintenance: the development of modern language-learning tools. Although machine translation isn’t commonly

  5. Association-rule-based tuberculosis disease diagnosis

    Science.gov (United States)

    Asha, T.; Natarajan, S.; Murthy, K. N. B.

    2010-02-01

    Tuberculosis (TB) is a disease caused by bacteria called Mycobacterium tuberculosis. It usually spreads through the air and attacks low immune bodies such as patients with Human Immunodeficiency Virus (HIV). This work focuses on finding close association rules, a promising technique in Data Mining, within TB data. The proposed method first normalizes of raw data from medical records which includes categorical, nominal and continuous attributes and then determines Association Rules from the normalized data with different support and confidence. Association rules are applied on a real data set containing medical records of patients with TB obtained from a state hospital. The rules determined describes close association between one symptom to another; as an example, likelihood that an occurrence of sputum is closely associated with blood cough and HIV.

  6. A methodology for social experimentation

    DEFF Research Database (Denmark)

    Ravn, Ib

    A methodology is outlined whereby one may improve the performance of a social system to the satisfaction of its stakeholders, that is, facilitate desirable social and organizational transformations......A methodology is outlined whereby one may improve the performance of a social system to the satisfaction of its stakeholders, that is, facilitate desirable social and organizational transformations...

  7. Workshops as a Research Methodology

    Science.gov (United States)

    Ørngreen, Rikke; Levinsen, Karin

    2017-01-01

    This paper contributes to knowledge on workshops as a research methodology, and specifically on how such workshops pertain to e-learning. A literature review illustrated that workshops are discussed according to three different perspectives: workshops as a means, workshops as practice, and workshops as a research methodology. Focusing primarily on…

  8. Methodological Pluralism and Narrative Inquiry

    Science.gov (United States)

    Michie, Michael

    2013-01-01

    This paper considers how the integral theory model of Nancy Davis and Laurie Callihan might be enacted using a different qualitative methodology, in this case the narrative methodology. The focus of narrative research is shown to be on "what meaning is being made" rather than "what is happening here" (quadrant 2 rather than…

  9. Building ASIPS the Mescal methodology

    CERN Document Server

    Gries, Matthias

    2006-01-01

    A number of system designers use ASIP's rather than ASIC's to implement their system solutions. This book gives a comprehensive methodology for the design of these application-specific instruction processors (ASIPs). It includes demonstrations of applications of the methodologies using the Tipi research framework.

  10. A methodology for software documentation

    OpenAIRE

    Torres Júnior, Roberto Dias; Ahlert, Hubert

    2000-01-01

    With the growing complexity of window based software and the use of object-oriented, the development of software is getting more complex than ever. Based on that, this article intends to present a methodology for software documentation and to analyze our experience and how this methodology can aid the software maintenance

  11. Rule-based programming and strategies for automated generation of detailed kinetic models for gas phase combustion of polycyclic hydrocarbon molecules; Programmation par regles et strategies pour la generation automatique de mecanismes de combustion d'hydrocarbures polycycliques

    Energy Technology Data Exchange (ETDEWEB)

    Ibanescu, L.

    2004-06-15

    The primary objective of this thesis is to explore the approach of using rule-based systems and strategies, for a complex problem of chemical kinetic: the automated generation of reaction mechanisms. The chemical reactions are naturally expressed as conditional rewriting rules. The control of the chemical reactions chaining is easy to describe using a strategies language, such as the one of the ELAN system, developed in the Protheo team. The thesis presents the basic concepts of the chemical kinetics, the chemical and computational problems related to the conception and validation of a reaction mechanism, and gives a general structure for the generator of reaction mechanisms called GasEI. Our research focuses on the primary mechanism generator. We give solutions for encoding the chemical species, the reactions and their chaining, and we present the prototype developed in ELAN. The representation of the chemical species uses the notion of molecular graphs, encoded by a term structure called GasEI terms. The chemical reactions are expressed by rewriting rules on molecular graphs, encoded by a set of conditional rewriting rules on GasEI terms. The strategies language of the ELAN system is used to express the reactions chaining in the primary mechanism generator. This approach is illustrated by coding ten generic reactions of the oxidizing pyrolysis. Qualitative chemical validations of the prototype show that our approach gives, for acyclic molecules, the same results as the existing mechanism generators, and for polycyclic molecules produces original results.

  12. Rule-based programming and strategies for automated generation of detailed kinetic models for gas phase combustion of polycyclic hydrocarbon molecules; Programmation par regles et strategies pour la generation automatique de mecanismes de combustion d'hydrocarbures polycycliques

    Energy Technology Data Exchange (ETDEWEB)

    Ibanescu, L

    2004-06-15

    The primary objective of this thesis is to explore the approach of using rule-based systems and strategies, for a complex problem of chemical kinetic: the automated generation of reaction mechanisms. The chemical reactions are naturally expressed as conditional rewriting rules. The control of the chemical reactions chaining is easy to describe using a strategies language, such as the one of the ELAN system, developed in the Protheo team. The thesis presents the basic concepts of the chemical kinetics, the chemical and computational problems related to the conception and validation of a reaction mechanism, and gives a general structure for the generator of reaction mechanisms called GasEI. Our research focuses on the primary mechanism generator. We give solutions for encoding the chemical species, the reactions and their chaining, and we present the prototype developed in ELAN. The representation of the chemical species uses the notion of molecular graphs, encoded by a term structure called GasEI terms. The chemical reactions are expressed by rewriting rules on molecular graphs, encoded by a set of conditional rewriting rules on GasEI terms. The strategies language of the ELAN system is used to express the reactions chaining in the primary mechanism generator. This approach is illustrated by coding ten generic reactions of the oxidizing pyrolysis. Qualitative chemical validations of the prototype show that our approach gives, for acyclic molecules, the same results as the existing mechanism generators, and for polycyclic molecules produces original results.

  13. AEGIS methodology and a perspective from AEGIS methodology demonstrations

    International Nuclear Information System (INIS)

    Dove, F.H.

    1981-03-01

    Objectives of AEGIS (Assessment of Effectiveness of Geologic Isolation Systems) are to develop the capabilities needed to assess the post-closure safety of waste isolation in geologic formation; demonstrate these capabilities on reference sites; apply the assessment methodology to assist the NWTS program in site selection, waste package and repository design; and perform repository site analyses for the licensing needs of NWTS. This paper summarizes the AEGIS methodology, the experience gained from methodology demonstrations, and provides an overview in the following areas: estimation of the response of a repository to perturbing geologic and hydrologic events; estimation of the transport of radionuclides from a repository to man; and assessment of uncertainties

  14. Methodological practicalities in analytical generalization

    DEFF Research Database (Denmark)

    Halkier, Bente

    2011-01-01

    generalization. Theoretically, the argumentation in the article is based on practice theory. The main part of the article describes three different examples of ways of generalizing on the basis of the same qualitative data material. There is a particular focus on describing the methodological strategies......In this article, I argue that the existing literature on qualitative methodologies tend to discuss analytical generalization at a relatively abstract and general theoretical level. It is, however, not particularly straightforward to “translate” such abstract epistemological principles into more...... operative methodological strategies for producing analytical generalizations in research practices. Thus, the aim of the article is to contribute to the discussions among qualitatively working researchers about generalizing by way of exemplifying some of the methodological practicalities in analytical...

  15. Nanotoxicology materials, methodologies, and assessments

    CERN Document Server

    Durán, Nelson; Alves, Oswaldo L; Zucolotto, Valtencir

    2014-01-01

    This book begins with a detailed introduction to engineered nanostructures, followed by a section on methodologies used in research on cytotoxicity and genotoxicity, and concluding with evidence for the cyto- and genotoxicity of specific nanoparticles.

  16. Reflective Methodology: The Beginning Teacher

    Science.gov (United States)

    Templeton, Ronald K.; Siefert, Thomas E.

    1970-01-01

    Offers a variety of specific techniques which will help the beginning teacher to implement reflective methodology and create an inquiry-centered classroom atmosphere, at the same time meeting the many more pressing demands of first-year teaching. (JES)

  17. Methodologies used in Project Management

    OpenAIRE

    UNGUREANU, Adrian; UNGUREANU, Anca

    2014-01-01

    Undoubtedly, a methodology properly defined and strictly followed for project management provides a firm guarantee that the work will be done on time, in budget and according to specifications. A project management methodology in simple terms is a “must-have” to avoid failure and reduce risks, because is one of the critical success factors, such basic skills of the management team. This is the simple way to guide the team through the design and execution phases, processes and tasks throughout...

  18. Methodology for ranking restoration options

    DEFF Research Database (Denmark)

    Jensen, Per Hedemann

    1999-01-01

    techniques as a function of contamination and site characteristics. The project includes analyses of existing remediation methodologies and contaminated sites, and is structured in the following steps:-characterisation of relevant contaminated sites -identication and characterisation of relevant restoration...... techniques -assessment of the radiological impact -development and application of a selection methodology for restoration options -formulation ofgeneric conclusions and development of a manual The project is intended to apply to situations in which sites with nuclear installations have been contaminated...

  19. A modified GO-FLOW methodology with common cause failure based on Discrete Time Bayesian Network

    International Nuclear Information System (INIS)

    Fan, Dongming; Wang, Zili; Liu, Linlin; Ren, Yi

    2016-01-01

    Highlights: • Identification of particular causes of failure for common cause failure analysis. • Comparison two formalisms (GO-FLOW and Discrete Time Bayesian network) and establish the correlation between them. • Mapping the GO-FLOW model into Bayesian network model. • Calculated GO-FLOW model with common cause failures based on DTBN. - Abstract: The GO-FLOW methodology is a success-oriented system reliability modelling technique for multi-phase missions involving complex time-dependent, multi-state and common cause failure (CCF) features. However, the analysis algorithm cannot easily handle the multiple shared signals and CCFs. In addition, the simulative algorithm is time consuming when vast multi-state components exist in the model, and the multiple time points of phased mission problems increases the difficulty of the analysis method. In this paper, the Discrete Time Bayesian Network (DTBN) and the GO-FLOW methodology are integrated by the unified mapping rules. Based on these rules, the multi operators can be mapped into DTBN followed by, a complete GO-FLOW model with complex characteristics (e.g. phased mission, multi-state, and CCF) can be converted to the isomorphic DTBN and easily analyzed by utilizing the DTBN. With mature algorithms and tools, the multi-phase mission reliability parameter can be efficiently obtained via the proposed approach without considering the shared signals and the various complex logic operation. Meanwhile, CCF can also arise in the computing process.

  20. Managerial Methodology in Public Institutions

    Directory of Open Access Journals (Sweden)

    Ion VERBONCU

    2010-10-01

    Full Text Available One of the most important ways of making public institutions more efficient is by applying managerial methodology, embodied in the promotion of management tools, modern and sophisticated methodologies, as well as operation of designing/redesigning and maintenance of the management process and its components. Their implementation abides the imprint of constructive and functional particularities of public institutions, decentralized and devolved, and, of course, the managers’ expertise of these organizations. Managerial methodology is addressed through three important instruments diagnosis, management by objectives and scoreboard. Its presence in the performance management process should be mandatory, given the favorable influence on the management and economic performance and the degree of scholastic approach of the managers’ performance.

  1. Blanket safety by GEMSAFE methodology

    International Nuclear Information System (INIS)

    Sawada, Tetsuo; Saito, Masaki

    2001-01-01

    General Methodology of Safety Analysis and Evaluation for Fusion Energy Systems (GEMSAFE) has been applied to a number of fusion system designs, such as R-tokamak, Fusion Experimental Reactor (FER), and the International Thermonuclear Experimental Reactor (ITER) designs in the both stages of Conceptual Design Activities (CDA) and Engineering Design Activities (EDA). Though the major objective of GEMSAFE is to reasonably select design basis events (DBEs) it is also useful to elucidate related safety functions as well as requirements to ensure its safety. In this paper, we apply the methodology to fusion systems with future tritium breeding blankets and make clear which points of the system should be of concern from safety ensuring point of view. In this context, we have obtained five DBEs that are related to the blanket system. We have also clarified the safety functions required to prevent accident propagations initiated by those blanket-specific DBEs. The outline of the methodology is also reviewed. (author)

  2. The NLC Software Requirements Methodology

    Energy Technology Data Exchange (ETDEWEB)

    Shoaee, Hamid

    2002-08-20

    We describe the software requirements and development methodology developed for the NLC control system. Given the longevity of that project, and the likely geographical distribution of the collaborating engineers, the planned requirements management process is somewhat more formal than the norm in high energy physics projects. The short term goals of the requirements process are to accurately estimate costs, to decompose the problem, and to determine likely technologies. The long term goal is to enable a smooth transition from high level functional requirements to specific subsystem and component requirements for individual programmers, and to support distributed development. The methodology covers both ends of that life cycle. It covers both the analytical and documentary tools for software engineering, and project management support. This paper introduces the methodology, which is fully described in [1].

  3. Methodology, theoretical framework and scholarly significance: An ...

    African Journals Online (AJOL)

    Methodology, theoretical framework and scholarly significance: An overview ... AFRICAN JOURNALS ONLINE (AJOL) · Journals · Advanced Search ... Keywords: Legal Research, Methodology, Theory, Pedagogy, Legal Training, Scholarship ...

  4. Methodological Guidelines for Advertising Research

    DEFF Research Database (Denmark)

    Rossiter, John R.; Percy, Larry

    2017-01-01

    In this article, highly experienced advertising academics and advertising research consultants John R. Rossiter and Larry Percy present and discuss what they believe to be the seven most important methodological guidelines that need to be implemented to improve the practice of advertising research....... Their focus is on methodology, defined as first choosing a suitable theoretical framework to guide the research study and then identifying the advertising responses that need to be studied. Measurement of those responses is covered elsewhere in this special issue in the article by Bergkvist and Langner. Most...

  5. Acoustic emission methodology and application

    CERN Document Server

    Nazarchuk, Zinoviy; Serhiyenko, Oleh

    2017-01-01

    This monograph analyses in detail the physical aspects of the elastic waves radiation during deformation or fracture of materials. I presents the  methodological bases for the practical use of acoustic emission device, and describes the results of theoretical and experimental researches of evaluation of the crack growth resistance of materials, selection of the useful AE signals. The efficiency of this methodology is shown through the diagnostics of various-purpose industrial objects. The authors obtain results of experimental researches with the help of the new methods and facilities.

  6. An LWR design decision Methodology

    International Nuclear Information System (INIS)

    Leahy, T.J.; Rees, D.C.; Young, J.

    1982-01-01

    While all parties involved in nuclear plant regulation endeavor to make decisions which optimize the considerations of plant safety and financial impacts, these decisions are generally made without the benefit of a systematic and rigorous approach to the questions confronting the decision makers. A Design Decision Methodology has been developed which provides such a systematic approach. By employing this methodology, which makes use of currently accepted probabilistic risk assessment techniques and cost estimation, informed decisions may be made against a background of comparisons between the relative levels of safety and costs associated with various design alternatives

  7. Qualitative methodology in developmental psychology

    DEFF Research Database (Denmark)

    Demuth, Carolin; Mey, Günter

    2015-01-01

    Qualitative methodology presently is gaining increasing recognition in developmental psychology. Although the founders of developmental psychology to a large extent already used qualitative procedures, the field was long dominated by a (post) positivistic quantitative paradigm. The increasing rec...... in qualitative research offers a promising avenue to advance the field in this direction.......Qualitative methodology presently is gaining increasing recognition in developmental psychology. Although the founders of developmental psychology to a large extent already used qualitative procedures, the field was long dominated by a (post) positivistic quantitative paradigm. The increasing...

  8. Qualitative Methodology in Unfamiliar Cultures

    DEFF Research Database (Denmark)

    Svensson, Christian Franklin

    2014-01-01

    on a qualitative methodology, conscious reflection on research design and objectivity is important when doing fieldwork. This case study discusses such reflections. Emphasis throughout is given to applied qualitative methodology and its contributions to the social sciences, in particular having to do......This case study discusses qualitative fieldwork in Malaysia. The trends in higher education led to investigating how and why young Indians and Chinese in Malaysia are using the university to pursue a life strategy. Given the importance of field context in designing and analysing research based...

  9. Observational methodology in sport sciences

    Directory of Open Access Journals (Sweden)

    M. Teresa Anguera

    2013-11-01

    Full Text Available This paper reviews the conceptual framework, the key literature and the methods (observation tools, such as category systems and field formats, and coding software, etc. that should be followed when conducting research from the perspective of observational methodology. The observational designs used by the authors’ research group over the last twenty years are discussed, and the procedures for analysing data and assessing their quality are described. Mention is also made of the latest methodological trends in this field, such as the use of mixed methods.

  10. Reflections on Design Methodology Research

    DEFF Research Database (Denmark)

    2011-01-01

    We shall reflect on the results of Design Methodology research and their impact on design practice. In the past 50 years the number of researchers in the field has expanded enormously – as has the number of publications. During the same period design practice and its products have changed...... and produced are also now far more complex and distributed, putting designers under ever increasing pressure. We shall address the question: Are the results of Design Methodology research appropriate and are they delivering the expected results in design practice? In our attempt to answer this question we...

  11. Methodologic frontiers in environmental epidemiology.

    OpenAIRE

    Rothman, K J

    1993-01-01

    Environmental epidemiology comprises the epidemiologic study of those environmental factors that are outside the immediate control of the individual. Exposures of interest to environmental epidemiologists include air pollution, water pollution, occupational exposure to physical and chemical agents, as well as psychosocial elements of environmental concern. The main methodologic problem in environmental epidemiology is exposure assessment, a problem that extends through all of epidemiologic re...

  12. IMSF: Infinite Methodology Set Framework

    Science.gov (United States)

    Ota, Martin; Jelínek, Ivan

    Software development is usually an integration task in enterprise environment - few software applications work autonomously now. It is usually a collaboration of heterogeneous and unstable teams. One serious problem is lack of resources, a popular result being outsourcing, ‘body shopping’, and indirectly team and team member fluctuation. Outsourced sub-deliveries easily become black boxes with no clear development method used, which has a negative impact on supportability. Such environments then often face the problems of quality assurance and enterprise know-how management. The used methodology is one of the key factors. Each methodology was created as a generalization of a number of solved projects, and each methodology is thus more or less connected with a set of task types. When the task type is not suitable, it causes problems that usually result in an undocumented ad-hoc solution. This was the motivation behind formalizing a simple process for collaborative software engineering. Infinite Methodology Set Framework (IMSF) defines the ICT business process of adaptive use of methods for classified types of tasks. The article introduces IMSF and briefly comments its meta-model.

  13. Unattended Monitoring System Design Methodology

    International Nuclear Information System (INIS)

    Drayer, D.D.; DeLand, S.M.; Harmon, C.D.; Matter, J.C.; Martinez, R.L.; Smith, J.D.

    1999-01-01

    A methodology for designing Unattended Monitoring Systems starting at a systems level has been developed at Sandia National Laboratories. This proven methodology provides a template that describes the process for selecting and applying appropriate technologies to meet unattended system requirements, as well as providing a framework for development of both training courses and workshops associated with unattended monitoring. The design and implementation of unattended monitoring systems is generally intended to respond to some form of policy based requirements resulting from international agreements or domestic regulations. Once the monitoring requirements are established, a review of the associated process and its related facilities enables identification of strategic monitoring locations and development of a conceptual system design. The detailed design effort results in the definition of detection components as well as the supporting communications network and data management scheme. The data analyses then enables a coherent display of the knowledge generated during the monitoring effort. The resultant knowledge is then compared to the original system objectives to ensure that the design adequately addresses the fundamental principles stated in the policy agreements. Implementation of this design methodology will ensure that comprehensive unattended monitoring system designs provide appropriate answers to those critical questions imposed by specific agreements or regulations. This paper describes the main features of the methodology and discusses how it can be applied in real world situations

  14. Environmental Zoning: Some methodological implications

    NARCIS (Netherlands)

    Ike, Paul; Voogd, Henk

    1991-01-01

    The purpose of this article is to discuss some methodological problems of environmental zoning. The principle of environmental zoning will be elaborated. In addition an overview is given of a number of approaches that have been followed in practice to arrive at an integral judgement. Finally some

  15. A methodology for string resolution

    International Nuclear Information System (INIS)

    Karonis, N.T.

    1992-11-01

    In this paper we present a methodology, not a tool. We present this methodology with the intent that it be adopted, on a case by case basis, by each of the existing tools in EPICS. In presenting this methodology, we describe each of its two components in detail and conclude with an example depicting how the methodology can be used across a pair of tools. The task of any control system is to provide access to the various components of the machine being controlled, for example, the Advanced Photon Source (APS). By access, we mean the ability to monitor the machine's status (reading) as well as the ability to explicitly change its status (writing). The Experimental Physics and Industrial Control System (EPICS) is a set of tools, designed to act in concert, that allows one to construct a control system. EPICS provides the ability to construct a control system that allows reading and writing access to the machine. It does this through the notion of databases. Each of the components of the APS that is accessed by the control system is represented in EPICS by a set of named database records. Once this abstraction is made, from physical device to named database records, the process of monitoring and changing the state of that device becomes the simple process of reading and writing information from and to its associated named records

  16. Counting stem cells : methodological constraints

    NARCIS (Netherlands)

    Bystrykh, Leonid V.; Verovskaya, Evgenia; Zwart, Erik; Broekhuis, Mathilde; de Haan, Gerald

    The number of stem cells contributing to hematopoiesis has been a matter of debate. Many studies use retroviral tagging of stem cells to measure clonal contribution. Here we argue that methodological factors can impact such clonal analyses. Whereas early studies had low resolution, leading to

  17. Test reactor risk assessment methodology

    International Nuclear Information System (INIS)

    Jennings, R.H.; Rawlins, J.K.; Stewart, M.E.

    1976-04-01

    A methodology has been developed for the identification of accident initiating events and the fault modeling of systems, including common mode identification, as these methods are applied in overall test reactor risk assessment. The methods are exemplified by a determination of risks to a loss of primary coolant flow in the Engineering Test Reactor

  18. METHODOLOGICAL ELEMENTS OF SITUATIONAL ANALYSIS

    Directory of Open Access Journals (Sweden)

    Tetyana KOVALCHUK

    2016-07-01

    Full Text Available The article deals with the investigation of theoretical and methodological principles of situational analysis. The necessity of situational analysis is proved in modern conditions. The notion “situational analysis” is determined. We have concluded that situational analysis is a continuous system study which purpose is to identify dangerous situation signs, to evaluate comprehensively such signs influenced by a system of objective and subjective factors, to search for motivated targeted actions used to eliminate adverse effects of the exposure of the system to the situation now and in the future and to develop the managerial actions needed to bring the system back to norm. It is developed a methodological approach to the situational analysis, its goal is substantiated, proved the expediency of diagnostic, evaluative and searching functions in the process of situational analysis. The basic methodological elements of the situational analysis are grounded. The substantiation of the principal methodological elements of system analysis will enable the analyst to develop adaptive methods able to take into account the peculiar features of a unique object which is a situation that has emerged in a complex system, to diagnose such situation and subject it to system and in-depth analysis, to identify risks opportunities, to make timely management decisions as required by a particular period.

  19. 16 Offsetting deficit conceptualisations: methodological ...

    African Journals Online (AJOL)

    uses the concepts of literacy practices and knowledge recontextualisation to ... 1996, 2000) theory of 'knowledge recontextualisation' in the development of curricula .... cognitive, social and cultural abilities needed to fit in and thrive in the HE learning .... this argument, that a methodology and analytic framework that seeks to ...

  20. Methodology for the case studies

    NARCIS (Netherlands)

    Smits, M.J.W.; Woltjer, G.B.

    2017-01-01

    This document is about the methodology and selection of the case studies. It is meant as a guideline for the case studies, and together with the other reports in this work package can be a source of inform ation for policy officers, interest groups and researchers evaluating or performing impact

  1. Safety at Work : Research Methodology

    NARCIS (Netherlands)

    Beurden, van K. (Karin); Boer, de J. (Johannes); Brinks, G. (Ger); Goering-Zaburnenko, T. (Tatiana); Houten, van Y. (Ynze); Teeuw, W. (Wouter)

    2012-01-01

    In this document, we provide the methodological background for the Safety atWork project. This document combines several project deliverables as defined inthe overall project plan: validation techniques and methods (D5.1.1), performanceindicators for safety at work (D5.1.2), personal protection

  2. Environmental impact statement analysis: dose methodology

    International Nuclear Information System (INIS)

    Mueller, M.A.; Strenge, D.L.; Napier, B.A.

    1981-01-01

    Standardized sections and methodologies are being developed for use in environmental impact statements (EIS) for activities to be conducted on the Hanford Reservation. Five areas for standardization have been identified: routine operations dose methodologies, accident dose methodology, Hanford Site description, health effects methodology, and socioeconomic environment for Hanford waste management activities

  3. A Critique of Methodological Dualism in Education

    Science.gov (United States)

    Yang, Jeong A.; Yoo, Jae-Bong

    2018-01-01

    This paper aims to critically examine the paradigm of methodological dualism and explore whether methodologies in social science currently are appropriate for educational research. There are two primary methodologies for studying education: quantitative and qualitative methods. This is what we mean by "methodological dualism". Is…

  4. Feminist Methodologies and Engineering Education Research

    Science.gov (United States)

    Beddoes, Kacey

    2013-01-01

    This paper introduces feminist methodologies in the context of engineering education research. It builds upon other recent methodology articles in engineering education journals and presents feminist research methodologies as a concrete engineering education setting in which to explore the connections between epistemology, methodology and theory.…

  5. Methodologies of Uncertainty Propagation Calculation

    International Nuclear Information System (INIS)

    Chojnacki, Eric

    2002-01-01

    After recalling the theoretical principle and the practical difficulties of the methodologies of uncertainty propagation calculation, the author discussed how to propagate input uncertainties. He said there were two kinds of input uncertainty: - variability: uncertainty due to heterogeneity, - lack of knowledge: uncertainty due to ignorance. It was therefore necessary to use two different propagation methods. He demonstrated this in a simple example which he generalised, treating the variability uncertainty by the probability theory and the lack of knowledge uncertainty by the fuzzy theory. He cautioned, however, against the systematic use of probability theory which may lead to unjustifiable and illegitimate precise answers. Mr Chojnacki's conclusions were that the importance of distinguishing variability and lack of knowledge increased as the problem was getting more and more complex in terms of number of parameters or time steps, and that it was necessary to develop uncertainty propagation methodologies combining probability theory and fuzzy theory

  6. Multicriteria methodology for decision aiding

    CERN Document Server

    Roy, Bernard

    1996-01-01

    This is the first comprehensive book to present, in English, the multicriteria methodology for decision aiding In the foreword the distinctive features and main ideas of the European School of MCDA are outlined The twelve chapters are essentially expository in nature, but scholarly in treatment Some questions, which are too often neglected in the literature on decision theory, such as how is a decision made, who are the actors, what is a decision aiding model, how to define the set of alternatives, are discussed Examples are used throughout the book to illustrate the various concepts Ways to model the consequences of each alternative and building criteria taking into account the inevitable imprecisions, uncertainties and indeterminations are described and illustrated The three classical operational approaches of MCDA synthesis in one criterion (including MAUT), synthesis by outranking relations, interactive local judgements, are studied This methodology tries to be a theoretical or intellectual framework dire...

  7. AGR core safety assessment methodologies

    International Nuclear Information System (INIS)

    McLachlan, N.; Reed, J.; Metcalfe, M.P.

    1996-01-01

    To demonstrate the safety of its gas-cooled graphite-moderated AGR reactors, nuclear safety assessments of the cores are based upon a methodology which demonstrates no component failures, geometrical stability of the structure and material properties bounded by a database. All AGRs continue to meet these three criteria. However, predictions of future core behaviour indicate that the safety case methodology will eventually need to be modified to deal with new phenomena. A new approach to the safety assessment of the cores is currently under development, which can take account of these factors while at the same time providing the same level of protection for the cores. This approach will be based on the functionality of the core: unhindered movement of control rods, continued adequate cooling of the fuel and the core, continued ability to charge and discharge fuel. (author). 5 figs

  8. Methodological update in Medicina Intensiva.

    Science.gov (United States)

    García Garmendia, J L

    2018-04-01

    Research in the critically ill is complex by the heterogeneity of patients, the difficulties to achieve representative sample sizes and the number of variables simultaneously involved. However, the quantity and quality of records is high as well as the relevance of the variables used, such as survival. The methodological tools have evolved to offering new perspectives and analysis models that allow extracting relevant information from the data that accompanies the critically ill patient. The need for training in methodology and interpretation of results is an important challenge for the intensivists who wish to be updated on the research developments and clinical advances in Intensive Medicine. Copyright © 2017 Elsevier España, S.L.U. y SEMNIM. All rights reserved.

  9. Methodology for building confidence measures

    Science.gov (United States)

    Bramson, Aaron L.

    2004-04-01

    This paper presents a generalized methodology for propagating known or estimated levels of individual source document truth reliability to determine the confidence level of a combined output. Initial document certainty levels are augmented by (i) combining the reliability measures of multiply sources, (ii) incorporating the truth reinforcement of related elements, and (iii) incorporating the importance of the individual elements for determining the probability of truth for the whole. The result is a measure of confidence in system output based on the establishing of links among the truth values of inputs. This methodology was developed for application to a multi-component situation awareness tool under development at the Air Force Research Laboratory in Rome, New York. Determining how improvements in data quality and the variety of documents collected affect the probability of a correct situational detection helps optimize the performance of the tool overall.

  10. Mo(ve)ment-methodology

    DEFF Research Database (Denmark)

    Mørck, Line Lerche; Christian Celosse-Andersen, Martin

    2018-01-01

    This paper describes the theoretical basis for and development of a moment-movement research methodology, based on an integration of critical psychological practice research and critical ethnographic social practice theory. Central theoretical conceptualizations, such as human agency, life...... conditions and identity formation, are discussed in relation to criminological theories of gang desistance. The paper illustrates how the mo(ve)ment methodology was applied in a study of comprehensive processes of identity (re)formation and gang exit processes. This study was conducted with Martin, a former....... This is a moment which captures Martin’s complex and ambiguous feelings of conflictual concerns, frustration, anger, and a new feeling of insecurity in his masculinity, as well as engagement and a sense of deep meaningfulness as he becomes a more reflective academic. All these conflicting feelings also give...

  11. Design methodology of Dutch banknotes

    Science.gov (United States)

    de Heij, Hans A. M.

    2000-04-01

    Since the introduction of a design methodology for Dutch banknotes, the quality of Dutch paper currency has improved in more than one way. The methodology is question provides for (i) a design policy, which helps fix clear objectives; (ii) design management, to ensure a smooth cooperation between the graphic designer, printer, papermaker an central bank, (iii) a program of requirements, a banknote development guideline for all parties involved. This systematic approach enables an objective selection of design proposals, including security features. Furthermore, the project manager obtains regular feedback from the public by conducting market surveys. Each new design of a Netherlands Guilder banknote issued by the Nederlandsche Bank of the past 50 years has been an improvement on its predecessor in terms of value recognition, security and durability.

  12. Workshops as a Research Methodology

    DEFF Research Database (Denmark)

    Ørngreen, Rikke; Levinsen, Karin Tweddell

    2017-01-01

    , and workshops as a research methodology. Focusing primarily on the latter, this paper presents five studies on upper secondary and higher education teachers’ professional development and on teaching and learning through video conferencing. Through analysis and discussion of these studies’ findings, we argue......This paper contributes to knowledge on workshops as a research methodology, and specifically on how such workshops pertain to e-learning. A literature review illustrated that workshops are discussed according to three different perspectives: workshops as a means, workshops as practice...... that workshops provide a platform that can aid researchers in identifying and exploring relevant factors in a given domain by providing means for understanding complex work and knowledge processes that are supported by technology (for example, e-learning). The approach supports identifying factors...

  13. Methodological challenges and lessons learned

    DEFF Research Database (Denmark)

    Nielsen, Poul Erik; Gustafsson, Jessica

    2017-01-01

    Taking as point of departure three recently conducted empirical studies, the aim of this article is to theoretically and empirically discuss methodological challenges studying the interrelations between media and social reality and to critically reflect on the methodologies used in the studies....... By deconstructing the studies, the article draws attention to the fact that different methods are able to grasp different elements of social reality. Moreover, by analysing the power relations at play, the article demonstrated that the interplay between interviewer and interviewee, and how both parties fit...... into present power structures, greatly influence the narratives that are co-produced during interviews. The article thus concludes that in order to fully understand complex phenomena it is not just enough to use a mixture of methods, the makeup of the research team is also imperative, as a diverse team...

  14. Implementation impacts of PRL methodology

    International Nuclear Information System (INIS)

    Caudill, J.A.; Krupa, J.F.; Meadors, R.E.; Odum, J.V.; Rodrigues, G.C.

    1993-02-01

    This report responds to a DOE-SR request to evaluate the impacts from implementation of the proposed Plutonium Recovery Limit (PRL) methodology. The PRL Methodology is based on cost minimization for decisions to discard or recover plutonium contained in scrap, residues, and other plutonium bearing materials. Implementation of the PRL methodology may result in decisions to declare as waste certain plutonium bearing materials originally considered to be a recoverable plutonium product. Such decisions may have regulatory impacts, because any material declared to be waste would immediately be subject to provisions of the Resource Conservation and Recovery Act (RCRA). The decision to discard these materials will have impacts on waste storage, treatment, and disposal facilities. Current plans for the de-inventory of plutonium processing facilities have identified certain materials as candidates for discard based upon the economic considerations associated with extending the operating schedules for recovery of the contained plutonium versus potential waste disposal costs. This report evaluates the impacts of discarding those materials as proposed by the F Area De-Inventory Plan and compares the De-Inventory Plan assessments with conclusions from application of the PRL. The impact analysis was performed for those materials proposed as potential candidates for discard by the De-Inventory Plan. The De-Inventory Plan identified 433 items, containing approximately 1% of the current SRS Pu-239 inventory, as not appropriate for recovery as the site moves to complete the mission of F-Canyon and FB-Line. The materials were entered into storage awaiting recovery as product under the Department's previous Economic Discard Limit (EDL) methodology which valued plutonium at its incremental cost of production in reactors. An application of Departmental PRLs to the subject 433 items revealed that approximately 40% of them would continue to be potentially recoverable as product plutonium

  15. ISE System Development Methodology Manual

    Energy Technology Data Exchange (ETDEWEB)

    Hayhoe, G.F.

    1992-02-17

    The Information Systems Engineering (ISE) System Development Methodology Manual (SDM) is a framework of life cycle management guidelines that provide ISE personnel with direction, organization, consistency, and improved communication when developing and maintaining systems. These guide-lines were designed to allow ISE to build and deliver Total Quality products, and to meet the goals and requirements of the US Department of Energy (DOE), Westinghouse Savannah River Company, and Westinghouse Electric Corporation.

  16. Environmental Testing Methodology in Biometrics

    OpenAIRE

    Fernández Saavedra, Belén; Sánchez Reíllo, Raúl; Alonso Moreno, Raúl; Miguel Hurtado, Óscar

    2010-01-01

    8 pages document + 5-slide presentation.-- Contributed to: 1st International Biometric Performance Conference (IBPC 2010, NIST, Gaithersburg, MD, US, Mar 1-5, 2010). Recently, biometrics is used in many security systems and these systems can be located in different environments. As many experts claim and previous works have demonstrated, environmental conditions influence biometric performance. Nevertheless, there is not a specific methodology for testing this influence at the moment...

  17. Soft systems methodology: other voices

    OpenAIRE

    Holwell, Sue

    2000-01-01

    This issue of Systemic Practice and Action Research, celebrating the work of Peter Checkland, in the particular nature and development of soft systems methodology (SSM), would not have happened unless the work was seen by others as being important. No significant contribution to thinking happens without a secondary literature developing. Not surprisingly, many commentaries have accompanied the ongoing development of SSM. Some of these are insightful, some full of errors, and some include both...

  18. Systems engineering agile design methodologies

    CERN Document Server

    Crowder, James A

    2013-01-01

    This book examines the paradigm of the engineering design process. The authors discuss agile systems and engineering design. The book captures the entire design process (functionbases), context, and requirements to affect real reuse. It provides a methodology for an engineering design process foundation for modern and future systems design. This book captures design patterns with context for actual Systems Engineering Design Reuse and contains a new paradigm in Design Knowledge Management.

  19. Methodological remarks on contraction theory

    DEFF Research Database (Denmark)

    Jouffroy, Jerome; Slotine, Jean-Jacques E.

    Because contraction analysis stems from a differential and incremental framework, the nature and methodology of contraction-based proofs are significantly different from those of their Lyapunov-based counterparts. This paper specifically studies this issue, and illustrates it by revisiting some c...... classical examples traditionally addressed using Lyapunov theory. Even in these cases, contraction tools can often yield significantly simplified analysis. The examples include adaptive control, robotics, and a proof of convergence of the deterministic Extended Kalman Filter....

  20. Artificial Intelligence Techniques and Methodology

    OpenAIRE

    Carbonell, Jaime G.; Sleeman, Derek

    1982-01-01

    Two closely related aspects of artificial intelligence that have received comparatively little attention in the recent literature are research methodology, and the analysis of computational techniques that span multiple application areas. We believe both issues to be increasingly significant as Artificial Intelligence matures into a science and spins off major application efforts. It is imperative to analyze the repertoire of AI methods with respect to past experience, utility in new domains,...

  1. Developing Foucault's Discourse Analytic Methodology

    Directory of Open Access Journals (Sweden)

    Rainer Diaz-Bone

    2006-01-01

    Full Text Available A methodological position for a FOUCAULTian discourse analysis is presented. A sequence of analytical steps is introduced and an illustrating example is offered. It is emphasized that discourse analysis has to discover the system-level of discursive rules and the deeper structure of the discursive formation. Otherwise the analysis will be unfinished. Michel FOUCAULTs work is theoretically grounded in French structuralism and (the so called post-structuralism. In this paper, post-structuralism is not conceived as a means for overcoming of structuralism, but as a way of critically continuing the structural perspective. In this way, discursive structures can be related to discursive practices and the concept of structure can be disclosed (e. g. to inter-discourse or DERRIDAs concept of structurality. In this way, the structural methodology is continued and radicalized, but not given up. In this paper, FOUCAULTs theory is combined with the works of Michel PÊCHEUX and (especially for the sociology of knowledge and the sociology of culture Pierre BOURDIEU. The practice of discourse analysis is theoretically grounded. This practice can be conceived as a reflexive coupling of deconstruction and reconstruction in the material to be analyzed. This methodology therefore can be characterized as a reconstructive qualitative methodology. At the end of the article, forms of discourse analysis are criticized that do not intend to recover the system level of discursive rules and that do not intend to discover the deeper structure of the discursive formation (i. e. episteme, socio-episteme. These forms merely are commentaries of discourses (not their analyses, they remain phenomenological and are therefore: pre-structuralist. URN: urn:nbn:de:0114-fqs060168

  2. Energy Efficiency Indicators Methodology Booklet

    Energy Technology Data Exchange (ETDEWEB)

    Sathaye, Jayant; Price, Lynn; McNeil, Michael; de la rue du Can, Stephane

    2010-05-01

    This Methodology Booklet provides a comprehensive review and methodology guiding principles for constructing energy efficiency indicators, with illustrative examples of application to individual countries. It reviews work done by international agencies and national government in constructing meaningful energy efficiency indicators that help policy makers to assess changes in energy efficiency over time. Building on past OECD experience and best practices, and the knowledge of these countries' institutions, relevant sources of information to construct an energy indicator database are identified. A framework based on levels of hierarchy of indicators -- spanning from aggregate, macro level to disaggregated end-use level metrics -- is presented to help shape the understanding of assessing energy efficiency. In each sector of activity: industry, commercial, residential, agriculture and transport, indicators are presented and recommendations to distinguish the different factors affecting energy use are highlighted. The methodology booklet addresses specifically issues that are relevant to developing indicators where activity is a major factor driving energy demand. A companion spreadsheet tool is available upon request.

  3. Software engineering methodologies and tools

    Science.gov (United States)

    Wilcox, Lawrence M.

    1993-01-01

    Over the years many engineering disciplines have developed, including chemical, electronic, etc. Common to all engineering disciplines is the use of rigor, models, metrics, and predefined methodologies. Recently, a new engineering discipline has appeared on the scene, called software engineering. For over thirty years computer software has been developed and the track record has not been good. Software development projects often miss schedules, are over budget, do not give the user what is wanted, and produce defects. One estimate is there are one to three defects per 1000 lines of deployed code. More and more systems are requiring larger and more complex software for support. As this requirement grows, the software development problems grow exponentially. It is believed that software quality can be improved by applying engineering principles. Another compelling reason to bring the engineering disciplines to software development is productivity. It has been estimated that productivity of producing software has only increased one to two percent a year in the last thirty years. Ironically, the computer and its software have contributed significantly to the industry-wide productivity, but computer professionals have done a poor job of using the computer to do their job. Engineering disciplines and methodologies are now emerging supported by software tools that address the problems of software development. This paper addresses some of the current software engineering methodologies as a backdrop for the general evaluation of computer assisted software engineering (CASE) tools from actual installation of and experimentation with some specific tools.

  4. Methodology for technical risk assessment

    International Nuclear Information System (INIS)

    Waganer, L.M.; Zuckerman, D.S.

    1983-01-01

    A methodology has been developed for and applied to the assessment of the technical risks associated with an evolving technology. This methodology, originally developed for fusion by K. W. Billman and F. R. Scott at EPRI, has been applied to assess the technical risk of a fuel system for a fusion reactor. Technical risk is defined as the risk that a particular technology or component which is currently under development will not achieve a set of required technical specifications (i.e. probability of failure). The individual steps in the technical risk assessment are summarized. The first step in this methodology is to clearly and completely quantify the technical requirements for the particular system being examined. The next step is to identify and define subsystems and various options which appear capable of achieving the required technical performance. The subsystem options are then characterized regarding subsystem functions, interface requirements with the subsystems and systems, important components, developmental obstacles and technical limitations. Key technical subsystem performance parameters are identified which directly or indirectly relate to the system technical specifications. Past, existing and future technical performance data from subsystem experts are obtained by using a Bayesian Interrogation technique. The input data is solicited in the form of probability functions. Thus the output performance of the system is expressed as probability functions

  5. Methodology for astronaut reconditioning research.

    Science.gov (United States)

    Beard, David J; Cook, Jonathan A

    2017-01-01

    Space medicine offers some unique challenges, especially in terms of research methodology. A specific challenge for astronaut reconditioning involves identification of what aspects of terrestrial research methodology hold and which require modification. This paper reviews this area and presents appropriate solutions where possible. It is concluded that spaceflight rehabilitation research should remain question/problem driven and is broadly similar to the terrestrial equivalent on small populations, such as rare diseases and various sports. Astronauts and Medical Operations personnel should be involved at all levels to ensure feasibility of research protocols. There is room for creative and hybrid methodology but careful systematic observation is likely to be more achievable and fruitful than complex trial based comparisons. Multi-space agency collaboration will be critical to pool data from small groups of astronauts with the accepted use of standardised outcome measures across all agencies. Systematic reviews will be an essential component. Most limitations relate to the inherent small sample size available for human spaceflight research. Early adoption of a co-operative model for spaceflight rehabilitation research is therefore advised. Copyright © 2016 Elsevier Ltd. All rights reserved.

  6. Methodology for ranking restoration options

    International Nuclear Information System (INIS)

    Hedemann Jensen, Per

    1999-04-01

    The work described in this report has been performed as a part of the RESTRAT Project FI4P-CT95-0021a (PL 950128) co-funded by the Nuclear Fission Safety Programme of the European Commission. The RESTRAT project has the overall objective of developing generic methodologies for ranking restoration techniques as a function of contamination and site characteristics. The project includes analyses of existing remediation methodologies and contaminated sites, and is structured in the following steps: characterisation of relevant contaminated sites; identification and characterisation of relevant restoration techniques; assessment of the radiological impact; development and application of a selection methodology for restoration options; formulation of generic conclusions and development of a manual. The project is intended to apply to situations in which sites with nuclear installations have been contaminated with radioactive materials as a result of the operation of these installations. The areas considered for remedial measures include contaminated land areas, rivers and sediments in rivers, lakes, and sea areas. Five contaminated European sites have been studied. Various remedial measures have been envisaged with respect to the optimisation of the protection of the populations being exposed to the radionuclides at the sites. Cost-benefit analysis and multi-attribute utility analysis have been applied for optimisation. Health, economic and social attributes have been included and weighting factors for the different attributes have been determined by the use of scaling constants. (au)

  7. Survey of Dynamic PSA Methodologies

    International Nuclear Information System (INIS)

    Lee, Hansul; Kim, Hyeonmin; Heo, Gyunyoung; Kim, Taewan

    2015-01-01

    Event Tree(ET)/Fault Tree(FT) are significant methodology in Probabilistic Safety Assessment(PSA) for Nuclear Power Plants(NPPs). ET/FT methodology has the advantage for users to be able to easily learn and model. It enables better communication between engineers engaged in the same field. However, conventional methodologies are difficult to cope with the dynamic behavior (e.g. operation mode changes or sequence-dependent failure) and integrated situation of mechanical failure and human errors. Meanwhile, new possibilities are coming for the improved PSA by virtue of the dramatic development on digital hardware, software, information technology, and data analysis.. More specifically, the computing environment has been greatly improved with being compared to the past, so we are able to conduct risk analysis with the large amount of data actually available. One method which can take the technological advantages aforementioned should be the dynamic PSA such that conventional ET/FT can have time- and condition-dependent behaviors in accident scenarios. In this paper, we investigated the various enabling techniques for the dynamic PSA. Even though its history and academic achievement was great, it seems less interesting from industrial and regulatory viewpoint. Authors expect this can contribute to better understanding of dynamic PSA in terms of algorithm, practice, and applicability. In paper, the overview for the dynamic PSA was conducted. Most of methodologies share similar concepts. Among them, DDET seems a backbone for most of methodologies since it can be applied to large problems. The common characteristics sharing the concept of DDET are as follows: • Both deterministic and stochastic approaches • Improves the identification of PSA success criteria • Helps to limit detrimental effects of sequence binning (normally adopted in PSA) • Helps to avoid defining non-optimal success criteria that may distort the risk • Framework for comprehensively considering

  8. Survey of Dynamic PSA Methodologies

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Hansul; Kim, Hyeonmin; Heo, Gyunyoung [Kyung Hee University, Yongin (Korea, Republic of); Kim, Taewan [KEPCO International Nuclear Graduate School, Ulsan (Korea, Republic of)

    2015-05-15

    Event Tree(ET)/Fault Tree(FT) are significant methodology in Probabilistic Safety Assessment(PSA) for Nuclear Power Plants(NPPs). ET/FT methodology has the advantage for users to be able to easily learn and model. It enables better communication between engineers engaged in the same field. However, conventional methodologies are difficult to cope with the dynamic behavior (e.g. operation mode changes or sequence-dependent failure) and integrated situation of mechanical failure and human errors. Meanwhile, new possibilities are coming for the improved PSA by virtue of the dramatic development on digital hardware, software, information technology, and data analysis.. More specifically, the computing environment has been greatly improved with being compared to the past, so we are able to conduct risk analysis with the large amount of data actually available. One method which can take the technological advantages aforementioned should be the dynamic PSA such that conventional ET/FT can have time- and condition-dependent behaviors in accident scenarios. In this paper, we investigated the various enabling techniques for the dynamic PSA. Even though its history and academic achievement was great, it seems less interesting from industrial and regulatory viewpoint. Authors expect this can contribute to better understanding of dynamic PSA in terms of algorithm, practice, and applicability. In paper, the overview for the dynamic PSA was conducted. Most of methodologies share similar concepts. Among them, DDET seems a backbone for most of methodologies since it can be applied to large problems. The common characteristics sharing the concept of DDET are as follows: • Both deterministic and stochastic approaches • Improves the identification of PSA success criteria • Helps to limit detrimental effects of sequence binning (normally adopted in PSA) • Helps to avoid defining non-optimal success criteria that may distort the risk • Framework for comprehensively considering

  9. Prioritization methodology for chemical replacement

    Science.gov (United States)

    Cruit, Wendy; Goldberg, Ben; Schutzenhofer, Scott

    1995-01-01

    Since United States of America federal legislation has required ozone depleting chemicals (class 1 & 2) to be banned from production, The National Aeronautics and Space Administration (NASA) and industry have been required to find other chemicals and methods to replace these target chemicals. This project was initiated as a development of a prioritization methodology suitable for assessing and ranking existing processes for replacement 'urgency.' The methodology was produced in the form of a workbook (NASA Technical Paper 3421). The final workbook contains two tools, one for evaluation and one for prioritization. The two tools are interconnected in that they were developed from one central theme - chemical replacement due to imposed laws and regulations. This workbook provides matrices, detailed explanations of how to use them, and a detailed methodology for prioritization of replacement technology. The main objective is to provide a GUIDELINE to help direct the research for replacement technology. The approach for prioritization called for a system which would result in a numerical rating for the chemicals and processes being assessed. A Quality Function Deployment (QFD) technique was used in order to determine numerical values which would correspond to the concerns raised and their respective importance to the process. This workbook defines the approach and the application of the QFD matrix. This technique: (1) provides a standard database for technology that can be easily reviewed, and (2) provides a standard format for information when requesting resources for further research for chemical replacement technology. Originally, this workbook was to be used for Class 1 and Class 2 chemicals, but it was specifically designed to be flexible enough to be used for any chemical used in a process (if the chemical and/or process needs to be replaced). The methodology consists of comparison matrices (and the smaller comparison components) which allow replacement technology

  10. The fractional scaling methodology (FSM) Part 1. methodology development

    International Nuclear Information System (INIS)

    Novak Zuber; Ivan Catton; Upendra S Rohatgi; Wolfgang Wulff

    2005-01-01

    Full text of publication follows: a quantitative methodology is developed, based on the concepts of hierarchy and synthesis, to integrate and organize information and data. The methodology uses scaling to synthesize experimental data and analytical results, and to provide quantitative criteria for evaluating the effects of various design and operating parameters that influence processes in a complex system such as a nuclear power plant or a related test facility. Synthesis and scaling are performed on three hierarchical levels: the process, component and system levels. Scaling on the process level determines the effect of a selected process on a particular state variable during a selected scenario. At the component level this scaling determines the effects various processes have on a state variable, and it ranks the processes according to their importance by the magnitude of the fractional change they cause on that state variable. At the system level the scaling determines the governing processes and corresponding components, ranking these in the order of importance according to their effect on the fractional change of system-wide state variables. The scaling methodology reveals on all levels the fractional change of state variables and is called therefore the Fractional Scaling Methodology (FSM). FSM synthesizes process parameters and assigns to each thermohydraulic process a dimensionless effect metric Ω = ωt, that is the product of the specific rate of fractional change ω and the characteristic time t. The rate of fractional change ω is the ratio of process transport rate over content of a preserved quantity in a component. The effect metric Ω quantifies the contribution of the process to the fractional change of a state variable in a given component. Ordering of a component effect metrics provides the hierarchy of processes in a component, then in all components and the system. FSM separates quantitatively dominant from minor processes and components and

  11. Radioisotope methodology course radioprotection aspects

    International Nuclear Information System (INIS)

    Bergoc, R.M.; Caro, R.A.; Menossi, C.A.

    1996-01-01

    The advancement knowledge in molecular and cell biology, biochemistry, medicine and pharmacology, which has taken place during the last 50 years, after World War II finalization, is really outstanding. It can be safely said that this fact is principally due to the application of radioisotope techniques. The research on metabolisms, biodistribution of pharmaceuticals, pharmacodynamics, etc., is mostly carried out by means of techniques employing radioactive materials. Radioisotopes and radiation are frequently used in medicine both as diagnostic and therapeutic tools. The radioimmunoanalysis is today a routine method in endocrinology and in general clinical medicine. The receptor determination and characterization is a steadily growing methodology used in clinical biochemistry, pharmacology and medicine. The use of radiopharmaceuticals and radiation of different origins, for therapeutic purposes, should not be overlooked. For these reasons, the importance to teach radioisotope methodology is steadily growing. This is principally the case for specialization at the post-graduate level but at the pre graduate curriculum it is worthwhile to give some elementary theoretical and practical notions on this subject. These observations are justified by a more than 30 years teaching experience at both levels at the School of Pharmacy and Biochemistry of the University of Buenos Aires, Argentina. In 1960 we began to teach Physics III, an obligatory pregraduate course for biochemistry students, in which some elementary notions of radioactivity and measurement techniques were given. Successive modifications of the biochemistry pregraduate curriculum incorporated radiochemistry as an elective subject and since 1978, radioisotope methodology, as obligatory subject for biochemistry students. This subject is given at the radioisotope laboratory during the first semester of each year and its objective is to provide theoretical and practical knowledge to the biochemistry students, even

  12. Methodologies for tracking learning paths

    DEFF Research Database (Denmark)

    Frølunde, Lisbeth; Gilje, Øystein; Lindstrand, Fredrik

    2009-01-01

    filmmakers: what furthers their interest and/or hinders it, and what learning patterns emerge. The aim of this article is to present and discuss issues regarding the methodology and meth- ods of the study, such as developing a relationship with interviewees when conducting inter- views online (using MSN). We...... suggest two considerations about using online interviews: how the interviewees value the given subject of conversation and their familiarity with being online. The benefit of getting online communication with the young filmmakers offers ease, because it is both practical and appropriates a meeting...

  13. Continuous culture apparatus and methodology

    International Nuclear Information System (INIS)

    Conway, H.L.

    1975-01-01

    At present, we are investigating the sorption of potentially toxic trace elements by phytoplankton under controlled laboratory conditions. Continuous culture techniques were used to study the mechanism of the sorption of the trace elements by unialgal diatom populations and the factors influencing this sorption. Continuous culture methodology has been used extensively to study bacterial kinetics. It is an excellent technique for obtaining a known physiological state of phytoplankton populations. An automated method for the synthesis of continuous culture medium for use in these experiments is described

  14. Chapter three: methodology of exposure modeling

    CSIR Research Space (South Africa)

    Moschandreas, DJ

    2002-12-01

    Full Text Available methodologies and models are reviewed. Three exposure/measurement methodologies are assessed. Estimation methods focus on source evaluation and attribution, sources include those outdoors and indoors as well as in occupational and in-transit environments. Fate...

  15. Methodological Issues and Practices in Qualitative Research.

    Science.gov (United States)

    Bradley, Jana

    1993-01-01

    Discusses methodological issues concerning qualitative research and describes research practices that qualitative researchers use to address these methodological issues. Topics discussed include the researcher as interpreter, the emergent nature of qualitative research, understanding the experience of others, trustworthiness in qualitative…

  16. Audit Methodology for IT Governance

    Directory of Open Access Journals (Sweden)

    Mirela GHEORGHE

    2010-01-01

    Full Text Available The continuous development of the new IT technologies was followed up by a rapid integration of them at the organization level. The management of the organizations face a new challenge: structural redefinition of the IT component in order to create plus value and to minimize IT risks through an efficient management of all IT resources of the organization. These changes have had a great impact on the governance of the IT component. The paper proposes an audit methodology of the IT Governance at the organization level. From this point of view the developed audit strategy is a strategy based on risks to enable IT auditor to study from the best angle efficiency and effectiveness of the IT Governance structure. The evaluation of the risks associated with IT Governance is a key process in planning the audit mission which will allow the identification of the segments with increased risks. With now ambition for completeness, the proposed methodology provides the auditor a useful tool in the accomplishment of his mission.

  17. Methodology of Credit Analysis Development

    Directory of Open Access Journals (Sweden)

    Slađana Neogradi

    2017-12-01

    Full Text Available The subject of research presented in this paper refers to the definition of methodology for the development of credit analysis in companies and its application in lending operations in the Republic of Serbia. With the developing credit market, there is a growing need for a well-developed risk and loss prevention system. In the introduction the process of bank analysis of the loan applicant is presented in order to minimize and manage the credit risk. By examining the subject matter, the process of processing the credit application is described, the procedure of analyzing the financial statements in order to get an insight into the borrower's creditworthiness. In the second part of the paper, the theoretical and methodological framework is presented applied in the concrete company. In the third part, models are presented which banks should use to protect against exposure to risks, i.e. their goal is to reduce losses on loan operations in our country, as well as to adjust to market conditions in an optimal way.

  18. Safeguarding the fuel cycle: Methodologies

    International Nuclear Information System (INIS)

    Gruemm, H.

    1984-01-01

    The effectiveness of IAEA safeguards is characterized by the extent to which they achieve their basic purpose - credible verification that no nuclear material is diverted from peaceful uses. This effectiveness depends inter alia but significantly on manpower in terms of the number and qualifications of inspectors. Staff increases will be required to improve effectiveness further, if this is requested by Member States, as well as to take into account new facilities expected to come under safeguards in the future. However, they are difficult to achieve due to financial constraints set by the IAEA budget. As a consequence, much has been done and is being undertaken to improve utilization of available manpower, including standardization of inspection procedures; improvement of management practices and training; rationalization of planning, reporting, and evaluation of inspection activities; and development of new equipment. This article focuses on certain aspects of the verification methodology presently used and asks: are any modifications of this methodology conceivable that would lead to economies of manpower, without loss of effectiveness. It has been stated in this context that present safeguards approaches are ''facility-oriented'' and that the adoption of a ''fuel cycle-oriented approach'' might bring about the desired savings. Many studies have been devoted to this very interesting suggestion. Up to this moment, no definite answer is available and further studies will be necessary to come to a conclusion. In what follows, the essentials of the problem are explained and some possible paths to a solution are discussed

  19. Methodology for combining dynamic responses

    International Nuclear Information System (INIS)

    Cudlin, R.; Hosford, S.; Mattu, R.; Wichman, K.

    1978-09-01

    The NRC has historically required that the structural/mechanical responses due to various accident loads and loads caused by natural phenomena, (such as earthquakes) be combined when analyzing structures, systems, and components important to safety. Several approaches to account for the potential interaction of loads resulting from accidents and natural phenomena have been used. One approach, the so-called absolute or linear summation (ABS) method, linearly adds the peak structural responses due to the individual dynamic loads. In general, the ABS method has also reflected the staff's conservative preference for the combination of dynamic load responses. A second approach, referred to as SRSS, yields a combined response equal to the square root of the sum of the squares of the peak responses due to the individual dynamic loads. The lack of a physical relationship between some of the loads has raised questions as to the proper methodology to be used in the design of nuclear power plants. An NRR Working Group was constituted to examine load combination methodologies and to develop a recommendation concerning criteria or conditions for their application. Evaluations of and recommendations on the use of the ABS and SRSS methods are provided in the report

  20. Information technology security system engineering methodology

    Science.gov (United States)

    Childs, D.

    2003-01-01

    A methodology is described for system engineering security into large information technology systems under development. The methodology is an integration of a risk management process and a generic system development life cycle process. The methodology is to be used by Security System Engineers to effectively engineer and integrate information technology security into a target system as it progresses through the development life cycle. The methodology can also be used to re-engineer security into a legacy system.

  1. Reconciling Anti-essentialism and Quantitative Methodology

    DEFF Research Database (Denmark)

    Jensen, Mathias Fjællegaard

    2017-01-01

    Quantitative methodology has a contested role in feminist scholarship which remains almost exclusively qualitative. Considering Irigaray’s notion of mimicry, Spivak’s strategic essentialism, and Butler’s contingent foundations, the essentialising implications of quantitative methodology may prove...... the potential to reconcile anti-essentialism and quantitative methodology, and thus, to make peace in the quantitative/qualitative Paradigm Wars....

  2. 42 CFR 441.472 - Budget methodology.

    Science.gov (United States)

    2010-10-01

    ... 42 Public Health 4 2010-10-01 2010-10-01 false Budget methodology. 441.472 Section 441.472 Public... Self-Directed Personal Assistance Services Program § 441.472 Budget methodology. (a) The State shall set forth a budget methodology that ensures service authorization resides with the State and meets the...

  3. Application of agile methodologies in software development

    Directory of Open Access Journals (Sweden)

    Jovanović Aca D.

    2016-01-01

    Full Text Available The paper presents the potentials for the development of software using agile methodologies. Special consideration is devoted to the potentials and advantages of use of the Scrum methodology in the development of software and the relationship between the implementation of agile methodologies and the software development projects.

  4. The epistemology of mathematical and statistical modeling: a quiet methodological revolution.

    Science.gov (United States)

    Rodgers, Joseph Lee

    2010-01-01

    A quiet methodological revolution, a modeling revolution, has occurred over the past several decades, almost without discussion. In contrast, the 20th century ended with contentious argument over the utility of null hypothesis significance testing (NHST). The NHST controversy may have been at least partially irrelevant, because in certain ways the modeling revolution obviated the NHST argument. I begin with a history of NHST and modeling and their relation to one another. Next, I define and illustrate principles involved in developing and evaluating mathematical models. Following, I discuss the difference between using statistical procedures within a rule-based framework and building mathematical models from a scientific epistemology. Only the former is treated carefully in most psychology graduate training. The pedagogical implications of this imbalance and the revised pedagogy required to account for the modeling revolution are described. To conclude, I discuss how attention to modeling implies shifting statistical practice in certain progressive ways. The epistemological basis of statistics has moved away from being a set of procedures, applied mechanistically, and moved toward building and evaluating statistical and scientific models. Copyrigiht 2009 APA, all rights reserved.

  5. Monitoring of metabolites and by-products in a down-scaled industrial lager beer fermentation

    OpenAIRE

    Sjöström, Fredrik

    2013-01-01

    The sugar composition of the wort and how these sugars are utilised by the yeast affects the organoleptic properties of the beer. To monitor the saccharides in the wort before inoculation and during fermentation is important in modern brewing industry. Reducing the duration of the brewing process is valuable and can be achieved by reducing the fermentation time by an increase in temperature. However, this must be done without changing the quality and characteristics of the end product, anothe...

  6. Parametric analysis of a down-scaled turbo jet engine suitable for drone and UAV propulsion

    Science.gov (United States)

    Wessley, G. Jims John; Chauhan, Swati

    2018-04-01

    This paper presents a detailed study on the need for downscaling gas turbine engines for UAV and drone propulsion. Also, the procedure for downscaling and the parametric analysis of a downscaled engine using Gas Turbine Simulation Program software GSP 11 is presented. The need for identifying a micro gas turbine engine in the thrust range of 0.13 to 4.45 kN to power UAVs and drones weighing in the range of 4.5 to 25 kg is considered and in order to meet the requirement a parametric analysis on the scaled down Allison J33-A-35 Turbojet engine is performed. It is evident from the analysis that the thrust developed by the scaled engine and the Thrust Specific Fuel Consumption TSFC depends on pressure ratio, mass flow rate of air and Mach number. A scaling factor of 0.195 corresponding to air mass flow rate of 7.69 kg/s produces a thrust in the range of 4.57 to 5.6 kN while operating at a Mach number of 0.3 within the altitude of 5000 to 9000 m. The thermal and overall efficiency of the scaled engine is found to be 67% and 75% respectively for a pressure ratio of 2. The outcomes of this analysis form a strong base for further analysis, design and fabrication of micro gas turbine engines to propel future UAVs and drones.

  7. Methodology for flammable gas evaluations

    Energy Technology Data Exchange (ETDEWEB)

    Hopkins, J.D., Westinghouse Hanford

    1996-06-12

    There are 177 radioactive waste storage tanks at the Hanford Site. The waste generates flammable gases. The waste releases gas continuously, but in some tanks the waste has shown a tendency to trap these flammable gases. When enough gas is trapped in a tank`s waste matrix, it may be released in a way that renders part or all of the tank atmosphere flammable for a period of time. Tanks must be evaluated against previously defined criteria to determine whether they can present a flammable gas hazard. This document presents the methodology for evaluating tanks in two areas of concern in the tank headspace:steady-state flammable-gas concentration resulting from continuous release, and concentration resulting from an episodic gas release.

  8. Methodological Reflections: Inter- ethnic Research

    DEFF Research Database (Denmark)

    Singla, Rashmi

    2010-01-01

    with both youth and the parental generation with ethnic minority background in Denmark. These reflections include implications and challenges related to researcher’s national, ethnic background and educational, professional position in encounter with   diverse ‘researched persons’ such as youth......This article reflects on the methodological and epistemological aspects of the ethical issues involved in encounters between researcher and research participants with ethnic minority background in contexts with diversity. Specific challenges involved in longitudinal research (10 - 15 years......) are also considered. The issues related to the social relevance of the research deriving from psycho political validity implying consideration of power dynamics in the personal, relational and collective domains are included. The primary basis for these reflections is a follow-up study concerning young...

  9. Simulation enabled safeguards assessment methodology

    International Nuclear Information System (INIS)

    Bean, Robert; Bjornard, Trond; Larson, Tom

    2007-01-01

    It is expected that nuclear energy will be a significant component of future supplies. New facilities, operating under a strengthened international nonproliferation regime will be needed. There is good reason to believe virtual engineering applied to the facility design, as well as to the safeguards system design will reduce total project cost and improve efficiency in the design cycle. Simulation Enabled Safeguards Assessment MEthodology has been developed as a software package to provide this capability for nuclear reprocessing facilities. The software architecture is specifically designed for distributed computing, collaborative design efforts, and modular construction to allow step improvements in functionality. Drag and drop wire-frame construction allows the user to select the desired components from a component warehouse, render the system for 3D visualization, and, linked to a set of physics libraries and/or computational codes, conduct process evaluations of the system they have designed. (authors)

  10. Simulation Enabled Safeguards Assessment Methodology

    International Nuclear Information System (INIS)

    Robert Bean; Trond Bjornard; Thomas Larson

    2007-01-01

    It is expected that nuclear energy will be a significant component of future supplies. New facilities, operating under a strengthened international nonproliferation regime will be needed. There is good reason to believe virtual engineering applied to the facility design, as well as to the safeguards system design will reduce total project cost and improve efficiency in the design cycle. Simulation Enabled Safeguards Assessment Methodology (SESAME) has been developed as a software package to provide this capability for nuclear reprocessing facilities. The software architecture is specifically designed for distributed computing, collaborative design efforts, and modular construction to allow step improvements in functionality. Drag and drop wireframe construction allows the user to select the desired components from a component warehouse, render the system for 3D visualization, and, linked to a set of physics libraries and/or computational codes, conduct process evaluations of the system they have designed

  11. Sustainable Innovation and Entrepreneurship Methodology

    DEFF Research Database (Denmark)

    Celik, Sine; Joore, Peter; Christodoulou, Panayiotis

    or regional “co-creation platform for sustainable solutions” to promote structural innovation. In this manual, the Sustainable Innovation and Entrepreneurship Methodology will be described. The organisational guidelines mainly take point of departure in how Aalborg University (AAU) in Denmark has organised......The objective of the InnoLabs project is to facilitate cross-sectoral, multidisciplinary solutions to complex social problems in various European settings. InnoLabs are university-driven physical and/or organizational spaces that function as student innovation laboratories and operate as a local...... this in daily practice. In line with the objectives of the Innolabs project (output 05), partners in the Innolabs project have reflected, evaluated and concluded the project experiences, which are described in this report. The InnoLabs project was developed for the 2014 call of Erasmus+ funds KA2- Cooperation...

  12. Methodologies for 2011 economic reports

    DEFF Research Database (Denmark)

    Nielsen, Rasmus

    STECF’s Expert Working Group 11-03 convened in Athens (28th March – 1st April, 2011) to discuss and seek agreement on the content, indicators, methodologies and format of the 2011 Annual Economic Reports (AER) on the EU fishing fleet, the fish processing and the aquaculture sectors. Proposals...... for improved contents and the overall structure were discussed. Templates for the national and EU overview chapters for the EU the fish processing and the aquaculture sectors were produced. Indicators for the EU fishing fleet and fish processing reports were reviewed; new indicators for the fish processing...... and the aquaculture sector reports were proposed. And topics of special interest were proposed for all three reports....

  13. Neural Networks Methodology and Applications

    CERN Document Server

    Dreyfus, Gérard

    2005-01-01

    Neural networks represent a powerful data processing technique that has reached maturity and broad application. When clearly understood and appropriately used, they are a mandatory component in the toolbox of any engineer who wants make the best use of the available data, in order to build models, make predictions, mine data, recognize shapes or signals, etc. Ranging from theoretical foundations to real-life applications, this book is intended to provide engineers and researchers with clear methodologies for taking advantage of neural networks in industrial, financial or banking applications, many instances of which are presented in the book. For the benefit of readers wishing to gain deeper knowledge of the topics, the book features appendices that provide theoretical details for greater insight, and algorithmic details for efficient programming and implementation. The chapters have been written by experts ands seemlessly edited to present a coherent and comprehensive, yet not redundant, practically-oriented...

  14. Stakeholder analysis methodologies resource book

    Energy Technology Data Exchange (ETDEWEB)

    Babiuch, W.M.; Farhar, B.C.

    1994-03-01

    Stakeholder analysis allows analysts to identify how parties might be affected by government projects. This process involves identifying the likely impacts of a proposed action and stakeholder groups affected by that action. Additionally, the process involves assessing how these groups might be affected and suggesting measures to mitigate any adverse effects. Evidence suggests that the efficiency and effectiveness of government actions can be increased and adverse social impacts mitigated when officials understand how a proposed action might affect stakeholders. This report discusses how to conduct useful stakeholder analyses for government officials making decisions on energy-efficiency and renewable-energy technologies and their commercialization. It discusses methodological issues that may affect the validity and reliability of findings, including sampling, generalizability, validity, ``uncooperative`` stakeholder groups, using social indicators, and the effect of government regulations. The Appendix contains resource directories and a list of specialists in stakeholder analysis and involvement.

  15. Inventory differences: An evaluation methodology

    International Nuclear Information System (INIS)

    Heinberg, C.L.; Roberts, N.J.

    1987-01-01

    This paper discusses an evaluation methodology which is used for inventory differences at the Los Alamos National Laboratory. It is recognized that there are various methods which can be, and are being, used to evaluate process inventory differences at DOE facilities. The purpose of this paper is to share our thoughts on the subject and our techniques with those who are responsible for the evaluation of inventory differences at their facility. One of the most dangerous aspects of any evaluation technique, especially one as complex as most inventory difference evaluations tend to be, is to fail to look at the tools being used as indicators. There is a tendency to look at the results of an evaluation by one technique as an absolute. At the Los Alamos National Laboratory, several tools are used and the final evaluation is based on a combination of the observed results of a many-faceted evaluation. The tools used and some examples are presented

  16. Methodology of formal software evaluation

    International Nuclear Information System (INIS)

    Tuszynski, J.

    1998-01-01

    Sydkraft AB, the major Swedish utility, owner of ca 6000 MW el installed in nuclear (NPP Barsebaeck and NPP Oskarshamn), fossil fuel and hydro Power Plants is facing modernization of the control systems of the plants. Standards applicable require structured, formal methods for implementation of the control functions in the modem, real time software systems. This presentation introduces implementation methodology as discussed presently at the Sydkraft organisation. The approach suggested is based upon the process of co-operation of three parties taking part in the implementation; owner of the plant, vendor and Quality Assurance (QA) organisation. QA will be based on tools for formal software validation and on systematic gathering by the owner of validated and proved-by-operation control modules for the concern-wide utilisation. (author)

  17. Butterfly valve torque prediction methodology

    International Nuclear Information System (INIS)

    Eldiwany, B.H.; Sharma, V.; Kalsi, M.S.; Wolfe, K.

    1994-01-01

    As part of the Motor-Operated Valve (MOV) Performance Prediction Program, the Electric Power Research Institute has sponsored the development of methodologies for predicting thrust and torque requirements of gate, globe, and butterfly MOVs. This paper presents the methodology that will be used by utilities to calculate the dynamic torque requirements for butterfly valves. The total dynamic torque at any disc position is the sum of the hydrodynamic torque, bearing torque (which is induced by the hydrodynamic force), as well as other small torque components (such as packing torque). The hydrodynamic torque on the valve disc, caused by the fluid flow through the valve, depends on the disc angle, flow velocity, upstream flow disturbances, disc shape, and the disc aspect ratio. The butterfly valve model provides sets of nondimensional flow and torque coefficients that can be used to predict flow rate and hydrodynamic torque throughout the disc stroke and to calculate the required actuation torque and the maximum transmitted torque throughout the opening and closing stroke. The scope of the model includes symmetric and nonsymmetric discs of different shapes and aspects ratios in compressible and incompressible fluid applications under both choked and nonchoked flow conditions. The model features were validated against test data from a comprehensive flowloop and in situ test program. These tests were designed to systematically address the effect of the following parameters on the required torque: valve size, disc shapes and disc aspect ratios, upstream elbow orientation and its proximity, and flow conditions. The applicability of the nondimensional coefficients to valves of different sizes was validated by performing tests on 42-in. valve and a precisely scaled 6-in. model. The butterfly valve model torque predictions were found to bound test data from the flow-loop and in situ testing, as shown in the examples provided in this paper

  18. System Anthropological Psychology: Methodological Foundations

    Directory of Open Access Journals (Sweden)

    Vitaliy Y. Klochko

    2012-01-01

    Full Text Available The article considers methodological foundations of the system anthropologicalpsychology (SAP as a scientific branch developed by a well-represented groupof Siberian scientists. SAP is a theory based on axiomatics of cultural-historicalpsychology of L.S. Vygotsky and transspective analysis as a specially developedmeans to define the tendencies of science developing as a self-organizing system.Transspective analysis has revealed regularities in a constantly growing complexityof professional-psychological thinking along the course of emergence ofscientific cognition. It has proved that the field of modern psychology is shapedby theories constructed with ideation of different grades of complexity. The concept“dynamics of the paradigm of science” is introduced; it allows transitions tobe acknowledged from ordinary-binary logic characteristics of the classical scienceto a binary-ternary logic, adequate to non-classical science and then to aternary-multidimensional logic, which is now at the stage of emergence. The latteris employed in SAP construction. It involves the following basic methodologicalprinciples: the principle of directed (selective interaction and the principle ofgenerative effect of selective interaction. The concept of “complimentary interaction”applied in natural as well as humanitarian sciences is reconsidered in thecontext of psychology. The conclusion is made that the principle of selectivity anddirectedness of interaction is relevant to the whole Universe embracing all kindsof systems including the living ones. Different levels of matter organization representingsemantic structures of various complexity use one and the same principleof meaning making through which the Universe ensures its sustainability asa self-developing phenomenon. This methodology provides an explanation fornature and stages of emergence of multidimensional life space of an individual,which comes as a foundation for generation of such features of

  19. CIAU methodology and BEPU applications

    International Nuclear Information System (INIS)

    Petruzzi, A.; D'Auria, F.

    2009-01-01

    Best-Estimate calculation results from complex thermal-hydraulic system codes (like Relap5, Cathare, Athlet, Trace, etc..) are affected by unavoidable approximations that are unpredictable without the use of computational tools that account for the various sources of uncertainty. Therefore the use of best-estimate codes within the reactor technology, either for design or safety purposes, implies understanding and accepting the limitations and the deficiencies of those codes. Uncertainties may have different origins ranging from the approximation of the models, to the approximation of the numerical solution, and to the lack of precision of the values adopted for boundary and initial conditions. The amount of uncertainty that affects a calculation may strongly depend upon the codes and the modeling techniques (i.e. the code's users). A consistent and robust uncertainty methodology must be developed taking into consideration all the above aspects. The CIAU (Code with the capability of Internal Assessment of Uncertainty) and the UMAE (Uncertainty Methodology based on Accuracy Evaluation) methods have been developed by University of Pisa (UNIPI) in the framework of a long lasting research activities started since 80's and involving several researchers. CIAU is extensively discussed in the available technical literature, Refs. [1, 2, 3, 4, 5, 6, 7], and tens of additional relevant papers, that provide comprehensive details about the method, can be found in the bibliography lists of the above references. Therefore, the present paper supplies only 'spot-information' about CIAU and focuses mostly on the applications to some cases of industrial interest. In particular the application of CIAU to the OECD BEMUSE (Best Estimate Methods Uncertainty and Sensitivity Evaluation, [8, 9]) project is discussed and a critical comparison respect with other uncertainty methods (in relation to items like: sources of uncertainties, selection of the input parameters and quantification of

  20. The Methodological Dynamism of Grounded Theory

    Directory of Open Access Journals (Sweden)

    Nicholas Ralph

    2015-11-01

    Full Text Available Variations in grounded theory (GT interpretation are the subject of ongoing debate. Divergences of opinion, genres, approaches, methodologies, and methods exist, resulting in disagreement on what GT methodology is and how it comes to be. From the postpositivism of Glaser and Strauss, to the symbolic interactionist roots of Strauss and Corbin, through to the constructivism of Charmaz, the field of GT methodology is distinctive in the sense that those using it offer new ontological, epistemological, and methodological perspectives at specific moments in time. We explore the unusual dynamism attached to GT’s underpinnings. Our view is that through a process of symbolic interactionism, in which generations of researchers interact with their context, moments are formed and philosophical perspectives are interpreted in a manner congruent with GT’s essential methods. We call this methodological dynamism, a process characterized by contextual awareness and moment formation, contemporaneous translation, generational methodology, and methodological consumerism.

  1. Climate index for Switzerland - Methodology

    International Nuclear Information System (INIS)

    2006-01-01

    According to the U.S. Department of Energy, an estimated 25% of the GNP is affected by weather-related events. The variations in temperature - even small ones - can also have long-lasting effects on the operational results of a company. Among other, the Energy supply sector is sensitive to weather risks: a milder or harsher than usual winter leads to a decrease or increase of energy consumption. The price of electricity on power trading facilities like Powernext is especially sensitive to odd changes in temperatures. Powernext and Meteo-France (the French meteorological agency) have joined expertise in order to promote the use of weather indices in term of decision making or underlying of hedging tools to energy actors, end users from any other sector of activity and specialists of the weather risk hedging. The Powernext Weather indices are made from information collected by Meteo-France's main observation network according to the norms of international meteorology, in areas carefully selected. The gross data are submitted to a thorough review allowing the correction of abnormalities and the reconstitution of missing data. Each index is fashioned to take into account the economic activity in the various regions of the country as represented by each region's population. This demographic information represents a fair approximation of the weight of the regional economic activity. This document presents the calculation methodology of average, minimum and maximum weather indexes with the winter and summer regression equations for the different economical regions of Switzerland. (J.S.)

  2. Methodology of site protection studies

    International Nuclear Information System (INIS)

    Farges, L.

    1980-01-01

    Preliminary studies preceding building of a nuclear facility aim at assessing the choice of a site and establishing operating and control procedures. These studies are of two types. Studies on the impact of environment on the nuclear facility to be constructed form one type and studies on the impact of nuclear facilities on the environment form the second type. A methodology giving a framework to studies of second type is presented. These studies are undertaken to choose suitable sites for nuclear facilities. After a preliminary selection of a site based on the first estimate, a detailed site study is undertaken. The procedure for this consists of five successive phases, namely, (1) an inquiry assessing the initial state of the site, (2) an initial synthesis of accumulated information for assessing the health and safety consequences of releases, (3) laboratory and field studies simulating the movement of waste products for a quantitative assessment of effects, (4) final synthesis for laying down the release limits and radiological control methods, and (5) conclusions based on comparing the data of final synthesis to the limits prescribed by regulations. These five phases are outlined. Role of periodic reassessments after the facility is in operation for same time is explained. (M.G.B.)

  3. Climate index for Belgium - Methodology

    International Nuclear Information System (INIS)

    2006-01-01

    According to the U.S. Department of Energy, an estimated 25% of the GNP is affected by weather-related events. The variations in temperature - even small ones - can also have long-lasting effects on the operational results of a company. Among other, the Energy supply sector is sensitive to weather risks: a milder or harsher than usual winter leads to a decrease or increase of energy consumption. The price of electricity on power trading facilities like Powernext is especially sensitive to odd changes in temperatures. Powernext and Meteo-France (the French meteorological agency) have joined expertise in order to promote the use of weather indices in term of decision making or underlying of hedging tools to energy actors, end users from any other sector of activity and specialists of the weather risk hedging. The Powernext Weather indices are made from information collected by Meteo-France's main observation network according to the norms of international meteorology, in areas carefully selected. The gross data are submitted to a thorough review allowing the correction of abnormalities and the reconstitution of missing data. Each index is fashioned to take into account the economic activity in the various regions of the country as represented by each region's population. This demographic information represents a fair approximation of the weight of the regional economic activity. This document presents the calculation methodology of average, minimum and maximum weather indexes with the winter and summer regression equations for the different economical regions of Belgium. (J.S.)

  4. Zika detection: comparison of methodologies

    Directory of Open Access Journals (Sweden)

    Tatiana Elias Colombo

    Full Text Available ABSTRACT Many countries in the Americas have detected local transmission of multiple arboviruses that cause febrile illnesses. Therefore, laboratory testing has become an important tool for confirming the etiology of these diseases. The present study aimed to compare the sensitivity and specificity of three different Zika virus detection assays. One hundred serum samples from patients presenting with acute febrile symptoms were tested using a previously reported TaqMan® RT-qPCR assay. We used a SYBR® Green RT-qPCR and a conventional PCR methodologies to compare the results. Of the samples that were determined to be negative by the TaqMan® RT-qPCR assay, 100% (Kappa = 0.670 were also found to be negative by SYBR® Green RT-qPCR based on Tm comparison; however, 14% (Kappa = 0.035 were found to be positive by conventional PCR followed by agarose gel electrophoresis. The differences between the ZIKV strains circulating worldwide and the low viremia period can compromise diagnostic accuracy and thereby the accuracy of outbreak data. Therefore, improved assays are required to improve the diagnosis and surveillance of arbovirus.

  5. Climate index for Portugal - Methodology

    International Nuclear Information System (INIS)

    2006-01-01

    According to the U.S. Department of Energy, an estimated 25% of the GNP is affected by weather-related events. The variations in temperature - even small ones - can also have long-lasting effects on the operational results of a company. Among other, the Energy supply sector is sensitive to weather risks: a milder or harsher than usual winter leads to a decrease or increase of energy consumption. The price of electricity on power trading facilities like Powernext is especially sensitive to odd changes in temperatures. Powernext and Meteo-France (the French meteorological agency) have joined expertise in order to promote the use of weather indices in term of decision making or underlying of hedging tools to energy actors, end users from any other sector of activity and specialists of the weather risk hedging. The Powernext Weather indices are made from information collected by Meteo-France's main observation network according to the norms of international meteorology, in areas carefully selected. The gross data are submitted to a thorough review allowing the correction of abnormalities and the reconstitution of missing data. Each index is fashioned to take into account the economic activity in the various regions of the country as represented by each region's population. This demographic information represents a fair approximation of the weight of the regional economic activity. This document presents the calculation methodology of average, minimum and maximum weather indexes with the winter and summer regression equations for the different economical regions of Portugal. (J.S.)

  6. Methodological Aspects of Architectural Documentation

    Directory of Open Access Journals (Sweden)

    Arivaldo Amorim

    2011-12-01

    Full Text Available This paper discusses the methodological approach that is being developed in the state of Bahia in Brazil since 2003, in architectural and urban sites documentation, using extensive digital technologies. Bahia has a vast territory with important architectural ensembles ranging from the sixteenth century to present day. As part of this heritage is constructed of raw earth and wood, it is very sensitive to various deleterious agents. It is therefore critical document this collection that is under threats. To conduct those activities diverse digital technologies that could be used in documentation process are being experimented. The task is being developed as an academic research, with few financial resources, by scholarship students and some volunteers. Several technologies are tested ranging from the simplest to the more sophisticated ones, used in the main stages of the documentation project, as follows: work overall planning, data acquisition, processing and management and ultimately, to control and evaluate the work. The activities that motivated this paper are being conducted in the cities of Rio de Contas and Lençóis in the Chapada Diamantina, located at 420 km and 750 km from Salvador respectively, in Cachoeira city at Recôncavo Baiano area, 120 km from Salvador, the capital of Bahia state, and at Pelourinho neighbourhood, located in the historic capital. Part of the material produced can be consulted in the website: < www.lcad.ufba.br>.

  7. Methodology of a systematic review.

    Science.gov (United States)

    Linares-Espinós, E; Hernández, V; Domínguez-Escrig, J L; Fernández-Pello, S; Hevia, V; Mayor, J; Padilla-Fernández, B; Ribal, M J

    2018-05-03

    The objective of evidence-based medicine is to employ the best scientific information available to apply to clinical practice. Understanding and interpreting the scientific evidence involves understanding the available levels of evidence, where systematic reviews and meta-analyses of clinical trials are at the top of the levels-of-evidence pyramid. The review process should be well developed and planned to reduce biases and eliminate irrelevant and low-quality studies. The steps for implementing a systematic review include (i) correctly formulating the clinical question to answer (PICO), (ii) developing a protocol (inclusion and exclusion criteria), (iii) performing a detailed and broad literature search and (iv) screening the abstracts of the studies identified in the search and subsequently of the selected complete texts (PRISMA). Once the studies have been selected, we need to (v) extract the necessary data into a form designed in the protocol to summarise the included studies, (vi) assess the biases of each study, identifying the quality of the available evidence, and (vii) develop tables and text that synthesise the evidence. A systematic review involves a critical and reproducible summary of the results of the available publications on a particular topic or clinical question. To improve scientific writing, the methodology is shown in a structured manner to implement a systematic review. Copyright © 2018 AEU. Publicado por Elsevier España, S.L.U. All rights reserved.

  8. Climate index for France - Methodology

    International Nuclear Information System (INIS)

    2006-01-01

    According to the U.S. Department of Energy, an estimated 25% of the GNP is affected by weather-related events. The variations in temperature - even small ones - can also have long-lasting effects on the operational results of a company. Among other, the Energy supply sector is sensitive to weather risks: a milder or harsher than usual winter leads to a decrease or increase of energy consumption. The price of electricity on power trading facilities like Powernext is especially sensitive to odd changes in temperatures. Powernext and Meteo-France (the French meteorological agency) have joined expertise in order to promote the use of weather indices in term of decision making or underlying of hedging tools to energy actors, end users from any other sector of activity and specialists of the weather risk hedging. The Powernext Weather indices are made from information collected by Meteo-France's main observation network according to the norms of international meteorology, in areas carefully selected. The gross data are submitted to a thorough review allowing the correction of abnormalities and the reconstitution of missing data. Each index is fashioned to take into account the economic activity in the various regions of the country as represented by each region's population. This demographic information represents a fair approximation of the weight of the regional economic activity. This document presents the calculation methodology of average, minimum and maximum weather indexes with the winter and summer regression equations for the different economical regions of France. (J.S.)

  9. Waste Package Design Methodology Report

    Energy Technology Data Exchange (ETDEWEB)

    D.A. Brownson

    2001-09-28

    The objective of this report is to describe the analytical methods and processes used by the Waste Package Design Section to establish the integrity of the various waste package designs, the emplacement pallet, and the drip shield. The scope of this report shall be the methodology used in criticality, risk-informed, shielding, source term, structural, and thermal analyses. The basic features and appropriateness of the methods are illustrated, and the processes are defined whereby input values and assumptions flow through the application of those methods to obtain designs that ensure defense-in-depth as well as satisfy requirements on system performance. Such requirements include those imposed by federal regulation, from both the U.S. Department of Energy (DOE) and U.S. Nuclear Regulatory Commission (NRC), and those imposed by the Yucca Mountain Project to meet repository performance goals. The report is to be used, in part, to describe the waste package design methods and techniques to be used for producing input to the License Application Report.

  10. Methodological Issues in Questionnaire Design.

    Science.gov (United States)

    Song, Youngshin; Son, Youn Jung; Oh, Doonam

    2015-06-01

    The process of designing a questionnaire is complicated. Many questionnaires on nursing phenomena have been developed and used by nursing researchers. The purpose of this paper was to discuss questionnaire design and factors that should be considered when using existing scales. Methodological issues were discussed, such as factors in the design of questions, steps in developing questionnaires, wording and formatting methods for items, and administrations methods. How to use existing scales, how to facilitate cultural adaptation, and how to prevent socially desirable responding were discussed. Moreover, the triangulation method in questionnaire development was introduced. Steps were recommended for designing questions such as appropriately operationalizing key concepts for the target population, clearly formatting response options, generating items and confirming final items through face or content validity, sufficiently piloting the questionnaire using item analysis, demonstrating reliability and validity, finalizing the scale, and training the administrator. Psychometric properties and cultural equivalence should be evaluated prior to administration when using an existing questionnaire and performing cultural adaptation. In the context of well-defined nursing phenomena, logical and systematic methods will contribute to the development of simple and precise questionnaires.

  11. Waste Package Design Methodology Report

    International Nuclear Information System (INIS)

    D.A. Brownson

    2001-01-01

    The objective of this report is to describe the analytical methods and processes used by the Waste Package Design Section to establish the integrity of the various waste package designs, the emplacement pallet, and the drip shield. The scope of this report shall be the methodology used in criticality, risk-informed, shielding, source term, structural, and thermal analyses. The basic features and appropriateness of the methods are illustrated, and the processes are defined whereby input values and assumptions flow through the application of those methods to obtain designs that ensure defense-in-depth as well as satisfy requirements on system performance. Such requirements include those imposed by federal regulation, from both the U.S. Department of Energy (DOE) and U.S. Nuclear Regulatory Commission (NRC), and those imposed by the Yucca Mountain Project to meet repository performance goals. The report is to be used, in part, to describe the waste package design methods and techniques to be used for producing input to the License Application Report

  12. Climate index for Spain - Methodology

    International Nuclear Information System (INIS)

    2006-01-01

    According to the U.S. Department of Energy, an estimated 25% of the GNP is affected by weather-related events. The variations in temperature - even small ones - can also have long-lasting effects on the operational results of a company. Among other, the Energy supply sector is sensitive to weather risks: a milder or harsher than usual winter leads to a decrease or increase of energy consumption. The price of electricity on power trading facilities like Powernext is especially sensitive to odd changes in temperatures. Powernext and Meteo-France (the French meteorological agency) have joined expertise in order to promote the use of weather indices in term of decision making or underlying of hedging tools to energy actors, end users from any other sector of activity and specialists of the weather risk hedging. The Powernext Weather indices are made from information collected by Meteo-France's main observation network according to the norms of international meteorology, in areas carefully selected. The gross data are submitted to a thorough review allowing the correction of abnormalities and the reconstitution of missing data. Each index is fashioned to take into account the economic activity in the various regions of the country as represented by each region's population. This demographic information represents a fair approximation of the weight of the regional economic activity. This document presents the calculation methodology of average, minimum and maximum weather indexes with the winter and summer regression equations for the different economical regions of Spain. (J.S.)

  13. Open Government Data Publication Methodology

    Directory of Open Access Journals (Sweden)

    Jan Kucera

    2015-04-01

    Full Text Available Public sector bodies hold a significant amount of data that might be of potential interest to citizens and businesses. However the re-use potential of this data is still largely untapped because the data is not always published in a way that would allow its easy discovery, understanding and re-use. Open Government Data (OGD initiatives aim at increasing availability of machine-readable data provided under an open license and therefore these initiatives might facilitate re-use of the government data which in turn might lead to increased transparency and economic growth. However the recent studies show that still only a portion of data provided by the public sector bodies is truly open. Public sector bodies face a number of challenges when publishing OGD and they need to address the relevant risks. Therefore there is a need for best practices and methodologies for publication of OGD that would provide the responsible persons with a clear guidance on how the OGD initiatives should be implemented and how the known challenges and risks should be addressed.

  14. Safety analysis methodology for OPR 1000

    International Nuclear Information System (INIS)

    Hwang-Yong, Jun

    2005-01-01

    Full text: Korea Electric Power Research Institute (KEPRI) has been developing inhouse safety analysis methodology based on the delicate codes available to KEPRI to overcome the problems arising from currently used vendor oriented methodologies. For the Loss of Coolant Accident (LOCA) analysis, the KREM (KEPRI Realistic Evaluation Methodology) has been developed based on the RELAP-5 code. The methodology was approved for the Westinghouse 3-loop plants by the Korean regulatory organization and the project to extent the methodology to the Optimized Power Reactor 1000 (OPR1000) has been ongoing since 2001. Also, for the Non-LOCA analysis, the KNAP (Korea Non-LOCA Analysis Package) has been developed using the UNICORN-TM code system. To demonstrate the feasibility of these codes systems and methodologies, some typical cases of the design basis accidents mentioned in the final safety analysis report (FSAR) were analyzed. (author)

  15. Tourism Methodologies - New Perspectives, Practices and Procedures

    DEFF Research Database (Denmark)

    This volume offers methodological discussions within the multidisciplinary field of tourism and shows how tourism researchers develop and apply new tourism methodologies. The book is presented as an anthology, giving voice to many diverse researchers who reflect on tourism methodology in differen...... codings and analysis, and tapping into the global network of social media.......This volume offers methodological discussions within the multidisciplinary field of tourism and shows how tourism researchers develop and apply new tourism methodologies. The book is presented as an anthology, giving voice to many diverse researchers who reflect on tourism methodology in different...... in interview and field work situations, and how do we engage with the performative aspects of tourism as a field of study? The book acknowledges that research is also performance and that it constitutes an aspect of intervention in the situations and contexts it is trying to explore. This is an issue dealt...

  16. Constructivism: a naturalistic methodology for nursing inquiry.

    Science.gov (United States)

    Appleton, J V; King, L

    1997-12-01

    This article will explore the philosophical underpinnings of the constructivist research paradigm. Despite its increasing popularity in evaluative health research studies there is limited recognition of constructivism in popular research texts. Lincoln and Guba's original approach to constructivist methodology is outlined and a detailed framework for nursing research is offered. Fundamental issues and concerns surrounding this methodology are debated and differences between method and methodology are highlighted.

  17. The GPT methodology. New fields of application

    International Nuclear Information System (INIS)

    Gandini, A.; Gomit, J.M.; Abramytchev, V.

    1996-01-01

    The GPT (Generalized Perturbation Theory) methodology is described, and a new application is discussed. The results obtained for a simple model (zero dimension, six parameters of interest) show that the expressions obtained using the GPT methodology, lead to results close to those obtained through direct calculations. The GPT methodology is useful to be used for radioactive waste disposal problems. The potentiality of the method linked to zero dimension model can be extended to radionuclide migration problems with space description. (K.A.)

  18. Methodology for Validating Building Energy Analysis Simulations

    Energy Technology Data Exchange (ETDEWEB)

    Judkoff, R.; Wortman, D.; O' Doherty, B.; Burch, J.

    2008-04-01

    The objective of this report was to develop a validation methodology for building energy analysis simulations, collect high-quality, unambiguous empirical data for validation, and apply the validation methodology to the DOE-2.1, BLAST-2MRT, BLAST-3.0, DEROB-3, DEROB-4, and SUNCAT 2.4 computer programs. This report covers background information, literature survey, validation methodology, comparative studies, analytical verification, empirical validation, comparative evaluation of codes, and conclusions.

  19. Current trends in Bayesian methodology with applications

    CERN Document Server

    Upadhyay, Satyanshu K; Dey, Dipak K; Loganathan, Appaia

    2015-01-01

    Collecting Bayesian material scattered throughout the literature, Current Trends in Bayesian Methodology with Applications examines the latest methodological and applied aspects of Bayesian statistics. The book covers biostatistics, econometrics, reliability and risk analysis, spatial statistics, image analysis, shape analysis, Bayesian computation, clustering, uncertainty assessment, high-energy astrophysics, neural networking, fuzzy information, objective Bayesian methodologies, empirical Bayes methods, small area estimation, and many more topics.Each chapter is self-contained and focuses on

  20. Formalizing the ISDF Software Development Methodology

    OpenAIRE

    Mihai Liviu DESPA

    2015-01-01

    The paper is aimed at depicting the ISDF software development methodology by emphasizing quality management and software development lifecycle. The ISDF methodology was built especially for innovative software development projects. The ISDF methodology was developed empirically by trial and error in the process of implementing multiple innovative projects. The research process began by analysing key concepts like innovation and software development and by settling the important dilemma of wha...

  1. THE FUTURE OF LANGUAGE TEACHING METHODOLOGY

    OpenAIRE

    Ted Rodgers

    1998-01-01

    Abstract : This paper reviews the current state of ELT methodology, particulary in respect to a number of current views suggesting that the profession is now in a "post-methods" era in which previous attention to Methods (Total Physical Response, Silent Way, Natural Approach, etc.) has given way to a more generic approach to ELT methodology. Ten potential future courses of ELT methodology are outlines and three of these are considered in some detail. Particular consideration is given as to ho...

  2. Formalizing the ISDF Software Development Methodology

    Directory of Open Access Journals (Sweden)

    Mihai Liviu DESPA

    2015-01-01

    Full Text Available The paper is aimed at depicting the ISDF software development methodology by emphasizing quality management and software development lifecycle. The ISDF methodology was built especially for innovative software development projects. The ISDF methodology was developed empirically by trial and error in the process of implementing multiple innovative projects. The research process began by analysing key concepts like innovation and software development and by settling the important dilemma of what makes a web application innovative. Innovation in software development is presented from the end-user, project owner and project manager’s point of view. The main components of a software development methodology are identified. Thus a software development methodology should account for people, roles, skills, teams, tools, techniques, processes, activities, standards, quality measuring tools, and team values. Current software development models are presented and briefly analysed. The need for a dedicated innovation oriented software development methodology is emphasized by highlighting shortcomings of current software development methodologies when tackling innovation. The ISDF methodology is presented in the context of developing an actual application. The ALHPA application is used as a case study for emphasizing the characteristics of the ISDF methodology. The development life cycle of the ISDF methodology includes research, planning, prototyping, design, development, testing, setup and maintenance. Artefacts generated by the ISDF methodology are presented. Quality is managed in the ISDF methodology by assessing compliance, usability, reliability, repeatability, availability and security. In order to properly asses each quality component a dedicated indicator is built. A template for interpreting each indicator is provided. Conclusions are formulated and new related research topics are submitted for debate.

  3. Probabilistic methodology for turbine missile risk analysis

    International Nuclear Information System (INIS)

    Twisdale, L.A.; Dunn, W.L.; Frank, R.A.

    1984-01-01

    A methodology has been developed for estimation of the probabilities of turbine-generated missile damage to nuclear power plant structures and systems. Mathematical models of the missile generation, transport, and impact events have been developed and sequenced to form an integrated turbine missile simulation methodology. Probabilistic Monte Carlo techniques are used to estimate the plant impact and damage probabilities. The methodology has been coded in the TURMIS computer code to facilitate numerical analysis and plant-specific turbine missile probability assessments. Sensitivity analyses have been performed on both the individual models and the integrated methodology, and probabilities have been estimated for a hypothetical nuclear power plant case study. (orig.)

  4. A development methodology for scientific software

    International Nuclear Information System (INIS)

    Cort, G.; Barrus, D.M.; Goldstone, J.A.; Miller, L.; Nelson, R.O.; Poore, R.V.

    1985-01-01

    We present the details of a software development methodology that addresses all phases of the software life cycle, yet is well suited for application by small projects with limited resources. The methodology has been developed at the Los Alamos Weapons Neutron Research (WNR) Facility and was utilized during the recent development of the WNR Data Acquisition Command Language. The methodology emphasizes the development and maintenance of comprehensive documentation for all software components. The impact of the methodology upon software quality and programmer productivity is assessed

  5. Methodological issues of genetic association studies.

    Science.gov (United States)

    Simundic, Ana-Maria

    2010-12-01

    Genetic association studies explore the association between genetic polymorphisms and a certain trait, disease or predisposition to disease. It has long been acknowledged that many genetic association studies fail to replicate their initial positive findings. This raises concern about the methodological quality of these reports. Case-control genetic association studies often suffer from various methodological flaws in study design and data analysis, and are often reported poorly. Flawed methodology and poor reporting leads to distorted results and incorrect conclusions. Many journals have adopted guidelines for reporting genetic association studies. In this review, some major methodological determinants of genetic association studies will be discussed.

  6. Evolution of aging assessment methodologies

    International Nuclear Information System (INIS)

    McCrea, L.; Dam, R.; Gold, R.

    2011-01-01

    Under the influence of organizations like the IAEA and INPO the expectations of the regulator and plant operators alike are driving the evolution of aging assessment methodologies. The key result is that these assessments need to be executed more efficiently while supporting risk informed thinking within a living process. Some recent trends impacting aging assessments include new standards from the regulator requiring more frequent updates of aging assessments (RD-334), and broader component coverage driven by equipment reliability program demands (INPO AP-913). These trends point to the need to be able to do aging assessment more efficiently, and to manage the configuration. Some of the challenges include increasing efficiency while maintaining completeness and minimizing error, employing a systematic, well defined approach while maintaining the flexibility to apply the right level of effort to achieve desired results, and in particular, assuring that Aging Related Degradation Mechanisms (ARDMs) are sufficiently addressed. Meeting these needs creates a natural synergy with the Preventive Maintenance living program and therefore lends itself to a more integrated approach. To support this program, the SYSTMSTM software has been enhanced to accommodate for the various facets of an integrated program while meeting the needs described above. The systematic processes in SYSTMS are built with the vision of supporting risk-informed decision making as part of a larger risk-based functional tools suite. This paper intends to show how the utilities can benefit from the cost savings associated with increased assessment efficiency, and utilizing Candu Energy Inc.'s CANDU-specific knowledge-base and experience in aging assessment to get it right the first time. (author)

  7. Regional indexes for France. Methodology

    International Nuclear Information System (INIS)

    2006-01-01

    According to the U.S. Department of Energy, an estimated 25% of the GNP is affected by weather-related events. The variations in temperature - even small ones - can also have long-lasting effects on the operational results of a company. Among other, the Energy supply sector is sensitive to weather risks: a milder or harsher than usual winter leads to a decrease or increase of energy consumption. The price of electricity on power trading facilities like Powernext is especially sensitive to odd changes in temperatures. Powernext and Meteo-France (the French meteorological agency) have joined expertise in order to promote the use of weather indices in term of decision making or underlying of hedging tools to energy actors, end users from any other sector of activity and specialists of the weather risk hedging. The Powernext Weather indices are made from information collected by Meteo-France's main observation network according to the norms of international meteorology, in areas carefully selected. The gross data are submitted to a thorough review allowing the correction of abnormalities and the reconstitution of missing data. Each index is fashioned to take into account the economic activity in the various regions of the country as represented by each region's population. This demographic information represents a fair approximation of the weight of the regional economic activity. This document presents the calculation methodology of regional weather indexes for France. Each regional index covers an homogenous area in terms of temperature represented by the biggest city. Five reference cities are used to represent the 22 economical regions of France (Paris Orly, Lyon, Marignane, Bordeaux and Strasbourg). (J.S.)

  8. Evolution of aging assessment methodologies

    Energy Technology Data Exchange (ETDEWEB)

    McCrea, L.; Dam, R.; Gold, R. [Candu Energy Inc., Mississauga, Ontario (Canada)

    2011-07-01

    Under the influence of organizations like the IAEA and INPO the expectations of the regulator and plant operators alike are driving the evolution of aging assessment methodologies. The key result is that these assessments need to be executed more efficiently while supporting risk informed thinking within a living process. Some recent trends impacting aging assessments include new standards from the regulator requiring more frequent updates of aging assessments (RD-334), and broader component coverage driven by equipment reliability program demands (INPO AP-913). These trends point to the need to be able to do aging assessment more efficiently, and to manage the configuration. Some of the challenges include increasing efficiency while maintaining completeness and minimizing error, employing a systematic, well defined approach while maintaining the flexibility to apply the right level of effort to achieve desired results, and in particular, assuring that Aging Related Degradation Mechanisms (ARDMs) are sufficiently addressed. Meeting these needs creates a natural synergy with the Preventive Maintenance living program and therefore lends itself to a more integrated approach. To support this program, the SYSTMSTM software has been enhanced to accommodate for the various facets of an integrated program while meeting the needs described above. The systematic processes in SYSTMS are built with the vision of supporting risk-informed decision making as part of a larger risk-based functional tools suite. This paper intends to show how the utilities can benefit from the cost savings associated with increased assessment efficiency, and utilizing Candu Energy Inc.'s CANDU-specific knowledge-base and experience in aging assessment to get it right the first time. (author)

  9. Relational and Object-Oriented Methodology in Data Bases Systems

    Directory of Open Access Journals (Sweden)

    Marian Pompiliu CRISTESCU

    2006-01-01

    Full Text Available Database programming languages integrate concepts of databases and programming languages to provide both implementation tools for data-intensive applications and high-level user interfaces to databases. Frequently, database programs contain a large amount of application knowledge which is hidden in the procedural code and thus difficult to maintain with changing data and user views. This paper presents a first attempt to improve the situation by supporting the integrated definition and management of data and rules based on a setoriented and predicative approach. The use of database technology for integrated fact and rule base management is shown to have some important advantages in terms of fact and rule integrity, question-answering, and explanation of results.

  10. Causal Meta-Analysis : Methodology and Applications

    NARCIS (Netherlands)

    Bax, L.J.

    2009-01-01

    Meta-analysis is a statistical method to summarize research data from multiple studies in a quantitative manner. This dissertation addresses a number of methodological topics in causal meta-analysis and reports the development and validation of meta-analysis software. In the first (methodological)

  11. Systematic Review Methodology in Higher Education

    Science.gov (United States)

    Bearman, Margaret; Smith, Calvin D.; Carbone, Angela; Slade, Susan; Baik, Chi; Hughes-Warrington, Marnie; Neumann, David L.

    2012-01-01

    Systematic review methodology can be distinguished from narrative reviews of the literature through its emphasis on transparent, structured and comprehensive approaches to searching the literature and its requirement for formal synthesis of research findings. There appears to be relatively little use of the systematic review methodology within the…

  12. Disposal Criticality Analysis Methodology Topical Report

    International Nuclear Information System (INIS)

    Horton, D.G.

    1998-01-01

    The fundamental objective of this topical report is to present the planned risk-informed disposal criticality analysis methodology to the NRC to seek acceptance that the principles of the methodology and the planned approach to validating the methodology are sound. The design parameters and environmental assumptions within which the waste forms will reside are currently not fully established and will vary with the detailed waste package design, engineered barrier design, repository design, and repository layout. Therefore, it is not practical to present the full validation of the methodology in this report, though a limited validation over a parameter range potentially applicable to the repository is presented for approval. If the NRC accepts the methodology as described in this section, the methodology will be fully validated for repository design applications to which it will be applied in the License Application and its references. For certain fuel types (e.g., intact naval fuel), a ny processes, criteria, codes or methods different from the ones presented in this report will be described in separate addenda. These addenda will employ the principles of the methodology described in this report as a foundation. Departures from the specifics of the methodology presented in this report will be described in the addenda

  13. PSA methodology development and application in Japan

    International Nuclear Information System (INIS)

    Kazuo Sato; Toshiaki Tobioka; Kiyoharu Abe

    1987-01-01

    The outlines of Japanese activities on development and application of probabilistic safety assessment (PSA) methodologies are described. First the activities on methodology development are described for system reliability analysis, operational data analysis, core melt accident analysis, environmental consequence analysis and seismic risk analysis. Then the methodoligy application examples by the regulatory side and the industry side are described. (author)

  14. Developing educational hypermedia applications: a methodological approach

    Directory of Open Access Journals (Sweden)

    Jose Miguel Nunes

    1996-01-01

    Full Text Available This paper proposes an hypermedia development methodology with the aim of integrating the work of both educators, who will be primarily responsible for the instructional design, with that of software experts, responsible for the software design and development. Hence, it is proposed that the educators and programmers should interact in an integrated and systematic manner following a methodological approach.

  15. Design Methodologies: Industrial and Educational Applications

    NARCIS (Netherlands)

    Tomiyama, T.; Gul, P.; Jin, Y.; Lutters, Diederick; Kind, Ch.; Kimura, F.

    2009-01-01

    The field of Design Theory and Methodology has a rich collection of research results that has been taught at educational institutions as well as applied to design practices. First, this keynote paper describes some methods to classify them. It then illustrates individual theories and methodologies

  16. Screening radon risks: A methodology for policymakers

    International Nuclear Information System (INIS)

    Eisinger, D.S.; Simmons, R.A.; Lammering, M.; Sotiros, R.

    1991-01-01

    This paper provides an easy-to-use screening methodology to estimate potential excess lifetime lung cancer risk resulting from indoor radon exposure. The methodology was developed under U.S. EPA Office of Policy, Planning, and Evaluation sponsorship of the agency's Integrated Environmental Management Projects (IEMP) and State/Regional Comparative Risk Projects. These projects help policymakers understand and use scientific data to develop environmental problem-solving strategies. This research presents the risk assessment methodology, discusses its basis, and identifies appropriate applications. The paper also identifies assumptions built into the methodology and qualitatively addresses methodological uncertainties, the direction in which these uncertainties could bias analyses, and their relative importance. The methodology draws from several sources, including risk assessment formulations developed by the U.S. EPA's Office of Radiation Programs, the EPA's Integrated Environmental Management Project (Denver), the International Commission on Radiological Protection, and the National Institute for Occupational Safety and Health. When constructed as a spreadsheet program, the methodology easily facilitates analyses and sensitivity studies (the paper includes several sensitivity study options). The methodology will be most helpful to those who need to make decisions concerning radon testing, public education, and exposure prevention and mitigation programs.26 references

  17. A new Methodology for Operations Strategy

    DEFF Research Database (Denmark)

    Koch, Christian; Rytter, Niels Gorm; Boer, Harry

    2005-01-01

    This paper proposes a new methodology for developing and implementing Operations Strategy (OS). It encompasses both content and process aspects of OS and differs thereby from many of the present OS methodologies. The paper outlines its paradigmatic foundation and presents aim, process, dimensions...

  18. Comparative study on software development methodologies

    Directory of Open Access Journals (Sweden)

    Mihai Liviu DESPA

    2014-12-01

    Full Text Available This paper focuses on the current state of knowledge in the field of software development methodologies. It aims to set the stage for the formalization of a software development methodology dedicated to innovation orientated IT projects. The paper starts by depicting specific characteristics in software development project management. Managing software development projects involves techniques and skills that are proprietary to the IT industry. Also the software development project manager handles challenges and risks that are predominantly encountered in business and research areas that involve state of the art technology. Conventional software development stages are defined and briefly described. Development stages are the building blocks of any software development methodology so it is important to properly research this aspect. Current software development methodologies are presented. Development stages are defined for every showcased methodology. For each methodology a graphic representation is illustrated in order to better individualize its structure. Software development methodologies are compared by highlighting strengths and weaknesses from the stakeholder's point of view. Conclusions are formulated and a research direction aimed at formalizing a software development methodology dedicated to innovation orientated IT projects is enunciated.

  19. Solid Waste Management Planning--A Methodology

    Science.gov (United States)

    Theisen, Hilary M.; And Others

    1975-01-01

    This article presents a twofold solid waste management plan consisting of a basic design methodology and a decision-making methodology. The former provides a framework for the developing plan while the latter builds flexibility into the design so that there is a model for use during the planning process. (MA)

  20. Disposal Criticality Analysis Methodology Topical Report

    International Nuclear Information System (INIS)

    D.G. Horton

    1998-01-01

    The fundamental objective of this topical report is to present the planned risk-informed disposal criticality analysis methodology to the NRC to seek acceptance that the principles of the methodology and the planned approach to validating the methodology are sound. The design parameters and environmental assumptions within which the waste forms will reside are currently not fully established and will vary with the detailed waste package design, engineered barrier design, repository design, and repository layout. Therefore, it is not practical to present the full validation of the methodology in this report, though a limited validation over a parameter range potentially applicable to the repository is presented for approval. If the NRC accepts the methodology as described in this section, the methodology will be fully validated for repository design applications to which it will be applied in the License Application and its references. For certain fuel types (e.g., intact naval fuel), any processes, criteria, codes or methods different from the ones presented in this report will be described in separate addenda. These addenda will employ the principles of the methodology described in this report as a foundation. Departures from the specifics of the methodology presented in this report will be described in the addenda

  1. Generalized Response Surface Methodology : A New Metaheuristic

    NARCIS (Netherlands)

    Kleijnen, J.P.C.

    2006-01-01

    Generalized Response Surface Methodology (GRSM) is a novel general-purpose metaheuristic based on Box and Wilson.s Response Surface Methodology (RSM).Both GRSM and RSM estimate local gradients to search for the optimal solution.These gradients use local first-order polynomials.GRSM, however, uses

  2. 24 CFR 904.205 - Training methodology.

    Science.gov (United States)

    2010-04-01

    ... Training methodology. Equal in importance to the content of the pre- and post-occupancy training is the training methodology. Because groups vary, there should be adaptability in the communication and learning experience. Methods to be utilized may include group presentations, small discussion groups, special classes...

  3. Identification and systems methodologies for territorial delimitation

    Directory of Open Access Journals (Sweden)

    Iván Montoya R

    2010-12-01

    Full Text Available This document identifies the main issues affecting the delimitation of territories and explores the conceptual approaches for describing the relationship of the territories understood as organizations with their environment. Subsequently, we studied the systems methodologies known as soft systems methodology, SSM, and complex adaptive systems, CAS. Finally, the advantages of systemic approaches to territorial delimitation are shown

  4. Discipline and Methodology in Higher Education Research

    Science.gov (United States)

    Tight, Malcolm

    2013-01-01

    Higher education research is a multidisciplinary field, engaging researchers from across the academy who make use of a wide range of methodological approaches. This article examines the relation between discipline and methodology in higher education research, analysing a database of 567 articles published in 15 leading higher education journals…

  5. Embodied Writing: Choreographic Composition as Methodology

    Science.gov (United States)

    Ulmer, Jasmine B.

    2015-01-01

    This paper seeks to examine how embodied methodological approaches might inform dance education practice and research. Through a series of examples, this paper explores how choreographic writing might function as an embodied writing methodology. Here, choreographic writing is envisioned as a form of visual word choreography in which words move,…

  6. METHODOLOGICAL PROBLEMS OF E-LEARNING DIDACTICS

    Directory of Open Access Journals (Sweden)

    Sergey F. Sergeev

    2015-01-01

    Full Text Available The article is devoted to the discussion of the methodological problems of e-learning, didactic issues the use of advanced networking and Internet technologies to create training systems and simulators based on the methodological principles of non-classical and post-non-classical psychology and pedagogy. 

  7. Improving Learning Outcome Using Six Sigma Methodology

    Science.gov (United States)

    Tetteh, Godson A.

    2015-01-01

    Purpose: The purpose of this research paper is to apply the Six Sigma methodology to identify the attributes of a lecturer that will help improve a student's prior knowledge of a discipline from an initial "x" per cent knowledge to a higher "y" per cent of knowledge. Design/methodology/approach: The data collection method…

  8. Grounded Theory Methodology: Positivism, Hermeneutics, and Pragmatism

    Science.gov (United States)

    Age, Lars-Johan

    2011-01-01

    Glaserian grounded theory methodology, which has been widely adopted as a scientific methodology in recent decades, has been variously characterised as "hermeneutic" and "positivist." This commentary therefore takes a different approach to characterising grounded theory by undertaking a comprehensive analysis of: (a) the philosophical paradigms of…

  9. A methodology for developing distributed programs

    NARCIS (Netherlands)

    Ramesh, S.; Mehndiratta, S.L.

    1987-01-01

    A methodology, different from the existing ones, for constructing distributed programs is presented. It is based on the well-known idea of developing distributed programs via synchronous and centralized programs. The distinguishing features of the methodology are: 1) specification include process

  10. Methodology for performing surveys for fixed contamination

    International Nuclear Information System (INIS)

    Durham, J.S.; Gardner, D.L.

    1994-10-01

    This report describes a methodology for performing instrument surveys for fixed contamination that can be used to support the release of material from radiological areas, including release to controlled areas and release from radiological control. The methodology, which is based on a fast scan survey and a series of statistical, fixed measurements, meets the requirements of the U.S. Department of Energy Radiological Control Manual (RadCon Manual) (DOE 1994) and DOE Order 5400.5 (DOE 1990) for surveys for fixed contamination and requires less time than a conventional scan survey. The confidence interval associated with the new methodology conforms to the draft national standard for surveys. The methodology that is presented applies only to surveys for fixed contamination. Surveys for removable contamination are not discussed, and the new methodology does not affect surveys for removable contamination

  11. Imaginative methodologies in the social sciences

    DEFF Research Database (Denmark)

    Imaginative Methodologies develops, expands and challenges conventional social scientific methodology and language by way of literary, poetic and other alternative sources of inspiration. Sociologists, social workers, anthropologists, criminologists and psychologists all try to rethink, provoke...... and reignite social scientific methodology. Imaginative Methodologies challenges the mainstream social science methodological orthodoxy closely guarding the boundaries between the social sciences and the arts and humanities, pointing out that authors and artists are often engaged in projects parallel to those...... of the social sciences and vice versa, and that artistic and cultural productions today do not constitute a specialist field, but are integral to our social reality. The book will be of interest to scholars and students in the social sciences and across the arts and humanities working with questions...

  12. USGS Methodology for Assessing Continuous Petroleum Resources

    Science.gov (United States)

    Charpentier, Ronald R.; Cook, Troy A.

    2011-01-01

    The U.S. Geological Survey (USGS) has developed a new quantitative methodology for assessing resources in continuous (unconventional) petroleum deposits. Continuous petroleum resources include shale gas, coalbed gas, and other oil and gas deposits in low-permeability ("tight") reservoirs. The methodology is based on an approach combining geologic understanding with well productivities. The methodology is probabilistic, with both input and output variables as probability distributions, and uses Monte Carlo simulation to calculate the estimates. The new methodology is an improvement of previous USGS methodologies in that it better accommodates the uncertainties in undrilled or minimally drilled deposits that must be assessed using analogs. The publication is a collection of PowerPoint slides with accompanying comments.

  13. Methodologies for certification of transuranic waste packages

    International Nuclear Information System (INIS)

    Christensen, R.N.; Kok, K.D.

    1980-10-01

    The objective of this study was to postulate methodologies for certification that a waste package is acceptable for disposal in a licensed geologic repository. Within the context of this report, certification means the overall process which verifies that a waste package meets the criteria or specifications established for acceptance for disposal in a repository. The overall methodology for certification will include (1) certifying authorities, (2) tests and procedures, and (3) documentation and quality assurance programs. Each criterion will require a methodology that is specific to that criterion. In some cases, different waste forms will require a different methodology. The purpose of predicting certification methodologies is to provide additional information as to what changes, if any, are needed for the TRU waste in storage

  14. Preliminary safety analysis methodology for the SMART

    Energy Technology Data Exchange (ETDEWEB)

    Bae, Kyoo Hwan; Chung, Y. J.; Kim, H. C.; Sim, S. K.; Lee, W. J.; Chung, B. D.; Song, J. H. [Korea Atomic Energy Research Institute, Taejeon (Korea)

    2000-03-01

    This technical report was prepared for a preliminary safety analysis methodology of the 330MWt SMART (System-integrated Modular Advanced ReacTor) which has been developed by Korea Atomic Energy Research Institute (KAERI) and funded by the Ministry of Science and Technology (MOST) since July 1996. This preliminary safety analysis methodology has been used to identify an envelope for the safety of the SMART conceptual design. As the SMART design evolves, further validated final safety analysis methodology will be developed. Current licensing safety analysis methodology of the Westinghouse and KSNPP PWRs operating and under development in Korea as well as the Russian licensing safety analysis methodology for the integral reactors have been reviewed and compared to develop the preliminary SMART safety analysis methodology. SMART design characteristics and safety systems have been reviewed against licensing practices of the PWRs operating or KNGR (Korean Next Generation Reactor) under construction in Korea. Detailed safety analysis methodology has been developed for the potential SMART limiting events of main steam line break, main feedwater pipe break, loss of reactor coolant flow, CEA withdrawal, primary to secondary pipe break and the small break loss of coolant accident. SMART preliminary safety analysis methodology will be further developed and validated in parallel with the safety analysis codes as the SMART design further evolves. Validated safety analysis methodology will be submitted to MOST as a Topical Report for a review of the SMART licensing safety analysis methodology. Thus, it is recommended for the nuclear regulatory authority to establish regulatory guides and criteria for the integral reactor. 22 refs., 18 figs., 16 tabs. (Author)

  15. Mechatronics methodology: 15 years of experience

    Directory of Open Access Journals (Sweden)

    Efren Gorrostieta

    2015-09-01

    Full Text Available This article presents a methodology to teach students to develop mechatronic projects. It was taught in higher education schools, in different universities in Mexico, in courses such as: Robotics, Control Systems, Mechatronic Systems, Artificial Intelligence, etc. The intention of this methodology is not only to achieve the integration of different subjects but also to accomplish synergy between them so that the final result may be the best possible in quality, time and robustness. Since its introduction into the educational area, this methodology was evaluated and modified for approximately five years, were substantial characteristics were adopted. For the next ten years, only minor alterations were carried out. Fifteen years of experience have proven that the methodology is useful not only for training but also for real projects. In this article, we first explain the methodology and its main characteristics, as well as a brief history of its teaching in different educational programs. Then, we present two cases were the methodology was successfully applied. The first project consisted in the design, construction and evaluation of a mobile robotic manipulator which aims to be used as an explosives ordnance device. In the second case, we document the results of a project assignment for robotics tasks carried out by students which were formerly taught with the methodology.

  16. Research Methodology in Global Strategy Research

    DEFF Research Database (Denmark)

    Cuervo-Cazurra, Alvaro; Mudambi, Ram; Pedersen, Torben

    2017-01-01

    We review advances in research methodology used in global strategy research and provide suggestions on how researchers can improve their analyses and arguments. Methodological advances in the extraction of information, such as computer-aided text analysis, and in the analysis of datasets......, such as differences-in-differences and propensity score matching, have helped deal with challenges (e.g., endogeneity and causality) that bedeviled earlier studies and resulted in conflicting findings. These methodological advances need to be considered as tools that complement theoretical arguments and well......-explained logics and mechanisms so that researchers can provide better and more relevant recommendations to managers designing the global strategies of their organizations....

  17. Air pollution monitoring - a methodological approach

    International Nuclear Information System (INIS)

    Trajkovska Trpevska, Magdalena

    2002-01-01

    Methodology for monitoring the emission of polluters in the air is a complex concept that in general embraces following fazes: sampling, laboratory treatment, and interpretation of results. In Company for technological and laboratory investigation and environmental protection - Mining Institute Skopje, the control of emission of polluters in the air is performing according methodology based in general on the recommendation of standard VDI 2.066 prescribe from Ministry of Ecology in Germany, because adequate legislation in our country does not exist. In this article the basic treatment of methodology for the air polluters emission control is presented. (Original)

  18. Nuclear power plant simulation facility evaluation methodology

    International Nuclear Information System (INIS)

    Haas, P.M.; Carter, R.J.; Laughery, K.R. Jr.

    1985-01-01

    A methodology for evaluation of nuclear power plant simulation facilities with regard to their acceptability for use in the US Nuclear Regulatory Commission (NRC) operator licensing exam is described. The evaluation is based primarily on simulator fidelity, but incorporates some aspects of direct operator/trainee performance measurement. The panel presentation and paper discuss data requirements, data collection, data analysis and criteria for conclusions regarding the fidelity evaluation, and summarize the proposed use of direct performance measurment. While field testing and refinement of the methodology are recommended, this initial effort provides a firm basis for NRC to fully develop the necessary methodology

  19. Photovoltaic module energy rating methodology development

    Energy Technology Data Exchange (ETDEWEB)

    Kroposki, B.; Myers, D.; Emery, K.; Mrig, L. [National Renewable Energy Lab., Golden, CO (United States); Whitaker, C.; Newmiller, J. [Endecon Engineering, San Ramon, CA (United States)

    1996-05-01

    A consensus-based methodology to calculate the energy output of a PV module will be described in this paper. The methodology develops a simple measure of PV module performance that provides for a realistic estimate of how a module will perform in specific applications. The approach makes use of the weather data profiles that describe conditions throughout the United States and emphasizes performance differences between various module types. An industry-representative Technical Review Committee has been assembled to provide feedback and guidance on the strawman and final approach used in developing the methodology.

  20. Covariance Evaluation Methodology for Neutron Cross Sections

    Energy Technology Data Exchange (ETDEWEB)

    Herman,M.; Arcilla, R.; Mattoon, C.M.; Mughabghab, S.F.; Oblozinsky, P.; Pigni, M.; Pritychenko, b.; Songzoni, A.A.

    2008-09-01

    We present the NNDC-BNL methodology for estimating neutron cross section covariances in thermal, resolved resonance, unresolved resonance and fast neutron regions. The three key elements of the methodology are Atlas of Neutron Resonances, nuclear reaction code EMPIRE, and the Bayesian code implementing Kalman filter concept. The covariance data processing, visualization and distribution capabilities are integral components of the NNDC methodology. We illustrate its application on examples including relatively detailed evaluation of covariances for two individual nuclei and massive production of simple covariance estimates for 307 materials. Certain peculiarities regarding evaluation of covariances for resolved resonances and the consistency between resonance parameter uncertainties and thermal cross section uncertainties are also discussed.