WorldWideScience

Sample records for ground rules methodologies

  1. GROUNDED THEORY METHODOLOGY and GROUNDED THEORY RESEARCH in TURKEY

    OpenAIRE

    ARIK, Ferhat; ARIK, Işıl Avşar

    2016-01-01

    This research discusses the historical development of the Grounded Theory Methodology, which is one of the qualitative research method, its transformation over time and how it is used as a methodology in Turkey. The Grounded Theory which was founded by Strauss and Glaser, is a qualitative methodology based on inductive logic to discover theories in contrast with the deductive understanding which is based on testing an existing theory in sociology. It is possible to examine the Grounded Theory...

  2. Essential methodological considerations when using grounded theory.

    Science.gov (United States)

    Achora, Susan; Matua, Gerald Amandu

    2016-07-01

    To suggest important methodological considerations when using grounded theory. A research method widely used in nursing research is grounded theory, at the centre of which is theory construction. However, researchers still struggle with some of its methodological issues. Although grounded theory is widely used to study and explain issues in nursing practice, many researchers are still failing to adhere to its rigorous standards. Researchers should articulate the focus of their investigations - the substantive area of interest as well as the focal population. This should be followed by a succinct explanation of the strategies used to collect and analyse data, supported by clear coding processes. Finally, the resolution of the core issues, including the core category and related categories, should be explained to advance readers' understanding. Researchers should endeavour to understand the tenets of grounded theory. This enables 'neophytes' in particular to make methodological decisions that will improve their studies' rigour and fit with grounded theory. This paper complements the current dialogue on improving the understanding of grounded theory methodology in nursing research. The paper also suggests important procedural decisions researchers need to make to preserve their studies' scientific merit and fit with grounded theory.

  3. Grounded Theory Methodology: Positivism, Hermeneutics, and Pragmatism

    Science.gov (United States)

    Age, Lars-Johan

    2011-01-01

    Glaserian grounded theory methodology, which has been widely adopted as a scientific methodology in recent decades, has been variously characterised as "hermeneutic" and "positivist." This commentary therefore takes a different approach to characterising grounded theory by undertaking a comprehensive analysis of: (a) the philosophical paradigms of…

  4. Independent Orbiter Assessment (IOA): FMEA/CIL instructions and ground rules

    Science.gov (United States)

    Traves, S. T.

    1986-01-01

    The McDonnell Douglas Astronautics Company was selected to conduct an independent assessment of the Orbiter Failure Mode and Effects Analysis/Critical Items List (FMEA/CIL). Part of this effort involved an examination of the FMEA/CIL preparation instructions and ground rules. Assessment objectives were to identify omissions and ambiguities in the ground rules that may impede the identification of shuttle orbiter safety and mission critical items, and to ensure that ground rules allow these items to receive proper management visibility for risk assessment. Assessment objectives were followed during the performance of the assessment without being influenced by external considerations such as effects on budget, schedule, and documentation growth. Assessment personnel were employed who had a strong reliability background but no previous space shuttle FMEA/CIL experience to ensure an independent assessment would be achieved. The following observations were made: (1) not all essential items are in the CIL for management visibility; (2) ground rules omit FMEA/CIL coverage of items that perform critical functions; (3) essential items excluded from the CIL do not receive design justification; and (4) FMEAs/CILs are not updated in a timely manner. In addition to the above issues, a number of other issues were identified that correct FMEA/CIL preparation instruction omissions and clarify ambiguities. The assessment was successful in that many of the issues have significant safety implications.

  5. The Methodological Dynamism of Grounded Theory

    Directory of Open Access Journals (Sweden)

    Nicholas Ralph

    2015-11-01

    Full Text Available Variations in grounded theory (GT interpretation are the subject of ongoing debate. Divergences of opinion, genres, approaches, methodologies, and methods exist, resulting in disagreement on what GT methodology is and how it comes to be. From the postpositivism of Glaser and Strauss, to the symbolic interactionist roots of Strauss and Corbin, through to the constructivism of Charmaz, the field of GT methodology is distinctive in the sense that those using it offer new ontological, epistemological, and methodological perspectives at specific moments in time. We explore the unusual dynamism attached to GT’s underpinnings. Our view is that through a process of symbolic interactionism, in which generations of researchers interact with their context, moments are formed and philosophical perspectives are interpreted in a manner congruent with GT’s essential methods. We call this methodological dynamism, a process characterized by contextual awareness and moment formation, contemporaneous translation, generational methodology, and methodological consumerism.

  6. Grounded theory methodology - has it become a movement?

    OpenAIRE

    Berterö, Carina

    2012-01-01

    There is an ongoing debate regarding the nature of grounded theory, and an examination of many studies claiming to follow grounded theory indicates a wide range of approaches. In 1967 Glaser and Strauss’s ‘‘The Discovery of Grounded Theory; Strategies for Qualitative Research’’ was published and represented a breakthrough in qualitative research; it offered methodological consensus and systematic strategies for qualitative research practice. The defining characteristics of grounded theory inc...

  7. A Proven Methodology for Developing Secure Software and Applying It to Ground Systems

    Science.gov (United States)

    Bailey, Brandon

    2016-01-01

    Part Two expands upon Part One in an attempt to translate the methodology for ground system personnel. The goal is to build upon the methodology presented in Part One by showing examples and details on how to implement the methodology. Section 1: Ground Systems Overview; Section 2: Secure Software Development; Section 3: Defense in Depth for Ground Systems; Section 4: What Now?

  8. Are There Two Methods of Grounded Theory? Demystifying the Methodological Debate

    Directory of Open Access Journals (Sweden)

    Cheri Ann Hernandez, RN, Ph.D., CDE

    2008-06-01

    Full Text Available Grounded theory is an inductive research method for the generation of substantive or formal theory, using qualitative or quantitative data generated from research interviews, observation, or written sources, or some combination thereof (Glaser & Strauss, 1967. In recent years there has been much controversy over the etiology of its discovery, as well as, the exact way in which grounded theory research is to be operationalized. Unfortunately, this situation has resulted in much confusion, particularly among novice researchers who wish to utilize this research method. In this article, the historical, methodological and philosophical roots of grounded theory are delineated in a beginning effort to demystify this methodological debate. Grounded theory variants such as feminist grounded theory (Wuest, 1995 or constructivist grounded theory (Charmaz, 1990 are beyond the scope of this discussion.

  9. Where Are the Grounds for Grounded Theory? A Troubled Empirical Methodology Meets Wittgenstein

    Science.gov (United States)

    James, Fiona

    2018-01-01

    This article provides a critical exposition of the epistemological underpinnings of a recent redevelopment of Grounded Theory (GT) methodology, "Constructivist" GT. Although proffered as freed from the "objectivist" tenets of the original version, critical examination exposes the essentialism threaded through its integral…

  10. Probabilistic Rule Generator: A new methodology of variable-valued logic synthesis

    International Nuclear Information System (INIS)

    Lee, W.D.; Ray, S.R.

    1986-01-01

    A new methodology to synthesize variable-valued logic formulas from training data events is presented. Probablistic Rule Generator (PRG) employs not only information-theoretic entropy as a heuristic to capture a path expression but also multiple-valued logic to expand a captured complex. PRG is efficient for capturing major clusters in the event space, and is more general than previous methodologies in providing probabilistic features

  11. Contracting Selection for the Development of the Range Rule Risk Methodology

    National Research Council Canada - National Science Library

    1997-01-01

    ...-Effectiveness Risk Tool and contractor selection for the development of the Range Rule Risk Methodology. The audit objective was to determine whether the Government appropriately used the Ordnance and Explosives Cost-Effectiveness Risk Tool...

  12. Grounded theory: a methodological spiral from positivism to postmodernism.

    Science.gov (United States)

    Mills, Jane; Chapman, Ysanne; Bonner, Ann; Francis, Karen

    2007-04-01

    Our aim in this paper is to explain a methodological/methods package devised to incorporate situational and social world mapping with frame analysis, based on a grounded theory study of Australian rural nurses' experiences of mentoring. Situational analysis, as conceived by Adele Clarke, shifts the research methodology of grounded theory from being located within a postpositivist paradigm to a postmodern paradigm. Clarke uses three types of maps during this process: situational, social world and positional, in combination with discourse analysis. During our grounded theory study, the process of concurrent interview data generation and analysis incorporated situational and social world mapping techniques. An outcome of this was our increased awareness of how outside actors influenced participants in their constructions of mentoring. In our attempts to use Clarke's methodological package, however, it became apparent that our constructivist beliefs about human agency could not be reconciled with the postmodern project of discourse analysis. We then turned to the literature on symbolic interactionism and adopted frame analysis as a method to examine the literature on rural nursing and mentoring as secondary form of data. While we found situational and social world mapping very useful, we were less successful in using positional maps. In retrospect, we would argue that collective action framing provides an alternative to analysing such positions in the literature. This is particularly so for researchers who locate themselves within a constructivist paradigm, and who are therefore unwilling to reject the notion of human agency and the ability of individuals to shape their world in some way. Our example of using this package of situational and social worlds mapping with frame analysis is intended to assist other researchers to locate participants more transparently in the social worlds that they negotiate in their everyday practice.

  13. From Darwin to constructivism: the evolution of grounded theory.

    Science.gov (United States)

    Hall, Helen; Griffiths, Debra; McKenna, Lisa

    2013-01-01

    To explore the evolution of grounded theory and equip the reader with a greater understanding of the diverse conceptual positioning that is evident in the methodology. Grounded theory was developed during the modernist phase of research to develop theories that are derived from data and explain human interaction. Its philosophical foundations derive from symbolic interactionism and were influenced by a range of scholars including Charles Darwin and George Mead. Rather than a rigid set of rules and procedures, grounded theory is a way of conceptualising data. Researchers demonstrate a range of perspectives and there is significant variation in the way the methodology is interpreted and executed. Some grounded theorists continue to align closely with the original post-positivist view, while others take a more constructivist approach. Although the diverse interpretations accommodate flexibility, they may also result in confusion. The grounded theory approach enables researchers to align to their own particular world view and use methods that are flexible and practical. With an appreciation of the diverse philosophical approaches to grounded theory, researchers are enabled to use and appraise the methodology more effectively.

  14. Methodological approaches based on business rules

    Directory of Open Access Journals (Sweden)

    Anca Ioana ANDREESCU

    2008-01-01

    Full Text Available Business rules and business processes are essential artifacts in defining the requirements of a software system. Business processes capture business behavior, while rules connect processes and thus control processes and business behavior. Traditionally, rules are scattered inside application code. This approach makes it very difficult to change rules and shorten the life cycle of the software system. Because rules change more quickly than the application itself, it is desirable to externalize the rules and move them outside the application. This paper analyzes and evaluates three well-known business rules approaches. It also outlines some critical factors that have to be taken into account in the decision to introduce business rules facilities in a software system. Based on the concept of explicit manipulation of business rules in a software system, the need for a general approach based on business rules is discussed.

  15. Methodological Grounds of Managing Innovation Development of Restaurants

    OpenAIRE

    Naidiuk V. S.

    2013-01-01

    The goal of the article lies in identification and further development of methodological grounds of managing the innovation development of restaurants. Based on the data of the critical analysis of existing scientific views on interpretation of the essence of the "managing innovation development of an enterprise" notion, the article conducts clarification of this definition. In the result of the study the article builds up a cause-effect diagram of solution of the problem of ensuring efficien...

  16. Scalability of a Methodology for Generating Technical Trading Rules with GAPs Based on Risk-Return Adjustment and Incremental Training

    Science.gov (United States)

    de La Cal, E. A.; Fernández, E. M.; Quiroga, R.; Villar, J. R.; Sedano, J.

    In previous works a methodology was defined, based on the design of a genetic algorithm GAP and an incremental training technique adapted to the learning of series of stock market values. The GAP technique consists in a fusion of GP and GA. The GAP algorithm implements the automatic search for crisp trading rules taking as objectives of the training both the optimization of the return obtained and the minimization of the assumed risk. Applying the proposed methodology, rules have been obtained for a period of eight years of the S&P500 index. The achieved adjustment of the relation return-risk has generated rules with returns very superior in the testing period to those obtained applying habitual methodologies and even clearly superior to Buy&Hold. This work probes that the proposed methodology is valid for different assets in a different market than previous work.

  17. Methodological approaches based on business rules

    OpenAIRE

    Anca Ioana ANDREESCU; Adina UTA

    2008-01-01

    Business rules and business processes are essential artifacts in defining the requirements of a software system. Business processes capture business behavior, while rules connect processes and thus control processes and business behavior. Traditionally, rules are scattered inside application code. This approach makes it very difficult to change rules and shorten the life cycle of the software system. Because rules change more quickly than the application itself, it is desirable to externalize...

  18. Surface Signature Characterization at SPE through Ground-Proximal Methods: Methodology Change and Technical Justification

    Energy Technology Data Exchange (ETDEWEB)

    Schultz-Fellenz, Emily S. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2015-09-09

    A portion of LANL’s FY15 SPE objectives includes initial ground-based or ground-proximal investigations at the SPE Phase 2 site. The area of interest is the U2ez location in Yucca Flat. This collection serves as a baseline for discrimination of surface features and acquisition of topographic signatures prior to any development or pre-shot activities associated with SPE Phase 2. Our team originally intended to perform our field investigations using previously vetted ground-based (GB) LIDAR methodologies. However, the extended proposed time frame of the GB LIDAR data collection, and associated data processing time and delivery date, were unacceptable. After technical consultation and careful literature research, LANL identified an alternative methodology to achieve our technical objectives and fully support critical model parameterization. Very-low-altitude unmanned aerial systems (UAS) photogrammetry appeared to satisfy our objectives in lieu of GB LIDAR. The SPE Phase 2 baseline collection was used as a test of this UAS photogrammetric methodology.

  19. [The grounded theory as a methodological alternative for nursing research].

    Science.gov (United States)

    dos Santos, Sérgio Ribeiro; da Nóbrega, Maria Miriam

    2002-01-01

    This study presents a method of interpretative and systematic research with appliance to the development of studies in nursing called "the grounded theory", whose theoretical support is the symbolic interactionism. The purpose of the paper is to describe the grounded theory as an alternative methodology for the construction of knowledge in nursing. The study highlights four topics: the basic principle, the basic concepts, the trajectory of the method and the process of analysis of the data. We conclude that the systematization of data and its interpretation, based on social actors' experience, constitute strong subsidies to generate theories through this research tool.

  20. Bioclim deliverable D8a: development of the rule-based down-scaling methodology for BIOCLIM Work-package 3

    International Nuclear Information System (INIS)

    2003-01-01

    The BIOCLIM project on modelling sequential Biosphere systems under Climate change for radioactive waste disposal is part of the EURATOM fifth European framework programme. The project was launched in October 2000 for a three-year period. The project aims at providing a scientific basis and practical methodology for assessing the possible long term impacts on the safety of radioactive waste repositories in deep formations due to climate and environmental change. Five work packages (WP) have been identified to fulfill the project objectives. One of the tasks of BIOCLIM WP3 was to develop a rule-based approach for down-scaling from the MoBidiC model of intermediate complexity in order to provide consistent estimates of monthly temperature and precipitation for the specific regions of interest to BIOCLIM (Central Spain, Central England and Northeast France, together with Germany and the Czech Republic). A statistical down-scaling methodology has been developed by Philippe Marbaix of CEA/LSCE for use with the second climate model of intermediate complexity used in BIOCLIM - CLIMBER-GREMLINS. The rule-based methodology assigns climate states or classes to a point on the time continuum of a region according to a combination of simple threshold values which can be determined from the coarse scale climate model. Once climate states or classes have been defined, monthly temperature and precipitation climatologies are constructed using analogue stations identified from a data base of present-day climate observations. The most appropriate climate classification for BIOCLIM purposes is the Koeppen/Trewartha scheme. This scheme has the advantage of being empirical, but only requires monthly averages of temperature and precipitation as input variables. Section 2 of this deliverable (D8a) outline how each of the eight methodological steps have been undertaken for each of the three main BIOCLIM study regions (Central England, Northeast France and Central Spain) using Mo

  1. Sensor-based activity recognition using extended belief rule-based inference methodology.

    Science.gov (United States)

    Calzada, A; Liu, J; Nugent, C D; Wang, H; Martinez, L

    2014-01-01

    The recently developed extended belief rule-based inference methodology (RIMER+) recognizes the need of modeling different types of information and uncertainty that usually coexist in real environments. A home setting with sensors located in different rooms and on different appliances can be considered as a particularly relevant example of such an environment, which brings a range of challenges for sensor-based activity recognition. Although RIMER+ has been designed as a generic decision model that could be applied in a wide range of situations, this paper discusses how this methodology can be adapted to recognize human activities using binary sensors within smart environments. The evaluation of RIMER+ against other state-of-the-art classifiers in terms of accuracy, efficiency and applicability was found to be significantly relevant, specially in situations of input data incompleteness, and it demonstrates the potential of this methodology and underpins the basis to develop further research on the topic.

  2. Park Planning for Ageing Adults Using Grounded Theory Methodology

    Directory of Open Access Journals (Sweden)

    Bernie Dahl

    2004-06-01

    Full Text Available The importance of understanding park planning issues and implementing planning strategies for ageing adults was the driving force for this study. Literature reviews have identified a variety of scholarly work from fields such as gerontology, psychology, sociology and economics, all of which provide valuable information regarding the special needs of ageing adults. Very few researchers, however, have investigated the leisure behaviours of older adults in outdoor recreation (Croskeys, Tinsley and Tinsley, 2002 and the use of grounded theory methodology has essentially been unexplored in this area. Ageing adults are projected to live more than 20 percent of their life in retirement (MRP, 1998, cited in Croskeys, Tinsley and Tinsley, 2002, allowing for an increased amount of discretionary time. This offers opportunities for ageing adults to participate in outdoor recreational activities and will undoubtedly increase their leisure time. However, with limited research in recreational needs and inclusion for older adults, it is difficult for park planners and administrators to meet the growing needs of this population. Therefore, this research is necessary in order to determine whether ageing adults are being accounted for in park and outdoor recreational planning. The objective of this study was to use grounded theory research methodology to identify and examine ageing adult needs in relation to outdoor leisure activities in a regional park setting. Ten Midwestern regional park visitors (aged 65-75 years old and four park employees were interviewed. Our research attempts to fill in the gaps between the perceptions of ageing park users and those of park planners, using a methodology that relies primarily on direct contact with park visitors.

  3. Developing Foucault's Discourse Analytic Methodology

    Directory of Open Access Journals (Sweden)

    Rainer Diaz-Bone

    2006-01-01

    Full Text Available A methodological position for a FOUCAULTian discourse analysis is presented. A sequence of analytical steps is introduced and an illustrating example is offered. It is emphasized that discourse analysis has to discover the system-level of discursive rules and the deeper structure of the discursive formation. Otherwise the analysis will be unfinished. Michel FOUCAULTs work is theoretically grounded in French structuralism and (the so called post-structuralism. In this paper, post-structuralism is not conceived as a means for overcoming of structuralism, but as a way of critically continuing the structural perspective. In this way, discursive structures can be related to discursive practices and the concept of structure can be disclosed (e. g. to inter-discourse or DERRIDAs concept of structurality. In this way, the structural methodology is continued and radicalized, but not given up. In this paper, FOUCAULTs theory is combined with the works of Michel PÊCHEUX and (especially for the sociology of knowledge and the sociology of culture Pierre BOURDIEU. The practice of discourse analysis is theoretically grounded. This practice can be conceived as a reflexive coupling of deconstruction and reconstruction in the material to be analyzed. This methodology therefore can be characterized as a reconstructive qualitative methodology. At the end of the article, forms of discourse analysis are criticized that do not intend to recover the system level of discursive rules and that do not intend to discover the deeper structure of the discursive formation (i. e. episteme, socio-episteme. These forms merely are commentaries of discourses (not their analyses, they remain phenomenological and are therefore: pre-structuralist. URN: urn:nbn:de:0114-fqs060168

  4. Interrelationships Between Receiver/Relative Operating Characteristics Display, Binomial, Logit, and Bayes' Rule Probability of Detection Methodologies

    Science.gov (United States)

    Generazio, Edward R.

    2014-01-01

    Unknown risks are introduced into failure critical systems when probability of detection (POD) capabilities are accepted without a complete understanding of the statistical method applied and the interpretation of the statistical results. The presence of this risk in the nondestructive evaluation (NDE) community is revealed in common statements about POD. These statements are often interpreted in a variety of ways and therefore, the very existence of the statements identifies the need for a more comprehensive understanding of POD methodologies. Statistical methodologies have data requirements to be met, procedures to be followed, and requirements for validation or demonstration of adequacy of the POD estimates. Risks are further enhanced due to the wide range of statistical methodologies used for determining the POD capability. Receiver/Relative Operating Characteristics (ROC) Display, simple binomial, logistic regression, and Bayes' rule POD methodologies are widely used in determining POD capability. This work focuses on Hit-Miss data to reveal the framework of the interrelationships between Receiver/Relative Operating Characteristics Display, simple binomial, logistic regression, and Bayes' Rule methodologies as they are applied to POD. Knowledge of these interrelationships leads to an intuitive and global understanding of the statistical data, procedural and validation requirements for establishing credible POD estimates.

  5. Safety analysis methodologies for radioactive waste repositories in shallow ground

    International Nuclear Information System (INIS)

    1984-01-01

    The report is part of the IAEA Safety Series and is addressed to authorities and specialists responsible for or involved in planning, performing and/or reviewing safety assessments of shallow ground radioactive waste repositories. It discusses approaches that are applicable for safety analysis of a shallow ground repository. The methodologies, analysis techniques and models described are pertinent to the task of predicting the long-term performance of a shallow ground disposal system. They may be used during the processes of selection, confirmation and licensing of new sites and disposal systems or to evaluate the long-term consequences in the post-sealing phase of existing operating or inactive sites. The analysis may point out need for remedial action, or provide information to be used in deciding on the duration of surveillance. Safety analysis both general in nature and specific to a certain repository, site or design concept, are discussed, with emphasis on deterministic and probabilistic studies

  6. Methodological proposal of grounding in commercial and industrial installations

    International Nuclear Information System (INIS)

    Rodriguez Araya, Michael Eduardo

    2013-01-01

    A methodology is elaborated for the design of methods of commercial and industrial grounding. International standards and technical documents related with the grounding in the electrical design are studied in commercial and industrial installations. The design techniques of earthing systems are investigated. The topics that were covered to develop a design proposal have been: the analysis of resistivity, soil types, calculations of step voltages, contact and voltage of mesh. A field visit is programmed in nearby of the Escuela de Ingenieria Electrica at the Universidad de Costa Rica, to realize the pertinent measurements of resistivity for the design of a hypothetical grounding mesh for a future installation. The tellurometer (GP-1 model) of the brand Amprobe Instrument was used to provide the data from ground resistivity. The equipment has used four electrodes and has implemented the Wenner method for calculations. A earthing design is realized in a company in the industrial or commercial sector of Costa Rica. The earthing designs are realized to protect equipments found at the site and are affected by conditions such as: atmospheric overloads, transients, sags, interruptions or any event that may to affect the quality of the energy. The resistivity of an ground has depended largely on the amount of moisture that has presented. A correct earthing system should cover the greater amount of the total area of the building, and to comply with the voltage of mesh necessary for the design has been optimal. The design of any earthing has depended on unique characteristics that have been indicated by the location of industry [es

  7. Business analysis methodology in telecommunication industry – the research based on the grounded theory

    Directory of Open Access Journals (Sweden)

    Hana Nenickova

    2013-10-01

    Full Text Available The objective of this article is to present the grounded theory using in the qualitative research as a basis to build a business analysis methodology for the implementation of information systems in telecommunication enterprises in Czech Republic. In the preparation of the methodology I have used the current needs of telecommunications companies, which are characterized mainly by high dependence on information systems. Besides that, this industry is characterized by high flexibility and competition and compressing of the corporate strategy timeline. The grounded theory of business analysis defines the specifics of the telecommunications industry, focusing on the very specific description of the procedure for collecting the business requirements and following the business strategy.

  8. Methodological tools for the collection and analysis of participant observation data using grounded theory.

    Science.gov (United States)

    Laitinen, Heleena; Kaunonen, Marja; Astedt-Kurki, Päivi

    2014-11-01

    To give clarity to the analysis of participant observation in nursing when implementing the grounded theory method. Participant observation (PO) is a method of collecting data that reveals the reality of daily life in a specific context. In grounded theory, interviews are the primary method of collecting data but PO gives a distinctive insight, revealing what people are really doing, instead of what they say they are doing. However, more focus is needed on the analysis of PO. An observational study carried out to gain awareness of nursing care and its electronic documentation in four acute care wards in hospitals in Finland. Discussion of using the grounded theory method and PO as a data collection tool. The following methodological tools are discussed: an observational protocol, jotting of notes, microanalysis, the use of questioning, constant comparison, and writing and illustrating. Each tool has specific significance in collecting and analysing data, working in constant interaction. Grounded theory and participant observation supplied rich data and revealed the complexity of the daily reality of acute care. In this study, the methodological tools provided a base for the study at the research sites and outside. The process as a whole was challenging. It was time-consuming and it required rigorous and simultaneous data collection and analysis, including reflective writing. Using these methodological tools helped the researcher stay focused from data collection and analysis to building theory. Using PO as a data collection method in qualitative nursing research provides insights. It is not commonly discussed in nursing research and therefore this study can provide insight, which cannot be seen or revealed by using other data collection methods. Therefore, this paper can produce a useful tool for those who intend to use PO and grounded theory in their nursing research.

  9. Back- and fore-grounding ontology: exploring the linkages between critical realism, pragmatism, and methodologies in health & rehabilitation sciences.

    Science.gov (United States)

    DeForge, Ryan; Shaw, Jay

    2012-03-01

    Back- and fore-grounding ontology: exploring the linkages between critical realism, pragmatism, and methodologies in health & rehabilitation sciences As two doctoral candidates in a health and rehabilitation sciences program, we describe in this paper our respective paradigmatic locations along a quite nonlinear ontological-epistemological-axiological-methodological chain. In a turn-taking fashion, we unpack the tenets of critical realism and pragmatism, and then trace the linkages from these paradigmatic locations through to the methodological choices that address a community-based research problem. Beyond serving as an answer to calls for academics in training to demonstrate philosophical-theoretical-methodological integrity and coherence in their scholarship, this paper represents critical realism and its fore-grounding of a deeply stratified ontology in reflexive relation to pragmatism and its back-grounding of ontology. We conclude by considering the merits and challenges of conducting research from within singular versus proliferate paradigmatic perspectives. © 2011 Blackwell Publishing Ltd.

  10. Analysis Methodology for Optimal Selection of Ground Station Site in Space Missions

    Science.gov (United States)

    Nieves-Chinchilla, J.; Farjas, M.; Martínez, R.

    2013-12-01

    Optimization of ground station sites is especially important in complex missions that include several small satellites (clusters or constellations) such as the QB50 project, where one ground station would be able to track several spatial vehicles, even simultaneously. In this regard the design of the communication system has to carefully take into account the ground station site and relevant signal phenomena, depending on the frequency band. To propose the optimal location of the ground station, these aspects become even more relevant to establish a trusted communication link due to the ground segment site in urban areas and/or selection of low orbits for the space segment. In addition, updated cartography with high resolution data of the location and its surroundings help to develop recommendations in the design of its location for spatial vehicles tracking and hence to improve effectiveness. The objectives of this analysis methodology are: completion of cartographic information, modelling the obstacles that hinder communication between the ground and space segment and representation in the generated 3D scene of the degree of impairment in the signal/noise of the phenomena that interferes with communication. The integration of new technologies of geographic data capture, such as 3D Laser Scan, determine that increased optimization of the antenna elevation mask, in its AOS and LOS azimuths along the horizon visible, maximizes visibility time with spatial vehicles. Furthermore, from the three-dimensional cloud of points captured, specific information is selected and, using 3D modeling techniques, the 3D scene of the antenna location site and surroundings is generated. The resulting 3D model evidences nearby obstacles related to the cartographic conditions such as mountain formations and buildings, and any additional obstacles that interfere with the operational quality of the antenna (other antennas and electronic devices that emit or receive in the same bandwidth

  11. Top Level Space Cost Methodology (TLSCM)

    Science.gov (United States)

    1997-12-02

    Software 7 6. ACEIT . 7 C. Ground Rules and Assumptions 7 D. Typical Life Cycle Cost Distribution 7 E. Methodologies 7 1. Cost/budget Threshold 9 2. Analogy...which is based on real-time Air Force and space programs. Ref.(25:2- 8, 2-9) 6. ACEIT : Automated Cost Estimating Integrated Tools( ACEIT ), Tecolote...Research, Inc. There is a way to use the ACEIT cost program to get a print-out of an expanded WBS. Therefore, find someone that has ACEIT experience and

  12. Fuzzy-Rule-Based Object Identification Methodology for NAVI System

    Directory of Open Access Journals (Sweden)

    Yaacob Sazali

    2005-01-01

    Full Text Available We present an object identification methodology applied in a navigation assistance for visually impaired (NAVI system. The NAVI has a single board processing system (SBPS, a digital video camera mounted headgear, and a pair of stereo earphones. The captured image from the camera is processed by the SBPS to generate a specially structured stereo sound suitable for vision impaired people in understanding the presence of objects/obstacles in front of them. The image processing stage is designed to identify the objects in the captured image. Edge detection and edge-linking procedures are applied in the processing of image. A concept of object preference is included in the image processing scheme and this concept is realized using a fuzzy-rule base. The blind users are trained with the stereo sound produced by NAVI for achieving a collision-free autonomous navigation.

  13. Fuzzy-Rule-Based Object Identification Methodology for NAVI System

    Science.gov (United States)

    Nagarajan, R.; Sainarayanan, G.; Yaacob, Sazali; Porle, Rosalyn R.

    2005-12-01

    We present an object identification methodology applied in a navigation assistance for visually impaired (NAVI) system. The NAVI has a single board processing system (SBPS), a digital video camera mounted headgear, and a pair of stereo earphones. The captured image from the camera is processed by the SBPS to generate a specially structured stereo sound suitable for vision impaired people in understanding the presence of objects/obstacles in front of them. The image processing stage is designed to identify the objects in the captured image. Edge detection and edge-linking procedures are applied in the processing of image. A concept of object preference is included in the image processing scheme and this concept is realized using a fuzzy-rule base. The blind users are trained with the stereo sound produced by NAVI for achieving a collision-free autonomous navigation.

  14. A Personal Journey with Grounded Theory Methodology. Kathy Charmaz in Conversation With Reiner Keller

    Directory of Open Access Journals (Sweden)

    Kathy Charmaz

    2016-01-01

    Full Text Available Kathy CHARMAZ is one of the most important thinkers in grounded theory methodology today. Her trailblazing work on constructivist grounded theory continues to inspire research across many disciplines and around the world. In this interview, she reflects on the aura surrounding qualitative inquiry that existed in California in the late 1960s to early 1970s and the lessons she learned from her first forays into empirical research. She comments on the trajectory that grounded theory research has followed since then and gives an account of her own perspective on constructivist grounded theory. In doing so, she underlines the importance of the Chicago School and symbolic interactionist tradition for grounded theory research work today and shows where the latter is positioned in the current field of qualitative fieldwork. URN: http://nbn-resolving.de/urn:nbn:de:0114-fqs1601165

  15. Selection of Grounded Theory as an Appropriate Research Methodology for a Dissertation: One Student’s Perspective

    Directory of Open Access Journals (Sweden)

    James W. Jones, Ed.D.

    2009-06-01

    Full Text Available Doctoral students wanting to use grounded theory as a methodological approach for their dissertation often face multiple challenges gaining acceptance of their approach by their committee. This paper presents the case that the author used to overcome these challenges through the process of eliminating other methodologies, leaving grounded theory as the preferred method for the desired research issue. Through examining the approach used successfully by the author, other doctoral students will be able to frame similar arguments justifying the use of grounded theory in their dissertations and seeing the use of the method continue to spread into new fields and applications. This paper examines the case built for selecting grounded theory as a defensible dissertation approach. The basic research issue that I wanted to investigate was how practitioners in an applied field sought information in their work; in other words, how they researched. I further narrowed the investigation down to a more specific field, but the paper presented here is left in broader form so that other students can see the approach in more general terms.

  16. A Methodology for Multiple Rule System Integration and Resolution Within a Singular Knowledge Base

    Science.gov (United States)

    Kautzmann, Frank N., III

    1988-01-01

    Expert Systems which support knowledge representation by qualitative modeling techniques experience problems, when called upon to support integrated views embodying description and explanation, especially when other factors such as multiple causality, competing rule model resolution, and multiple uses of knowledge representation are included. A series of prototypes are being developed to demonstrate the feasibility of automating the process of systems engineering, design and configuration, and diagnosis and fault management. A study involves not only a generic knowledge representation; it must also support multiple views at varying levels of description and interaction between physical elements, systems, and subsystems. Moreover, it will involve models of description and explanation for each level. This multiple model feature requires the development of control methods between rule systems and heuristics on a meta-level for each expert system involved in an integrated and larger class of expert system. The broadest possible category of interacting expert systems is described along with a general methodology for the knowledge representation and control of mutually exclusive rule systems.

  17. Non-plant referenced simulator methodology to meet new 10 CFR 55.45 rule

    International Nuclear Information System (INIS)

    Ibarra, J.G.

    1988-01-01

    The new 10CFR55.45 rule on Operating Tests necessitates that simulators be upgraded to meet the new requirements. This paper presents the human factors work done on an NRC approved guidance document sponsored by four utilities to develop a non-plant reference simulator facility. Human factors developed the simulator process flow and criteria, and integrated all the development work into the simulation facility plan. The human factors work provided the mechanism to solidify ideas and provided the foundation for the simulator development methodology

  18. Rules of performance in the nursing home: A grounded theory of nurse-CNA communication.

    Science.gov (United States)

    Madden, Connie; Clayton, Margaret; Canary, Heather E; Towsley, Gail; Cloyes, Kristin; Lund, Dale

    This study offers an initial theoretical understanding of nurse-CNA communication processes from the perspectives of nurses and CNAs who are providing direct care to residents in nursing homes. A grounded theory approach provided an understanding of nurse-CNA communication process within the complexities of the nursing home setting. Four themes (maintaining information flow, following procedure, fostering collegiality, and showing respect) describe the "rules of performance" that intertwine in nuanced relationships to guide nurse-CNA communication processes. Understanding how these rules of performance guide nurse-CNA communication processes, and how they are positively and negatively influenced, suggests that nurse-CNA communication during direct care of nursing home residents could be improved through policy and education that is specifically designed to be relevant and applicable to direct care providers in the nursing home environment. Copyright © 2016 Elsevier Inc. All rights reserved.

  19. Ruling the Commons. Introducing a new methodology for the analysis of historical commons

    Directory of Open Access Journals (Sweden)

    Tine de Moor

    2016-10-01

    Full Text Available Despite significant progress in recent years, the evolution of commons over the long run remains an under-explored area within commons studies. During the last years an international team of historians have worked under the umbrella of the Common Rules Project in order to design and test a new methodology aimed at advancing our knowledge on the dynamics of institutions for collective action – in particular commons. This project aims to contribute to the current debate on commons on three different fronts. Theoretically, it explicitly draws our attention to issues of change and adaptation in the commons – contrasting with more static analyses. Empirically, it highlights the value of historical records as a rich source of information for longitudinal analysis of the functioning of commons. Methodologically, it develops a systematic way of analyzing and comparing commons’ regulations across regions and time, setting a number of variables that have been defined on the basis of the “most common denominators” in commons regulation across countries and time periods. In this paper we introduce the project, describe our sources and methodology, and present the preliminary results of our analysis.

  20. Design methodology for the physical protection upgrade rule requirements for fixed sites. Technical report

    International Nuclear Information System (INIS)

    Evans, L.J. Jr.; Allen, T.

    1980-06-01

    This Design Methodology document aids the licensee in understanding how the fixed site requirements of the Physical Protection Upgrade Rule affect the design of physical protection systems for fuel processing plants, fuel manufacturing plants, or other fixed site special nuclear material operations involving possession or use of formula quantities of strategic special nuclear material. The document consists of three major elements: Logic Trees, Safeguards Jobs and Component Matrices, and Effectiveness Test Questionnaires. The work is based upon a previous study conducted by Sandia Laboratories for the Nuclear Regulatory Commission

  1. Initial building investigations at Aberdeen Proving Ground, Maryland: Objectives and methodology

    Energy Technology Data Exchange (ETDEWEB)

    Brubaker, K.L.; Dougherty, J.M.; McGinnis, L.D.

    1994-12-01

    As part of an environmental-contamination source-definition program at Aberdeen Proving Ground, detailed internal and external inspections of 23 potentially contaminated buildings are being conducted to describe and characterize the state of each building as it currently exists and to identify areas potentially contaminated with toxic or other hazardous substances. In addition, a detailed geophysical investigation is being conducted in the vicinity of each target building to locate and identify subsurface structures, associated with former building operations, that are potential sources of contamination. This report describes the objectives of the initial building inspections, including the geophysical investigations, and discusses the methodology that has been developed to achieve these objectives.

  2. Navigating the grounded theory terrain. Part 1.

    Science.gov (United States)

    Hunter, Andrew; Murphy, Kathy; Grealish, Annmarie; Casey, Dympna; Keady, John

    2011-01-01

    The decision to use grounded theory is not an easy one and this article aims to illustrate and explore the methodological complexity and decision-making process. It explores the decision making of one researcher in the first two years of a grounded theory PhD study looking at the psychosocial training needs of nurses and healthcare assistants working with people with dementia in residential care. It aims to map out three different approaches to grounded theory: classic, Straussian and constructivist. In nursing research, grounded theory is often referred to but it is not always well understood. This confusion is due in part to the history of grounded theory methodology, which is one of development and divergent approaches. Common elements across grounded theory approaches are briefly outlined, along with the key differences of the divergent approaches. Methodological literature pertaining to the three chosen grounded theory approaches is considered and presented to illustrate the options and support the choice made. The process of deciding on classical grounded theory as the version best suited to this research is presented. The methodological and personal factors that directed the decision are outlined. The relative strengths of Straussian and constructivist grounded theories are reviewed. All three grounded theory approaches considered offer the researcher a structured, rigorous methodology, but researchers need to understand their choices and make those choices based on a range of methodological and personal factors. In the second article, the final methodological decision will be outlined and its research application described.

  3. Methodology for applying monitored natural attenuation to petroleum hydrocarbon-contaminated ground-water systems with examples from South Carolina

    Science.gov (United States)

    Chapelle, Frank H.; Robertson, John F.; Landmeyer, James E.; Bradley, Paul M.

    2000-01-01

    Natural attenuation processes such as dispersion, advection, and biogradation serve to decrease concentrations of disssolved contaminants as they are transported in all ground-water systems.  However, the efficiency of these natural attenuation processes and the degree to which they help attain remediation goals, varies considerably from site to site.  This report provides a methodology for quantifying various natural attenuation mechanisms.  This methodology incorporates information on (1) concentrations of contaminants in space and/or time; (2) ambient reduction/oxidation (redox) conditions; (3) rates and directions of ground-water flow; (4) rates of contaminant biodegradation; and (5) demographic considerations, such as the presence of nearby receptor exposure points or property boundaries.  This document outlines the hydrologic, geochemical, and biologic data needed to assess the efficiency of natural attenuation, provides a screening tool for making preliminary assessments, and provides examples of how to determine when natural attenuation can be a useful component of site remediation at leaking underground storage tank sites.

  4. Transitions in midwestern ground water law

    International Nuclear Information System (INIS)

    Bowman, J.A.; Clark, G.R.

    1989-01-01

    The evolution of ground-water law in eight states in the Midwest (Illinois, Indiana, Iowa, Michigan, Minnesota, Missouri, Ohio, and Wisconsin) is examined, and a review of transitions in ground-water doctrines is presented. Two underlying themes in changing ground-water management are communicated. First, ground-water law is evolving from private property rules of capture based on the absolute ownership doctrines to rules requiring conservation and sharing of ground water as a public resource. Second, in both courts and state legislatures, a proactive role of ground-water management is emerging, again, with an emphasis on sharing. Both of these trends are apparent in the Midwest. In the last decade midwestern states have (1) seen significant shifts in court decisions on ground-water use with greater recognition of the reciprocal or mutually dependent nature of ground-water rights, and (2) seen increased legislative development of comprehensive ground-water management statutes that emphasize the reciprocal liabilities of ground-water use. These trends are examined and ground-water management programs discussed for eight states in the Midwest

  5. Methodological Grounds of Managing Innovation Development of Restaurants

    Directory of Open Access Journals (Sweden)

    Naidiuk V. S.

    2013-12-01

    Full Text Available The goal of the article lies in identification and further development of methodological grounds of managing the innovation development of restaurants. Based on the data of the critical analysis of existing scientific views on interpretation of the essence of the “managing innovation development of an enterprise” notion, the article conducts clarification of this definition. In the result of the study the article builds up a cause-effect diagram of solution of the problem of ensuring efficient management of the innovation development of a restaurant. The article develops a conceptual scheme of development and realisation of the strategy of innovation development in a restaurant. It experimentally confirms the hypothesis of availability of a very strong density of the feedback between resistance to innovation changes and a variable share of qualified personnel that is capable of permanent development (learning and generation of new ideas, in restaurants and builds a model of dependency between them. The prospects of further studies in this direction could become scientific studies directed at development of methodical approaches to identification of the level of innovation potential and assessment of efficiency of managing innovation development of different (by type, class, size, etc. restaurants. The obtained data could also be used for development of a new or improvement of the existing tools of strategic management of innovation development at the micro-level.

  6. 77 FR 52977 - Regulatory Capital Rules: Advanced Approaches Risk-Based Capital Rule; Market Risk Capital Rule

    Science.gov (United States)

    2012-08-30

    ...-weighted assets for residential mortgages, securitization exposures, and counterparty credit risk. The.... Risk-Weighted Assets--Proposed Modifications to the Advanced Approaches Rules A. Counterparty Credit... Margin Period of Risk 3. Changes to the Internal Models Methodology (IMM) 4. Credit Valuation Adjustments...

  7. Making sense of grounded theory in medical education.

    Science.gov (United States)

    Kennedy, Tara J T; Lingard, Lorelei A

    2006-02-01

    Grounded theory is a research methodology designed to develop, through collection and analysis of data that is primarily (but not exclusively) qualitative, a well-integrated set of concepts that provide a theoretical explanation of a social phenomenon. This paper aims to provide an introduction to key features of grounded theory methodology within the context of medical education research. In this paper we include a discussion of the origins of grounded theory, a description of key methodological processes, a comment on pitfalls encountered commonly in the application of grounded theory research, and a summary of the strengths of grounded theory methodology with illustrations from the medical education domain. The significant strengths of grounded theory that have resulted in its enduring prominence in qualitative research include its clearly articulated analytical process and its emphasis on the generation of pragmatic theory that is grounded in the data of experience. When applied properly and thoughtfully, grounded theory can address research questions of significant relevance to the domain of medical education.

  8. [Introduction to grounded theory].

    Science.gov (United States)

    Wang, Shou-Yu; Windsor, Carol; Yates, Patsy

    2012-02-01

    Grounded theory, first developed by Glaser and Strauss in the 1960s, was introduced into nursing education as a distinct research methodology in the 1970s. The theory is grounded in a critique of the dominant contemporary approach to social inquiry, which imposed "enduring" theoretical propositions onto study data. Rather than starting from a set theoretical framework, grounded theory relies on researchers distinguishing meaningful constructs from generated data and then identifying an appropriate theory. Grounded theory is thus particularly useful in investigating complex issues and behaviours not previously addressed and concepts and relationships in particular populations or places that are still undeveloped or weakly connected. Grounded theory data analysis processes include open, axial and selective coding levels. The purpose of this article was to explore the grounded theory research process and provide an initial understanding of this methodology.

  9. Evolving temporal association rules with genetic algorithms

    OpenAIRE

    Matthews, Stephen G.; Gongora, Mario A.; Hopgood, Adrian A.

    2010-01-01

    A novel framework for mining temporal association rules by discovering itemsets with a genetic algorithm is introduced. Metaheuristics have been applied to association rule mining, we show the efficacy of extending this to another variant - temporal association rule mining. Our framework is an enhancement to existing temporal association rule mining methods as it employs a genetic algorithm to simultaneously search the rule space and temporal space. A methodology for validating the ability of...

  10. New approach to nonleptonic weak interactions. I. Derivation of asymptotic selection rules for the two-particle weak ground-state-hadron matrix elements

    International Nuclear Information System (INIS)

    Tanuma, T.; Oneda, S.; Terasaki, K.

    1984-01-01

    A new approach to nonleptonic weak interactions is presented. It is argued that the presence and violation of the Vertical BarΔIVertical Bar = 1/2 rule as well as those of the quark-line selection rules can be explained in a unified way, along with other fundamental physical quantities [such as the value of g/sub A/(0) and the smallness of the isoscalar nucleon magnetic moments], in terms of a single dynamical asymptotic ansatz imposed at the level of observable hadrons. The ansatz prescribes a way in which asymptotic flavor SU(N) symmetry is secured levelwise for a certain class of chiral algebras in the standard QCD model. It yields severe asymptotic constraints upon the two-particle hadronic matrix elements of nonleptonic weak Hamiltonians as well as QCD currents and their charges. It produces for weak matrix elements the asymptotic Vertical BarΔIVertical Bar = 1/2 rule and its charm counterpart for the ground-state hadrons, while for strong matrix elements quark-line-like approximate selection rules. However, for the less important weak two-particle vertices involving higher excited states, the Vertical BarΔIVertical Bar = 1/2 rule and its charm counterpart are in general violated, providing us with an explicit source of the violation of these selection rules in physical processes

  11. Identification and sensitivity analysis of a correlated ground rule system (design arc)

    Science.gov (United States)

    Eastman, Eric; Chidambarrao, Dureseti; Rausch, Werner; Topaloglu, Rasit O.; Shao, Dongbing; Ramachandran, Ravikumar; Angyal, Matthew

    2017-04-01

    We demonstrate a tool which can function as an interface between VLSI designers and process-technology engineers throughout the Design-Technology Co-optimization (DTCO) process. This tool uses a Monte Carlo algorithm on the output of lithography simulations to model the frequency of fail mechanisms on wafer. Fail mechanisms are defined according to process integration flow: by Boolean operations and measurements between original and derived shapes. Another feature of this design rule optimization methodology is the use of a Markov-Chain-based algorithm to perform a sensitivity analysis, the output of which may be used by process engineers to target key process-induced variabilities for improvement. This tool is used to analyze multiple Middle-Of-Line fail mechanisms in a 10nm inverter design and identify key process assumptions that will most strongly affect the yield of the structures. This tool and the underlying algorithm are also shown to be scalable to arbitrarily complex geometries in three dimensions. Such a characteristic which is becoming more important with the introduction of novel patterning technologies and more complex 3-D on-wafer structures.

  12. Implementing XML Schema Naming and Design Rules

    Energy Technology Data Exchange (ETDEWEB)

    Lubell, Joshua [National Institute of Standards and Technology (NIST); Kulvatunyou, Boonserm [ORNL; Morris, Katherine [National Institute of Standards and Technology (NIST); Harvey, Betty [Electronic Commerce Connection, Inc.

    2006-08-01

    We are building a methodology and tool kit for encoding XML schema Naming and Design Rules (NDRs) in a computer-interpretable fashion, enabling automated rule enforcement and improving schema quality. Through our experience implementing rules from various NDR specifications, we discuss some issues and offer practical guidance to organizations grappling with NDR development.

  13. Grounded theory.

    Science.gov (United States)

    Harris, Tina

    2015-04-29

    Grounded theory is a popular research approach in health care and the social sciences. This article provides a description of grounded theory methodology and its key components, using examples from published studies to demonstrate practical application. It aims to demystify grounded theory for novice nurse researchers, by explaining what it is, when to use it, why they would want to use it and how to use it. It should enable nurse researchers to decide if grounded theory is an appropriate approach for their research, and to determine the quality of any grounded theory research they read.

  14. A Rule-Based Local Search Algorithm for General Shift Design Problems in Airport Ground Handling

    DEFF Research Database (Denmark)

    Clausen, Tommy

    We consider a generalized version of the shift design problem where shifts are created to cover a multiskilled demand and fit the parameters of the workforce. We present a collection of constraints and objectives for the generalized shift design problem. A local search solution framework with mul......We consider a generalized version of the shift design problem where shifts are created to cover a multiskilled demand and fit the parameters of the workforce. We present a collection of constraints and objectives for the generalized shift design problem. A local search solution framework...... with multiple neighborhoods and a loosely coupled rule engine based on simulated annealing is presented. Computational experiments on real-life data from various airport ground handling organization show the performance and flexibility of the proposed algorithm....

  15. Atypical Rulings of the Indonesian Constitutional Court

    Directory of Open Access Journals (Sweden)

    Bisariyadi

    2016-08-01

    Full Text Available In deciding judicial review cases, the Court may issue rulings that is not in accordance to what is stipulated in the Constitutional Court Law (Law Number 8 Year 2011. Atypical rulings means that the court may reconstruct a provision, delay the legislation/rulings enactment or give instruction to lawmakers. In addition, the court also introduce the “conditionally (unconstitutional” concept. This essay attempts to identify and classify these atypical rulings, including conditionally (un constitutional rulings, by examined the constitutional court judicial review rulings from 2003 to 2015. This study will provide a ground work for advance research on typical rulings by the Indonesian constitutional court.

  16. Revised Rules for Concrete Bridges

    DEFF Research Database (Denmark)

    Thoft-Christensen, Palle; Jensen, F. M.; Middleton, C.

    This paper is based on research performed for the Highway Agency, London, UK under the project DPU/9/44 "Revision of Bridge Assessment Rules Based on Whole Life Performance: Concrete Bridges" It contains details of a methodology which can be used to generate Whole Life (WL) reliability profiles....... These WL reliability profiles may be used to establish revised rules for Concrete Bridges....

  17. Spatio-Temporal Rule Mining

    DEFF Research Database (Denmark)

    Gidofalvi, Gyozo; Pedersen, Torben Bach

    2005-01-01

    Recent advances in communication and information technology, such as the increasing accuracy of GPS technology and the miniaturization of wireless communication devices pave the road for Location-Based Services (LBS). To achieve high quality for such services, spatio-temporal data mining techniques...... are needed. In this paper, we describe experiences with spatio-temporal rule mining in a Danish data mining company. First, a number of real world spatio-temporal data sets are described, leading to a taxonomy of spatio-temporal data. Second, the paper describes a general methodology that transforms...... the spatio-temporal rule mining task to the traditional market basket analysis task and applies it to the described data sets, enabling traditional association rule mining methods to discover spatio-temporal rules for LBS. Finally, unique issues in spatio-temporal rule mining are identified and discussed....

  18. Navigating the grounded theory terrain. Part 2.

    Science.gov (United States)

    Hunter, Andrew; Murphy, Kathy; Grealish, Annmarie; Casey, Dympna; Keady, John

    2011-01-01

    In this paper, the choice of classic grounded theory will be discussed and justified in the context of the first author's PhD research. The methodological discussion takes place within the context of PhD research entitled: Development of a stakeholder-led framework for a structured education programme that will prepare nurses and healthcare assistants to deliver a psychosocial intervention for people with dementia. There is a lack of research and limited understanding of the effect of psychosocial interventions on people with dementia. The first author thought classic grounded theory a suitable research methodology to investigate as it is held to be ideal for areas of research where there is little understanding of the social processes at work. The literature relating to the practical application of classic grounded theory is illustrated using examples relating to four key grounded theory components: Theory development: using constant comparison and memoing, Methodological rigour, Emergence of a core category, Inclusion of self and engagement with participants. Following discussion of the choice and application of classic grounded theory, this paper explores the need for researchers to visit and understand the various grounded theory options. This paper argues that researchers new to grounded theory must be familiar with and understand the various options. The researchers will then be able to apply the methodologies they choose consistently and critically. Doing so will allow them to develop theory rigorously and they will ultimately be able to better defend their final methodological destinations.

  19. Medicare program; replacement of reasonable charge methodology by fee schedules for parenteral and enteral nutrients, equipment, and supplies. Final rule.

    Science.gov (United States)

    2001-08-28

    This final rule implements fee schedules for payment of parenteral and enteral nutrition (PEN) items and services furnished under the prosthetic device benefit, defined in section 1861(s)(8) of the Social Security Act. The authority for establishing these fee schedules is provided by the Balanced Budget Act of 1997, which amended the Social Security Act at section 1842(s). Section 1842(s) of the Social Security Act specifies that statewide or other area wide fee schedules may be implemented for the following items and services still subject to the reasonable charge payment methodology: medical supplies; home dialysis supplies and equipment; therapeutic shoes; parenteral and enteral nutrients, equipment, and supplies; electromyogram devices; salivation devices; blood products; and transfusion medicine. This final rule describes changes made to the proposed fee schedule payment methodology for these items and services and provides that the fee schedules for PEN items and services are effective for all covered items and services furnished on or after January 1, 2002. Fee schedules will not be implemented for electromyogram devices and salivation devices at this time since these items are not covered by Medicare. In addition, fee schedules will not be implemented for medical supplies, home dialysis supplies and equipment, therapeutic shoes, blood products, and transfusion medicine at this time since the data required to establish these fee schedules are inadequate.

  20. Rule-based decision making model

    International Nuclear Information System (INIS)

    Sirola, Miki

    1998-01-01

    A rule-based decision making model is designed in G2 environment. A theoretical and methodological frame for the model is composed and motivated. The rule-based decision making model is based on object-oriented modelling, knowledge engineering and decision theory. The idea of safety objective tree is utilized. Advanced rule-based methodologies are applied. A general decision making model 'decision element' is constructed. The strategy planning of the decision element is based on e.g. value theory and utility theory. A hypothetical process model is built to give input data for the decision element. The basic principle of the object model in decision making is division in tasks. Probability models are used in characterizing component availabilities. Bayes' theorem is used to recalculate the probability figures when new information is got. The model includes simple learning features to save the solution path. A decision analytic interpretation is given to the decision making process. (author)

  1. App Studies : Platform Rules and Methodological Challenges

    NARCIS (Netherlands)

    Gerlitz, C.; Helmond, A.; van der Vlist, F.; Weltevrede, E.; De Groote, S.; Majmundar, P.

    2016-01-01

    The panel engages with conceptual and methodological challenges within a specific area of ‘internet rules’, namely the space of mobile apps. Whereas the web was set out to function as a ‘generative’ and open technology facilitating the production of unanticipated services and applications, the

  2. 3He electron scattering sum rules

    International Nuclear Information System (INIS)

    Kim, Y.E.; Tornow, V.

    1982-01-01

    Electron scattering sum rules for 3 He are derived with a realistic ground-state wave function. The theoretical results are compared with the experimentally measured integrated cross sections. (author)

  3. La Teoría Fundamentada como Metodología de Investigación Cualitativa en Enfermería Grounded theory as a qualitative research methodology in nursing

    Directory of Open Access Journals (Sweden)

    Cristina G. Vivar

    2010-12-01

    Full Text Available La teoría fundamentada (TF es un diseño de investigación cualitativa, reconocido a nivel internacional, que ha sido utilizado para desarrollar teorías sobre fenómenos de salud relevantes. Sin embargo, en el ámbito de la enfermería española, la TF ha tenido escasa consideración. Por ello, este artículo se centra en esta metodología cualitativa e ilustra su contribución en la investigación enfermera en España y su utilidad para la Enfermería. También, se presentan brevemente las características metodológicas distintivas de la teoría fundamentada.Grounded theory is a qualitative research design used at an international level. It has been applied as a methodology to develop theories about relevant health phenomena. However, in the Spanish nursing context, grounded theory has received very little attention. This article focuses on this qualitative methodology and illustrates its contribution to nursing research in Spain and its relevance for nursing. Moreover, the main methodological characteristics of grounded theory are briefly presented.

  4. Methodology for evaluating the grounding system in electrical substations; Metodologia para la evaluacion del sistema de puesta a tierra en subestaciones electricas

    Energy Technology Data Exchange (ETDEWEB)

    Torrelles Rivas, L.F [Universidad Nacional Experimental Politecnica: Antonio Jose de Sucre (UNEXPO), Guayana, Bolivar (Venezuela)]. E-mail: torrellesluis@gmail.com; Alvarez, P. [Petroleos de Venezuela S.A (PDVSA), Maturin, Monagas (Venezuela)]. E-mail: alvarezph@pdvsa.com

    2013-03-15

    The present work proposes a methodology for evaluating grounding systems in electrical substations from medium and high voltage, in order to diagnose the state of the elements of the grounding system and the corresponding electrical variables. The assessment methodology developed includes a visual inspection phase to the elements of the substation. Then, by performing measurements and data analysis, the electrical continuity between the components of the substation and the mesh ground is verified, the soil resistivity and resistance of the mesh. Also included in the methodology the calculation of the step and touch voltage of the substation, based on the criteria of the International IEEE standards. We study the case of the 115 kV Pirital Substation belonging to PDVSA Oriente Transmission Network. [Spanish] En el presente trabajo se plantea una metodologia para la evaluacion de sistemas de puesta a tierra en subestaciones electricas de media y alta tension, con la finalidad de diagnosticar el estado de los elementos que conforman dicho sistema y las variables electricas correspondientes. La metodologia de evaluacion desarrollada incluye una fase de inspeccion visual de los elementos que conforman la subestacion. Luego, mediante la ejecucion de mediciones y analisis de datos, se verifica la continuidad electrica entre los componentes de la subestacion y la malla de puesta a tierra, la resistividad del suelo y resistencia de la malla. Se incluye tambien en la metodologia el calculo de las tensiones de paso y de toque de la subestacion, segun lo fundamentado en los criterios de los estandares Internacionales IEEE. Se estudia el caso de la Subestacion Pirital 115 kV perteneciente a la Red de Transmision de PDVSA Oriente.

  5. Development of methodology and computer programs for the ground response spectrum and the probabilistic seismic hazard analysis

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Joon Kyoung [Semyung Univ., Research Institute of Industrial Science and Technol , Jecheon (Korea, Republic of)

    1996-12-15

    Objective of this study is to investigate and develop the methodologies and corresponding computer codes, compatible to the domestic seismological and geological environments, for estimating ground response spectrum and probabilistic seismic hazard. Using the PSHA computer program, the Cumulative Probability Functions(CPDF) and Probability Functions (PDF) of the annual exceedence have been investigated for the analysis of the uncertainty space of the annual probability at ten interested seismic hazard levels (0.1 g to 0.99 g). The cumulative provability functions and provability functions of the annual exceedence have been also compared to those results from the different input parameter spaces.

  6. The ATLAS SCT grounding and shielding concept and implementation

    CERN Document Server

    Bates, RL; Bernabeu, J; Bizzell, J; Bohm, J; Brenner, R; Bruckman de Renstrom, P A; Catinaccio, A; Cindro, V; Ciocio, A; Civera, J V; Chouridou, S; Dervan, P; Dick, B; Dolezal, Z; Eklund, L; Feld, L; Ferrere, D; Gadomski, S; Gonzalez, F; Gornicki, E; Greenhall, A; Grillo, A A; Grosse-Knetter, J; Gruwe, M; Haywood, S; Hessey, N P; Ikegami, Y; Jones, T J; Kaplon, J; Kodys, P; Kohriki, T; Kondo, T; Koperny, S; Lacasta, C; Lozano Bahilo, J; Malecki, P; Martinez-McKinney, F; McMahon, S J; McPherson, A; Mikulec, B; Mikus, M; Moorhead, G F; Morrissey, M C; Nagai, K; Nichols, A; O'Shea, V; Pater, J R; Peeters, S J M; Pernegger, H; Perrin, E; Phillips, P W; Pieron, J P; Roe, S; Sanchez, J; Spencer, E; Stastny, J; Tarrant, J; Terada, S; Tyndel, M; Unno, Y; Wallny, R; Weber, M; Weidberg, A R; Wells, P S; Werneke, P; Wilmut, I

    2012-01-01

    This paper describes the design and implementation of the grounding and shielding system for the ATLAS SemiConductor Tracker (SCT). The mitigation of electromagnetic interference and noise pickup through power lines is the critical design goal as they have the potential to jeopardize the electrical performance. We accomplish this by adhering to the ATLAS grounding rules, by avoiding ground loops and isolating the different subdetectors. Noise sources are identified and design rules to protect the SCT against them are described. A rigorous implementation of the design was crucial to achieve the required performance. This paper highlights the location, connection and assembly of the different components that affect the grounding and shielding system: cables, filters, cooling pipes, shielding enclosure, power supplies and others. Special care is taken with the electrical properties of materials and joints. The monitoring of the grounding system during the installation period is also discussed. Finally, after con...

  7. Rules of thumb and simplified methods

    International Nuclear Information System (INIS)

    Lahti, G.P.

    1985-01-01

    The author points out the value of a thorough grounding in fundamental physics combined with experience of applied practice when using simplified methods and rules of thumb in shield engineering. Present-day quality assurance procedures and good engineering practices require careful documentation of all calculations. The aforementioned knowledge of rules of thumb and back-of-the-envelope calculations can assure both the preparer and the reviewer that the results in the quality assurance documentation are the physically correct ones

  8. Rules of Thumb from the Literature on Research and Evaluation.

    Science.gov (United States)

    Lai, Morris K.

    Practical advice on frequently asked questions dealing with research and evaluation methodology is presented as rules of thumb, with citations to the author's sources. A statement in the literature is considered a rule of thumb if it meets one of the following criteria: (1) it is specifically called a rule of thumb; (2) it contains numbers in…

  9. Management Research and Grounded Theory: A review of grounded theorybuilding approach in organisational and management research.

    Directory of Open Access Journals (Sweden)

    Graham J.J. Kenealy, Ph.D.

    2008-06-01

    Full Text Available Grounded theory is a systematic methodology for the collection and analysis of data which was discovered by Glaser and Strauss in the 1960’s. The discovery of this method was first presented to the academic community in their book ‘The Discovery of Grounded Theory’ (1967 which still remains a primary point of reference for those undertaking qualitative research and grounded theory in particular. This powerful research method has become very popular in some research domains; whilst increasing in popularity it is still less prevalent in the field of organisational and management research particularly in its original form. This self reflexive paper sets out to explore the possibilities for this imbalance which takes the discussion onto the areas of methodological adaptation and training. It also enters the debate about access to research subjects and provides a succinct argument supporting the notion that grounded theory should simply be viewed as a method that develops empirically grounded conceptual theory.

  10. How to do a grounded theory study: a worked example of a study of dental practices.

    Science.gov (United States)

    Sbaraini, Alexandra; Carter, Stacy M; Evans, R Wendell; Blinkhorn, Anthony

    2011-09-09

    Qualitative methodologies are increasingly popular in medical research. Grounded theory is the methodology most-often cited by authors of qualitative studies in medicine, but it has been suggested that many 'grounded theory' studies are not concordant with the methodology. In this paper we provide a worked example of a grounded theory project. Our aim is to provide a model for practice, to connect medical researchers with a useful methodology, and to increase the quality of 'grounded theory' research published in the medical literature. We documented a worked example of using grounded theory methodology in practice. We describe our sampling, data collection, data analysis and interpretation. We explain how these steps were consistent with grounded theory methodology, and show how they related to one another. Grounded theory methodology assisted us to develop a detailed model of the process of adapting preventive protocols into dental practice, and to analyse variation in this process in different dental practices. By employing grounded theory methodology rigorously, medical researchers can better design and justify their methods, and produce high-quality findings that will be more useful to patients, professionals and the research community.

  11. How to do a grounded theory study: a worked example of a study of dental practices

    Directory of Open Access Journals (Sweden)

    Evans R

    2011-09-01

    Full Text Available Abstract Background Qualitative methodologies are increasingly popular in medical research. Grounded theory is the methodology most-often cited by authors of qualitative studies in medicine, but it has been suggested that many 'grounded theory' studies are not concordant with the methodology. In this paper we provide a worked example of a grounded theory project. Our aim is to provide a model for practice, to connect medical researchers with a useful methodology, and to increase the quality of 'grounded theory' research published in the medical literature. Methods We documented a worked example of using grounded theory methodology in practice. Results We describe our sampling, data collection, data analysis and interpretation. We explain how these steps were consistent with grounded theory methodology, and show how they related to one another. Grounded theory methodology assisted us to develop a detailed model of the process of adapting preventive protocols into dental practice, and to analyse variation in this process in different dental practices. Conclusions By employing grounded theory methodology rigorously, medical researchers can better design and justify their methods, and produce high-quality findings that will be more useful to patients, professionals and the research community.

  12. Sign rules for anisotropic quantum spin systems

    International Nuclear Information System (INIS)

    Bishop, R. F.; Farnell, D. J. J.; Parkinson, J. B.

    2000-01-01

    We present exact ''sign rules'' for various spin-s anisotropic spin-lattice models. It is shown that, after a simple transformation which utilizes these sign rules, the ground-state wave function of the transformed Hamiltonian is positive definite. Using these results exact statements for various expectation values of off-diagonal operators are presented, and transitions in the behavior of these expectation values are observed at particular values of the anisotropy. Furthermore, the importance of such sign rules in variational calculations and quantum Monte Carlo calculations is emphasized. This is illustrated by a simple variational treatment of a one-dimensional anisotropic spin model

  13. The 5S methodology as a tool for improving the organisation

    OpenAIRE

    J. Michalska; D. Szewieczek

    2007-01-01

    Purpose: The aim of this paper is showing the 5S methodology. In this paper it was introduced the way of implementing the 5S methodology in the company.Design/methodology/approach: In the frames of own research it has been analysed and implemented the 5S rules in the production process.Findings: On the basis of the own research it can be stated, that introducing the 5S rules bring the great changes in the company, for example: process improvement by costs’ reduction, increasing of effectivene...

  14. 40 CFR 141.403 - Treatment technique requirements for ground water systems.

    Science.gov (United States)

    2010-07-01

    ... ground water systems. 141.403 Section 141.403 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) WATER PROGRAMS (CONTINUED) NATIONAL PRIMARY DRINKING WATER REGULATIONS Ground Water Rule § 141... customer as follows: (i) Chemical disinfection—(A) Ground water systems serving greater than 3,300 people...

  15. Checking Flight Rules with TraceContract: Application of a Scala DSL for Trace Analysis

    Science.gov (United States)

    Barringer, Howard; Havelund, Klaus; Morris, Robert A.

    2011-01-01

    Typically during the design and development of a NASA space mission, rules and constraints are identified to help reduce reasons for failure during operations. These flight rules are usually captured in a set of indexed tables, containing rule descriptions, rationales for the rules, and other information. Flight rules can be part of manual operations procedures carried out by humans. However, they can also be automated, and either implemented as on-board monitors, or as ground based monitors that are part of a ground data system. In the case of automated flight rules, one considerable expense to be addressed for any mission is the extensive process by which system engineers express flight rules in prose, software developers translate these requirements into code, and then both experts verify that the resulting application is correct. This paper explores the potential benefits of using an internal Scala DSL for general trace analysis, named TRACECONTRACT, to write executable specifications of flight rules. TRACECONTRACT can generally be applied to analysis of for example log files or for monitoring executing systems online.

  16. Diverse Ways to Fore-Ground Methodological Insights about Qualitative Research

    Science.gov (United States)

    Koro-Ljungberg, Mirka; Mazzei, Lisa A.; Ceglowski, Deborah

    2013-01-01

    Texts and articles that put epistemological theories and methodologies to work in the context of qualitative research can stimulate scholarship in various ways such as through methodological innovations, transferability of theories and methods, interdisciplinarity, and transformative reflections across traditions and frameworks. Such…

  17. Empirical and pragmatic adequacy of grounded theory: Advancing nurse empowerment theory for nurses' practice.

    Science.gov (United States)

    Udod, Sonia A; Racine, Louise

    2017-12-01

    To draw on the findings of a grounded theory study aimed at exploring how power is exercised in nurse-manager relationships in the hospital setting, this paper examines the empirical and pragmatic adequacy of grounded theory as a methodology to advance the concept of empowerment in the area of nursing leadership and management. The evidence on staff nurse empowerment has highlighted the magnitude of individual and organisational outcomes, but has not fully explicated the micro-level processes underlying how power is exercised, shared or created within the nurse-manager relationship. Although grounded theory is a widely adopted nursing research methodology, it remains less used in nursing leadership because of the dominance of quantitative approaches to research. Grounded theory methodology provides the empirical and pragmatic relevance to inform nursing practice and policy. Grounded theory is a relevant qualitative approach to use in leadership research as it provides a fine and detailed analysis of the process underlying complexity and bureaucracy. Discursive paper. A critical examination of the empirical and pragmatic relevance of grounded theory by (Corbin & Strauss, , ) as a method for analysing and solving problems in nurses' practice is provided. This paper provides evidence to support the empirical and pragmatic adequacy of grounded theory methodology. Although the application of the ontological, epistemological and methodological assumptions of grounded theory is challenging, this methodology is useful to address real-life problems in nursing practice by developing theoretical explanations of nurse empowerment, or lack thereof, in the workplace. Grounded theory represents a relevant methodology to inform nursing leadership research. Grounded theory is anchored in the reality of practice. The strength of grounded theory is to provide results that can be readily applied to clinical practice and policy as they arise from problems that affect practice and that

  18. The Grounded Theory Bookshelf

    Directory of Open Access Journals (Sweden)

    Vivian B. Martin, Ph.D.

    2005-03-01

    Full Text Available Bookshelf will provide critical reviews and perspectives on books on theory and methodology of interest to grounded theory. This issue includes a review of Heaton’s Reworking Qualitative Data, of special interest for some of its references to grounded theory as a secondary analysis tool; and Goulding’s Grounded Theory: A practical guide for management, business, and market researchers, a book that attempts to explicate the method and presents a grounded theory study that falls a little short of the mark of a fully elaborated theory.Reworking Qualitative Data, Janet Heaton (Sage, 2004. Paperback, 176 pages, $29.95. Hardcover also available.

  19. RuleML-Based Learning Object Interoperability on the Semantic Web

    Science.gov (United States)

    Biletskiy, Yevgen; Boley, Harold; Ranganathan, Girish R.

    2008-01-01

    Purpose: The present paper aims to describe an approach for building the Semantic Web rules for interoperation between heterogeneous learning objects, namely course outlines from different universities, and one of the rule uses: identifying (in)compatibilities between course descriptions. Design/methodology/approach: As proof of concept, a rule…

  20. Parity of Θ+(1540) from QCD sum rules

    International Nuclear Information System (INIS)

    Lee, Su Houng; Kim, Hungchong; Kwon, Youngshin

    2005-01-01

    The QCD sum rule for the pentaquark Θ + , first analyzed by Sugiyama, Doi and Oka, is reanalyzed with a phenomenological side that explicitly includes the contribution from the two-particle reducible kaon-nucleon intermediate state. The magnitude for the overlap of the Θ + interpolating current with the kaon-nucleon state is obtained by using soft-kaon theorem and a separate sum rule for the ground state nucleon with the pentaquark nucleon interpolating current. It is found that the K-N intermediate state constitutes only 10% of the sum rule so that the original claim that the parity of Θ + is negative remains valid

  1. Derivation of asymptotic Vertical BarΔIVertical Bar = 1/2 rule

    International Nuclear Information System (INIS)

    Terasaki, K.; Oneda, S.

    1982-01-01

    It is argued that the origin of the observed approximate Vertical BarΔIVertical Bar = 1/2 rule is the presence of an asymptotic Vertical BarΔIVertical Bar = 1/2 rule which exists among certain two-body hadronic weak matrix elements, involving especially the ground-state hadrons

  2. A systematic review of grounded theory studies in physiotherapy.

    Science.gov (United States)

    Ali, Nancy; May, Stephen; Grafton, Kate

    2018-05-23

    This systematic review aimed at appraising the methodological rigor of grounded theory research published in the field of physiotherapy to assess how the methodology is understood and applied. A secondary aim was to provide research implications drawn from the findings to guide future grounded theory methodology (GTM) research. A systematic search was conducted in MEDLINE, CINHAL, SPORT Discus, Science Direct, PubMed, Scopus, and Web of Science to identify studies in the field of physiotherapy that reported using GTM and/or methods in the study title and/or abstract. The descriptive characteristics and methodological quality of eligible studies were examined using grounded theory methodology assessment guidelines. The review included 68 studies conducted between 1998 and 2017. The findings showed that GTM is becoming increasingly used by physiotherapy researchers. Thirty-six studies (53%) demonstrated a good understanding and appropriate application of GTM. Thirty-two studies (47%) presented descriptive findings and were considered to be of poor methodological quality. There are several key tenets of GTM that are integral to the iterative process of qualitative theorizing and need to be applied throughout all research practices including sampling, data collection, and analysis.

  3. European methodology of analysis vertical restraints under rule of reason in context of cooperative relation specific investments

    Directory of Open Access Journals (Sweden)

    Agamirova Maria, Е.

    2015-06-01

    Full Text Available The problem of underinvestment in specific assets is a key issue in new institutional economics, especially in case of cooperative relation specific investments. It can be solved due to vertical restraints, as an alternative way of vertical integration to transfer control to partner, who makes relation specific investments. The type of relationspecific investments called «cooperative» investments (or cross investments was nearly absent in economic analysis up to the very end of the twentieth century despite of the fact that such investments are widespread. It led to the absence of analysis relation specific investments in official regulation documents. At the same time, different types of relation specific investments can be characterized by different degree of riskiness and need special regulations of vertical agreements. In the paper author makes an attempt to analyze the European methodology of assessment vertical restraints under rule of reason focusing on the type of relation specific investments. It makes possible to improve analysis of vertical restraint in Russian antitrust.

  4. Ibn Hibban and The Mudallisin’s Narrations in his Book al-Sahih: Rule, Motive and Methodology

    Directory of Open Access Journals (Sweden)

    Muhammad Rozaimi Ramle

    2016-06-01

    Full Text Available Tadlis i.e concealing the narrators is an illlah i.e. a concealed flaws that is an obstruction for a Hadith to be classified as authentic in the science of Hadith,. A Hadith, which has Tadlis in its sanad (the chain of narration will be classified as dhaif (weak. Ibn Hibban was one of the muhaddithin who has compiled authentic hadiths in his book al-Taqasim wa al-Anwa’. In this book, he has put conditions in accepting the authenticity of Hadith narrated by a mudallis. After the study has been carried out, it shows that Ibn Hibban not only stipulated rules for the narrations of mudallisin, but he also has his own motive when putting them in his book and methodology while dealing with them. Hence, this research is intended to explain this matter. An analytical and critical methods will be utilized for the purpose of this study. It also focuses on the definition of tadlis according to Ibn Hibban and its comparison to other scholars of hadith.

  5. Detection of Changes in Ground-Level Ozone Concentrations via Entropy

    Directory of Open Access Journals (Sweden)

    Yuehua Wu

    2015-04-01

    Full Text Available Ground-level ozone concentration is a key indicator of air quality. Theremay exist sudden changes in ozone concentration data over a long time horizon, which may be caused by the implementation of government regulations and policies, such as establishing exhaust emission limits for on-road vehicles. To monitor and assess the efficacy of these policies, we propose a methodology for detecting changes in ground-level ozone concentrations, which consists of three major steps: data transformation, simultaneous autoregressive modelling and change-point detection on the estimated entropy. To show the effectiveness of the proposed methodology, the methodology is applied to detect changes in ground-level ozone concentration data collected in the Toronto region of Canada between June and September for the years from 1988 to 2009. The proposed methodology is also applicable to other climate data.

  6. Trade rules and exchange rate misalignments: in search for a WTO solution

    Directory of Open Access Journals (Sweden)

    Vera Thorstensen

    2014-09-01

    Full Text Available The debate on the link between trade rules and rules on exchange rates is raising the attention of experts on international trade law and economics. The main purpose of this paper is to analyze the impacts of exchange rate misalignments on tariffs as applied by the WTO - World Trade Organization. It is divided into five sections: the first one explains the methodology used to determine exchange rate misalignments and also presents its results for Brazil, U.S. and China; the second summarizes the methodology applied to calculate the impacts of exchange rate misalignments on the level of tariff protection through an exercise of "misalignment tariffication"; the third examines the effects of exchange rate variations on tariffs and their consequences for the multilateral trading system; the fourth one creates a methodology to estimate exchange rates against a currency of the World and a proposal to deal with persistent and significant misalignments related to trade rules. The conclusions are present in the last section.

  7. High Throughput Determination of Plant Height, Ground Cover, and Above-Ground Biomass in Wheat with LiDAR.

    Science.gov (United States)

    Jimenez-Berni, Jose A; Deery, David M; Rozas-Larraondo, Pablo; Condon, Anthony Tony G; Rebetzke, Greg J; James, Richard A; Bovill, William D; Furbank, Robert T; Sirault, Xavier R R

    2018-01-01

    Crop improvement efforts are targeting increased above-ground biomass and radiation-use efficiency as drivers for greater yield. Early ground cover and canopy height contribute to biomass production, but manual measurements of these traits, and in particular above-ground biomass, are slow and labor-intensive, more so when made at multiple developmental stages. These constraints limit the ability to capture these data in a temporal fashion, hampering insights that could be gained from multi-dimensional data. Here we demonstrate the capacity of Light Detection and Ranging (LiDAR), mounted on a lightweight, mobile, ground-based platform, for rapid multi-temporal and non-destructive estimation of canopy height, ground cover and above-ground biomass. Field validation of LiDAR measurements is presented. For canopy height, strong relationships with LiDAR ( r 2 of 0.99 and root mean square error of 0.017 m) were obtained. Ground cover was estimated from LiDAR using two methodologies: red reflectance image and canopy height. In contrast to NDVI, LiDAR was not affected by saturation at high ground cover, and the comparison of both LiDAR methodologies showed strong association ( r 2 = 0.92 and slope = 1.02) at ground cover above 0.8. For above-ground biomass, a dedicated field experiment was performed with destructive biomass sampled eight times across different developmental stages. Two methodologies are presented for the estimation of biomass from LiDAR: 3D voxel index (3DVI) and 3D profile index (3DPI). The parameters involved in the calculation of 3DVI and 3DPI were optimized for each sample event from tillering to maturity, as well as generalized for any developmental stage. Individual sample point predictions were strong while predictions across all eight sample events, provided the strongest association with biomass ( r 2 = 0.93 and r 2 = 0.92) for 3DPI and 3DVI, respectively. Given these results, we believe that application of this system will provide new

  8. Update of Part 61 Impacts Analysis Methodology. Methodology report. Volume 1

    International Nuclear Information System (INIS)

    Oztunali, O.I.; Roles, G.W.

    1986-01-01

    Under contract to the US Nuclear Regulatory Commission, the Envirosphere Company has expanded and updated the impacts analysis methodology used during the development of the 10 CFR Part 61 rule to allow improved consideration of the costs and impacts of treatment and disposal of low-level waste that is close to or exceeds Class C concentrations. The modifications described in this report principally include: (1) an update of the low-level radioactive waste source term, (2) consideration of additional alternative disposal technologies, (3) expansion of the methodology used to calculate disposal costs, (4) consideration of an additional exposure pathway involving direct human contact with disposed waste due to a hypothetical drilling scenario, and (5) use of updated health physics analysis procedures (ICRP-30). Volume 1 of this report describes the calculational algorithms of the updated analysis methodology

  9. Update of Part 61 Impacts Analysis Methodology. Methodology report. Volume 1

    Energy Technology Data Exchange (ETDEWEB)

    Oztunali, O.I.; Roles, G.W.

    1986-01-01

    Under contract to the US Nuclear Regulatory Commission, the Envirosphere Company has expanded and updated the impacts analysis methodology used during the development of the 10 CFR Part 61 rule to allow improved consideration of the costs and impacts of treatment and disposal of low-level waste that is close to or exceeds Class C concentrations. The modifications described in this report principally include: (1) an update of the low-level radioactive waste source term, (2) consideration of additional alternative disposal technologies, (3) expansion of the methodology used to calculate disposal costs, (4) consideration of an additional exposure pathway involving direct human contact with disposed waste due to a hypothetical drilling scenario, and (5) use of updated health physics analysis procedures (ICRP-30). Volume 1 of this report describes the calculational algorithms of the updated analysis methodology.

  10. Ship Collision and Grounding Analysis

    DEFF Research Database (Denmark)

    Pedersen, Preben Terndrup

    2010-01-01

    It is the purpose of the paper to present a review of prediction and analysis tools for collision and grounding analyses and to outline a probabilistic procedure whereby these tools can be used by the maritime industry to develop performance based rules to reduce the risk associated with human, e...

  11. A discussion of differences in preparation, performance and postreflections in participant observations within two grounded theory approaches.

    Science.gov (United States)

    Berthelsen, Connie Bøttcher; Lindhardt, Tove; Frederiksen, Kirsten

    2017-06-01

    This paper presents a discussion of the differences in using participant observation as a data collection method by comparing the classic grounded theory methodology of Barney Glaser with the constructivist grounded theory methodology by Kathy Charmaz. Participant observations allow nursing researchers to experience activities and interactions directly in situ. However, using participant observations as a data collection method can be done in many ways, depending on the chosen grounded theory methodology, and may produce different results. This discussion shows that how the differences between using participant observations in classic and constructivist grounded theory can be considerable and that grounded theory researchers should adhere to the method descriptions of performing participant observations according to the selected grounded theory methodology to enhance the quality of research. © 2016 Nordic College of Caring Science.

  12. Linking Symbolic Interactionism and Grounded Theory Methods in a Research Design

    Directory of Open Access Journals (Sweden)

    Jennifer Chamberlain-Salaun

    2013-09-01

    Full Text Available This article focuses on Corbin and Strauss’ evolved version of grounded theory. In the third edition of their seminal text, Basics of Qualitative Research: Techniques and Procedures for Developing Grounded Theory, the authors present 16 assumptions that underpin their conception of grounded theory methodology. The assumptions stem from a symbolic interactionism perspective of social life, including the themes of meaning, action and interaction, self and perspectives. As research design incorporates both methodology and methods, the authors aim to expose the linkages between the 16 assumptions and essential grounded theory methods, highlighting the application of the latter in light of the former. Analyzing the links between symbolic interactionism and essential grounded theory methods provides novice researchers and researchers new to grounded theory with a foundation from which to design an evolved grounded theory research study.

  13. Self-Tuning Threshold Method for Real-Time Gait Phase Detection Based on Ground Contact Forces Using FSRs

    Directory of Open Access Journals (Sweden)

    Jing Tang

    2018-02-01

    Full Text Available This paper presents a novel methodology for detecting the gait phase of human walking on level ground. The previous threshold method (TM sets a threshold to divide the ground contact forces (GCFs into on-ground and off-ground states. However, the previous methods for gait phase detection demonstrate no adaptability to different people and different walking speeds. Therefore, this paper presents a self-tuning triple threshold algorithm (STTTA that calculates adjustable thresholds to adapt to human walking. Two force sensitive resistors (FSRs were placed on the ball and heel to measure GCFs. Three thresholds (i.e., high-threshold, middle-threshold andlow-threshold were used to search out the maximum and minimum GCFs for the self-adjustments of thresholds. The high-threshold was the main threshold used to divide the GCFs into on-ground and off-ground statuses. Then, the gait phases were obtained through the gait phase detection algorithm (GPDA, which provides the rules that determine calculations for STTTA. Finally, the STTTA reliability is determined by comparing the results between STTTA and Mariani method referenced as the timing analysis module (TAM and Lopez–Meyer methods. Experimental results show that the proposed method can be used to detect gait phases in real time and obtain high reliability when compared with the previous methods in the literature. In addition, the proposed method exhibits strong adaptability to different wearers walking at different walking speeds.

  14. A usage-centered evaluation methodology for unmanned ground vehicles

    NARCIS (Netherlands)

    Diggelen, J. van; Looije, R.; Mioch, T.; Neerincx, M.A.; Smets, N.J.J.M.

    2012-01-01

    This paper presents a usage-centered evaluation method to assess the capabilities of a particular Unmanned Ground Vehicle (UGV) for establishing the operational goals. The method includes a test battery consisting of basic tasks (e.g., slalom, funnel driving, object detection). Tests can be of

  15. New French basic safety rule on seismic input ground motions

    International Nuclear Information System (INIS)

    Forner, Sophie; Boulaigue, Yves

    2002-01-01

    French regulatory practice requires that the main safety functions of a land-based major nuclear facility, in particular in accordance with its specific characteristics, safe shutdown, cooling and containment of radioactive substances, be assured during and/or after earthquake events that can plausibly occur at the site where the installation is located. This rule specifies an acceptable method for determining the seismic motion to be taken into account when designing a facility to address the seismic risk. In regions where deformation factors are low, such as in metropolitan France, the intervals between strong earthquakes are long and it can be difficult to associate some earthquakes with known faults. In addition, despite substantial progress in recent years, it is difficult, given the French seismotectonic situation, to identify potentially seismogenic faults and determine the characteristics of the earthquakes that are liable to occur. Therefore, the approach proposed in this Basic Safety Rule is intended to avoid this difficulty by allowing for all direct and indirect influences that can play a role in the occurrence of earthquakes, as well as all seismic knowledge. Furthermore, as concerns calculation of seismic motion, the low number of records of strong motion in metropolitan France makes it necessary to use data from other regions of the world

  16. Resisting Coherence: Trans Men's Experiences and the Use of Grounded Theory Methods

    Science.gov (United States)

    Catalano, D. Chase J.

    2017-01-01

    In this methodological reflective manuscript, I explore my decision to use a grounded theoretical approach to my dissertation study on trans* men in higher education. Specifically, I question whether grounded theory as a methodology is capable of capturing the complexity and capaciousness of trans*-masculine experiences. Through the lenses of…

  17. Review and Application of Ship Collision and Grounding Analysis Procedures

    DEFF Research Database (Denmark)

    Pedersen, Preben Terndrup

    2010-01-01

    It is the purpose of the paper to present a review of prediction and analysis tools for collision and grounding analyses and to outline a probabilistic procedure for which these tools can be used by the maritime industry to develop performance based rules to reduce the risk associated with human,......, environmental and economic costs of collision and grounding events. The main goal of collision and grounding research should be to identify the most economic risk control options associated with prevention and mitigation of collision and grounding events....

  18. Evaluating data worth for ground-water management under uncertainty

    Science.gov (United States)

    Wagner, B.J.

    1999-01-01

    A decision framework is presented for assessing the value of ground-water sampling within the context of ground-water management under uncertainty. The framework couples two optimization models-a chance-constrained ground-water management model and an integer-programing sampling network design model-to identify optimal pumping and sampling strategies. The methodology consists of four steps: (1) The optimal ground-water management strategy for the present level of model uncertainty is determined using the chance-constrained management model; (2) for a specified data collection budget, the monitoring network design model identifies, prior to data collection, the sampling strategy that will minimize model uncertainty; (3) the optimal ground-water management strategy is recalculated on the basis of the projected model uncertainty after sampling; and (4) the worth of the monitoring strategy is assessed by comparing the value of the sample information-i.e., the projected reduction in management costs-with the cost of data collection. Steps 2-4 are repeated for a series of data collection budgets, producing a suite of management/monitoring alternatives, from which the best alternative can be selected. A hypothetical example demonstrates the methodology's ability to identify the ground-water sampling strategy with greatest net economic benefit for ground-water management.A decision framework is presented for assessing the value of ground-water sampling within the context of ground-water management under uncertainty. The framework couples two optimization models - a chance-constrained ground-water management model and an integer-programming sampling network design model - to identify optimal pumping and sampling strategies. The methodology consists of four steps: (1) The optimal ground-water management strategy for the present level of model uncertainty is determined using the chance-constrained management model; (2) for a specified data collection budget, the monitoring

  19. Application of American and French rules for the next belgian PWR

    International Nuclear Information System (INIS)

    Roch, M.; Cavaco, A.

    1987-01-01

    The licensing practice in Belgium is evolving from the precedent compliance with the USNRC rules (as applied to the 4 last Belgian PWRs) to a more sophisticated approach applied to the next Belgian PWR (N8), which incorporates a mixed compliance with the USNRC or with French rules, depending on the equipment, the structure or the system considered. In this paper, we present the approach concerning the licensing rules applicable to N8. The following aspects are covered: rules applicable to the NSSS; rules applicable to the BOP (codes of design for systems and structures); rules applicable to the equipment (code of construction for mechanical and electrical components); impact on the lay-out of the plant. Some examples of application of this methodology are given. (author)

  20. [A commentary on the Ruling of the Tribuna Constitucional 212/1996 of 19 December 1996 (I)].

    Science.gov (United States)

    González Morán, L

    1998-01-01

    This article is a commentary on Spain's Constitutional Court's ruling of 19 December 1996 (STC 212/1996), on the challenge (596/89) on grounds of alleged unconstitutionality made against Law 42/1988, 28 December, which regulates the donation of human embryos and foetuses or the cells, tissues and organs therefrom. The article is structured as follows: it opens with a summary of Law 42/1988, since this is felt necessary to understand the subsequent challenge made on grounds of alleged unconstitutionality. We then provide specific details of the challenge and the resulting ruling, before concluding with some critical remarks on the aforementioned Law and ruling.

  1. Linking the Intercultural and Grounded Theory: Methodological Issues in Migration Research

    Directory of Open Access Journals (Sweden)

    Vera Sheridan

    2009-01-01

    Full Text Available Connecting intercultural research with Grounded Theory was advocated in the early history of intercultural theorising and includes the development of researchers' intercultural competencies. Such competency comes to the fore where intercultural theory places an equal emphasis on home and host cultures in migration research. In this context we have found a Grounded Theory approach particularly suitable for disentangling complex interlinkings within migration experiences and their individual outcomes. Grounded Theory allows for the exploration of various theories in different fields and the emergence of new or deeper interpretations of intercultural experiences, including where research has not engaged deeply with or avoided intercultural contexts. The use of software, based on Grounded Theory, provides the resource for systematically exploring the inter-related nature of data. In addition, engaging in intercultural research, in particular, raises questions around our practice as social science researchers: adherence to ethics guidelines, for instance, can be in some conflict with the relations we build with members of communities whose cultural values, for instance around friendship or trust, impact on the norms of both our own and institutional expectations. This leads to reflection on the relationship with research participants in terms of our own intercultural experiences and position. URN: urn:nbn:de:0114-fqs0901363

  2. 618-11 Burial Ground USRADS radiological surveys

    International Nuclear Information System (INIS)

    Wendling, M.A.

    1994-01-01

    This report summarizes and documents the results of the radiological surveys conducted from February 4 through February 10, 1993 over the 618-11 Burial Ground, Hanford Site, Richland, Washington. In addition, this report explains the survey methodology using the Ultrasonic Ranging and Data System (USRADS). The 618-11 Burial Ground radiological survey field task consisted of two activities: characterization of the specific background conditions and the radiological survey of the area. The radiological survey of the 618-11 Burial Ground, along with the background study, were conducted by Site Investigative Surveys Environmental Restoration Health Physics Organization of the Westinghouse Hanford Company. The survey methodology was based on utilization of the Ultrasonic Ranging and Data System (USRADS) for automated recording of the gross gamma radiation levels at or near six (6) inches and at three (3) feet from the surface soil

  3. Constructing New Theory for Identifying Students with Emotional Disturbance: A Constructivist Approach to Grounded Theory

    OpenAIRE

    Dori Barnett

    2012-01-01

    A grounded theory study that examined how practitioners in a county alternative and correctional education setting identify youth with emotional and behavioral difficulties for special education services provides an exemplar for a constructivist approach to grounded theory methodology. Discussion focuses on how a constructivist orientation to grounded theory methodology informed research decisions, shaped the development of the emergent grounded theory, and prompted a way of thinking about da...

  4. A new methodology for the study of FAC phenomenon based on a fuzzy rule system

    International Nuclear Information System (INIS)

    Ferreira Guimaraes, Antonio Cesar

    2003-01-01

    This work consists of the representation of the corrosion problem, FAC - 'Flow-Accelerated Corrosion' in components, structures and passive systems in a nuclear power plant with aging, through a fuzzy rules system, in substitution to the conventional modeling and experimental analyses. Using data characteristic of the nature of the problem to be analyzed, a reduced number of rules can be establish to represent the actual problem. The results can be visualized in a very satisfactory way thus providing the engineer with the knowledge to work in the space of solution of rules to do the necessary inferences

  5. Transition sum rules in the shell model

    Science.gov (United States)

    Lu, Yi; Johnson, Calvin W.

    2018-03-01

    An important characterization of electromagnetic and weak transitions in atomic nuclei are sum rules. We focus on the non-energy-weighted sum rule (NEWSR), or total strength, and the energy-weighted sum rule (EWSR); the ratio of the EWSR to the NEWSR is the centroid or average energy of transition strengths from an nuclear initial state to all allowed final states. These sum rules can be expressed as expectation values of operators, which in the case of the EWSR is a double commutator. While most prior applications of the double commutator have been to special cases, we derive general formulas for matrix elements of both operators in a shell model framework (occupation space), given the input matrix elements for the nuclear Hamiltonian and for the transition operator. With these new formulas, we easily evaluate centroids of transition strength functions, with no need to calculate daughter states. We apply this simple tool to a number of nuclides and demonstrate the sum rules follow smooth secular behavior as a function of initial energy, as well as compare the electric dipole (E 1 ) sum rule against the famous Thomas-Reiche-Kuhn version. We also find surprising systematic behaviors for ground-state electric quadrupole (E 2 ) centroids in the s d shell.

  6. Jahn-Teller effect versus Hund's rule coupling in C60N-

    Science.gov (United States)

    Wehrli, S.; Sigrist, M.

    2007-09-01

    We propose variational states for the ground state and the low-energy collective rotator excitations in negatively charged C60N- ions (N=1,…,5) . The approach includes the linear electron-phonon coupling and the Coulomb interaction on the same level. The electron-phonon coupling is treated within the effective mode approximation which yields the linear t1u⊗Hg Jahn-Teller problem whereas the Coulomb interaction gives rise to Hund’s rule coupling for N=2,3,4 . The Hamiltonian has accidental SO(3) symmetry which allows an elegant formulation in terms of angular momenta. Trial states are constructed from coherent states and using projection operators onto angular momentum subspaces which results in good variational states for the complete parameter range. The evaluation of the corresponding energies is to a large extent analytical. We use the approach for a detailed analysis of the competition between Jahn-Teller effect and Hund’s rule coupling, which determines the spin state for N=2,3,4 . We calculate the low-spin-high-spin gap for N=2,3,4 as a function of the Hund’s rule coupling constant J . We find that the experimentally measured gaps suggest a coupling constant in the range J=60-80meV . Using a finite value for J , we recalculate the ground state energies of the C60N- ions and find that the Jahn-Teller energy gain is partly counterbalanced by the Hund’s rule coupling. In particular, the ground state energies for N=2,3,4 are almost equal.

  7. Validation and Comparison of One-Dimensional Ground Motion Methodologies

    International Nuclear Information System (INIS)

    B. Darragh; W. Silva; N. Gregor

    2006-01-01

    Both point- and finite-source stochastic one-dimensional ground motion models, coupled to vertically propagating equivalent-linear shear-wave site response models are validated using an extensive set of strong motion data as part of the Yucca Mountain Project. The validation and comparison exercises are presented entirely in terms of 5% damped pseudo absolute response spectra. The study consists of a quantitative analyses involving modeling nineteen well-recorded earthquakes, M 5.6 to 7.4 at over 600 sites. The sites range in distance from about 1 to about 200 km in the western US (460 km for central-eastern US). In general, this validation demonstrates that the stochastic point- and finite-source models produce accurate predictions of strong ground motions over the range of 0 to 100 km and for magnitudes M 5.0 to 7.4. The stochastic finite-source model appears to be broadband, producing near zero bias from about 0.3 Hz (low frequency limit of the analyses) to the high frequency limit of the data (100 and 25 Hz for response and Fourier amplitude spectra, respectively)

  8. 77 FR 11591 - Certain Ground Fault Circuit Interrupters and Products Containing Same, Investigations...

    Science.gov (United States)

    2012-02-27

    ... INTERNATIONAL TRADE COMMISSION [Investigation No. 337-TA-739] Certain Ground Fault Circuit Interrupters and Products Containing Same, Investigations: Terminations, Modifications and Rulings AGENCY: U.S... importation, and the sale within the United States after importation of certain ground fault circuit...

  9. Integrated layout based Monte-Carlo simulation for design arc optimization

    Science.gov (United States)

    Shao, Dongbing; Clevenger, Larry; Zhuang, Lei; Liebmann, Lars; Wong, Robert; Culp, James

    2016-03-01

    Design rules are created considering a wafer fail mechanism with the relevant design levels under various design cases, and the values are set to cover the worst scenario. Because of the simplification and generalization, design rule hinders, rather than helps, dense device scaling. As an example, SRAM designs always need extensive ground rule waivers. Furthermore, dense design also often involves "design arc", a collection of design rules, the sum of which equals critical pitch defined by technology. In design arc, a single rule change can lead to chain reaction of other rule violations. In this talk we present a methodology using Layout Based Monte-Carlo Simulation (LBMCS) with integrated multiple ground rule checks. We apply this methodology on SRAM word line contact, and the result is a layout that has balanced wafer fail risks based on Process Assumptions (PAs). This work was performed at the IBM Microelectronics Div, Semiconductor Research and Development Center, Hopewell Junction, NY 12533

  10. Eliciting Perceptual Ground Truth for Image Segmentation

    OpenAIRE

    Hodge, Victoria Jane; Eakins, John; Austin, Jim

    2006-01-01

    In this paper, we investigate human visual perception and establish a body of ground truth data elicited from human visual studies. We aim to build on the formative work of Ren, Eakins and Briggs who produced an initial ground truth database. Human subjects were asked to draw and rank their perceptions of the parts of a series of figurative images. These rankings were then used to score the perceptions, identify the preferred human breakdowns and thus allow us to induce perceptual rules for h...

  11. Study on seismic reliability for foundation grounds and surrounding slopes of nuclear power plants. Proposal of evaluation methodology and integration of seismic reliability evaluation system

    International Nuclear Information System (INIS)

    Ohtori, Yasuki; Kanatani, Mamoru

    2006-01-01

    This paper proposes an evaluation methodology of annual probability of failure for soil structures subjected to earthquakes and integrates the analysis system for seismic reliability of soil structures. The method is based on margin analysis, that evaluates the ground motion level at which structure is damaged. First, ground motion index that is strongly correlated with damage or response of the specific structure, is selected. The ultimate strength in terms of selected ground motion index is then evaluated. Next, variation of soil properties is taken into account for the evaluation of seismic stability of structures. The variation of the safety factor (SF) is evaluated and then the variation is converted into the variation of the specific ground motion index. Finally, the fragility curve is developed and then the annual probability of failure is evaluated combined with seismic hazard curve. The system facilitates the assessment of seismic reliability. A generator of random numbers, dynamic analysis program and stability analysis program are incorporated into one package. Once we define a structural model, distribution of the soil properties, input ground motions and so forth, list of safety factors for each sliding line is obtained. Monte Carlo Simulation (MCS), Latin Hypercube Sampling (LHS), point estimation method (PEM) and first order second moment (FOSM) implemented in this system are also introduced. As numerical examples, a ground foundation and a surrounding slope are assessed using the proposed method and the integrated system. (author)

  12. Measurement of ground motion in various sites

    International Nuclear Information System (INIS)

    Bialowons, W.; Amirikas, R.; Bertolini, A.; Kruecker, D.

    2007-04-01

    Ground vibrations may affect low emittance beam transport in linear colliders, Free Electron Lasers (FEL) and synchrotron radiation facilities. This paper is an overview of a study program to measure ground vibrations in various sites which can be used for site characterization in relation to accelerator design. Commercial broadband seismometers have been used to measure ground vibrations and the resultant database is available to the scientific community. The methodology employed is to use the same equipment and data analysis tools for ease of comparison. This database of ground vibrations taken in 19 sites around the world is first of its kind. (orig.)

  13. The use of grounded theory in studies of nurses and midwives' coping processes: a systematic literature search.

    Science.gov (United States)

    Cheer, Karen; MacLaren, David; Tsey, Komla

    2015-01-01

    Researchers are increasingly using grounded theory methodologies to study the professional experience of nurses and midwives. To review common grounded theory characteristics and research design quality as described in grounded theory studies of coping strategies used by nurses and midwives. A systematic database search for 2005-2015 identified and assessed grounded theory characteristics from 16 studies. Study quality was assessed using a modified Critical Appraisal Skills Programme tool. Grounded theory was considered a methodology or a set of methods, able to be used within different nursing and midwifery contexts. Specific research requirements determined the common grounded theory characteristics used in different studies. Most researchers did not clarify their epistemological and theoretical perspectives. To improve research design and trustworthiness of grounded theory studies in nursing and midwifery, researchers need to state their theoretical stance and clearly articulate their use of grounded theory methodology and characteristics in research reporting.

  14. Engaging Non-State Security Providers: Whither the Rule of Law?

    Directory of Open Access Journals (Sweden)

    Timothy Donais

    2017-07-01

    Full Text Available The primacy of the rule of law has long been seen as one of the essential principles of security sector reform (SSR programming, and part of the larger gospel of SSR is that the accountability of security providers is best guaranteed by embedding security governance within a rule of law framework. Acknowledging the reality of non-state security provision, however, presents a challenge to thinking about SSR as merely the extension of the rule of law into the security realm, in large part because whatever legitimacy non-state security providers possess tends to be grounded in 'extralegal' foundations. This paper – more conceptual than empirical in its approach – considers the implications of hybrid forms of security governance for thinking about the relationship between SSR and rule of law promotion, and argues that the rule of law still provides a useful source of strategic direction for SSR programming.

  15. Scientific or rule-of-thumb techniques of ground-water management--Which will prevail?

    Science.gov (United States)

    McGuinness, Charles Lee

    1969-01-01

    Emphasis in ground-water development, once directed largely to quantitatively minor (but sociologically vital) service of human and stock needs, is shifting: aquifers are treated as possible regulating reservoirs managed conjunctively with surface water. Too, emphasis on reducing stream pollution is stimulating interest in aquifers as possible waste-storage media. Such management of aquifers requires vast amounts of data plus a much better understanding of aquifer-system behavior than now exists. Implicit in this deficiency of knowledge is a need for much new research, lest aquifers be managed according to ineffective rule-of-thumb standards, or even abandoned as unmanageable. The geohydrologist's task is to define both internal and boundary characteristics of aquifer systems. Stratigraphy is a primary determinant of these characteristics, but stratigraphically minor features may make aquifers transcend stratigraphic boundaries. For example, a structurally insignificant fracture may carry more water than a major fault; a minor stratigraphic discontinuity may be a major hydrologic boundary. Hence, there is a need for ways of defining aquifer boundaries and quantifying aquifer and confining-bed characteristics that are very different from ordinary stratigraphic techniques. Among critical needs are techniques for measuring crossbed permeability; for extrapolating and interpolating point data on direction and magnitude of permeability in defining aquifer geometry; and for accurately measuring geochemical properties of water and aquifer material, and interpreting those measurements in terms of source of water, rate of movement, and waste-sorbing capacities of aquifers and of confining beds--in general, techniques adequate for predicting aquifer response to imposed forces whether static, hydraulic, thermal, or chemical. Only when such predictions can be made routinely can aquifer characteristics be inserted into a master model that incorporates both the hydrologic and

  16. Evolving rule-based systems in two medical domains using genetic programming.

    Science.gov (United States)

    Tsakonas, Athanasios; Dounias, Georgios; Jantzen, Jan; Axer, Hubertus; Bjerregaard, Beth; von Keyserlingk, Diedrich Graf

    2004-11-01

    To demonstrate and compare the application of different genetic programming (GP) based intelligent methodologies for the construction of rule-based systems in two medical domains: the diagnosis of aphasia's subtypes and the classification of pap-smear examinations. Past data representing (a) successful diagnosis of aphasia's subtypes from collaborating medical experts through a free interview per patient, and (b) correctly classified smears (images of cells) by cyto-technologists, previously stained using the Papanicolaou method. Initially a hybrid approach is proposed, which combines standard genetic programming and heuristic hierarchical crisp rule-base construction. Then, genetic programming for the production of crisp rule based systems is attempted. Finally, another hybrid intelligent model is composed by a grammar driven genetic programming system for the generation of fuzzy rule-based systems. Results denote the effectiveness of the proposed systems, while they are also compared for their efficiency, accuracy and comprehensibility, to those of an inductive machine learning approach as well as to those of a standard genetic programming symbolic expression approach. The proposed GP-based intelligent methodologies are able to produce accurate and comprehensible results for medical experts performing competitive to other intelligent approaches. The aim of the authors was the production of accurate but also sensible decision rules that could potentially help medical doctors to extract conclusions, even at the expense of a higher classification score achievement.

  17. Update of Part 61 impacts analysis methodology

    International Nuclear Information System (INIS)

    Oztunali, O.I.; Roles, G.W.

    1986-01-01

    The US Nuclear Regulatory Commission is expanding the impacts analysis methodology used during the development of the 10 CFR Part 61 rule to allow improved consideration of costs and impacts of disposal of waste that exceeds Class C concentrations. The project includes updating the computer codes that comprise the methodology, reviewing and updating data assumptions on waste streams and disposal technologies, and calculation of costs for small as well as large disposal facilities. This paper outlines work done to date on this project

  18. Performance Based Criteria for Ship Collision and Grounding

    DEFF Research Database (Denmark)

    Pedersen, Preben Terndrup

    2009-01-01

    The paper outlines a probabilistic procedure whereby the maritime industry can develop performance based rules to reduce the risk associated with human, environmental and economic costs of collision and grounding events and identify the most economic risk control options associated with prevention...

  19. Liquidity Traps and the Price (In)Determinacy of Monetary Rules

    OpenAIRE

    Eagle, David M

    2012-01-01

    This paper proposes a new methodology for assessing price indeterminacy to supplant the discredited nonexplosive criterion. Using this methodology, we find that nominal GDP targeting and price-level targeting do determine prices when the central bank follows a sufficiently strong feedback rule for setting the nominal interest rate. However, inflation targeting leads to price indeterminacy, a result consistent with the principles of calculus. This price indeterminacy of inflation targeting ...

  20. Examining the Nexus between Grounded Theory and Symbolic Interactionism

    OpenAIRE

    P. Jane Milliken RN, PhD; Rita Schreiber RN, DNS

    2012-01-01

    Grounded theory is inherently symbolic interactionist; however, not all grounded theory researchers appreciate its importance or benefit from its influence. Elsewhere, we have written about the intrinsic relationship between grounded theory and symbolic interactionism, highlighting the silent, fundamental contribution of symbolic interactionism to the methodology. At the same time, there are significant insights to be had by bringing a conscious awareness of the philosophy of symbolic interac...

  1. 76 FR 21036 - Application of the Prevailing Wage Methodology in the H-2B Program

    Science.gov (United States)

    2011-04-14

    ... Department to ``promulgate new rules concerning the calculation of the prevailing wage rate in the H-2B... wage methodology set forth in this Rule applies only to wages paid for work performed on or after...: Notice. SUMMARY: On January 19, 2011, the Department of Labor (Department) published a final rule, Wage...

  2. The automated ground network system

    Science.gov (United States)

    Smith, Miles T.; Militch, Peter N.

    1993-01-01

    The primary goal of the Automated Ground Network System (AGNS) project is to reduce Ground Network (GN) station life-cycle costs. To accomplish this goal, the AGNS project will employ an object-oriented approach to develop a new infrastructure that will permit continuous application of new technologies and methodologies to the Ground Network's class of problems. The AGNS project is a Total Quality (TQ) project. Through use of an open collaborative development environment, developers and users will have equal input into the end-to-end design and development process. This will permit direct user input and feedback and will enable rapid prototyping for requirements clarification. This paper describes the AGNS objectives, operations concept, and proposed design.

  3. Rigour and grounded theory.

    Science.gov (United States)

    Cooney, Adeline

    2011-01-01

    This paper explores ways to enhance and demonstrate rigour in a grounded theory study. Grounded theory is sometimes criticised for a lack of rigour. Beck (1993) identified credibility, auditability and fittingness as the main standards of rigour for qualitative research methods. These criteria were evaluated for applicability to a Straussian grounded theory study and expanded or refocused where necessary. The author uses a Straussian grounded theory study (Cooney, In press) to examine how the revised criteria can be applied when conducting a grounded theory study. Strauss and Corbin (1998b) criteria for judging the adequacy of a grounded theory were examined in the context of the wider literature examining rigour in qualitative research studies in general and grounded theory studies in particular. A literature search for 'rigour' and 'grounded theory' was carried out to support this analysis. Criteria are suggested for enhancing and demonstrating the rigour of a Straussian grounded theory study. These include: cross-checking emerging concepts against participants' meanings, asking experts if the theory 'fit' their experiences, and recording detailed memos outlining all analytical and sampling decisions. IMPLICATIONS FOR RESEARCH PRACTICE: The criteria identified have been expressed as questions to enable novice researchers to audit the extent to which they are demonstrating rigour when writing up their studies. However, it should not be forgotten that rigour is built into the grounded theory method through the inductive-deductive cycle of theory generation. Care in applying the grounded theory methodology correctly is the single most important factor in ensuring rigour.

  4. 48 CFR 6101.21 - Hearing procedures [Rule 21].

    Science.gov (United States)

    2010-10-01

    ...) Nature and conduct of hearings. (1) Except when necessary to maintain the confidentiality of protected... subpoena pursuant to 6101.16(h) (Rule 16(h)). (h) Issues not raised by pleadings. If evidence is objected to at a hearing on the ground that it is not within the issues raised by the pleadings, it may...

  5. Multimodal hybrid reasoning methodology for personalized wellbeing services.

    Science.gov (United States)

    Ali, Rahman; Afzal, Muhammad; Hussain, Maqbool; Ali, Maqbool; Siddiqi, Muhammad Hameed; Lee, Sungyoung; Ho Kang, Byeong

    2016-02-01

    A wellness system provides wellbeing recommendations to support experts in promoting a healthier lifestyle and inducing individuals to adopt healthy habits. Adopting physical activity effectively promotes a healthier lifestyle. A physical activity recommendation system assists users to adopt daily routines to form a best practice of life by involving themselves in healthy physical activities. Traditional physical activity recommendation systems focus on general recommendations applicable to a community of users rather than specific individuals. These recommendations are general in nature and are fit for the community at a certain level, but they are not relevant to every individual based on specific requirements and personal interests. To cover this aspect, we propose a multimodal hybrid reasoning methodology (HRM) that generates personalized physical activity recommendations according to the user׳s specific needs and personal interests. The methodology integrates the rule-based reasoning (RBR), case-based reasoning (CBR), and preference-based reasoning (PBR) approaches in a linear combination that enables personalization of recommendations. RBR uses explicit knowledge rules from physical activity guidelines, CBR uses implicit knowledge from experts׳ past experiences, and PBR uses users׳ personal interests and preferences. To validate the methodology, a weight management scenario is considered and experimented with. The RBR part of the methodology generates goal, weight status, and plan recommendations, the CBR part suggests the top three relevant physical activities for executing the recommended plan, and the PBR part filters out irrelevant recommendations from the suggested ones using the user׳s personal preferences and interests. To evaluate the methodology, a baseline-RBR system is developed, which is improved first using ranged rules and ultimately using a hybrid-CBR. A comparison of the results of these systems shows that hybrid-CBR outperforms the

  6. 76 FR 45667 - Wage Methodology for the Temporary Non-Agricultural Employment H-2B Program; Amendment of...

    Science.gov (United States)

    2011-08-01

    ... private wage surveys in very limited circumstances. Lastly, the Wage Rule required the new wage... methodology set forth in this Rule applies only to wages paid for work performed on or after January 1, 2012..., 2012 effective date of the Wage Rule and ordered us to announce a new effective date for the rule...

  7. Ground rubber: Sorption media for ground water containing benzene and O-xylene

    International Nuclear Information System (INIS)

    Kershaw, D.S.; Pamukcu, S.

    1997-01-01

    The purpose of the current study is to examine the ability of ground rubber to sorb benzene and O-xylene from water contained with aromatic hydrocarbons. The study consisted of running both batch and packed bed column tests to determine the sorption capacity, the required sorption equilibration time, and the flow through utilization efficiency of ground rubber under various contact times when exposed to water contaminated with various amounts of benzene or O-xylene. Initial batch test results indicate that ground rubber can attain equilibrium sorption capacities up to 1.3 or 8.2 mg of benzene or O-xylene, respectively, per gram of tire rubber at solution equilibrium concentrations of 10 mg/L. Packed bed column tests indicate that ground tire rubber has on the average a 40% utilization rate when a hydraulic residence time of 15 min is used. Possible future uses of round rubber as a sorption media could include, but are not limited to, the use of ground rubber as an aggregate in slurry cutoff walls that are in contact with petroleum products. Ground rubber could also be used as a sorption media in pump-and-treat methodologies or as a sorption media in in-situ reactive permeable barriers

  8. Extracting Cross-Ontology Weighted Association Rules from Gene Ontology Annotations.

    Science.gov (United States)

    Agapito, Giuseppe; Milano, Marianna; Guzzi, Pietro Hiram; Cannataro, Mario

    2016-01-01

    Gene Ontology (GO) is a structured repository of concepts (GO Terms) that are associated to one or more gene products through a process referred to as annotation. The analysis of annotated data is an important opportunity for bioinformatics. There are different approaches of analysis, among those, the use of association rules (AR) which provides useful knowledge, discovering biologically relevant associations between terms of GO, not previously known. In a previous work, we introduced GO-WAR (Gene Ontology-based Weighted Association Rules), a methodology for extracting weighted association rules from ontology-based annotated datasets. We here adapt the GO-WAR algorithm to mine cross-ontology association rules, i.e., rules that involve GO terms present in the three sub-ontologies of GO. We conduct a deep performance evaluation of GO-WAR by mining publicly available GO annotated datasets, showing how GO-WAR outperforms current state of the art approaches.

  9. A methodology for extracting knowledge rules from artificial neural networks applied to forecast demand for electric power; Uma metodologia para extracao de regras de conhecimento a partir de redes neurais artificiais aplicadas para previsao de demanda por energia eletrica

    Energy Technology Data Exchange (ETDEWEB)

    Steinmetz, Tarcisio; Souza, Glauber; Ferreira, Sandro; Santos, Jose V. Canto dos; Valiati, Joao [Universidade do Vale do Rio dos Sinos (PIPCA/UNISINOS), Sao Leopoldo, RS (Brazil). Programa de Pos-Graduacao em Computacao Aplicada], Emails: trsteinmetz@unisinos.br, gsouza@unisinos.br, sferreira, jvcanto@unisinos.br, jfvaliati@unisinos.br

    2009-07-01

    We present a methodology for the extraction of rules from Artificial Neural Networks (ANN) trained to forecast the electric load demand. The rules have the ability to express the knowledge regarding the behavior of load demand acquired by the ANN during the training process. The rules are presented to the user in an easy to read format, such as IF premise THEN consequence. Where premise relates to the input data submitted to the ANN (mapped as fuzzy sets), and consequence appears as a linear equation describing the output to be presented by the ANN, should the premise part holds true. Experimentation demonstrates the method's capacity for acquiring and presenting high quality rules from neural networks trained to forecast electric load demand for several amounts of time in the future. (author)

  10. Building Grounded Theory in Entrepreneurship Research

    DEFF Research Database (Denmark)

    Mäkelä, Markus; Turcan, Romeo V.

    2007-01-01

    In this chapter we describe the process of building of theory from data (Glaser and Strauss 1967; Strauss and Corbin 1998). We discuss current grounded theory in relation to research in entrepreneurship and point out directions and potential improvements for further research in this field....... The chapter has two goals. First, we wish to provide an explicit paradigmatic positioning of the grounded theory methodology, discussing the most relevant views of ontology and epistemology that can be used as alternative starting points for conducting grounded theory research. While the chapter introduces...... our approach to grounded theory, we acknowledge the existence of other approaches and try to locate our approach in relation to them. As an important part of this discussion, we take a stand on how to usefully define ‘grounded theory’ and ‘case study research’. Second, we seek to firmly link our...

  11. A discussion of differences in preparation, performance and postreflections in participant observations within two grounded theory approaches

    DEFF Research Database (Denmark)

    Bøttcher Berthelsen, Connie; Damsgaard, Tove Lindhardt; Frederiksen, Kirsten

    2017-01-01

    This paper presents a discussion of the differences in using participant observation as a data collection method by comparing the classic grounded theory methodology of Barney Glaser with the constructivist grounded theory methodology by Kathy Charmaz. Participant observations allow nursing...... researchers to experience activities and interactions directly in situ. However, using participant observations as a data collection method can be done in many ways, depending on the chosen grounded theory methodology, and may produce different results. This discussion shows that how the differences between...

  12. Constructing New Theory for Identifying Students with Emotional Disturbance: A Constructivist Approach to Grounded Theory

    Directory of Open Access Journals (Sweden)

    Dori Barnett

    2012-06-01

    Full Text Available A grounded theory study that examined how practitioners in a county alternative and correctional education setting identify youth with emotional and behavioral difficulties for special education services provides an exemplar for a constructivist approach to grounded theory methodology. Discussion focuses on how a constructivist orientation to grounded theory methodology informed research decisions, shaped the development of the emergent grounded theory, and prompted a way of thinking about data collection and analysis. Implications for future research directions and policy and practice in the field of special and alternative education are discussed.

  13. Medicare Program; Inpatient Rehabilitation Facility Prospective Payment System for Federal Fiscal Year 2018. Final rule.

    Science.gov (United States)

    2017-08-03

    This final rule updates the prospective payment rates for inpatient rehabilitation facilities (IRFs) for federal fiscal year (FY) 2018 as required by the statute. As required by section 1886(j)(5) of the Social Security Act (the Act), this rule includes the classification and weighting factors for the IRF prospective payment system's (IRF PPS) case-mix groups and a description of the methodologies and data used in computing the prospective payment rates for FY 2018. This final rule also revises the International Classification of Diseases, 10th Revision, Clinical Modification (ICD-10-CM) diagnosis codes that are used to determine presumptive compliance under the "60 percent rule," removes the 25 percent payment penalty for inpatient rehabilitation facility patient assessment instrument (IRF-PAI) late transmissions, removes the voluntary swallowing status item (Item 27) from the IRF-PAI, summarizes comments regarding the criteria used to classify facilities for payment under the IRF PPS, provides for a subregulatory process for certain annual updates to the presumptive methodology diagnosis code lists, adopts the use of height/weight items on the IRF-PAI to determine patient body mass index (BMI) greater than 50 for cases of single-joint replacement under the presumptive methodology, and revises and updates measures and reporting requirements under the IRF quality reporting program (QRP).

  14. Stochastic correlative firing for figure-ground segregation.

    Science.gov (United States)

    Chen, Zhe

    2005-03-01

    Segregation of sensory inputs into separate objects is a central aspect of perception and arises in all sensory modalities. The figure-ground segregation problem requires identifying an object of interest in a complex scene, in many cases given binaural auditory or binocular visual observations. The computations required for visual and auditory figure-ground segregation share many common features and can be cast within a unified framework. Sensory perception can be viewed as a problem of optimizing information transmission. Here we suggest a stochastic correlative firing mechanism and an associative learning rule for figure-ground segregation in several classic sensory perception tasks, including the cocktail party problem in binaural hearing, binocular fusion of stereo images, and Gestalt grouping in motion perception.

  15. A Rigorous Methodology for Analyzing and Designing Plug-Ins

    DEFF Research Database (Denmark)

    Fasie, Marieta V.; Haxthausen, Anne Elisabeth; Kiniry, Joseph

    2013-01-01

    . This paper addresses these problems by describing a rigorous methodology for analyzing and designing plug-ins. The methodology is grounded in the Extended Business Object Notation (EBON) and covers informal analysis and design of features, GUI, actions, and scenarios, formal architecture design, including...... behavioral semantics, and validation. The methodology is illustrated via a case study whose focus is an Eclipse environment for the RAISE formal method's tool suite....

  16. A Belief Rule-Based Expert System to Diagnose Influenza

    DEFF Research Database (Denmark)

    Hossain, Mohammad Shahadat; Khalid, Md. Saifuddin; Akter, Shamima

    2014-01-01

    , development and application of an expert system to diagnose influenza under uncertainty. The recently developed generic belief rule-based inference methodology by using the evidential reasoning (RIMER) approach is employed to develop this expert system, termed as Belief Rule Based Expert System (BRBES......). The RIMER approach can handle different types of uncertainties, both in knowledge representation, and in inference procedures. The knowledge-base of this system was constructed by using records of the real patient data along with in consultation with the Influenza specialists of Bangladesh. Practical case...

  17. Uses of dipole oscillator strength sum rules in second order perturbation theory

    International Nuclear Information System (INIS)

    Struensee, M.C.

    1984-01-01

    Certain moments of the dipole oscillator strength distribution of atoms and molecules can be calculated from theory (using sum rules) or deduced from experiment. The present work describes the use of these moments to construct effective distributions which lead to bounds and estimates of physical properties of interest. Asymptotic analysis is then used to obtain the high energy behavior of the oscillator strength density and a previously unknown sum rule for atoms and molecules. A new type of effective distribution, which incorporates the information concerning the asymptotic behavior and the new sum rule, is suggested. This new type of distribution is used to calculate the logarithmic mean excitation energies for the ground states of atomic hydrogen, atomic helium and the negative hydrogen ion. The calculations for atomic helium and the negative hydrogen ion require the evaluation of certain ground state expectation values. These have been calculated using high accuracy wavefunctions containing the nonconventional terms shown by Fock to be necessary for a correct analytic expansion when both electrons are near the nucleus

  18. A Bayesian analysis of the nucleon QCD sum rules

    International Nuclear Information System (INIS)

    Ohtani, Keisuke; Gubler, Philipp; Oka, Makoto

    2011-01-01

    QCD sum rules of the nucleon channel are reanalyzed, using the maximum-entropy method (MEM). This new approach, based on the Bayesian probability theory, does not restrict the spectral function to the usual ''pole + continuum'' form, allowing a more flexible investigation of the nucleon spectral function. Making use of this flexibility, we are able to investigate the spectral functions of various interpolating fields, finding that the nucleon ground state mainly couples to an operator containing a scalar diquark. Moreover, we formulate the Gaussian sum rule for the nucleon channel and find that it is more suitable for the MEM analysis to extract the nucleon pole in the region of its experimental value, while the Borel sum rule does not contain enough information to clearly separate the nucleon pole from the continuum. (orig.)

  19. Reflecting on the challenges of choosing and using a grounded theory approach.

    Science.gov (United States)

    Markey, Kathleen; Tilki, Mary; Taylor, Georgina

    2014-11-01

    To explore three different approaches to grounded theory and consider some of the possible philosophical assumptions underpinning them. Grounded theory is a comprehensive yet complex methodology that offers a procedural structure that guides the researcher. However, divergent approaches to grounded theory present dilemmas for novice researchers seeking to choose a suitable research method. This is a methodology paper. This is a reflexive paper that explores some of the challenges experienced by a PhD student when choosing and operationalising a grounded theory approach. Before embarking on a study, novice grounded theory researchers should examine their research beliefs to assist them in selecting the most suitable approach. This requires an insight into the approaches' philosophical assumptions, such as those pertaining to ontology and epistemology. Researchers need to be clear about the philosophical assumptions underpinning their studies and the effects that different approaches will have on the research results. This paper presents a personal account of the journey of a novice grounded theory researcher who chose a grounded theory approach and worked within its theoretical parameters. Novice grounded theory researchers need to understand the different philosophical assumptions that influence the various grounded theory approaches, before choosing one particular approach.

  20. Optimal short-sighted rules

    Directory of Open Access Journals (Sweden)

    Sacha eBourgeois-Gironde

    2012-09-01

    Full Text Available The aim of this paper is to assess the relevance of methodological transfers from behavioral ecology to experimental economics with respect to the elicitation of intertemporal preferences. More precisely our discussion will stem from the analysis of Stephens and Anderson’s (2001 seminal article. In their study with blue jays they document that foraging behavior typically implements short sighted choice rules which are beneficial in the long-run. Such long term profitability of short-sighted behavior cannot be evidenced when using a self-control paradigm (one which contrasts in a binary way sooner smaller and later larger payoffs but becomes apparent when ecological patch-paradigms (replicating economic situations in which the main trade-off consists in staying on a food patch or leaving for another patch are implemented. We transfer this methodology in view of contrasting foraging strategies and self-control in human intertemporal choices.

  1. Review: Günter Mey & Katja Mruck (Eds. (2007. Grounded Theory Reader

    Directory of Open Access Journals (Sweden)

    Adrian Schmidtke

    2009-07-01

    Full Text Available The volume was published to mark the 40th anniversary of the publication of "The Discovery of Grounded Theory." The first part describes the emergence and fundamental positions of grounded theory methodology (GTM in methodological and theoretical terms; the second part focuses on research practices. The "Grounded Theory Reader" is an excellent compilation that doesn’t claim to be a standard textbook for newcomers to GTM. Rather, it is a reflection of the state of the art in GTM and enables insights in complex research practices. A basic understanding of GTM is recommended in order to get the most from the book. URN: urn:nbn:de:0114-fqs0903286

  2. Applications of rule-induction in the derivation of quantitative structure-activity relationships

    Science.gov (United States)

    A-Razzak, Mohammed; Glen, Robert C.

    1992-08-01

    Recently, methods have been developed in the field of Artificial Intelligence (AI), specifically in the expert systems area using rule-induction, designed to extract rules from data. We have applied these methods to the analysis of molecular series with the objective of generating rules which are predictive and reliable. The input to rule-induction consists of a number of examples with known outcomes (a training set) and the output is a tree-structured series of rules. Unlike most other analysis methods, the results of the analysis are in the form of simple statements which can be easily interpreted. These are readily applied to new data giving both a classification and a probability of correctness. Rule-induction has been applied to in-house generated and published QSAR datasets and the methodology, application and results of these analyses are discussed. The results imply that in some cases it would be advantageous to use rule-induction as a complementary technique in addition to conventional statistical and pattern-recognition methods.

  3. Evaluation and Selection of Best Priority Sequencing Rule in Job Shop Scheduling using Hybrid MCDM Technique

    Science.gov (United States)

    Kiran Kumar, Kalla; Nagaraju, Dega; Gayathri, S.; Narayanan, S.

    2017-05-01

    Priority Sequencing Rules provide the guidance for the order in which the jobs are to be processed at a workstation. The application of different priority rules in job shop scheduling gives different order of scheduling. More experimentation needs to be conducted before a final choice is made to know the best priority sequencing rule. Hence, a comprehensive method of selecting the right choice is essential in managerial decision making perspective. This paper considers seven different priority sequencing rules in job shop scheduling. For evaluation and selection of the best priority sequencing rule, a set of eight criteria are considered. The aim of this work is to demonstrate the methodology of evaluating and selecting the best priority sequencing rule by using hybrid multi criteria decision making technique (MCDM), i.e., analytical hierarchy process (AHP) with technique for order preference by similarity to ideal solution (TOPSIS). The criteria weights are calculated by using AHP whereas the relative closeness values of all priority sequencing rules are computed based on TOPSIS with the help of data acquired from the shop floor of a manufacturing firm. Finally, from the findings of this work, the priority sequencing rules are ranked from most important to least important. The comprehensive methodology presented in this paper is very much essential for the management of a workstation to choose the best priority sequencing rule among the available alternatives for processing the jobs with maximum benefit.

  4. Examining the Nexus between Grounded Theory and Symbolic Interactionism

    Directory of Open Access Journals (Sweden)

    P. Jane Milliken RN, PhD

    2012-12-01

    Full Text Available Grounded theory is inherently symbolic interactionist; however, not all grounded theory researchers appreciate its importance or benefit from its influence. Elsewhere, we have written about the intrinsic relationship between grounded theory and symbolic interactionism, highlighting the silent, fundamental contribution of symbolic interactionism to the methodology. At the same time, there are significant insights to be had by bringing a conscious awareness of the philosophy of symbolic interactionism to grounded theory research. In this article we discuss the symbolic interactionist concepts of mind, self, and society, and their applicability in grounded theorizing. Our purpose is to highlight foundational concepts of symbolic interactionism and their centrality in the processes of conducting grounded theory research.

  5. Presentation of the RCC-M design and construction rule

    International Nuclear Information System (INIS)

    Quero, J.R.

    1983-03-01

    Presentation of the French rules for design and construction of nuclear power plant components: stress resistance, material choice, fabrication and quality control, control methodology etc... covering equipments such as pressurized components (tanks, exchangers, pipes, pumps, valves and fittings), internal and support reactor elements and non pressurized small devices [fr

  6. Symbolic interactionism in grounded theory studies: women surviving with HIV/AIDS in rural northern Thailand.

    Science.gov (United States)

    Klunklin, Areewan; Greenwood, Jennifer

    2006-01-01

    Although it is generally acknowledged that symbolic interactionism and grounded theory are connected, the precise nature of their connection remains implicit and unexplained. As a result, many grounded theory studies are undertaken without an explanatory framework. This in turn results in the description rather than the explanation of data determined. In this report, the authors make explicit and explain the nature of the connections between symbolic interactionism and grounded theory research. Specifically, they make explicit the connection between Blumer's methodological principles and processes and grounded theory methodology. In addition, the authors illustrate the explanatory power of symbolic interactionism in grounded theory using data from a study of the HIV/AIDS experiences of married and widowed Thai women.

  7. A Belief Rule-Based Expert System to Assess Bronchiolitis Suspicion from Signs and Symptoms Under Uncertainty

    DEFF Research Database (Denmark)

    Karim, Rezuan; Hossain, Mohammad Shahadat; Khalid, Md. Saifuddin

    2017-01-01

    developed generic belief rule-based inference methodology by using evidential reasoning (RIMER) acts as the inference engine of this BRBES while belief rule base as the knowledge representation schema. The knowledge base of the system is constructed by using real patient data and expert opinion from...

  8. New QCD sum rules for nucleon axial-vector coupling constants

    International Nuclear Information System (INIS)

    Lee, F.X.; Leinweber, D.B.; Jin, X.

    1997-01-01

    Two new sets of QCD sum rules for the nucleon axial-vector coupling constants are derived using the external-field technique and generalized interpolating fields. An in-depth study of the predicative ability of these sum rules is carried out using a Monte Carlo based uncertainty analysis. The results show that the standard implementation of the QCD sum rule method has only marginal predicative power for the nucleon axial-vector coupling constants, as the relative errors are large. The errors range from approximately 50% to 100% compared to the nucleon mass obtained from the same method, which has only a 10%- 25% error. The origin of the large errors is examined. Previous analyses of these coupling constants are based on sum rules that have poor operator product expansion convergence and large continuum contributions. Preferred sum rules are identified and their predictions are obtained. We also investigate the new sum rules with an alternative treatment of the problematic transitions which are not exponentially suppressed in the standard treatment. The alternative treatment provides exponential suppression of their contributions relative to the ground state. Implications for other nucleon current matrix elements are also discussed. copyright 1997 The American Physical Society

  9. Automated remedial assessment methodology software system

    International Nuclear Information System (INIS)

    Whiting, M.; Wilkins, M.; Stiles, D.

    1994-11-01

    The Automated Remedial Analysis Methodology (ARAM) software system has been developed by the Pacific Northwest Laboratory to assist the U.S. Department of Energy (DOE) in evaluating cleanup options for over 10,000 contaminated sites across the DOE complex. The automated methodology comprises modules for decision logic diagrams, technology applicability and effectiveness rules, mass balance equations, cost and labor estimating factors and equations, and contaminant stream routing. ARAM is used to select technologies for meeting cleanup targets; determine the effectiveness of the technologies in destroying, removing, or immobilizing contaminants; decide the nature and amount of secondary waste requiring further treatment; and estimate the cost and labor involved when applying technologies

  10. Linking Symbolic Interactionism and Grounded Theory Methods in a Research Design

    OpenAIRE

    Jennifer Chamberlain-Salaun; Jane Mills; Kim Usher

    2013-01-01

    This article focuses on Corbin and Strauss’ evolved version of grounded theory. In the third edition of their seminal text, Basics of Qualitative Research: Techniques and Procedures for Developing Grounded Theory, the authors present 16 assumptions that underpin their conception of grounded theory methodology. The assumptions stem from a symbolic interactionism perspective of social life, including the themes of meanin...

  11. Rule-bases construction through self-learning for a table-based Sugeno-Takagi fuzzy logic control system

    Directory of Open Access Journals (Sweden)

    C. Boldisor

    2009-12-01

    Full Text Available A self-learning based methodology for building the rule-base of a fuzzy logic controller (FLC is presented and verified, aiming to engage intelligent characteristics to a fuzzy logic control systems. The methodology is a simplified version of those presented in today literature. Some aspects are intentionally ignored since it rarely appears in control system engineering and a SISO process is considered here. The fuzzy inference system obtained is a table-based Sugeno-Takagi type. System’s desired performance is defined by a reference model and rules are extracted from recorded data, after the correct control actions are learned. The presented algorithm is tested in constructing the rule-base of a fuzzy controller for a DC drive application. System’s performances and method’s viability are analyzed.

  12. Toward methodological emancipation in applied health research.

    Science.gov (United States)

    Thorne, Sally

    2011-04-01

    In this article, I trace the historical groundings of what have become methodological conventions in the use of qualitative approaches to answer questions arising from the applied health disciplines and advocate an alternative logic more strategically grounded in the epistemological orientations of the professional health disciplines. I argue for an increasing emphasis on the modification of conventional qualitative approaches to the particular knowledge demands of the applied practice domain, challenging the merits of what may have become unwarranted attachment to theorizing. Reorienting our methodological toolkits toward the questions arising within an evidence-dominated policy agenda, I encourage my applied health disciplinary colleagues to make themselves useful to that larger project by illuminating that which quantitative research renders invisible, problematizing the assumptions on which it generates conclusions, and filling in the gaps in knowledge needed to make decisions on behalf of people and populations.

  13. Methodology for estimating human perception to tremors in high-rise buildings

    Science.gov (United States)

    Du, Wenqi; Goh, Key Seng; Pan, Tso-Chien

    2017-07-01

    Human perception to tremors during earthquakes in high-rise buildings is usually associated with psychological discomfort such as fear and anxiety. This paper presents a methodology for estimating the level of perception to tremors for occupants living in high-rise buildings subjected to ground motion excitations. Unlike other approaches based on empirical or historical data, the proposed methodology performs a regression analysis using the analytical results of two generic models of 15 and 30 stories. The recorded ground motions in Singapore are collected and modified for structural response analyses. Simple predictive models are then developed to estimate the perception level to tremors based on a proposed ground motion intensity parameter—the average response spectrum intensity in the period range between 0.1 and 2.0 s. These models can be used to predict the percentage of occupants in high-rise buildings who may perceive the tremors at a given ground motion intensity. Furthermore, the models are validated with two recent tremor events reportedly felt in Singapore. It is found that the estimated results match reasonably well with the reports in the local newspapers and from the authorities. The proposed methodology is applicable to urban regions where people living in high-rise buildings might feel tremors during earthquakes.

  14. Organizing the Methodology Work at Higher School

    Directory of Open Access Journals (Sweden)

    O. A. Plaksina

    2012-01-01

    Full Text Available The paper considers the methodology components of organizing the higher school training. The research and analysis of the existing methodology systems carried out by the authors reveals that their advantages and disadvantages are related to the type of the system creating element of the methodology system organizational structure. The optimal scheme of such system has been developed in the context of Vocational School Reorganization implying the specification and expansion of the set of basic design principles of any control system. Following the suggested organizational approach provides the grounds for teachers’ self development and professional growth. The methodology of the approach allows using the given structure in any higher educational institution, providing the system transition from its simple functioning to the sustainable development mode. 

  15. Enhanced spatial resolution on figures versus grounds.

    Science.gov (United States)

    Hecht, Lauren N; Cosman, Joshua D; Vecera, Shaun P

    2016-07-01

    Much is known about the cues that determine figure-ground assignment, but less is known about the consequences of figure-ground assignment on later visual processing. Previous work has demonstrated that regions assigned figural status are subjectively more shape-like and salient than background regions. The increase in subjective salience of figural regions could be caused by a number of processes, one of which may be enhanced perceptual processing (e.g., an enhanced neural representation) of figures relative to grounds. We explored this hypothesis by having observers perform a perceptually demanding spatial resolution task in which targets appeared on either figure or ground regions. To rule out a purely attentional account of figural salience, observers discriminated targets on the basis of a region's color (red or green), which was equally likely to define the figure or the ground. The results of our experiments showed that targets appearing on figures were discriminated more accurately than those appearing in ground regions. In addition, targets appearing on figures were discriminated better than those presented in regions considered figurally neutral, but targets appearing within ground regions were discriminated more poorly than those appearing in figurally neutral regions. Taken together, our findings suggest that when two regions share a contour, regions assigned as figure are perceptually enhanced, whereas regions assigned as ground are perceptually suppressed.

  16. Enhanced spatial resolution on figures versus grounds

    Science.gov (United States)

    Hecht, Lauren N.; Cosman, Joshua D.; Vecera, Shaun P.

    2016-01-01

    Much is known about the cues that determine figure-ground assignment, but less is known about the consequences of figure-ground assignment on later visual processing. Previous work has demonstrated that regions assigned figural status are subjectively more shape-like and salient than background regions. The increase in subjective salience of figural regions could be caused by a number of processes, one of which may be enhanced perceptual processing (e.g., an enhanced neural representation) of figures relative to grounds. We explored this hypothesis by having observers perform a perceptually demanding spatial resolution task in which targets appeared either on figure or ground regions. To rule out a purely attentional account of figural salience, observers discriminated targets on the basis of a region’s color (red or green), which was equally likely to define the figure or the ground. The results of our experiments show that targets appearing on figures were discriminated more accurately than those appearing in ground regions. In addition, targets appearing on figures were discriminated better than those presented in regions considered figurally neutral, but targets appearing within ground regions were discriminated more poorly than those appearing in figurally neutral regions. Taken together, our findings suggest that when two regions share a contour, regions assigned as figure are perceptually enhanced, whereas regions assigned as grounds are perceptually suppressed. PMID:27048441

  17. A revised ground-motion and intensity interpolation scheme for shakemap

    Science.gov (United States)

    Worden, C.B.; Wald, D.J.; Allen, T.I.; Lin, K.; Garcia, D.; Cua, G.

    2010-01-01

    We describe a weighted-average approach for incorporating various types of data (observed peak ground motions and intensities and estimates from groundmotion prediction equations) into the ShakeMap ground motion and intensity mapping framework. This approach represents a fundamental revision of our existing ShakeMap methodology. In addition, the increased availability of near-real-time macroseismic intensity data, the development of newrelationships between intensity and peak ground motions, and new relationships to directly predict intensity from earthquake source information have facilitated the inclusion of intensity measurements directly into ShakeMap computations. Our approach allows for the combination of (1) direct observations (ground-motion measurements or reported intensities), (2) observations converted from intensity to ground motion (or vice versa), and (3) estimated ground motions and intensities from prediction equations or numerical models. Critically, each of the aforementioned data types must include an estimate of its uncertainties, including those caused by scaling the influence of observations to surrounding grid points and those associated with estimates given an unknown fault geometry. The ShakeMap ground-motion and intensity estimates are an uncertainty-weighted combination of these various data and estimates. A natural by-product of this interpolation process is an estimate of total uncertainty at each point on the map, which can be vital for comprehensive inventory loss calculations. We perform a number of tests to validate this new methodology and find that it produces a substantial improvement in the accuracy of ground-motion predictions over empirical prediction equations alone.

  18. Does Bertrand's rule apply to macronutrients?

    OpenAIRE

    Raubenheimer, D; Lee, K.P; Simpson, S.J

    2005-01-01

    It has been known for over a century that the dose–response curve for many micronutrients is non-monotonic, having an initial stage of increasing benefits with increased intake, followed by increasing costs as excesses become toxic. This phenomenon, termed Bertrand's rule, is widely assumed not to apply to caloric macronutrients. To date this assumption has been safe, owing to the considerable methodological challenges involved in coaxing animals to over-ingest macronutrients in a way that en...

  19. Post-decomposition optimizations using pattern matching and rule-based clustering for multi-patterning technology

    Science.gov (United States)

    Wang, Lynn T.-N.; Madhavan, Sriram

    2018-03-01

    A pattern matching and rule-based polygon clustering methodology with DFM scoring is proposed to detect decomposition-induced manufacturability detractors and fix the layout designs prior to manufacturing. A pattern matcher scans the layout for pre-characterized patterns from a library. If a pattern were detected, rule-based clustering identifies the neighboring polygons that interact with those captured by the pattern. Then, DFM scores are computed for the possible layout fixes: the fix with the best score is applied. The proposed methodology was applied to two 20nm products with a chip area of 11 mm2 on the metal 2 layer. All the hotspots were resolved. The number of DFM spacing violations decreased by 7-15%.

  20. Toward Paradigmatic Change in TESOL Methodologies: Building Plurilingual Pedagogies from the Ground Up

    Science.gov (United States)

    Lin, Angel

    2013-01-01

    Contemporary TESOL methodologies have been characterized by compartmentalization of languages in the classroom. However, recent years have seen the beginning signs of paradigmatic change in TESOL methodologies that indicate a move toward plurilingualism. In this article, the author draws on the case of Hong Kong to illustrate how, in the past four…

  1. Spatial variation in near-ground radiation and low temperature. Interactions with forest vegetation

    Energy Technology Data Exchange (ETDEWEB)

    Blennow, K.

    1997-10-01

    Low temperature has a large impact on the survival and distribution of plants. Interactive effects with high irradiance lead to cold-induced photo inhibition, which may impact on the establishment and growth of tree seedlings. In this thesis, novel approaches are applied for relating the spatial variability in low temperature and irradiance to photosynthetic performance and growth of tree seedlings, and for modelling the micro- and local-scale spatial variations in low temperature for heterogeneous terrain. The methodologies include the development and use of a digital image analysis system for hemispherical photographs, the use of Geographic Information Systems (GIS) and statistical methods, field data acquisition of meteorological elements, plant structure, growth and photosynthetic performance. Temperature and amounts of intercepted direct radiant energy for seedlings on clear days (IDRE) were related to chlorophyll a fluorescence, and the dry weight of seedlings. The combination of increased IDRE with reduced minimum temperatures resulted in persistent and strong photo inhibition as the season progressed, with likely implications for the establishment of tree seedlings at forest edges, and within shelter wood. For models of spatial distribution of low air temperature, the sky view factor was used to parameterize the radiative cooling, whilst drainage, ponding and stagnation of cold air, and thermal properties of the ground were all considered. The models hint at which scales and processes govern the development of spatial variations in low temperature for the construction of corresponding mechanistic models. The methodology is well suited for detecting areas that will be frost prone after clearing of forest and for comparing the magnitudes of impacts on low air temperature of forest management practices, such as shelter wood and soil preparation. The results can be used to formulate ground rules for use in practical forestry 141 refs, 5 figs, 1 tab

  2. FEBEX bentonite colloid stability in ground water

    Energy Technology Data Exchange (ETDEWEB)

    Seher, H.; Schaefer, T.; Geckeis, H. [Inst. fuer Nukleare Entsorgung (INE), Forschungszentrum Karlsruhe, 76021 Karlsruhe (Germany)]. e-mail: holger.seher@ine.fzk .de; Fanghaenel, T. [Ruprecht-Karls-Univ. Heidelberg, Physikalisch-Chemisches In st., D-69120 Heidelberg (Germany)

    2007-06-15

    Coagulation experiments are accomplished to identify the geochemical conditions for the stability of Febex bentonite colloids in granite ground water. The experiments are carried out by varying pH, ionic strength and type of electrolyte. The dynamic light scattering technique (photon correlation spectroscopy) is used to measure the size evolution of the colloids with time. Agglomeration rates are higher in MgCl{sub 2} and CaCl{sub 2} than in NaCl solution. Relative agglomeration rates follow approximately the Schulze-Hardy rule. Increasing agglomeration rates at pH>8 are observed in experiments with MgCl{sub 2} and CaCl{sub 2} which are, however, caused by coprecipitation phenomena. Bentonite colloid stability fields derived from the colloid agglomeration experiments predict low colloid stabilization in granite ground water taken from Aespoe, Sweden, and relatively high colloid stability in Grimsel ground water (Switzerland)

  3. Grounded theory in music therapy research.

    Science.gov (United States)

    O'Callaghan, Clare

    2012-01-01

    Grounded theory is one of the most common methodologies used in constructivist (qualitative) music therapy research. Researchers use the term "grounded theory" when denoting varying research designs and theoretical outcomes. This may be challenging for novice researchers when considering whether grounded theory is appropriate for their research phenomena. This paper examines grounded theory within music therapy research. Grounded theory is briefly described, including some of its "contested" ideas. A literature search was conducted using the descriptor "music therapy and grounded theory" in Pubmed, CINAHL PsychlNFO, SCOPUS, ERIC (CSA), Web of Science databases, and a music therapy monograph series. A descriptive analysis was performed on the uncovered studies to examine researched phenomena, grounded theory methods used, and how findings were presented, Thirty music therapy research projects were found in refereed journals and monographs from 1993 to "in press." The Strauss and Corbin approach to grounded theory dominates the field. Descriptors to signify grounded theory components in the studies greatly varied. Researchers have used partial or complete grounded theory methods to examine clients', family members', staff, music therapy "overhearers," music therapists', and students' experiences, as well as music therapy creative products and professional views, issues, and literature. Seven grounded theories were offered. It is suggested that grounded theory researchers clarify what and who inspired their design, why partial grounded theory methods were used (when relevant), and their ontology. By elucidating assumptions underpinning the data collection, analysis, and findings' contribution, researchers will continue to improve music therapy research using grounded theory methods.

  4. Genetic Programming for the Generation of Crisp and Fuzzy Rule Bases in Classification and Diagnosis of Medical Data

    DEFF Research Database (Denmark)

    Dounias, George; Tsakonas, Athanasios; Jantzen, Jan

    2002-01-01

    This paper demonstrates two methodologies for the construction of rule-based systems in medical decision making. The first approach consists of a method combining genetic programming and heuristic hierarchical rule-base construction. The second model is composed by a strongly-typed genetic...

  5. Seismic methodology in determining basis earthquake for nuclear installation

    International Nuclear Information System (INIS)

    Ameli Zamani, Sh.

    2008-01-01

    Design basis earthquake ground motions for nuclear installations should be determined to assure the design purpose of reactor safety: that reactors should be built and operated to pose no undue risk to public health and safety from earthquake and other hazards. Regarding the influence of seismic hazard to a site, large numbers of earthquake ground motions can be predicted considering possible variability among the source, path, and site parameters. However, seismic safety design using all predicted ground motions is practically impossible. In the determination of design basis earthquake ground motions it is therefore important to represent the influences of the large numbers of earthquake ground motions derived from the seismic ground motion prediction methods for the surrounding seismic sources. Viewing the relations between current design basis earthquake ground motion determination and modem earthquake ground motion estimation, a development of risk-informed design basis earthquake ground motion methodology is discussed for insight into the on going modernization of the Examination Guide for Seismic Design on NPP

  6. Lindhard's polarization parameter and atomic sum rules in the local plasma approximation

    DEFF Research Database (Denmark)

    Cabrera-Trujillo, R.; Apell, P.; Oddershede, J.

    2017-01-01

    In this work, we analyze the effects of Lindhard polarization parameter, χ, on the sum rule, Sp, within the local plasma approximation (LPA) as well as on the logarithmic sum rule Lp = dSp/dp, in both cases for the system in an initial excited state. We show results for a hydrogenic atom with nuc......In this work, we analyze the effects of Lindhard polarization parameter, χ, on the sum rule, Sp, within the local plasma approximation (LPA) as well as on the logarithmic sum rule Lp = dSp/dp, in both cases for the system in an initial excited state. We show results for a hydrogenic atom...... in terms of a screened charge Z* for the ground state. Our study shows that by increasing χ, the sum rule for p0 it increases, and the value p=0 provides the normalization/closure relation which remains fixed to the number of electrons for the same initial state. When p is fixed...

  7. Sum rules in extended RPA theories

    International Nuclear Information System (INIS)

    Adachi, S.; Lipparini, E.

    1988-01-01

    Different moments m k of the excitation strength function are studied in the framework of the second RPA and of the extended RPA in which 2p2h correlations are explicitly introduced into the ground state by using first-order perturbation theory. Formal properties of the equations of motion concerning sum rules are derived and compared with those exhibited by the usual 1p1h RPA. The problem of the separation of the spurious solutions in extended RPA calculations is also discussed. (orig.)

  8. Gaussian quadrature for splines via homotopy continuation: Rules for C2 cubic splines

    KAUST Repository

    Barton, Michael

    2015-10-24

    We introduce a new concept for generating optimal quadrature rules for splines. To generate an optimal quadrature rule in a given (target) spline space, we build an associated source space with known optimal quadrature and transfer the rule from the source space to the target one, while preserving the number of quadrature points and therefore optimality. The quadrature nodes and weights are, considered as a higher-dimensional point, a zero of a particular system of polynomial equations. As the space is continuously deformed by changing the source knot vector, the quadrature rule gets updated using polynomial homotopy continuation. For example, starting with C1C1 cubic splines with uniform knot sequences, we demonstrate the methodology by deriving the optimal rules for uniform C2C2 cubic spline spaces where the rule was only conjectured to date. We validate our algorithm by showing that the resulting quadrature rule is independent of the path chosen between the target and the source knot vectors as well as the source rule chosen.

  9. Gaussian quadrature for splines via homotopy continuation: Rules for C2 cubic splines

    KAUST Repository

    Barton, Michael; Calo, Victor M.

    2015-01-01

    We introduce a new concept for generating optimal quadrature rules for splines. To generate an optimal quadrature rule in a given (target) spline space, we build an associated source space with known optimal quadrature and transfer the rule from the source space to the target one, while preserving the number of quadrature points and therefore optimality. The quadrature nodes and weights are, considered as a higher-dimensional point, a zero of a particular system of polynomial equations. As the space is continuously deformed by changing the source knot vector, the quadrature rule gets updated using polynomial homotopy continuation. For example, starting with C1C1 cubic splines with uniform knot sequences, we demonstrate the methodology by deriving the optimal rules for uniform C2C2 cubic spline spaces where the rule was only conjectured to date. We validate our algorithm by showing that the resulting quadrature rule is independent of the path chosen between the target and the source knot vectors as well as the source rule chosen.

  10. Evolving Rule-Based Systems in two Medical Domains using Genetic Programming

    DEFF Research Database (Denmark)

    Tsakonas, A.; Dounias, G.; Jantzen, Jan

    2004-01-01

    We demonstrate, compare and discuss the application of two genetic programming methodologies for the construction of rule-based systems in two medical domains: the diagnosis of Aphasia's subtypes and the classification of Pap-Smear Test examinations. The first approach consists of a scheme...

  11. Assessing Financial Education Methods: Principles vs. Rules-of-Thumb Approaches

    Science.gov (United States)

    Skimmyhorn, William L.; Davies, Evan R.; Mun, David; Mitchell, Brian

    2016-01-01

    Despite thousands of programs and tremendous public and private interest in improving financial decision-making, little is known about how best to teach financial education. Using an experimental approach, the authors estimated the effects of two different education methodologies (principles-based and rules-of-thumb) on the knowledge,…

  12. SUM-RULES FOR MAGNETIC DICHROISM IN RARE-EARTH 4F-PHOTOEMISSION

    NARCIS (Netherlands)

    THOLE, BT; VANDERLAAN, G

    1993-01-01

    We present new sum rules for magnetic dichroism in spin polarized photoemission from partly filled shells which give the expectation values of the orbital and spin magnetic moments and their correlations in the ground state. We apply this to the 4f photoemission of rare earths, where the

  13. 48 CFR 6101.27 - Relief from decision or order [Rule 27].

    Science.gov (United States)

    2010-10-01

    ... order [Rule 27]. (a) Grounds. The Board may relieve a party from the operation of a final decision or... discovered, even through due diligence; (2) Justifiable or excusable mistake, inadvertence, surprise, or neglect; (3) Fraud, misrepresentation, or other misconduct of an adverse party; (4) The decision has been...

  14. Clarification of the Blurred Boundaries between Grounded Theory and Ethnography: Differences and Similarities

    Science.gov (United States)

    Aldiabat, Khaldoun; Le Navenec, Carol-Lynne

    2011-01-01

    There is confusion among graduate students about how to select the qualitative methodology that best fits their research question. Often this confusion arises in regard to making a choice between a grounded theory methodology and an ethnographic methodology. This difficulty may stem from the fact that these students do not have a clear…

  15. Developing a Guideline for Reporting and Evaluating Grounded Theory Research Studies (GUREGT)

    DEFF Research Database (Denmark)

    Berthelsen, Connie Bøttcher; Grimshaw-Aagaard, Søsserr Lone Smilla; Hansen, Carrinna

    2018-01-01

    theory research studies. The study was conducted in three phases. Phase 1: A structured literature review in PubMed, CINAHL, Cochrane Libraries, PsycInfo and SCOPUS to identify recommendations for reporting and evaluating grounded theory. Phase 2: A selective review of the methodological grounded theory...

  16. Maximizing Health or Sufficient Capability in Economic Evaluation? A Methodological Experiment of Treatment for Drug Addiction.

    Science.gov (United States)

    Goranitis, Ilias; Coast, Joanna; Day, Ed; Copello, Alex; Freemantle, Nick; Frew, Emma

    2017-07-01

    Conventional practice within the United Kingdom and beyond is to conduct economic evaluations with "health" as evaluative space and "health maximization" as the decision-making rule. However, there is increasing recognition that this evaluative framework may not always be appropriate, and this is particularly the case within public health and social care contexts. This article presents a methodological case study designed to explore the impact of changing the evaluative space within an economic evaluation from health to capability well-being and the decision-making rule from health maximization to the maximization of sufficient capability. Capability well-being is an evaluative space grounded on Amartya Sen's capability approach and assesses well-being based on individuals' ability to do and be the things they value in life. Sufficient capability is an egalitarian approach to decision making that aims to ensure everyone in society achieves a normatively sufficient level of capability well-being. The case study is treatment for drug addiction, and the cost-effectiveness of 2 psychological interventions relative to usual care is assessed using data from a pilot trial. Analyses are undertaken from a health care and a government perspective. For the purpose of the study, quality-adjusted life years (measured using the EQ-5D-5L) and years of full capability equivalent and years of sufficient capability equivalent (both measured using the ICECAP-A [ICEpop CAPability measure for Adults]) are estimated. The study concludes that different evaluative spaces and decision-making rules have the potential to offer opposing treatment recommendations. The implications for policy makers are discussed.

  17. Approach to developing a ground-motion design basis for facilities important to safety at Yucca Mountain

    International Nuclear Information System (INIS)

    King, J.L.

    1990-01-01

    This paper discusses a methodology for developing a ground-motion design basis for prospective facilities at Yucca Mountain that are important to safety. The methodology utilizes a guasi-deterministic construct called the 10,000-year cumulative-slip earthquake that is designed to provide a conservative, robust, and reproducible estimate of ground motion that has a one-in-ten chance of occurring during the preclosure period. This estimate is intended to define a ground-motion level for which the seismic design would ensure minimal disruption to operations engineering analyses to ensure safe performance are included

  18. Average System Cost Methodology : Administrator's Record of Decision.

    Energy Technology Data Exchange (ETDEWEB)

    United States. Bonneville Power Administration.

    1984-06-01

    Significant features of average system cost (ASC) methodology adopted are: retention of the jurisdictional approach where retail rate orders of regulartory agencies provide primary data for computing the ASC for utilities participating in the residential exchange; inclusion of transmission costs; exclusion of construction work in progress; use of a utility's weighted cost of debt securities; exclusion of income taxes; simplification of separation procedures for subsidized generation and transmission accounts from other accounts; clarification of ASC methodology rules; more generous review timetable for individual filings; phase-in of reformed methodology; and each exchanging utility must file under the new methodology within 20 days of implementation by the Federal Energy Regulatory Commission of the ten major participating utilities, the revised ASC will substantially only affect three. (PSB)

  19. Using GO-WAR for mining cross-ontology weighted association rules.

    Science.gov (United States)

    Agapito, Giuseppe; Cannataro, Mario; Guzzi, Pietro Hiram; Milano, Marianna

    2015-07-01

    The Gene Ontology (GO) is a structured repository of concepts (GO terms) that are associated to one or more gene products. The process of association is referred to as annotation. The relevance and the specificity of both GO terms and annotations are evaluated by a measure defined as information content (IC). The analysis of annotated data is thus an important challenge for bioinformatics. There exist different approaches of analysis. From those, the use of association rules (AR) may provide useful knowledge, and it has been used in some applications, e.g. improving the quality of annotations. Nevertheless classical association rules algorithms do not take into account the source of annotation nor the importance yielding to the generation of candidate rules with low IC. This paper presents GO-WAR (Gene Ontology-based Weighted Association Rules) a methodology for extracting weighted association rules. GO-WAR can extract association rules with a high level of IC without loss of support and confidence from a dataset of annotated data. A case study on using of GO-WAR on publicly available GO annotation datasets is used to demonstrate that our method outperforms current state of the art approaches. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.

  20. Post place and route design-technology co-optimization for scaling at single-digit nodes with constant ground rules

    Science.gov (United States)

    Mattii, Luca; Milojevic, Dragomir; Debacker, Peter; Berekovic, Mladen; Sherazi, Syed Muhammad Yasser; Chava, Bharani; Bardon, Marie Garcia; Schuddinck, Pieter; Rodopoulos, Dimitrios; Baert, Rogier; Gerousis, Vassilios; Ryckaert, Julien; Raghavan, Praveen

    2018-01-01

    Standard-cell design, technology choices, and place and route (P&R) efficiency are deeply interrelated in CMOS technology nodes below 10 nm, where lower number of tracks cells and higher pin densities pose increasingly challenging problems to the router in terms of congestion and pin accessibility. To evaluate and downselect the best solutions, a holistic design-technology co-optimization approach leveraging state-of-the-art P&R tools is thus necessary. We adopt such an approach using the imec N7 technology platform, with contacted poly pitch of 42 nm and tightest metal pitch of 32 nm, by comparing post P&R area of an IP block for different standard cell configurations, technology options, and cell height. Keeping the technology node and the set of ground rules unchanged, we demonstrate that a careful combination of these solutions can enable area gains of up to 50%, comparable with the area benefits of migrating to another node. We further demonstrate that these area benefits can be achieved at isoperformance with >20% reduced power. As at the end of the CMOS roadmap, conventional scaling enacted through pitch reduction is made more and more challenging by constraints imposed by lithography limits, material resistivity, manufacturability, and ultimately wafer cost, the approach shown herein offers a valid, attractive, and low-cost alternative.

  1. Learning Display Rules: The Socialization of Emotion Expression in Infancy.

    Science.gov (United States)

    Malatesta, Carol Zander; Haviland, Jeannette M.

    1982-01-01

    Develops a methodology for studying emotion socialization and examines the synchrony of mother and infant expressions to determine whether "instruction" in display rules is underway in early infancy and what the short-term effects of such instruction on infant expression might be. Sixty dyads were videotaped during play and reunion after brief…

  2. Diffraction or Reflection? Sketching the Contours of Two Methodologies in Educational Research

    Science.gov (United States)

    Bozalek, Vivienne; Zembylas, Michalinos

    2017-01-01

    Internationally, an interest is emerging in a growing body of work on what has become known as "diffractive methodologies" drawing attention to ontological aspects of research. Diffractive methodologies have largely been developed in response to a dissatisfaction with practices of "reflexivity", which are seen to be grounded in…

  3. IMPROVED ALGORITHM FOR CALCULATING COMPLEX NON-EQUIPOTENTIAL GROUNDING DEVICES OF ELECTRICAL INSTALLATIONS TAKING INTO ACCOUNT CONDUCTIVITY OF NATURAL GROUNDINGS

    Directory of Open Access Journals (Sweden)

    K. A. Starkov

    2017-08-01

    Full Text Available Purpose. The method of natural concentrated groundings substitution by the set of electrodes taking them into account in the algorithm of electric characteristics calculation for complicated grounding connections of electric installation is offered. An equivalent model as a set of linear electrodes is chosen in accordance with two criteria: leakage resistance and potentials on the ground surface. Methodology. We have applied induced potential method and methods for computing branched electrical circuits with distributed parameters. Results. We have obtained the algorithm for calculating complex non-equipotential grounding connections, which makes it possible to obtain refined values of the potential distribution in the electric stations and substations with outdoor switchgear. Originality. For the first time, we have taking into account the conductivity of natural concentrated grounds by a set of vertical and horizontal electrodes based on equivalent electrical characteristics applied to a two-layer ground. Practical value. The using of the proposed calculation algorithm in the electric grids of JSC «Kharkivoblenergo» made it possible to determine the values of the potential distribution at short circuit in electrical substation taking into account the influence of the conductivity of natural concentrated groundings.

  4. Estimating cotton canopy ground cover from remotely sensed scene reflectance

    International Nuclear Information System (INIS)

    Maas, S.J.

    1998-01-01

    Many agricultural applications require spatially distributed information on growth-related crop characteristics that could be supplied through aircraft or satellite remote sensing. A study was conducted to develop and test a methodology for estimating plant canopy ground cover for cotton (Gossypium hirsutum L.) from scene reflectance. Previous studies indicated that a relatively simple relationship between ground cover and scene reflectance could be developed based on linear mixture modeling. Theoretical analysis indicated that the effects of shadows in the scene could be compensated for by averaging the results obtained using scene reflectance in the red and near-infrared wavelengths. The methodology was tested using field data collected over several years from cotton test plots in Texas and California. Results of the study appear to verify the utility of this approach. Since the methodology relies on information that can be obtained solely through remote sensing, it would be particularly useful in applications where other field information, such as plant size, row spacing, and row orientation, is unavailable

  5. Ares I-X Ground Diagnostic Prototype

    Science.gov (United States)

    Schwabacher, Mark A.; Martin, Rodney Alexander; Waterman, Robert D.; Oostdyk, Rebecca Lynn; Ossenfort, John P.; Matthews, Bryan

    2010-01-01

    The automation of pre-launch diagnostics for launch vehicles offers three potential benefits: improving safety, reducing cost, and reducing launch delays. The Ares I-X Ground Diagnostic Prototype demonstrated anomaly detection, fault detection, fault isolation, and diagnostics for the Ares I-X first-stage Thrust Vector Control and for the associated ground hydraulics while the vehicle was in the Vehicle Assembly Building at Kennedy Space Center (KSC) and while it was on the launch pad. The prototype combines three existing tools. The first tool, TEAMS (Testability Engineering and Maintenance System), is a model-based tool from Qualtech Systems Inc. for fault isolation and diagnostics. The second tool, SHINE (Spacecraft Health Inference Engine), is a rule-based expert system that was developed at the NASA Jet Propulsion Laboratory. We developed SHINE rules for fault detection and mode identification, and used the outputs of SHINE as inputs to TEAMS. The third tool, IMS (Inductive Monitoring System), is an anomaly detection tool that was developed at NASA Ames Research Center. The three tools were integrated and deployed to KSC, where they were interfaced with live data. This paper describes how the prototype performed during the period of time before the launch, including accuracy and computer resource usage. The paper concludes with some of the lessons that we learned from the experience of developing and deploying the prototype.

  6. The rules of the Rue Morgue

    Energy Technology Data Exchange (ETDEWEB)

    Vanzi, M. [Univ. of Cagliari (Italy)

    1995-12-31

    This paper, it is evident, is mostly a joke, based on the fascinating (but not original) consideration of any failure analysis as a detective story. The Poe`s tale is a perfect instrument (but surely not the only possible one) for playing the game. If any practical application of ``The Rules of the Rue Morgue`` may be expected, it is on the possibility of defining what leaves us unsatisfied when a Failure Analyst`s result sounds out of tune. The reported Violations to the Dupin Postulate summarize the objections that the author would like to repeat for his own analyses, and for those cases in which he is required to review the work of other. On the constructive side, the proposed Rules, it has been repeatedly said, are common sense indications, and are surely not exhaustive, on a practical ground. Skill, patience, luck and memory are also required, but, unfortunately, not always and not together available. It should be of the greatest aid for the Failure Analyst community, in any case, that each public report could point out how it obeyed to a widely accepted set of failure analysis rules. Maybe -- why not? -- the Rules of the Rue Morgue. As a last consideration, for concluding the joke, the author invites his readers to open the original Poe`s tale at the very beginning of the story, when Monsieur Dupin is introduced. Thinking of the Failure Analyst as a member of the excellent family of the Scientists, many of us will sigh and smile.

  7. Methodology for Estimating Ingestion Dose for Emergency Response at SRS

    CERN Document Server

    Simpkins, A A

    2002-01-01

    At the Savannah River Site (SRS), emergency response models estimate dose for inhalation and ground shine pathways. A methodology has been developed to incorporate ingestion doses into the emergency response models. The methodology follows a two-phase approach. The first phase estimates site-specific derived response levels (DRLs) which can be compared with predicted ground-level concentrations to determine if intervention is needed to protect the public. This phase uses accepted methods with little deviation from recommended guidance. The second phase uses site-specific data to estimate a 'best estimate' dose to offsite individuals from ingestion of foodstuffs. While this method deviates from recommended guidance, it is technically defensibly and more realistic. As guidance is updated, these methods also will need to be updated.

  8. Gauss-Galerkin quadrature rules for quadratic and cubic spline spaces and their application to isogeometric analysis

    KAUST Repository

    Barton, Michael

    2016-07-21

    We introduce Gaussian quadrature rules for spline spaces that are frequently used in Galerkin discretizations to build mass and stiffness matrices. By definition, these spaces are of even degrees. The optimal quadrature rules we recently derived (Bartoň and Calo, 2016) act on spaces of the smallest odd degrees and, therefore, are still slightly sub-optimal. In this work, we derive optimal rules directly for even-degree spaces and therefore further improve our recent result. We use optimal quadrature rules for spaces over two elements as elementary building blocks and use recursively the homotopy continuation concept described in Bartoň and Calo (2016) to derive optimal rules for arbitrary admissible numbers of elements.We demonstrate the proposed methodology on relevant examples, where we derive optimal rules for various even-degree spline spaces. We also discuss convergence of our rules to their asymptotic counterparts, these are the analogues of the midpoint rule of Hughes et al. (2010), that are exact and optimal for infinite domains.

  9. A neural network based methodology to predict site-specific spectral acceleration values

    Science.gov (United States)

    Kamatchi, P.; Rajasankar, J.; Ramana, G. V.; Nagpal, A. K.

    2010-12-01

    A general neural network based methodology that has the potential to replace the computationally-intensive site-specific seismic analysis of structures is proposed in this paper. The basic framework of the methodology consists of a feed forward back propagation neural network algorithm with one hidden layer to represent the seismic potential of a region and soil amplification effects. The methodology is implemented and verified with parameters corresponding to Delhi city in India. For this purpose, strong ground motions are generated at bedrock level for a chosen site in Delhi due to earthquakes considered to originate from the central seismic gap of the Himalayan belt using necessary geological as well as geotechnical data. Surface level ground motions and corresponding site-specific response spectra are obtained by using a one-dimensional equivalent linear wave propagation model. Spectral acceleration values are considered as a target parameter to verify the performance of the methodology. Numerical studies carried out to validate the proposed methodology show that the errors in predicted spectral acceleration values are within acceptable limits for design purposes. The methodology is general in the sense that it can be applied to other seismically vulnerable regions and also can be updated by including more parameters depending on the state-of-the-art in the subject.

  10. Exploring the Virtuality Continuum for Complex Rule-Set Education in the Context of Soccer Rule Comprehension

    Directory of Open Access Journals (Sweden)

    Andrés N. Vargas González

    2017-11-01

    Full Text Available We present an exploratory study to assess the benefits of using Augmented Reality (AR in training sports rule comprehension. Soccer is the chosen context for this study due to the wide range of complexity in the rules and regulations. Observers must understand and holistically evaluate the proximity of players in the game to the ball and other visual objects, such as the goal, penalty area, and other players. Grounded in previous literature investigating the effects of Virtual Reality (VR scenarios on transfer of training (ToT, we explore how three different interfaces influence user perception using both qualitative and quantitative measures. To better understand how effective augmented reality technology is when combined with learning systems, we compare results on the effects of learning outcomes in three interface conditions: AR, VR and a traditional Desktop interface. We also compare these interfaces as measured by user experience, engagement, and immersion. Results show that there were no significance difference among VR and AR; however, these participants outperformed the Desktop group which needed a higher number of adaptations to acquire the same knowledge.

  11. Methodology, Measurement and Analysis of Flow Table Update Characteristics in Hardware OpenFlow Switches

    KAUST Repository

    Kuźniar, Maciej

    2018-02-15

    Software-Defined Networking (SDN) and OpenFlow are actively being standardized and deployed. These deployments rely on switches that come from various vendors and differ in terms of performance and available features. Understanding these differences and performance characteristics is essential for ensuring successful and safe deployments.We propose a systematic methodology for SDN switch performance analysis and devise a series of experiments based on this methodology. The methodology relies on sending a stream of rule updates, while relying on both observing the control plane view as reported by the switch and probing the data plane state to determine switch characteristics by comparing these views. We measure, report and explain the performance characteristics of flow table updates in six hardware OpenFlow switches. Our results describing rule update rates can help SDN designers make their controllers efficient. Further, we also highlight differences between the OpenFlow specification and its implementations, that if ignored, pose a serious threat to network security and correctness.

  12. Extraction and Preference Ordering of Multireservoir Water Supply Rules in Dry Years

    Directory of Open Access Journals (Sweden)

    Ling Kang

    2016-01-01

    Full Text Available This paper presents a new methodology of combined use of the nondominated sorting genetic algorithm II (NSGA-II and the approach of successive elimination of alternatives based on order and degree of efficiency (SEABODE in identifying the most preferred multireservoir water supply rules in dry years. First, the suggested operation rules consists of a two-point type time-varying hedging policy for a single reservoir and a simple proportional allocation policy of common water demand between two parallel reservoirs. Then, the NSGA-II is employed to derive enough noninferior operation rules (design alternatives in terms of two conflicting objectives (1 minimizing the total deficit ratio (TDR of all demands of the entire system in operation horizon, and (2 minimizing the maximum deficit ratio (MDR of water supply in a single period. Next, the SEABODE, a multicriteria decision making (MCDM procedure, is applied to further eliminate alternatives based on the concept of efficiency of order k with degree p. In SEABODE, the reservoir performance indices and water shortage indices are selected as evaluation criteria for preference ordering among the design alternatives obtained by NSGA-II. The proposed methodology was tested on a regional water supply system with three reservoirs located in the Jialing River, China, where the results demonstrate its applicability and merits.

  13. Demystifying Theoretical Sampling in Grounded Theory Research

    Directory of Open Access Journals (Sweden)

    Jenna Breckenridge BSc(Hons,Ph.D.Candidate

    2009-06-01

    Full Text Available Theoretical sampling is a central tenet of classic grounded theory and is essential to the development and refinement of a theory that is ‘grounded’ in data. While many authors appear to share concurrent definitions of theoretical sampling, the ways in which the process is actually executed remain largely elusive and inconsistent. As such, employing and describing the theoretical sampling process can present a particular challenge to novice researchers embarking upon their first grounded theory study. This article has been written in response to the challenges faced by the first author whilst writing a grounded theory proposal. It is intended to clarify theoretical sampling for new grounded theory researchers, offering some insight into the practicalities of selecting and employing a theoretical sampling strategy. It demonstrates that the credibility of a theory cannot be dissociated from the process by which it has been generated and seeks to encourage and challenge researchers to approach theoretical sampling in a way that is apposite to the core principles of the classic grounded theory methodology.

  14. Assessing Lightning and Wildfire Hazard by Land Properties and Cloud to Ground Lightning Data with Association Rule Mining in Alberta, Canada.

    Science.gov (United States)

    Cha, DongHwan; Wang, Xin; Kim, Jeong Woo

    2017-10-23

    Hotspot analysis was implemented to find regions in the province of Alberta (Canada) with high frequency Cloud to Ground (CG) lightning strikes clustered together. Generally, hotspot regions are located in the central, central east, and south central regions of the study region. About 94% of annual lightning occurred during warm months (June to August) and the daily lightning frequency was influenced by the diurnal heating cycle. The association rule mining technique was used to investigate frequent CG lightning patterns, which were verified by similarity measurement to check the patterns' consistency. The similarity coefficient values indicated that there were high correlations throughout the entire study period. Most wildfires (about 93%) in Alberta occurred in forests, wetland forests, and wetland shrub areas. It was also found that lightning and wildfires occur in two distinct areas: frequent wildfire regions with a high frequency of lightning, and frequent wild-fire regions with a low frequency of lightning. Further, the preference index (PI) revealed locations where the wildfires occurred more frequently than in other class regions. The wildfire hazard area was estimated with the CG lightning hazard map and specific land use types.

  15. Methodology is more than research design and technology.

    Science.gov (United States)

    Proctor, Robert W

    2005-05-01

    The Society for Computers in Psychology has been at the forefront of disseminating information about advances in computer technology and their applications for psychologists. Although technological advances, as well as clean research designs, are key contributors to progress in psychological research, the justification of methodological rules for interpreting data and making theory choices is at least as important. Historically, methodological beliefs and practices have been justified through intuition and logic, an approach known as foundationism. However, naturalism, a modern approach in the philosophy of science inspired by the work of Thomas S. Kuhn, indicates that all aspects of scientific practice, including its methodology, should be evaluated empirically. This article examines implications of the naturalistic approach for psychological research methods in general and for the current debate that is often framed as one of qualitative versus quantitative methods.

  16. Taxation without representation: the illegal IRS rule to expand tax credits under the PPACA.

    Science.gov (United States)

    Adler, Jonathan H; Cannon, Michael F

    2013-01-01

    The Patient Protection and Affordable Care Act (PPACA) provides tax credits and subsidies for the purchase of qualifying health insurance plans on state-run insurance exchanges. Contrary to expectations, many states are refusing or otherwise failing to create such exchanges. An Internal Revenue Service (IRS) rule purports to extend these tax credits and subsidies to the purchase of health insurance in federal exchanges created in states without exchanges of their own. This rule lacks statutory authority. The text, structure, and history of the Act show that tax credits and subsidies are not available in federally run exchanges. The IRS rule is contrary to congressional intent and cannot be justified on other legal grounds. Because tax credit eligibility can trigger penalties on employers and individuals, affected parties are likely to have standing to challenge the IRS rule in court.

  17. Analysis of correlation between pediatric asthma exacerbation and exposure to pollutant mixtures with association rule mining.

    Science.gov (United States)

    Toti, Giulia; Vilalta, Ricardo; Lindner, Peggy; Lefer, Barry; Macias, Charles; Price, Daniel

    2016-11-01

    Traditional studies on effects of outdoor pollution on asthma have been criticized for questionable statistical validity and inefficacy in exploring the effects of multiple air pollutants, alone and in combination. Association rule mining (ARM), a method easily interpretable and suitable for the analysis of the effects of multiple exposures, could be of use, but the traditional interest metrics of support and confidence need to be substituted with metrics that focus on risk variations caused by different exposures. We present an ARM-based methodology that produces rules associated with relevant odds ratios and limits the number of final rules even at very low support levels (0.5%), thanks to post-pruning criteria that limit rule redundancy and control for statistical significance. The methodology has been applied to a case-crossover study to explore the effects of multiple air pollutants on risk of asthma in pediatric subjects. We identified 27 rules with interesting odds ratio among more than 10,000 having the required support. The only rule including only one chemical is exposure to ozone on the previous day of the reported asthma attack (OR=1.14). 26 combinatory rules highlight the limitations of air quality policies based on single pollutant thresholds and suggest that exposure to mixtures of chemicals is more harmful, with odds ratio as high as 1.54 (associated with the combination day0 SO 2 , day0 NO, day0 NO 2 , day1 PM). The proposed method can be used to analyze risk variations caused by single and multiple exposures. The method is reliable and requires fewer assumptions on the data than parametric approaches. Rules including more than one pollutant highlight interactions that deserve further investigation, while helping to limit the search field. Copyright © 2016 Elsevier B.V. All rights reserved.

  18. Figure-ground segregation modulates apparent motion.

    Science.gov (United States)

    Ramachandran, V S; Anstis, S

    1986-01-01

    We explored the relationship between figure-ground segmentation and apparent motion. Results suggest that: static elements in the surround can eliminate apparent motion of a cluster of dots in the centre, but only if the cluster and surround have similar "grain" or texture; outlines that define occluding surfaces are taken into account by the motion mechanism; the brain uses a hierarchy of precedence rules in attributing motion to different segments of the visual scene. Being designated as "figure" confers a high rank in this scheme of priorities.

  19. Theoretical and methodological grounds of formation of the efficient system of higher education

    Directory of Open Access Journals (Sweden)

    Raevneva Elena V.

    2013-03-01

    Full Text Available The goal of the article lies in generalisation of the modern theoretical and methodological, methodical and instrumentation provision of building of efficient system of higher education. Analysis of literature on the problems of building educational systems shows that there is a theoretical and methodological and instrumentation level of study of this issue. The article considers a theoretical and methodological level of the study and specifies theories and philosophic schools, concepts, educational paradigms and scientific approaches used during formation of the educational paradigm. The article considers models of education and models and technologies of learning as instrumental provision. In the result of the analysis the article makes a conclusion that the humanistic paradigm, which is based on the competency building approach and which assumes the use of modern (innovation technologies of learning, should be in the foundation of reformation of the system of higher education. The prospect of further studies in this directions is formation of competences of potential specialists (graduates of higher educational establishments with consideration of requirements of employers and market in general.

  20. Dynasting Theory: Lessons in learning grounded theory

    Directory of Open Access Journals (Sweden)

    Johnben Teik-Cheok Loy, MBA, MTS, Ph.D.

    2011-06-01

    Full Text Available This article captures the key learning lessons gleaned from the author’s experience learning and developing a grounded theory for his doctoral dissertation using the classic methodology as conceived by Barney Glaser. The theory was developed through data gathered on founders and successors of Malaysian Chinese family-own businesses. The main concern for Malaysian Chinese family businesses emerged as dynasting . the building, maintaining, and growing the power and resources of the business within the family lineage. The core category emerged as dynasting across cultures, where founders and successors struggle to transition from traditional Chinese to hybrid cultural and modernized forms of family business from one generation to the next. The key learning lessons were categorized under five headings: (a sorting through different versions of grounded theory, (b educating and managing research stakeholders, (c embracing experiential learning, (d discovering the core category: grounded intuition, and (e recognizing limitations and possibilities.Keywords: grounded theory, learning, dynasting, family business, Chinese

  1. The rule of rescue.

    Science.gov (United States)

    McKie, John; Richardson, Jeff

    2003-06-01

    Jonsen coined the term "Rule of Rescue"(RR) to describe the imperative people feel to rescue identifiable individuals facing avoidable death. In this paper we attempt to draw a more detailed picture of the RR, identifying its conflict with cost-effectiveness analysis, the preference it entails for identifiable over statistical lives, the shock-horror response it elicits, the preference it entails for lifesaving over non-lifesaving measures, its extension to non-life-threatening conditions, and whether it is motivated by duty or sympathy. We also consider the measurement problems it raises, and argue that quantifying the RR would probably require a two-stage procedure. In the first stage the size of the individual utility gain from a health intervention would be assessed using a technique such as the Standard Gamble or the Time Trade-Off, and in the second the social benefits arising from the RR would be quantified employing the Person Trade-Off. We also consider the normative status of the RR. We argue that it can be defended from a utilitarian point of view, on the ground that rescues increase well-being by reinforcing people's belief that they live in a community that places great value upon life. However, utilitarianism has long been criticised for failing to take sufficient account of fairness, and the case is no different here: fairness requires that we do not discriminate between individuals on morally irrelevant grounds, whereas being "identifiable" does not seem to be a morally relevant ground for discrimination.

  2. Figure and Ground in the Visual Cortex: V2 Combines Stereoscopic Cues with Gestalt Rules

    OpenAIRE

    Qiu, Fangtu T.; von der Heydt, Rüdiger

    2005-01-01

    Figure-ground organization is a process by which the visual system identifies some image regions as foreground and others as background, inferring three-dimensional (3D) layout from 2D displays. A recent study reported that edge responses of neurons in area V2 are selective for side-of-figure, suggesting that figure-ground organization is encoded in the contour signals (border-ownership coding). Here we show that area V2 combines two strategies of computation, one that exploits binocular ster...

  3. Business rules for creating process flexibility : Mapping RIF rules and BDI rules

    NARCIS (Netherlands)

    Gong, Y.; Overbeek, S.J.; Janssen, M.

    2011-01-01

    Business rules and software agents can be used for creating flexible business processes. The Rule Interchange Format (RIF) is a new W3C recommendation standard for exchanging rules among disparate systems. Yet, the impact that the introduction of RIF has on the design of flexible business processes

  4. Methodology for designing aircraft having optimal sound signatures

    NARCIS (Netherlands)

    Sahai, A.K.; Simons, D.G.

    2017-01-01

    This paper presents a methodology with which aircraft designs can be modified such that they produce optimal sound signatures on the ground. With optimal sound it is implied in this case sounds that are perceived as less annoying by residents living near airport vicinities. A novel design and

  5. Rule-Based Event Processing and Reaction Rules

    Science.gov (United States)

    Paschke, Adrian; Kozlenkov, Alexander

    Reaction rules and event processing technologies play a key role in making business and IT / Internet infrastructures more agile and active. While event processing is concerned with detecting events from large event clouds or streams in almost real-time, reaction rules are concerned with the invocation of actions in response to events and actionable situations. They state the conditions under which actions must be taken. In the last decades various reaction rule and event processing approaches have been developed, which for the most part have been advanced separately. In this paper we survey reaction rule approaches and rule-based event processing systems and languages.

  6. Inspection of power and ground layers in PCB images

    Science.gov (United States)

    Bunyak, Filiz; Ercal, Fikret

    1998-10-01

    In this work, we present an inspection method for power and ground (P&G) layers of printed circuit boards (PCB) also called utility layers. Design considerations for the P&G layers are different than those of signal layers. Current PCB inspection approaches cannot be applied to these layers. P&G layers act as internal ground, neutral or power sources. P&G layers are predominantly copper with occasional pad areas (without copper) called clearance. Defect definition is based on the spacing between the holes that will be drilled in clearances and the surrounding copper. Overlap of pads of different sizes and shapes are allowed. This results in complex, hard to inspect clearances. Our inspection is based on identification of shape, size and position of the individual pads that contribute to an overlapping clearance and then inspection of each pad based on design rules and tolerances. Main steps of our algorithm are as follows: (1) extraction and preprocessing of clearance contours; (2) decomposition of contours into segments: corner detection and matching lines or circular arcs between two corners; (3) determination of the pads from partial contour information obtained in step (2), and (4) design rules checking for each detected pad.

  7. A Belief Rule-Based (BRB) Decision Support System for Assessing Clinical Asthma Suspicion

    DEFF Research Database (Denmark)

    Hossain, Mohammad Shahadat; Hossain, Emran; Khalid, Md. Saifuddin

    2014-01-01

    conditions of uncertainty. The Belief Rule-Based Inference Methodology Using the Evidential Reasoning (RIMER) approach was adopted to develop this expert system; which is named the Belief Rule-Based Expert System (BRBES). The system can handle various types of uncertainty in knowledge representation...... and inference procedures. The knowledge base of this system was constructed by using real patient data and expert opinion. Practical case studies were used to validate the system. The system-generated results are more effective and reliable in terms of accuracy than the results generated by a manual system....

  8. Quantifying reactor safety margins: Part 1: An overview of the code scaling, applicability, and uncertainty evaluation methodology

    International Nuclear Information System (INIS)

    Boyack, B.E.; Duffey, R.B.; Griffith, P.

    1988-01-01

    In August 1988, the Nuclear Regulatory Commission (NRC) approved the final version of a revised rule on the acceptance of emergency core cooling systems (ECCS) entitled ''Emergency Core Cooling System; Revisions to Acceptance Criteria.'' The revised rule states an alternate ECCS performance analysis, based on best-estimate methods, may be used to provide more realistic estimates of plant safety margins, provided the licensee quantifies the uncertainty of the estimates and included that uncertainty when comparing the calculated results with prescribed acceptance limits. To support the revised ECCS rule, the NRC and its contractors and consultants have developed and demonstrated a method called the Code Scaling, Applicability, and Uncertainty (CSAU) evaluation methodology. It is an auditable, traceable, and practical method for combining quantitative analyses and expert opinions to arrive at computed values of uncertainty. This paper provides an overview of the CSAU evaluation methodology and its application to a postulated cold-leg, large-break loss-of-coolant accident in a Westinghouse four-loop pressurized water reactor with 17 /times/ 17 fuel. The code selected for this demonstration of the CSAU methodology was TRAC-PF1/MOD1, Version 14.3. 23 refs., 5 figs., 1 tab

  9. The liability rules under international GHG emissions trading

    International Nuclear Information System (INIS)

    Zhong Xiang Zhang

    2001-01-01

    Article 17 of the Kyoto Protocol authorizes emissions trading, but the rules governing emissions trading have been deferred to subsequent conferences. In designing and implementing an international greenhouse gas (GHG) emissions trading scheme, assigning liability rules has been considered to be one of the most challenging issues. In general, a seller-beware liability works well in a strong enforcement environment. In the Kyoto Protocol, however, it may not always work. By contrast, a buyer-beware liability could be an effective deterrent to non-compliance, but the costs of imposing it are expected to be very high. To strike a middle ground, we suggest a combination of preventive measures with strong but feasible end-of-period punishments to ensure compliance with the Kyoto emissions commitments. Such measures aim to maximize efficiency gains from emissions trading and at the same time, to minimize over-selling risks. (author)

  10. Mechanisms of rule acquisition and rule following in inductive reasoning.

    Science.gov (United States)

    Crescentini, Cristiano; Seyed-Allaei, Shima; De Pisapia, Nicola; Jovicich, Jorge; Amati, Daniele; Shallice, Tim

    2011-05-25

    Despite the recent interest in the neuroanatomy of inductive reasoning processes, the regional specificity within prefrontal cortex (PFC) for the different mechanisms involved in induction tasks remains to be determined. In this study, we used fMRI to investigate the contribution of PFC regions to rule acquisition (rule search and rule discovery) and rule following. Twenty-six healthy young adult participants were presented with a series of images of cards, each consisting of a set of circles numbered in sequence with one colored blue. Participants had to predict the position of the blue circle on the next card. The rules that had to be acquired pertained to the relationship among succeeding stimuli. Responses given by subjects were categorized in a series of phases either tapping rule acquisition (responses given up to and including rule discovery) or rule following (correct responses after rule acquisition). Mid-dorsolateral PFC (mid-DLPFC) was active during rule search and remained active until successful rule acquisition. By contrast, rule following was associated with activation in temporal, motor, and medial/anterior prefrontal cortex. Moreover, frontopolar cortex (FPC) was active throughout the rule acquisition and rule following phases before a rule became familiar. We attributed activation in mid-DLPFC to hypothesis generation and in FPC to integration of multiple separate inferences. The present study provides evidence that brain activation during inductive reasoning involves a complex network of frontal processes and that different subregions respond during rule acquisition and rule following phases.

  11. TESTS AND METHODOLOGIES FOR THE SURVEY OF NARROW SPACES

    Directory of Open Access Journals (Sweden)

    L. Perfetti

    2017-02-01

    Full Text Available The research illustrated in this article aimed at identifying a good standard methodology to survey very narrow spaces during 3D investigation of Cultural Heritage. It is an important topic in today’s era of BIM modelling applied to Cultural Heritage. Spaces like staircases, corridors and passages are very common in the architectural or archaeological fields, and obtaining a 3D-oriented survey of those areas can be a very complex task when completeness of the model and high precision are requested. Photogrammetry appears to be the most promising solution in terms of versatility and manoeuvrability also considering the quality of the required data. Fisheye lenses were studied and tested in depth because of their significant advantage in the field of view if compared with rectilinear lenses. This advantage alone can be crucial to reduce the total amount of photos and, as a consequence, to obtain manageable data, to simplify the survey phase and to significantly reduce the elaboration time. In order to overcome the main issue that arise when using fisheye lenses, which is the lack of rules that can be employed to design the survey, a general mathematical formulation to precisely estimate the GSD (Ground Sampling Distance for every optical projection is presented here. A complete survey of a real complex case study was performed in order to test and stress the proposed methodology, and to handle a fisheye-based survey from beginning to end: the photogrammetric survey of the Minguzzi Staircase. It is a complex service spiral-staircase located in the Duomo di Milano with a total height of 25 meters and characterized by a narrow walkable space about 70 centimetres wide.

  12. Effects on ground motion related to spatial variability

    International Nuclear Information System (INIS)

    Vanmarcke, E.H.

    1987-01-01

    Models of the spectral content and the space-time correlation structure of strong earthquake ground motion are combined with transient random vibration analysis to yield site-specific response spectra that can account for the effect of local spatial averaging of the ground motion across a rigid foundation of prescribed size. The methodology is presented with reference to sites in eastern North America, although the basic approach is applicable to other seismic regions provided the source and attenuation parameters are regionally adjusted. Parameters in the spatial correlation model are based on data from the SMART-I accelerograph array, and the sensitivity of response spectra reduction factors with respect to these parameters is examined. The starting point of the analysis is the Fourier amplitude spectrum of site displacement expresses as a function of earthquake source parameters and source-to-site distance. The bedrock acceleration spectral density function at a point, derived from the displacement spectrum, is modified to account for anelastic attenuation, and where appropriate, for local soil effects and/or local spatial averaging across a foundation. Transient random vibration analysis yields approximate analytical expressions for median ground motion amplitudes and median response spectra of an earthquake defined in terms of its spectral density function and strong motion duration. The methodology is illustrated for three events characterized by their m b magnitude and epicentral distance. The focus in this paper is on the stochastic response prediction methodology enabling explicit accounting for strong motion duration and the effect of local spatial averaging on response spectra. The numerical examples enable a preliminary assessment of the reduction of response spectral amplitudes attributable to local spatial averaging across rigid foundations of different sizes. 36 refs

  13. Engineering uses of physics-based ground motion simulations

    Science.gov (United States)

    Baker, Jack W.; Luco, Nicolas; Abrahamson, Norman A.; Graves, Robert W.; Maechling, Phillip J.; Olsen, Kim B.

    2014-01-01

    This paper summarizes validation methodologies focused on enabling ground motion simulations to be used with confidence in engineering applications such as seismic hazard analysis and dynmaic analysis of structural and geotechnical systems. Numberical simullation of ground motion from large erthquakes, utilizing physics-based models of earthquake rupture and wave propagation, is an area of active research in the earth science community. Refinement and validatoin of these models require collaboration between earthquake scientists and engineering users, and testing/rating methodolgies for simulated ground motions to be used with confidence in engineering applications. This paper provides an introduction to this field and an overview of current research activities being coordinated by the Souther California Earthquake Center (SCEC). These activities are related both to advancing the science and computational infrastructure needed to produce ground motion simulations, as well as to engineering validation procedures. Current research areas and anticipated future achievements are also discussed.

  14. Phonological reduplication in sign language: rules rule

    Directory of Open Access Journals (Sweden)

    Iris eBerent

    2014-06-01

    Full Text Available Productivity—the hallmark of linguistic competence—is typically attributed to algebraic rules that support broad generalizations. Past research on spoken language has documented such generalizations in both adults and infants. But whether algebraic rules form part of the linguistic competence of signers remains unknown. To address this question, here we gauge the generalization afforded by American Sign Language (ASL. As a case study, we examine reduplication (X→XX—a rule that, inter alia, generates ASL nouns from verbs. If signers encode this rule, then they should freely extend it to novel syllables, including ones with features that are unattested in ASL. And since reduplicated disyllables are preferred in ASL, such rule should favor novel reduplicated signs. Novel reduplicated signs should thus be preferred to nonreduplicative controls (in rating, and consequently, such stimuli should also be harder to classify as nonsigns (in the lexical decision task. The results of four experiments support this prediction. These findings suggest that the phonological knowledge of signers includes powerful algebraic rules. The convergence between these conclusions and previous evidence for phonological rules in spoken language suggests that the architecture of the phonological mind is partly amodal.

  15. Grounding of Six Sigma s Breakthrough Cookbook: How to research a methodology?

    NARCIS (Netherlands)

    de Koning, H.; de Mast, J.

    2005-01-01

    The Six Sigma programme has developed into a standard for quality and efficiency improvement in business and industry. This fact makes scientific research into the validity and applicability of this methodology important. This article explores the possibilities of a scientific study of the

  16. In-medium QCD sum rules for {omega} meson, nucleon and D meson

    Energy Technology Data Exchange (ETDEWEB)

    Thomas, Ronny

    2008-07-01

    The modifications of hadronic properties caused by an ambient nuclear medium are investigated within the scope of QCD sum rules. This is exemplified for the cases of the {omega} meson, the nucleon and the D meson. By virtue of the sum rules, integrated spectral densities of these hadrons are linked to properties of the QCD ground state, quantified in condensates. For the cases of the {omega} meson and the nucleon it is discussed how the sum rules allow a restriction of the parameter range of poorly known four-quark condensates by a comparison of experimental and theoretical knowledge. The catalog of independent four-quark condensates is covered and relations among these condensates are revealed. The behavior of four-quark condensates under the chiral symmetry group and the relation to order parameters of spontaneous chiral symmetry breaking are outlined. In this respect, also the QCD condensates appearing in differences of sum rules of chiral partners are investigated. Finally, the effects of an ambient nuclear medium on the D meson are discussed and relevant condensates are identified. (orig.)

  17. 18 CFR 385.104 - Rule of construction (Rule 104).

    Science.gov (United States)

    2010-04-01

    ... Definitions § 385.104 Rule of construction (Rule 104). To the extent that the text of a rule is inconsistent with its caption, the text of the rule controls. [Order 376, 49 FR 21705, May 23, 1984] ...

  18. Rule-based land cover classification from very high-resolution satellite image with multiresolution segmentation

    Science.gov (United States)

    Haque, Md. Enamul; Al-Ramadan, Baqer; Johnson, Brian A.

    2016-07-01

    Multiresolution segmentation and rule-based classification techniques are used to classify objects from very high-resolution satellite images of urban areas. Custom rules are developed using different spectral, geometric, and textural features with five scale parameters, which exploit varying classification accuracy. Principal component analysis is used to select the most important features out of a total of 207 different features. In particular, seven different object types are considered for classification. The overall classification accuracy achieved for the rule-based method is 95.55% and 98.95% for seven and five classes, respectively. Other classifiers that are not using rules perform at 84.17% and 97.3% accuracy for seven and five classes, respectively. The results exploit coarse segmentation for higher scale parameter and fine segmentation for lower scale parameter. The major contribution of this research is the development of rule sets and the identification of major features for satellite image classification where the rule sets are transferable and the parameters are tunable for different types of imagery. Additionally, the individual objectwise classification and principal component analysis help to identify the required object from an arbitrary number of objects within images given ground truth data for the training.

  19. Methodology for Calculating Latency of GPS Probe Data

    Energy Technology Data Exchange (ETDEWEB)

    Young, Stanley E [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Wang, Zhongxiang [University of Maryland; Hamedi, Masoud [University of Maryland

    2017-10-01

    Crowdsourced GPS probe data, such as travel time on changeable-message signs and incident detection, have been gaining popularity in recent years as a source for real-time traffic information to driver operations and transportation systems management and operations. Efforts have been made to evaluate the quality of such data from different perspectives. Although such crowdsourced data are already in widespread use in many states, particularly the high traffic areas on the Eastern seaboard, concerns about latency - the time between traffic being perturbed as a result of an incident and reflection of the disturbance in the outsourced data feed - have escalated in importance. Latency is critical for the accuracy of real-time operations, emergency response, and traveler information systems. This paper offers a methodology for measuring probe data latency regarding a selected reference source. Although Bluetooth reidentification data are used as the reference source, the methodology can be applied to any other ground truth data source of choice. The core of the methodology is an algorithm for maximum pattern matching that works with three fitness objectives. To test the methodology, sample field reference data were collected on multiple freeway segments for a 2-week period by using portable Bluetooth sensors as ground truth. Equivalent GPS probe data were obtained from a private vendor, and their latency was evaluated. Latency at different times of the day, impact of road segmentation scheme on latency, and sensitivity of the latency to both speed-slowdown and recovery-from-slowdown episodes are also discussed.

  20. METHODOLOGICAL ELEMENTS OF SITUATIONAL ANALYSIS

    Directory of Open Access Journals (Sweden)

    Tetyana KOVALCHUK

    2016-07-01

    Full Text Available The article deals with the investigation of theoretical and methodological principles of situational analysis. The necessity of situational analysis is proved in modern conditions. The notion “situational analysis” is determined. We have concluded that situational analysis is a continuous system study which purpose is to identify dangerous situation signs, to evaluate comprehensively such signs influenced by a system of objective and subjective factors, to search for motivated targeted actions used to eliminate adverse effects of the exposure of the system to the situation now and in the future and to develop the managerial actions needed to bring the system back to norm. It is developed a methodological approach to the situational analysis, its goal is substantiated, proved the expediency of diagnostic, evaluative and searching functions in the process of situational analysis. The basic methodological elements of the situational analysis are grounded. The substantiation of the principal methodological elements of system analysis will enable the analyst to develop adaptive methods able to take into account the peculiar features of a unique object which is a situation that has emerged in a complex system, to diagnose such situation and subject it to system and in-depth analysis, to identify risks opportunities, to make timely management decisions as required by a particular period.

  1. FeynRules - Feynman rules made easy

    OpenAIRE

    Christensen, Neil D.; Duhr, Claude

    2008-01-01

    In this paper we present FeynRules, a new Mathematica package that facilitates the implementation of new particle physics models. After the user implements the basic model information (e.g. particle content, parameters and Lagrangian), FeynRules derives the Feynman rules and stores them in a generic form suitable for translation to any Feynman diagram calculation program. The model can then be translated to the format specific to a particular Feynman diagram calculator via F...

  2. Using extant literature in a grounded theory study: a personal account.

    Science.gov (United States)

    Yarwood-Ross, Lee; Jack, Kirsten

    2015-03-01

    To provide a personal account of the factors in a doctoral study that led to the adoption of classic grounded theory principles relating to the use of literature. Novice researchers considering grounded theory methodology will become aware of the contentious issue of how and when extant literature should be incorporated into a study. The three main grounded theory approaches are classic, Straussian and constructivist, and the seminal texts provide conflicting beliefs surrounding the use of literature. A classic approach avoids a pre-study literature review to minimise preconceptions and emphasises the constant comparison method, while the Straussian and constructivist approaches focus more on the beneficial aspects of an initial literature review and researcher reflexivity. The debate also extends into the wider academic community, where no consensus exists. This is a methodological paper detailing the authors' engagement in the debate surrounding the role of the literature in a grounded theory study. In the authors' experience, researchers can best understand the use of literature in grounded theory through immersion in the seminal texts, engaging with wider academic literature, and examining their preconceptions of the substantive area. The authors concluded that classic grounded theory principles were appropriate in the context of their doctoral study. Novice researchers will have their own sets of circumstances when preparing their studies and should become aware of the different perspectives to make decisions that they can ultimately justify. This paper can be used by other novice researchers as an example of the decision-making process that led to delaying a pre-study literature review and identifies the resources used to write a research proposal when using a classic grounded theory approach.

  3. Employees' and Managers' Accounts of Interactive Workplace Learning: A Grounded Theory of "Complex Integrative Learning"

    Science.gov (United States)

    Armson, Genevieve; Whiteley, Alma

    2010-01-01

    Purpose: The purpose of this paper is to investigate employees' and managers' accounts of interactive learning and what might encourage or inhibit emergent learning. Design/methodology/approach: The approach taken was a constructivist/social constructivist ontology, interpretive epistemology and qualitative methodology, using grounded theory…

  4. METHODOLOGY TO EVALUATE THE POTENTIAL FOR GROUND WATER CONTAMINATION FROM GEOTHERMAL FLUID RELEASES

    Science.gov (United States)

    This report provides analytical methods and graphical techniques to predict potential ground water contamination from geothermal energy development. Overflows and leaks from ponds, pipe leaks, well blowouts, leaks from well casing, and migration from injection zones can be handle...

  5. A New Methodology for Open Pit Slope Design in Karst-Prone Ground Conditions Based on Integrated Stochastic-Limit Equilibrium Analysis

    Science.gov (United States)

    Zhang, Ke; Cao, Ping; Ma, Guowei; Fan, Wenchen; Meng, Jingjing; Li, Kaihui

    2016-07-01

    Using the Chengmenshan Copper Mine as a case study, a new methodology for open pit slope design in karst-prone ground conditions is presented based on integrated stochastic-limit equilibrium analysis. The numerical modeling and optimization design procedure contain a collection of drill core data, karst cave stochastic model generation, SLIDE simulation and bisection method optimization. Borehole investigations are performed, and the statistical result shows that the length of the karst cave fits a negative exponential distribution model, but the length of carbonatite does not exactly follow any standard distribution. The inverse transform method and acceptance-rejection method are used to reproduce the length of the karst cave and carbonatite, respectively. A code for karst cave stochastic model generation, named KCSMG, is developed. The stability of the rock slope with the karst cave stochastic model is analyzed by combining the KCSMG code and the SLIDE program. This approach is then applied to study the effect of the karst cave on the stability of the open pit slope, and a procedure to optimize the open pit slope angle is presented.

  6. Optimal quadrature rules for odd-degree spline spaces and their application to tensor-product-based isogeometric analysis

    KAUST Repository

    Barton, Michael

    2016-03-14

    We introduce optimal quadrature rules for spline spaces that are frequently used in Galerkin discretizations to build mass and stiffness matrices. Using the homotopy continuation concept (Bartoň and Calo, 2016) that transforms optimal quadrature rules from source spaces to target spaces, we derive optimal rules for splines defined on finite domains. Starting with the classical Gaussian quadrature for polynomials, which is an optimal rule for a discontinuous odd-degree space, we derive rules for target spaces of higher continuity. We further show how the homotopy methodology handles cases where the source and target rules require different numbers of optimal quadrature points. We demonstrate it by deriving optimal rules for various odd-degree spline spaces, particularly with non-uniform knot sequences and non-uniform multiplicities. We also discuss convergence of our rules to their asymptotic counterparts, that is, the analogues of the midpoint rule of Hughes et al. (2010), that are exact and optimal for infinite domains. For spaces of low continuities, we numerically show that the derived rules quickly converge to their asymptotic counterparts as the weights and nodes of a few boundary elements differ from the asymptotic values.

  7. Optimal quadrature rules for odd-degree spline spaces and their application to tensor-product-based isogeometric analysis

    KAUST Repository

    Barton, Michael; Calo, Victor M.

    2016-01-01

    We introduce optimal quadrature rules for spline spaces that are frequently used in Galerkin discretizations to build mass and stiffness matrices. Using the homotopy continuation concept (Bartoň and Calo, 2016) that transforms optimal quadrature rules from source spaces to target spaces, we derive optimal rules for splines defined on finite domains. Starting with the classical Gaussian quadrature for polynomials, which is an optimal rule for a discontinuous odd-degree space, we derive rules for target spaces of higher continuity. We further show how the homotopy methodology handles cases where the source and target rules require different numbers of optimal quadrature points. We demonstrate it by deriving optimal rules for various odd-degree spline spaces, particularly with non-uniform knot sequences and non-uniform multiplicities. We also discuss convergence of our rules to their asymptotic counterparts, that is, the analogues of the midpoint rule of Hughes et al. (2010), that are exact and optimal for infinite domains. For spaces of low continuities, we numerically show that the derived rules quickly converge to their asymptotic counterparts as the weights and nodes of a few boundary elements differ from the asymptotic values.

  8. 75 FR 68392 - Self-Regulatory Organizations; The Options Clearing Corporation; Order Approving Proposed Rule...

    Science.gov (United States)

    2010-11-05

    ... Expand the Forms of Collateral Eligible for Incorporation in the System for Theoretical Analysis and Numerical Simulations Risk Management Methodology November 1, 2010. I. Introduction On August 25, 2010, The... Rules to expand the forms of collateral eligible for incorporation in OCC's System for Theoretical...

  9. Fisheye Photogrammetry: Tests and Methodologies for the Survey of Narrow Spaces

    Science.gov (United States)

    Perfetti, L.; Polari, C.; Fassi, F.

    2017-02-01

    The research illustrated in this article aimed at identifying a good standard methodology to survey very narrow spaces during 3D investigation of Cultural Heritage. It is an important topic in today's era of BIM modelling applied to Cultural Heritage. Spaces like staircases, corridors and passages are very common in the architectural or archaeological fields, and obtaining a 3D-oriented survey of those areas can be a very complex task when completeness of the model and high precision are requested. Photogrammetry appears to be the most promising solution in terms of versatility and manoeuvrability also considering the quality of the required data. Fisheye lenses were studied and tested in depth because of their significant advantage in the field of view if compared with rectilinear lenses. This advantage alone can be crucial to reduce the total amount of photos and, as a consequence, to obtain manageable data, to simplify the survey phase and to significantly reduce the elaboration time. In order to overcome the main issue that arise when using fisheye lenses, which is the lack of rules that can be employed to design the survey, a general mathematical formulation to precisely estimate the GSD (Ground Sampling Distance) for every optical projection is presented here. A complete survey of a real complex case study was performed in order to test and stress the proposed methodology, and to handle a fisheye-based survey from beginning to end: the photogrammetric survey of the Minguzzi Staircase. It is a complex service spiral-staircase located in the Duomo di Milano with a total height of 25 meters and characterized by a narrow walkable space about 70 centimetres wide.

  10. Development and evaluation of an ultrasonic ground water seepage meter.

    Science.gov (United States)

    Paulsen, R J; Smith, C F; O'Rourke, D; Wong, T F

    2001-01-01

    Submarine ground water discharge can influence significantly the near-shore transport and flux of chemicals into the oceans. Quantification of the sources and rates of such discharge requires a ground water seepage meter that provides continuous measurements at high resolution over an extended period of time. An ultrasonic flowmeter has been adapted for such measurements in the submarine environment. Connected to a steel collection funnel, the meter houses two piezoelectric transducers mounted at opposite ends of a cylindrical flow tube. By monitoring the perturbations of fluid flow on the propagation of sound waves inside the flow tube, the ultrasonic meter can measure both forward and reverse fluid flows in real time. Laboratory and field calibrations show that the ultrasonic meter can resolve ground water discharges on the order of 0.1 microm/sec, and it is sufficiently robust for deployment in the field for several days. Data from West Neck Bay, Shelter Island, New York, elucidate the temporal and spatial heterogeneity of submarine ground water discharge and its interplay with tidal loading. A negative correlation between the discharge and tidal elevation was generally observed. A methodology was also developed whereby data for the sound velocity as a function of temperature can be used to infer the salinity and source of the submarine discharge. Independent measurements of electrical conductance were performed to validate this methodology.

  11. Application of the NEI 95-10 methodology in the Rcic system of the Unit-1, to implement the criterions of the license renovation rule; (10 Cfr-54)

    International Nuclear Information System (INIS)

    Diaz, A.; Mendoza, G.; Arganis, C.; Viais, J.; Contreras, A.; Fernandez, G.; Medina, G.

    2012-10-01

    In December of 1991, the US National Regulatory Commission (US NRC) published the 10 Cfr-54, to establish the procedures, criterions and necessary requirements for the license renovation of a nuclear power station. In 1994 the US NRC proposed an amendment to these requirements that basically were centered in the effects of the aging in structures and passive components of long life and in the inclusion of Time Limited Aging Analyses (TLAAs). In a general way, is established that the requester should show to the regulator body that the effects of the aging in structures, systems and components are and will be appropriately negotiated or that the TLAAs has been evaluated for the operation extended period. The NEI 95-10 is a guide document developed by the Nuclear Energy Institute (NEI), to provide a focus accepted to cover the requirements of 10 Cfr-54, offering an efficient process that allows to any requester to complete in a practical way the requirements of the License Renovation Rule and to supplement their solicitude. This work presents the application of this guide to the Reactor Core Insulation Cooling (Rcic) of the Unit 1 of the nuclear power plant of Laguna Verde, elect as pilot system to carry out the application of the 10 Cfr-54 following the recommended methodology by the Industrial Guide for the implementation of the License Renovation Rule. (Author)

  12. Figure-ground segregation: A fully nonlocal approach.

    Science.gov (United States)

    Dimiccoli, Mariella

    2016-09-01

    We present a computational model that computes and integrates in a nonlocal fashion several configural cues for automatic figure-ground segregation. Our working hypothesis is that the figural status of each pixel is a nonlocal function of several geometric shape properties and it can be estimated without explicitly relying on object boundaries. The methodology is grounded on two elements: multi-directional linear voting and nonlinear diffusion. A first estimation of the figural status of each pixel is obtained as a result of a voting process, in which several differently oriented line-shaped neighborhoods vote to express their belief about the figural status of the pixel. A nonlinear diffusion process is then applied to enforce the coherence of figural status estimates among perceptually homogeneous regions. Computer simulations fit human perception and match the experimental evidence that several cues cooperate in defining figure-ground segregation. The results of this work suggest that figure-ground segregation involves feedback from cells with larger receptive fields in higher visual cortical areas. Copyright © 2015 Elsevier Ltd. All rights reserved.

  13. Demonstration of a performance assessment methodology for high-level radioactive waste disposal in basalt formations

    International Nuclear Information System (INIS)

    Bonano, E.J.; Davis, P.A.; Shipers, L.R.; Brinster, K.F.; Beyler, W.E.; Updegraff, C.D.; Shepherd, E.R.; Tilton, L.M.; Wahi, K.K.

    1989-06-01

    This document describes a performance assessment methodology developed for a high-level radioactive waste repository mined in deep basalt formations. This methodology is an extension of an earlier one applicable to bedded salt. The differences between the two methodologies arise primarily in the modeling of round-water flow and radionuclide transport. Bedded salt was assumed to be a porous medium, whereas basalt formations contain fractured zones. Therefore, mathematical models and associated computer codes were developed to simulate the aforementioned phenomena in fractured media. The use of the methodology is demonstrated at a hypothetical basalt site by analyzing seven scenarios: (1) thermohydrological effects caused by heat released from the repository, (2) mechanohydrological effects caused by an advancing and receding glacier, (3) normal ground-water flow, (4) pumping of ground water from a confined aquifer, (5) rerouting of a river near the repository, (6) drilling of a borehole through the repository, and (7) formation of a new fault intersecting the repository. The normal ground-water flow was considered the base-case scenario. This scenario was used to perform uncertainty and sensitivity analyses and to demonstrate the existing capabilities for assessing compliance with the ground-water travel time criterion and the containment requirements. Most of the other scenarios were considered perturbations of the base case, and a few were studied in terms of changes with respect to initial conditions. The potential impact of these scenarios on the long-term performance of the disposal system was ascertained through comparison with the base-case scenario or the undisturbed initial conditions. 66 refs., 106 figs., 27 tabs

  14. Confirming a predicted selection rule in inelastic neutron scattering spectroscopy: the quantum translator-rotator H2 entrapped inside C60.

    Science.gov (United States)

    Xu, Minzhong; Jiménez-Ruiz, Mónica; Johnson, Mark R; Rols, Stéphane; Ye, Shufeng; Carravetta, Marina; Denning, Mark S; Lei, Xuegong; Bačić, Zlatko; Horsewill, Anthony J

    2014-09-19

    We report an inelastic neutron scattering (INS) study of a H2 molecule encapsulated inside the fullerene C60 which confirms the recently predicted selection rule, the first to be established for the INS spectroscopy of aperiodic, discrete molecular compounds. Several transitions from the ground state of para-H2 to certain excited translation-rotation states, forbidden according to the selection rule, are systematically absent from the INS spectra, thus validating the selection rule with a high degree of confidence. Its confirmation sets a precedent, as it runs counter to the widely held view that the INS spectroscopy of molecular compounds is not subject to any selection rules.

  15. Software Development and Test Methodology for a Distributed Ground System

    Science.gov (United States)

    Ritter, George; Guillebeau, Pat; McNair, Ann R. (Technical Monitor)

    2002-01-01

    The Marshall Space Flight Center's (MSFC) Payload Operations Center (POC) ground system has evolved over a period of about 10 years. During this time the software processes have migrated from more traditional to more contemporary development processes in an effort to minimize unnecessary overhead while maximizing process benefits. The Software processes that have evolved still emphasize requirements capture, software configuration management, design documenting, and making sure the products that have been developed are accountable to initial requirements. This paper will give an overview of how the Software Processes have evolved, highlighting the positives as well as the negatives. In addition, we will mention the COTS tools that have been integrated into the processes and how the COTS have provided value to the project.

  16. A 3D Hybrid Integration Methodology for Terabit Transceivers

    DEFF Research Database (Denmark)

    Dong, Yunfeng; Johansen, Tom Keinicke; Zhurbenko, Vitaliy

    2015-01-01

    integration are described. An equivalent circuit model of the via-throughs connecting the RF circuitry to the modulator is proposed and its lumped element parameters are extracted. Wire bonding transitions between the driving and RF circuitry were designed and simulated. An optimized 3D interposer design......This paper presents a three-dimensional (3D) hybrid integration methodology for terabit transceivers. The simulation methodology for multi-conductor structures are explained. The effect of ground vias on the RF circuitry and the preferred interposer substrate material for large bandwidth 3D hybrid...

  17. Evaluation methodology for fixed-site physical protection systems

    International Nuclear Information System (INIS)

    Bennett, H.A.; Olascoaga, M.T.

    1980-01-01

    A system performance evaluation methodology has been developed to aid the Nuclear Regulatory Commission (NRC) in the implementation of new regulations designed to upgrade the physical protection of nuclear fuel cycle facilities. The evaluation methodology, called Safeguards Upgrade Rule Evaluation (SURE), provides a means of explicitly incorporating measures for highly important and often difficult to quantify performance factors, e.g., installation, maintenance, training and proficiency levels, compatibility of components in subsystems, etc. This is achieved by aggregating responses to component and system questionaires through successive levels of a functional hierarchy developed for each primary performance capability specified in the regulations, 10 CFR 73.45. An overall measure of performance for each capability is the result of this aggregation process. This paper provides a descripton of SURE

  18. Dibenzoheptazethrene isomers with different biradical characters: An exercise of clar's aromatic sextet rule in singlet biradicaloids

    KAUST Repository

    Sun, Zhe; Lee, Sangsu; Park, Kyuhyung; Zhu, Xiaojian; Zhang, Wenhua; Zheng, Bin; Hu, Pan; Zeng, Zebing; Das, Soumyajit; Li, Yuan; Chi, Chunyan; Li, Runwei; Huang, Kuo-Wei; Ding, Jun; Kim, Dongho; Wu, Jishan

    2013-01-01

    that the number of aromatic sextet rings plays an important role in determination of their ground states. In order to test the validity of this rule in singlet biradicaloids, the two soluble and stable dibenzoheptazethrene isomers DBHZ1 and DBHZ2 were prepared

  19. Large scale comparative codon-pair context analysis unveils general rules that fine-tune evolution of mRNA primary structure.

    Directory of Open Access Journals (Sweden)

    Gabriela Moura

    Full Text Available BACKGROUND: Codon usage and codon-pair context are important gene primary structure features that influence mRNA decoding fidelity. In order to identify general rules that shape codon-pair context and minimize mRNA decoding error, we have carried out a large scale comparative codon-pair context analysis of 119 fully sequenced genomes. METHODOLOGIES/PRINCIPAL FINDINGS: We have developed mathematical and software tools for large scale comparative codon-pair context analysis. These methodologies unveiled general and species specific codon-pair context rules that govern evolution of mRNAs in the 3 domains of life. We show that evolution of bacterial and archeal mRNA primary structure is mainly dependent on constraints imposed by the translational machinery, while in eukaryotes DNA methylation and tri-nucleotide repeats impose strong biases on codon-pair context. CONCLUSIONS: The data highlight fundamental differences between prokaryotic and eukaryotic mRNA decoding rules, which are partially independent of codon usage.

  20. New Safety rules

    CERN Multimedia

    Safety Commission

    2008-01-01

    The revision of CERN Safety rules is in progress and the following new Safety rules have been issued on 15-04-2008: Safety Procedure SP-R1 Establishing, Updating and Publishing CERN Safety rules: http://cern.ch/safety-rules/SP-R1.htm; Safety Regulation SR-S Smoking at CERN: http://cern.ch/safety-rules/SR-S.htm; Safety Regulation SR-M Mechanical Equipment: http://cern.ch/safety-rules/SR-M.htm; General Safety Instruction GSI-M1 Standard Lifting Equipment: http://cern.ch/safety-rules/GSI-M1.htm; General Safety Instruction GSI-M2 Standard Pressure Equipment: http://cern.ch/safety-rules/GSI-M2.htm; General Safety Instruction GSI-M3 Special Mechanical Equipment: http://cern.ch/safety-rules/GSI-M3.htm. These documents apply to all persons under the Director General’s authority. All Safety rules are available at the web page: http://www.cern.ch/safety-rules The Safety Commission

  1. Decision Tree Repository and Rule Set Based Mingjiang River Estuarine Wetlands Classifaction

    Science.gov (United States)

    Zhang, W.; Li, X.; Xiao, W.

    2018-05-01

    The increasing urbanization and industrialization have led to wetland losses in estuarine area of Mingjiang River over past three decades. There has been increasing attention given to produce wetland inventories using remote sensing and GIS technology. Due to inconsistency training site and training sample, traditionally pixel-based image classification methods can't achieve a comparable result within different organizations. Meanwhile, object-oriented image classification technique shows grate potential to solve this problem and Landsat moderate resolution remote sensing images are widely used to fulfill this requirement. Firstly, the standardized atmospheric correct, spectrally high fidelity texture feature enhancement was conducted before implementing the object-oriented wetland classification method in eCognition. Secondly, we performed the multi-scale segmentation procedure, taking the scale, hue, shape, compactness and smoothness of the image into account to get the appropriate parameters, using the top and down region merge algorithm from single pixel level, the optimal texture segmentation scale for different types of features is confirmed. Then, the segmented object is used as the classification unit to calculate the spectral information such as Mean value, Maximum value, Minimum value, Brightness value and the Normalized value. The Area, length, Tightness and the Shape rule of the image object Spatial features and texture features such as Mean, Variance and Entropy of image objects are used as classification features of training samples. Based on the reference images and the sampling points of on-the-spot investigation, typical training samples are selected uniformly and randomly for each type of ground objects. The spectral, texture and spatial characteristics of each type of feature in each feature layer corresponding to the range of values are used to create the decision tree repository. Finally, with the help of high resolution reference images, the

  2. The Alexander-Zweig (OZI) rule revisited

    International Nuclear Information System (INIS)

    Lipkin, H.J.

    1989-03-01

    Predictions, theoretical bases, experimental tests and violations of various versions of the A-Z (OZI) rule are examined. Dynamical mechanisms responsible for violations include allowed two-step transitions via intermediate states containing ordinary hadrons, gluons, flavor-mixed hadrons like η, η' or f 0 (S * ), and exotic hadrons like glueballs, multiquark states and hybrids. All can be produced via a strange component and decay into pions or vice versa. Each case is described by a different mechanism with a different suppression factor. OZI-forbidden production processes for φ and f' mesons are shown on general grounds to be less suppressed than forbidden decays, without assuming the presence of strange quarks in baryons. (author)

  3. Cortical Dynamics of Figure-Ground Separation in Response to 2D Pictures and 3D Scenes: How V2 Combines Border Ownership, Stereoscopic Cues, and Gestalt Grouping Rules

    Science.gov (United States)

    Grossberg, Stephen

    2016-01-01

    The FACADE model, and its laminar cortical realization and extension in the 3D LAMINART model, have explained, simulated, and predicted many perceptual and neurobiological data about how the visual cortex carries out 3D vision and figure-ground perception, and how these cortical mechanisms enable 2D pictures to generate 3D percepts of occluding and occluded objects. In particular, these models have proposed how border ownership occurs, but have not yet explicitly explained the correlation between multiple properties of border ownership neurons in cortical area V2 that were reported in a remarkable series of neurophysiological experiments by von der Heydt and his colleagues; namely, border ownership, contrast preference, binocular stereoscopic information, selectivity for side-of-figure, Gestalt rules, and strength of attentional modulation, as well as the time course during which such properties arise. This article shows how, by combining 3D LAMINART properties that were discovered in two parallel streams of research, a unified explanation of these properties emerges. This explanation proposes, moreover, how these properties contribute to the generation of consciously seen 3D surfaces. The first research stream models how processes like 3D boundary grouping and surface filling-in interact in multiple stages within and between the V1 interblob—V2 interstripe—V4 cortical stream and the V1 blob—V2 thin stripe—V4 cortical stream, respectively. Of particular importance for understanding figure-ground separation is how these cortical interactions convert computationally complementary boundary and surface mechanisms into a consistent conscious percept, including the critical use of surface contour feedback signals from surface representations in V2 thin stripes to boundary representations in V2 interstripes. Remarkably, key figure-ground properties emerge from these feedback interactions. The second research stream shows how cells that compute absolute disparity

  4. Cortical Dynamics of Figure-Ground Separation in Response to 2D Pictures and 3D Scenes: How V2 Combines Border Ownership, Stereoscopic Cues, and Gestalt Grouping Rules.

    Science.gov (United States)

    Grossberg, Stephen

    2015-01-01

    The FACADE model, and its laminar cortical realization and extension in the 3D LAMINART model, have explained, simulated, and predicted many perceptual and neurobiological data about how the visual cortex carries out 3D vision and figure-ground perception, and how these cortical mechanisms enable 2D pictures to generate 3D percepts of occluding and occluded objects. In particular, these models have proposed how border ownership occurs, but have not yet explicitly explained the correlation between multiple properties of border ownership neurons in cortical area V2 that were reported in a remarkable series of neurophysiological experiments by von der Heydt and his colleagues; namely, border ownership, contrast preference, binocular stereoscopic information, selectivity for side-of-figure, Gestalt rules, and strength of attentional modulation, as well as the time course during which such properties arise. This article shows how, by combining 3D LAMINART properties that were discovered in two parallel streams of research, a unified explanation of these properties emerges. This explanation proposes, moreover, how these properties contribute to the generation of consciously seen 3D surfaces. The first research stream models how processes like 3D boundary grouping and surface filling-in interact in multiple stages within and between the V1 interblob-V2 interstripe-V4 cortical stream and the V1 blob-V2 thin stripe-V4 cortical stream, respectively. Of particular importance for understanding figure-ground separation is how these cortical interactions convert computationally complementary boundary and surface mechanisms into a consistent conscious percept, including the critical use of surface contour feedback signals from surface representations in V2 thin stripes to boundary representations in V2 interstripes. Remarkably, key figure-ground properties emerge from these feedback interactions. The second research stream shows how cells that compute absolute disparity in

  5. Reversal of Hückel (anti)aromaticity in the lowest triplet states of hexaphyrins and spectroscopic evidence for Baird's rule

    Science.gov (United States)

    Sung, Young Mo; Yoon, Min-Chul; Lim, Jong Min; Rath, Harapriya; Naoda, Koji; Osuka, Atsuhiro; Kim, Dongho

    2015-05-01

    The reversal of (anti)aromaticity in a molecule's triplet excited state compared with its closed-shell singlet ground state is known as Baird's rule and has attracted the interest of synthetic, physical organic chemists and theorists because of the potential to modulate the fundamental properties of highly conjugated molecules. Here we show that two closely related bis-rhodium hexaphyrins (R26H and R28H) containing [26] and [28] π-electron peripheries, respectively, exhibit properties consistent with Baird's rule. In the ground state, R26H exhibits a sharp Soret-like band and distinct Q-like bands characteristic of an aromatic porphyrinoid, whereas R28H exhibits a broad absorption spectrum without Q-like bands, which is typical of an antiaromatic porphyrinoid. In contrast, the T-T absorption of R26H is broad, weak and featureless, whereas that of R28H displays an intense and sharp Soret-like band. These spectral signatures, in combination with quantum chemical calculations, are in line with qualitative expectations based on Baird's rule.

  6. Relations between emotions, display rules, social motives, and facial behaviour.

    Science.gov (United States)

    Zaalberg, Ruud; Manstead, Antony; Fischer, Agneta

    2004-02-01

    We report research on the relations between emotions, display rules, social motives, and facial behaviour. In Study 1 we used a questionnaire methodology to examine how respondents would react to a funny or a not funny joke told to them by a close friend or a stranger. We assessed display rules and motivations for smiling and/or laughing. Display rules and social motives (partly) mediated the relationship between the experimental manipulations and self-reported facial behaviour. Study 2 was a laboratory experiment in which funny or not funny jokes were told to participants by a male or female stranger. Consistent with hypotheses, hearing a funny joke evoked a stronger motivation to share positive affect by showing longer Duchenne smiling. Contrary to hypotheses, a not funny joke did not elicit greater prosocial motivation by showing longer "polite" smiling, although such a smiling pattern did occur. Rated funniness of the joke and the motivation to share positive affect mediated the relationship between the joke manipulation and facial behaviour. Path analysis was used to explore this mediating process in greater detail.

  7. A Nonlinear Programming and Artificial Neural Network Approach for Optimizing the Performance of a Job Dispatching Rule in a Wafer Fabrication Factory

    Directory of Open Access Journals (Sweden)

    Toly Chen

    2012-01-01

    Full Text Available A nonlinear programming and artificial neural network approach is presented in this study to optimize the performance of a job dispatching rule in a wafer fabrication factory. The proposed methodology fuses two existing rules and constructs a nonlinear programming model to choose the best values of parameters in the two rules by dynamically maximizing the standard deviation of the slack, which has been shown to benefit scheduling performance by several studies. In addition, a more effective approach is also applied to estimate the remaining cycle time of a job, which is empirically shown to be conducive to the scheduling performance. The efficacy of the proposed methodology was validated with a simulated case; evidence was found to support its effectiveness. We also suggested several directions in which it can be exploited in the future.

  8. The application of grounded theory and symbolic interactionism.

    Science.gov (United States)

    Jeon, Yun-Hee

    2004-09-01

    This paper describes the methodological and theoretical context and underpinnings of a study that examined community psychiatric nurses' work with family caregivers of older people with depression. The study used grounded theory research methods, with its theoretical foundations drawn from symbolic interactionism. The aims of the study were to describe and conceptualize the processes involved when community nurses work and interact with family caregivers and to develop an explanatory theory of these processes. This paper begins with an explanation of the rationale for using grounded theory as the method of choice, followed by a discussion of the theoretical underpinnings of the study, including a brief summary of the nature and origins of symbolic interactionism. Key premises of symbolic interactionism regarded as central to the study are outlined and an analytical overview of the grounded theory method is provided. The paper concludes with a commentary on some of the issues and debates in the use of grounded theory in nursing research. The main purpose of this paper is to provide a methodical and critical review of symbolic interactionism and grounded theory that can help readers, particularly those who are intending to use grounded theory, better understand the processes involved in applying this method to their research.

  9. Effects of energy development on ground water quality: an overview and preliminary assessment

    International Nuclear Information System (INIS)

    Parker, W.M. III; Yin, S.C.L.; Davis, M.J.; Kutz, W.J.

    1981-07-01

    A preliminary national overview of the various effects on ground water quality likely to result from energy development. Based on estimates of present and projected energy-development activities, those regions of the country are identified where ground water quality has the potential for being adversely affected. The general causes of change in ground water quality are reviewed. Specific effects on ground water quality of selected energy technologies are discussed, and some case-history material is provided. A brief overview of pertinent legislation relating to the protection and management of ground water quality is presented. Six methodologies that have some value for assessing the potential effects on ground water quality of energy development activities are reviewed. A method of identifying regions in the 48 contiguous states where there is a potential for ground water quality problems is described and then applied

  10. Modeling ground-based timber harvesting systems using computer simulation

    Science.gov (United States)

    Jingxin Wang; Chris B. LeDoux

    2001-01-01

    Modeling ground-based timber harvesting systems with an object-oriented methodology was investigated. Object-oriented modeling and design promote a better understanding of requirements, cleaner designs, and better maintainability of the harvesting simulation system. The model developed simulates chainsaw felling, drive-to-tree feller-buncher, swing-to-tree single-grip...

  11. Rough set and rule-based multicriteria decision aiding

    Directory of Open Access Journals (Sweden)

    Roman Slowinski

    2012-08-01

    Full Text Available The aim of multicriteria decision aiding is to give the decision maker a recommendation concerning a set of objects evaluated from multiple points of view called criteria. Since a rational decision maker acts with respect to his/her value system, in order to recommend the most-preferred decision, one must identify decision maker's preferences. In this paper, we focus on preference discovery from data concerning some past decisions of the decision maker. We consider the preference model in the form of a set of "if..., then..." decision rules discovered from the data by inductive learning. To structure the data prior to induction of rules, we use the Dominance-based Rough Set Approach (DRSA. DRSA is a methodology for reasoning about data, which handles ordinal evaluations of objects on considered criteria and monotonic relationships between these evaluations and the decision. We review applications of DRSA to a large variety of multicriteria decision problems.

  12. Simulation of operating rules and discretional decisions using a fuzzy rule-based system integrated into a water resources management model

    Science.gov (United States)

    Macian-Sorribes, Hector; Pulido-Velazquez, Manuel

    2013-04-01

    water stored in the reservoirs) and the month of the year as inputs; and the demand deliveries as outputs. The developed simulation management model integrates the fuzzy-ruled system of the operation of the two main reservoirs of the basin with the corresponding mass balance equations, the physical or boundary conditions and the water allocation rules among the competing demands. Historical information on inflow time series is used as inputs to the model simulation, being trained and validated using historical information on reservoir storage level and flow in several streams of the Mijares river. This methodology provides a more flexible and close to real policies approach. The model is easy to develop and to understand due to its rule-based structure, which mimics the human way of thinking. This can improve cooperation and negotiation between managers, decision-makers and stakeholders. The approach can be also applied to analyze the historical operation of the reservoir (what we have called a reservoir operation "audit").

  13. Ground test for vibration control demonstrator

    Science.gov (United States)

    Meyer, C.; Prodigue, J.; Broux, G.; Cantinaud, O.; Poussot-Vassal, C.

    2016-09-01

    In the objective of maximizing comfort in Falcon jets, Dassault Aviation is developing an innovative vibration control technology. Vibrations of the structure are measured at several locations and sent to a dedicated high performance vibration control computer. Control laws are implemented in this computer to analyse the vibrations in real time, and then elaborate orders sent to the existing control surfaces to counteract vibrations. After detailing the technology principles, this paper focuses on the vibration control ground demonstration that was performed by Dassault Aviation in May 2015 on Falcon 7X business jet. The goal of this test was to attenuate vibrations resulting from fixed forced excitation delivered by shakers. The ground test demonstrated the capability to implement an efficient closed-loop vibration control with a significant vibration level reduction and validated the vibration control law design methodology. This successful ground test was a prerequisite before the flight test demonstration that is now being prepared. This study has been partly supported by the JTI CleanSky SFWA-ITD.

  14. Rules and routines in organizations and the management of safety rules

    Energy Technology Data Exchange (ETDEWEB)

    Weichbrodt, J. Ch.

    2013-07-01

    This thesis is concerned with the relationship between rules and routines in organizations and how the former can be used to steer the latter. Rules are understood as formal organizational artifacts, whereas organizational routines are collective patterns of action. While research on routines has been thriving, a clear understanding of how rules can be used to influence or control organizational routines (and vice-versa) is still lacking. This question is of particular relevance to safety rules in high-risk organizations, where the way in which organizational routines unfold can ultimately be a matter of life and death. In these organizations, an important and related issue is the balancing of standardization and flexibility – which, in the case of rules, takes the form of finding the right degree of formalization. In high-risk organizations, the question is how to adequately regulate actors’ routines in order to facilitate safe behavior, while at the same time leaving enough leeway for actors to make good decisions in abnormal situations. The railroads are regarded as high-risk industries and also rely heavily on formal rules. In this thesis, the Swiss Federal Railways (SBB) were therefore selected for a field study on rules and routines. The issues outlined so far are being tackled theoretically (paper 1), empirically (paper 2), and from a practitioner’s (i.e., rule maker’s) point of view (paper 3). In paper 1, the relationship between rules and routines is theoretically conceptualized, based on a literature review. Literature on organizational control and coordination, on rules in human factors and safety, and on organizational routines is combined. Three distinct roles (rule maker, rule supervisor, and rule follower) are outlined. Six propositions are developed regarding the necessary characteristics of both routines and rules, the respective influence of the three roles on the rule-routine relationship, and regarding organizational aspects such as

  15. Rules and routines in organizations and the management of safety rules

    International Nuclear Information System (INIS)

    Weichbrodt, J. Ch.

    2013-01-01

    This thesis is concerned with the relationship between rules and routines in organizations and how the former can be used to steer the latter. Rules are understood as formal organizational artifacts, whereas organizational routines are collective patterns of action. While research on routines has been thriving, a clear understanding of how rules can be used to influence or control organizational routines (and vice-versa) is still lacking. This question is of particular relevance to safety rules in high-risk organizations, where the way in which organizational routines unfold can ultimately be a matter of life and death. In these organizations, an important and related issue is the balancing of standardization and flexibility – which, in the case of rules, takes the form of finding the right degree of formalization. In high-risk organizations, the question is how to adequately regulate actors’ routines in order to facilitate safe behavior, while at the same time leaving enough leeway for actors to make good decisions in abnormal situations. The railroads are regarded as high-risk industries and also rely heavily on formal rules. In this thesis, the Swiss Federal Railways (SBB) were therefore selected for a field study on rules and routines. The issues outlined so far are being tackled theoretically (paper 1), empirically (paper 2), and from a practitioner’s (i.e., rule maker’s) point of view (paper 3). In paper 1, the relationship between rules and routines is theoretically conceptualized, based on a literature review. Literature on organizational control and coordination, on rules in human factors and safety, and on organizational routines is combined. Three distinct roles (rule maker, rule supervisor, and rule follower) are outlined. Six propositions are developed regarding the necessary characteristics of both routines and rules, the respective influence of the three roles on the rule-routine relationship, and regarding organizational aspects such as

  16. INFORMATION USE ABOUT THE LEVEL OF AIRCRAFT FLIGHTS GROUND PROVISION TO PLAN AIR TRAFFIC

    Directory of Open Access Journals (Sweden)

    2016-01-01

    Full Text Available The given article considers the task of building up the best aircraft route on the basis of information about the level of flight ground provision. Disadvantages of traditional radar surveillance facilities are given. Four types of Russian Feder- ation aerospace depending on the level of ground radio flight provision are considered. Relevance of selecting an aircraft route from the view of necessity to plan aerospace is substantiated. The formula to calculate probabilities of obtaining not correct aircraft navigation data is given. The analysis of errors arising while building up the aircraft route linked with both operational navigation and communication equipment faults as well as with a human factor is carried out. Formulas of wrong route selecting probability when an aircraft track changes or is maintained are suggested. A generalized weighted index of losses on the basis of various factors affecting an aircraft track change is introduced. Importance of these factors are considered. A rule of aircraft transition to the next route point is formulated. The conclusion is made which route is the most rational in case of following the rule of route selecting at every flight stage. Practical recommendations which can be used to solve conflict between aircraft cruising under the given rule are suggested.

  17. Case Study Research Methodology in Nursing Research.

    Science.gov (United States)

    Cope, Diane G

    2015-11-01

    Through data collection methods using a holistic approach that focuses on variables in a natural setting, qualitative research methods seek to understand participants' perceptions and interpretations. Common qualitative research methods include ethnography, phenomenology, grounded theory, and historic research. Another type of methodology that has a similar qualitative approach is case study research, which seeks to understand a phenomenon or case from multiple perspectives within a given real-world context.

  18. Probabilistic prediction of expected ground condition and construction time and costs in road tunnels

    Directory of Open Access Journals (Sweden)

    A. Mahmoodzadeh

    2016-10-01

    Full Text Available Ground condition and construction (excavation and support time and costs are the key factors in decision-making during planning and design phases of a tunnel project. An innovative methodology for probabilistic estimation of ground condition and construction time and costs is proposed, which is an integration of the ground prediction approach based on Markov process, and the time and cost variance analysis based on Monte-Carlo (MC simulation. The former provides the probabilistic description of ground classification along tunnel alignment according to the geological information revealed from geological profile and boreholes. The latter provides the probabilistic description of the expected construction time and costs for each operation according to the survey feedbacks from experts. Then an engineering application to Hamro tunnel is presented to demonstrate how the ground condition and the construction time and costs are estimated in a probabilistic way. In most items, in order to estimate the data needed for this methodology, a number of questionnaires are distributed among the tunneling experts and finally the mean values of the respondents are applied. These facilitate both the owners and the contractors to be aware of the risk that they should carry before construction, and are useful for both tendering and bidding.

  19. 78 FR 24257 - Self-Regulatory Organizations; The Options Clearing Corporation; Order Approving Proposed Rule...

    Science.gov (United States)

    2013-04-24

    ... positions are in-the- money or out-of-the-money. Volume, like open interest, is a measure of a Clearing... demands on OCC's services and facilities that are not captured by the current methodology. IV. Conclusion... Organizations; The Options Clearing Corporation; Order Approving Proposed Rule Change To Implement a Revised...

  20. Complex-energy approach to sum rules within nuclear density functional theory

    Science.gov (United States)

    Hinohara, Nobuo; Kortelainen, Markus; Nazarewicz, Witold; Olsen, Erik

    2015-04-01

    Background: The linear response of the nucleus to an external field contains unique information about the effective interaction, the correlations governing the behavior of the many-body system, and the properties of its excited states. To characterize the response, it is useful to use its energy-weighted moments, or sum rules. By comparing computed sum rules with experimental values, the information content of the response can be utilized in the optimization process of the nuclear Hamiltonian or the nuclear energy density functional (EDF). But the additional information comes at a price: compared to the ground state, computation of excited states is more demanding. Purpose: To establish an efficient framework to compute energy-weighted sum rules of the response that is adaptable to the optimization of the nuclear EDF and large-scale surveys of collective strength, we have developed a new technique within the complex-energy finite-amplitude method (FAM) based on the quasiparticle random-phase approximation (QRPA). Methods: To compute sum rules, we carry out contour integration of the response function in the complex-energy plane. We benchmark our results against the conventional matrix formulation of the QRPA theory, the Thouless theorem for the energy-weighted sum rule, and the dielectric theorem for the inverse-energy-weighted sum rule. Results: We derive the sum-rule expressions from the contour integration of the complex-energy FAM. We demonstrate that calculated sum-rule values agree with those obtained from the matrix formulation of the QRPA. We also discuss the applicability of both the Thouless theorem about the energy-weighted sum rule and the dielectric theorem for the inverse-energy-weighted sum rule to nuclear density functional theory in cases when the EDF is not based on a Hamiltonian. Conclusions: The proposed sum-rule technique based on the complex-energy FAM is a tool of choice when optimizing effective interactions or energy functionals. The method

  1. Action Learning and Constructivist Grounded Theory: Powerfully Overlapping Fields of Practice

    Science.gov (United States)

    Rand, Jane

    2013-01-01

    This paper considers the shared characteristics between action learning (AL) and the research methodology constructivist grounded theory (CGT). Mirroring Edmonstone's [2011. "Action Learning and Organisation Development: Overlapping Fields of Practice." "Action Learning: Research and Practice" 8 (2): 93-102] article, which…

  2. Adding Theoretical Grounding to Grounded Theory: Toward Multi-Grounded Theory

    OpenAIRE

    Göran Goldkuhl; Stefan Cronholm

    2010-01-01

    The purpose of this paper is to challenge some of the cornerstones of the grounded theory approach and propose an extended and alternative approach for data analysis and theory development, which the authors call multi-grounded theory (MGT). A multi-grounded theory is not only empirically grounded; it is also grounded in other ways. Three different grounding processes are acknowledged: theoretical, empirical, and internal grounding. The authors go beyond the pure inductivist approach in GT an...

  3. Exploration of SWRL Rule Bases through Visualization, Paraphrasing, and Categorization of Rules

    Science.gov (United States)

    Hassanpour, Saeed; O'Connor, Martin J.; Das, Amar K.

    Rule bases are increasingly being used as repositories of knowledge content on the Semantic Web. As the size and complexity of these rule bases increases, developers and end users need methods of rule abstraction to facilitate rule management. In this paper, we describe a rule abstraction method for Semantic Web Rule Language (SWRL) rules that is based on lexical analysis and a set of heuristics. Our method results in a tree data structure that we exploit in creating techniques to visualize, paraphrase, and categorize SWRL rules. We evaluate our approach by applying it to several biomedical ontologies that contain SWRL rules, and show how the results reveal rule patterns within the rule base. We have implemented our method as a plug-in tool for Protégé-OWL, the most widely used ontology modeling software for the Semantic Web. Our tool can allow users to rapidly explore content and patterns in SWRL rule bases, enabling their acquisition and management.

  4. Educational Policymaking and the Methodology of Positive Economics: A Theoretical Critique

    Science.gov (United States)

    Gilead, Tal

    2014-01-01

    By critically interrogating the methodological foundations of orthodox economic theory, Tal Gilead challenges the growing conviction in educational policymaking quarters that, being more scientific than other forms of educational investigation, inquiries grounded in orthodox economics should provide the basis for educational policymaking. He…

  5. Building State Capacity to Achieve Government Victory during Civil War

    Science.gov (United States)

    2011-12-01

    There are five qualitative research methods used to study and test a hypothesis: phenomenology, ethnography , case study research , grounded theory...a willful observance to the rule of law. Methodology Quantitative and qualitative methodologies are two common approaches used by researchers to...discovering cause and effect, or other correlations between measured variables.56 Qualitative research uses non-numerical data, such as personal interviews

  6. "Emergence" vs. "Forcing" of Empirical Data? A Crucial Problem of "Grounded Theory" Reconsidered

    Directory of Open Access Journals (Sweden)

    Udo Kelle

    2005-05-01

    Full Text Available Since the late 1960s Barney GLASER and Anselm STRAUSS, developers of the methodology of "Grounded Theory" have made several attempts to explicate, clarify and reconceptualise some of the basic tenets of their methodological approach. Diverging concepts and understandings of Grounded Theory have arisen from these attempts which have led to a split between its founders. Much of the explication and reworking of Grounded Theory surrounds the relation between data and theory and the role of previous theoretical assumptions. The book which initially established the popularity of GLASER's and STRAUSS' methodological ideas, "The Discovery of Grounded Theory", contains two conflicting understandings of the relation between data and theory—the concept of "emergence" on the one hand and the concept of "theoretical sensitivity" on the other hand. Much of the later developments of Grounded Theory can be seen as attempts to reconcile these prima facie diverging concepts. Thereby GLASER recommends to draw on a variety of "coding families" while STRAUSS proposes the use of a general theory of action to build an axis for an emerging theory. This paper first summarises the most important developments within "Grounded Theory" concerning the understanding of the relation between empirical data and theoretical statements. Thereby special emphasis will be laid on differences between GLASER's and STRAUSS' concepts and on GLASER's current critique that the concepts of "coding paradigm" and "axial coding" described by STRAUSS and Juliet CORBIN lead to the "forcing" of data. It will be argued that GLASER's critique points out some existing weaknesses of STRAUSS' concepts but vastly exaggerates the risks of the STRAUSSian approach. A main argument of this paper is that basic problems of empirically grounded theory construction can be treated much more effectively if one draws on certain results of contemporary philosophical and epistemological discussions and on widely

  7. Stop. Write! Writing Grounded Theory

    Directory of Open Access Journals (Sweden)

    Barney G. Glaser, PhD, Hon. PhD

    2012-06-01

    Full Text Available The message in this book, the dictum in this book, is to stop and write when the Grounded Theory (GT methodology puts you in that ready position. Stop unending conceptualization, unending data coverage, and unending listening to others who would egg you on with additional data, ideas and/or requirements or simply wait too long. I will discuss these ideas in detail. My experience with PhD candidates is that for the few who write when ready, many do not and SHOULD. Simply put, many write-up, but many more should.

  8. Australian road rules

    Science.gov (United States)

    2009-02-01

    *These are national-level rules. Australian Road Rules - 2009 Version, Part 18, Division 1, Rule 300 "Use of Mobile Phones" describes restrictions of mobile phone use while driving. The rule basically states that drivers cannot make or receive calls ...

  9. Modeling collective animal behavior with a cognitive perspective: a methodological framework.

    Directory of Open Access Journals (Sweden)

    Sebastian Weitz

    Full Text Available The last decades have seen an increasing interest in modeling collective animal behavior. Some studies try to reproduce as accurately as possible the collective dynamics and patterns observed in several animal groups with biologically plausible, individual behavioral rules. The objective is then essentially to demonstrate that the observed collective features may be the result of self-organizing processes involving quite simple individual behaviors. Other studies concentrate on the objective of establishing or enriching links between collective behavior researches and cognitive or physiological ones, which then requires that each individual rule be carefully validated. Here we discuss the methodological consequences of this additional requirement. Using the example of corpse clustering in ants, we first illustrate that it may be impossible to discriminate among alternative individual rules by considering only observational data collected at the group level. Six individual behavioral models are described: They are clearly distinct in terms of individual behaviors, they all reproduce satisfactorily the collective dynamics and distribution patterns observed in experiments, and we show theoretically that it is strictly impossible to discriminate two of these models even in the limit of an infinite amount of data whatever the accuracy level. A set of methodological steps are then listed and discussed as practical ways to partially overcome this problem. They involve complementary experimental protocols specifically designed to address the behavioral rules successively, conserving group-level data for the overall model validation. In this context, we highlight the importance of maintaining a sharp distinction between model enunciation, with explicit references to validated biological concepts, and formal translation of these concepts in terms of quantitative state variables and fittable functional dependences. Illustrative examples are provided of the

  10. Rule Versus the Causality Rule in Insurance Law

    DEFF Research Database (Denmark)

    Lando, Henrik

    When the Buyer of insurance has negligently kept silent or misrepresented a (material) fact to the Seller, one of two rules will determine the extent to which cover will consequently be reduced. The pro-rata rule lowers cover in proportion to how much the Seller would have increased the premium had...... he been correctly informed; the causality rule provides either zero cover if the omitted fact has caused the insurance event, or full cover if the event would have occurred regardless of the fact. This article explores which rule is more efficient. Using the framework proposed by Picard and Dixit...... it subjects the risk averse Buyer of insurance to less variance. This implies that the pro rata rule should apply when there is significant risk for a Buyer of unintentional misrepresentation, and when the incentive to intentionally misrepresent can be curtailed through frequent verification of the Buyer...

  11. Supervision of Special Education Instruction in Rural Public School Districts: A Grounded Theory

    OpenAIRE

    Bays, Debora Ann

    2001-01-01

    The grounded theory presented in this study describes how the supervision of special education instruction occurs in public elementary schools in rural settings. Grounded theory methodology (Strauss & Corbin, 1998) was employed in this study. Nine elementary schools in three rural districts in the state of Virginia participated in the study. Interview data were collected from 34 participants, including special and general education teachers, principals, and directors of special education. Obs...

  12. Cortical dynamics of figure-ground separation in response to 2D pictures and 3D scenes:How V2 combines border ownership, stereoscopic cues, and Gestalt grouping rules

    Directory of Open Access Journals (Sweden)

    Stephen eGrossberg

    2016-01-01

    Full Text Available The FACADE model, and its laminar cortical realization and extension in the 3D LAMINART model, have explained, simulated, and predicted many perceptual and neurobiological data about how the visual cortex carries out 3D vision and figure-ground perception, and how these cortical mechanisms enable 2D pictures to generate 3D percepts of occluding and occluded objects. In particular, these models have proposed how border ownership occurs, but have not yet explicitly explained the correlation between multiple properties of border ownership neurons in cortical area V2 that were reported in a remarkable series of neurophysiological experiments by von der Heydt and his colleagues; namely, border ownership, contrast preference, binocular stereoscopic information, selectivity for side-of-figure, Gestalt rules, and strength of attentional modulation, as well as the time course during which such properties arise. This article shows how, by combining 3D LAMINART properties that were discovered in two parallel streams of research, a unified explanation of these properties emerges. This explanation proposes, moreover, how these properties contribute to the generation of consciously seen 3D surfaces. The first research stream models how processes like 3D boundary grouping and surface filling-in interact in multiple stages within and between the V1 interblob – V2 interstripe – V4 cortical stream and the V1 blob – V2 thin stripe – V4 cortical stream, respectively. Of particular importance for understanding figure-ground separation is how these cortical interactions convert computationally complementary boundary and surface mechanisms into a consistent conscious percept, including the critical use of surface contour feedback signals from surface representations in V2 thin stripes to boundary representations in V2 interstripes. Remarkably, key figure-ground properties emerge from these feedback interactions. The second research stream shows how cells that

  13. Characterising influences on safety culture in military aviation:a methodologically grounded approach

    OpenAIRE

    Bennett, Anthea; Hellier, Elizabeth; Weyman, Andrew

    2015-01-01

    Historically, much effort has been expended in safety culture / climate research toward identifying a generic core set of components, predominately using the self-administered questionnaire approach. However, no stable unified model has emerged, and much of this research has taken a methodologically top-down approach to depicting organisational safety culture. In light of this, the benefits of qualitative exploration as a precursor to and foundation for the development of quantitative climate...

  14. A pattern-based methodology for optimizing stitches in double-patterning technology

    Science.gov (United States)

    Wang, Lynn T.; Madhavan, Sriram; Dai, Vito; Capodieci, Luigi

    2015-03-01

    A pattern-based methodology for optimizing stitches is developed based on identifying stitch topologies and replacing them with pre-characterized fixing solutions in decomposed layouts. A topology-based library of stitches with predetermined fixing solutions is built. A pattern-based engine searches for matching topologies in the decomposed layouts. When a match is found, the engine opportunistically replaces the predetermined fixing solution: only a design rule check error-free replacement is preserved. The methodology is demonstrated on a 20nm layout design that contains over 67 million, first metal layer stitches. Results show that a small library containing 3 stitch topologies improves the stitch area regularity by 4x.

  15. "Naturalist Inquiry" and Grounded Theory

    Directory of Open Access Journals (Sweden)

    Barney G. Glaser

    2004-01-01

    Full Text Available The world of Qualitative Data Analysis (QDA methodology became quite taken with LINCOLN and GUBA's book "Naturalist Inquiry" (1985. I have no issue with it with respect to its application to QDA; it helped clarify and advance so many QDA issues. However, its application to Grounded Theory (GT has been a major block on GT, as originated, by its cooptation and corruption hence remodeling of GT by default. LINCOLN and GUBA have simply assumed GT is just another QDA method, which it is not. In "The Grounded Theory Perspective II" (GLASER 2002a, Chapter 9 on credibility, I have discussed "Naturalist In­quiry" (NI thought regarding how LINCOLN and GUBA's notion of "trustworthy" data (or worrisome data orientation and how their view of constant comparison can and has remodeled and eroded GT. In this paper I will consider other aspects of NI that remodel GT. URN: urn:nbn:de:0114-fqs040170

  16. Ground System Architectures Workshop GMSEC SERVICES SUITE (GSS): an Agile Development Story

    Science.gov (United States)

    Ly, Vuong

    2017-01-01

    The GMSEC (Goddard Mission Services Evolution Center) Services Suite (GSS) is a collection of tools and software services along with a robust customizable web-based portal that enables the user to capture, monitor, report, and analyze system-wide GMSEC data. Given our plug-and-play architecture and the needs for rapid system development, we opted to follow the Scrum Agile Methodology for software development. Being one of the first few projects to implement the Agile methodology at NASA GSFC, in this presentation we will present our approaches, tools, successes, and challenges in implementing this methodology. The GMSEC architecture provides a scalable, extensible ground and flight system for existing and future missions. GMSEC comes with a robust Application Programming Interface (GMSEC API) and a core set of Java-based GMSEC components that facilitate the development of a GMSEC-based ground system. Over the past few years, we have seen an upbeat in the number of customers who are moving from a native desktop application environment to a web based environment particularly for data monitoring and analysis. We also see a need to provide separation of the business logic from the GUI display for our Java-based components and also to consolidate all the GUI displays into one interface. This combination of separation and consolidation brings immediate value to a GMSEC-based ground system through increased ease of data access via a uniform interface, built-in security measures, centralized configuration management, and ease of feature extensibility.

  17. Review: Franz Breuer with Assistance of Barbara Dieris and Antje Lettau (2009. Reflexive Grounded Theory. Eine Einführung für die Forschungspraxis [Reflexive Grounded Theory: An Introduction to Research Praxis

    Directory of Open Access Journals (Sweden)

    Sandra Da Rin

    2010-03-01

    Full Text Available This textbook by Franz BREUER, produced with the assistance of Barbara DIERIS and Antje LETTAU, is of interest more for the introduction it provides to reflexive research praxis than to grounded theory methodology. This means the subjectivity of the researcher is included in the research process as a decisive source of cognition. Reflexive grounded theory methodology is characterized by three elements that also structure the textbook. In the present review, I focus on two of these in detail: the approach to the research field based on ethnography, particular its epistemological prerequisites, and the inclusion of (self- reflexivity. The latter points to questions that are addressed at the end of this review. URN: urn:nbn:de:0114-fqs1002140

  18. Early Site Permit Demonstration Program: Guidelines for determining design basis ground motions

    International Nuclear Information System (INIS)

    1993-01-01

    This report develops and applies a methodology for estimating strong earthquake ground motion. The motivation was to develop a much needed tool for use in developing the seismic requirements for structural designs. An earthquake's ground motion is a function of the earthquake's magnitude, and the physical properties of the earth through which the seismic waves travel from the earthquake fault to the site of interest. The emphasis of this study is on ground motion estimation in Eastern North America (east of the Rocky Mountains), with particular emphasis on the Eastern United States and southeastern Canada. Eastern North America is a stable continental region, having sparse earthquake activity with rare occurrences of large earthquakes. While large earthquakes are of interest for assessing seismic hazard, little data exists from the region to empirically quantify their effects. The focus of the report is on the attributes of ground motion in Eastern North America that are of interest for the design of facilities such as nuclear power plants. This document, Volume II, contains Appendices 2, 3, 5, 6, and 7 covering the following topics: Eastern North American Empirical Ground Motion Data; Examination of Variance of Seismographic Network Data; Soil Amplification and Vertical-to-Horizontal Ratios from Analysis of Strong Motion Data From Active Tectonic Regions; Revision and Calibration of Ou and Herrmann Method; Generalized Ray Procedure for Modeling Ground Motion Attenuation; Crustal Models for Velocity Regionalization; Depth Distribution Models; Development of Generic Site Effects Model; Validation and Comparison of One-Dimensional Site Response Methodologies; Plots of Amplification Factors; Assessment of Coupling Between Vertical ampersand Horizontal Motions in Nonlinear Site Response Analysis; and Modeling of Dynamic Soil Properties

  19. Challenges in combining different data sets during analysis when using grounded theory.

    Science.gov (United States)

    Rintala, Tuula-Maria; Paavilainen, Eija; Astedt-Kurki, Päivi

    2014-05-01

    To describe the challenges in combining two data sets during grounded theory analysis. The use of grounded theory in nursing research is common. It is a suitable method for studying human action and interaction. It is recommended that many alternative sources of data are collected to create as rich a dataset as possible. Data from interviews with people with diabetes (n=19) and their family members (n=19). Combining two data sets. When using grounded theory, there are numerous challenges in collecting and managing data, especially for the novice researcher. One challenge is to combine different data sets during the analysis. There are many methodological textbooks about grounded theory but there is little written in the literature about combining different data sets. Discussion is needed on the management of data and the challenges of grounded theory. This article provides a means for combining different data sets in the grounded theory analysis process.

  20. Ruled Laguerre minimal surfaces

    KAUST Repository

    Skopenkov, Mikhail

    2011-10-30

    A Laguerre minimal surface is an immersed surface in ℝ 3 being an extremal of the functional ∫ (H 2/K-1)dA. In the present paper, we prove that the only ruled Laguerre minimal surfaces are up to isometry the surfaces ℝ (φλ) = (Aφ, Bφ, Cφ + D cos 2φ) + λ(sin φ, cos φ, 0), where A,B,C,D ε ℝ are fixed. To achieve invariance under Laguerre transformations, we also derive all Laguerre minimal surfaces that are enveloped by a family of cones. The methodology is based on the isotropic model of Laguerre geometry. In this model a Laguerre minimal surface enveloped by a family of cones corresponds to a graph of a biharmonic function carrying a family of isotropic circles. We classify such functions by showing that the top view of the family of circles is a pencil. © 2011 Springer-Verlag.

  1. Grounded Theory: A practical guide for management, business and market researchers Christina Goulding Grounded Theory: A practical guide for management, business and market researchers Sage Publications No of pages: 186 £18.99 0761966838 0761966838 [Formula: see text].

    Science.gov (United States)

    Woods, Leslie

    2003-10-01

    Much has been written about the grounded theory approach to qualitative research, however the number of books devoted solely to this methodology remains relatively few. Therefore, any new book dedicated to the subject is always likely to attract attention - especially given the increasing popularity of grounded theory in healthcare research.

  2. Preclosure seismic design methodology for a geologic repository at Yucca Mountain. Revision 1

    International Nuclear Information System (INIS)

    1996-08-01

    This topical report is the second in a series of three reports being developed by the US Department of Energy (DOE) to document the preclosure seismic design of structures, systems, and components (SSCs) that are important to the radiological safety of the potential repository at Yucca Mountain, Nevada. The first topical report, Methodology to Assess Fault Displacement and Vibratory Ground Motion Hazards at Yucca Mountain, YMP/TR-002-NP, was submitted to the US Nuclear Regulatory Commission (NRC) staff for review and comment in 1994 and has been accepted by the staff. The DOE plans to implement this methodology in fiscal year 1997 to develop probabilistic descriptions of the vibratory ground motion hazard and the fault displacement hazard at the Yucca Mountain site. The second topical report (this report) describes the DOE methodology and acceptance criteria for the preclosure seismic design of SSCs important to safety. A third report, scheduled for fiscal year 1998, will document the results of the probabilistic seismic hazard assessment (conducted using the methodology in the first topical report) and the development of the preclosure seismic design inputs. This third report will be submitted to NRC staff for review and comment as a third topical report or as a design study report

  3. A performance assessment methodology for high-level radioactive waste disposal in unsaturated, fractured tuff

    International Nuclear Information System (INIS)

    Gallegos, D.P.

    1991-07-01

    Sandia National Laboratories, has developed a methodology for performance assessment of deep geologic disposal of high-level nuclear waste. The applicability of this performance assessment methodology has been demonstrated for disposal in bedded salt and basalt; it has since been modified for assessment of repositories in unsaturated, fractured tuff. Changes to the methodology are primarily in the form of new or modified ground water flow and radionuclide transport codes. A new computer code, DCM3D, has been developed to model three-dimensional ground-water flow in unsaturated, fractured rock using a dual-continuum approach. The NEFTRAN 2 code has been developed to efficiently model radionuclide transport in time-dependent velocity fields, has the ability to use externally calculated pore velocities and saturations, and includes the effect of saturation dependent retardation factors. In order to use these codes together in performance-assessment-type analyses, code-coupler programs were developed to translate DCM3D output into NEFTRAN 2 input. Other portions of the performance assessment methodology were evaluated as part of modifying the methodology for tuff. The scenario methodology developed under the bedded salt program has been applied to tuff. An investigation of the applicability of uncertainty and sensitivity analysis techniques to non-linear models indicate that Monte Carlo simulation remains the most robust technique for these analyses. No changes have been recommended for the dose and health effects models, nor the biosphere transport models. 52 refs., 1 fig

  4. The San Miguel Artist Project: A Grounded Theory of "The Emergence of Wonder"

    Directory of Open Access Journals (Sweden)

    Gordon Medlock

    2015-03-01

    Full Text Available This article employs classical grounded theory methodology to explain the creative process of artists. Two integrally connected core variables are identified: emergence and wonder. Wonder represents the experience that motivates and sustains the creation of works of art, and emergence the process by which the sense of wonder is progressively embodied in the content and form of the work. The theory describes a number of distinct phases, including the experience of wonder, immersion in artistic practice, conceiving a specific work or project, composing the work, presenting the work for an actual or potential audience, and finally moving-on. These phases involve a dynamic stream of recursive processes—sketching, refining, connecting, channeling, and assessing—that ultimately facilitate the emergence of wonder in artistic works. The theory of the emergence of wonder also appears to apply to the research processes of both grounded theory methodology and phenomenology, suggesting that these two research methodologies are more similar and have more in common with the artistic creative process than is commonly acknowledged. URN: http://nbn-resolving.de/urn:nbn:de:0114-fqs150256

  5. Delayed rule following

    OpenAIRE

    Schmitt, David R.

    2001-01-01

    Although the elements of a fully stated rule (discriminative stimulus [SD], some behavior, and a consequence) can occur nearly contemporaneously with the statement of the rule, there is often a delay between the rule statement and the SD. The effects of this delay on rule following have not been studied in behavior analysis, but they have been investigated in rule-like settings in the areas of prospective memory (remembering to do something in the future) and goal pursuit. Discriminative even...

  6. Radiation dose from solar flares at ground level

    International Nuclear Information System (INIS)

    O'Brien, K.

    1979-01-01

    Wdowczyk and Wolfendale (Nature, 268, 510, 1977) concluded that a very large solar flare producing exposure of 10 4 rad at ground level (lethal to almost any organism) has a possible frequency of once per 10 5 -10 8 yr. In the work reported similar results were obtained using a more elaborate model. Flares occuring from February 1956 to August 1972 were analyzed. The flare size distribution above the earth's atmosphere, and neutron flux, dose and dose equivalent at ground level at the latitude of Deep River, Canada, were calculated. The probable frequency of flares delivering various doses are given. Doses larger than 100 rad which have significant somatic effects on man and other animals may be delivered once in 10 6 years. The probability of 10 4 rad was found to be 10 -8 /yr. These calculations apply only to high geomagnetic latitudes. Field reversals during which the geomagnetic field is much weaker than current values total about 10% of the past 4 million years. This suggests that a very large flare delivering a large dose worldwide at ground level cannot be ruled out. (author)

  7. The Experiential Model of the Person-Centred Record: a social constructionist grounded theory

    OpenAIRE

    Mihelcic, Joanne

    2017-01-01

    The objective of this research was to explore the co-creation of person-centred records, to support memory, identity and personhood, with the person diagnosed with early stage dementia. This thesis describes the design of a second generation grounded theory methodology and applied archival research. With its postmodern, continuum and social constructionist influences second generation grounded theory sees a shift in how we understand the researcher’s interaction with participants in a study...

  8. Methodological issues and research recommendations for prognosis after mild traumatic brain injury

    DEFF Research Database (Denmark)

    Kristman, Vicki L; Borg, Jörgen; Godbolt, Alison K

    2014-01-01

    methodological concerns and knowledge gaps in the literature. Here we report and make recommendations on how to avoid methodological flaws found in prognostic studies of MTBI. Additionally, we discuss issues of MTBI definition and identify topic areas in need of further research to advance the understanding......, Prevention, Management and Rehabilitation Task Force on the prognosis of MTBI. Of 299 relevant studies, 101 were accepted as scientifically admissible. The methodological quality of the research literature on MTBI prognosis has not improved since the 2002 Task Force report. There are still many...... of prognosis after MTBI. Priority research areas include but are not limited to the use of confirmatory designs, studies of measurement validity, focus on the elderly, attention to litigation/compensation issues, the development of validated clinical prediction rules, the use of MTBI populations other than...

  9. Threshold Differences on Figure and Ground: Gelb and Granit (1923).

    Science.gov (United States)

    Kinateder, Max; Nelson, Rolf

    2017-01-01

    In 1923, Gelb and Granit, using a method of adjustment for a small red light, reported a lower threshold for the target when presented on a ground region than on an adjacent figural region. More recent work in perceptual organization has found precisely the opposite-a processing advantage seems to go to items presented on the figure, not the ground. Although Gelb and Granit continue to be cited for their finding, it has not previously been available as an English translation. Understanding their methodology and results is important for integrating early Gestalt theory with more recent investigations.

  10. Threshold Differences on Figure and Ground: Gelb and Granit (1923)

    Science.gov (United States)

    Kinateder, Max

    2017-01-01

    In 1923, Gelb and Granit, using a method of adjustment for a small red light, reported a lower threshold for the target when presented on a ground region than on an adjacent figural region. More recent work in perceptual organization has found precisely the opposite—a processing advantage seems to go to items presented on the figure, not the ground. Although Gelb and Granit continue to be cited for their finding, it has not previously been available as an English translation. Understanding their methodology and results is important for integrating early Gestalt theory with more recent investigations. PMID:28286640

  11. Benefits of rotational ground motions for planetary seismology

    Science.gov (United States)

    Donner, S.; Joshi, R.; Hadziioannou, C.; Nunn, C.; van Driel, M.; Schmelzbach, C.; Wassermann, J. M.; Igel, H.

    2017-12-01

    Exploring the internal structure of planetary objects is fundamental to understand the evolution of our solar system. In contrast to Earth, planetary seismology is hampered by the limited number of stations available, often just a single one. Classic seismology is based on the measurement of three components of translational ground motion. Its methods are mainly developed for a larger number of available stations. Therefore, the application of classical seismological methods to other planets is very limited. Here, we show that the additional measurement of three components of rotational ground motion could substantially improve the situation. From sparse or single station networks measuring translational and rotational ground motions it is possible to obtain additional information on structure and source. This includes direct information on local subsurface seismic velocities, separation of seismic phases, propagation direction of seismic energy, crustal scattering properties, as well as moment tensor source parameters for regional sources. The potential of this methodology will be highlighted through synthetic forward and inverse modeling experiments.

  12. Methodology and application of combined watershed and ground-water models in Kansas

    Science.gov (United States)

    Sophocleous, M.; Perkins, S.P.

    2000-01-01

    Increased irrigation in Kansas and other regions during the last several decades has caused serious water depletion, making the development of comprehensive strategies and tools to resolve such problems increasingly important. This paper makes the case for an intermediate complexity, quasi-distributed, comprehensive, large-watershed model, which falls between the fully distributed, physically based hydrological modeling system of the type of the SHE model and the lumped, conceptual rainfall-runoff modeling system of the type of the Stanford watershed model. This is achieved by integrating the quasi-distributed watershed model SWAT with the fully-distributed ground-water model MODFLOW. The advantage of this approach is the appreciably smaller input data requirements and the use of readily available data (compared to the fully distributed, physically based models), the statistical handling of watershed heterogeneities by employing the hydrologic-response-unit concept, and the significantly increased flexibility in handling stream-aquifer interactions, distributed well withdrawals, and multiple land uses. The mechanics of integrating the component watershed and ground-water models are outlined, and three real-world management applications of the integrated model from Kansas are briefly presented. Three different aspects of the integrated model are emphasized: (1) management applications of a Decision Support System for the integrated model (Rattlesnake Creek subbasin); (2) alternative conceptual models of spatial heterogeneity related to the presence or absence of an underlying aquifer with shallow or deep water table (Lower Republican River basin); and (3) the general nature of the integrated model linkage by employing a watershed simulator other than SWAT (Wet Walnut Creek basin). These applications demonstrate the practicality and versatility of this relatively simple and conceptually clear approach, making public acceptance of the integrated watershed modeling

  13. 76 FR 33387 - Self-Regulatory Organizations; Notice of Filing and Immediate Effectiveness of Proposed Rule...

    Science.gov (United States)

    2011-06-08

    ... calculates the CBOE Gold ETF Volatility Index (``GVZ''), which is based on the VIX methodology applied to options on the SPDR Gold Trust (``GLD''). The current filing would permit $0.50 strike price intervals for... exchange-traded fund (``ETF'') options. See Rule 1012, Commentary .05(a)(iv). To the extent that the CBOE...

  14. Spectral sum rules and magneto-roton as emergent graviton in fractional quantum Hall effect

    Energy Technology Data Exchange (ETDEWEB)

    Golkar, Siavash; Nguyen, Dung X.; Son, Dam T. [Enrico Fermi Institute, James Franck Institute and Department of Physics,University of Chicago, Chicago, Illinois 60637 (United States)

    2016-01-05

    We consider gapped fractional quantum Hall states on the lowest Landau level when the Coulomb energy is much smaller than the cyclotron energy. We introduce two spectral densities, ρ{sub T}(ω) and ρ̄{sub T}(ω), which are proportional to the probabilities of absorption of circularly polarized gravitons by the quantum Hall system. We prove three sum rules relating these spectral densities with the shift S, the q{sup 4} coefficient of the static structure factor S{sub 4}, and the high-frequency shear modulus of the ground state μ{sub ∞}, which is precisely defined. We confirm an inequality, first suggested by Haldane, that S{sub 4} is bounded from below by |S−1|/8. The Laughlin wavefunction saturates this bound, which we argue to imply that systems with ground state wavefunctions close to Laughlin’s absorb gravitons of predominantly one circular polarization. We consider a nonlinear model where the sum rules are saturated by a single magneto-roton mode. In this model, the magneto-roton arises from the mixing between oscillations of an internal metric and the hydrodynamic motion. Implications for experiments are briefly discussed.

  15. Methodologies for rapid evaluation of seismic demand levels in nuclear power plant structures

    International Nuclear Information System (INIS)

    Manrique, M.; Asfura, A.; Mukhim, G.

    1990-01-01

    A methodology for rapid assessment of both acceleration spectral peak and 'zero period acceleration' (ZPA) values for virtually any major structure in a nuclear power plant is presented. The methodology is based on spectral peak and ZPA amplification factors, developed from regression analyses of an analytical database. The developed amplification factors are applied to the plant's design ground spectrum to obtain amplified response parameters. A practical application of the methodology is presented. This paper also presents a methodology for calculating acceleration response spectrum curves at any number of desired damping ratios directly from a single known damping ratio spectrum. The methodology presented is particularly useful and directly applicable to older vintage nuclear power plant facilities (i.e. such as those affected by USI A-46). The methodology is based on principles of random vibration theory. The methodology has been implemented in a computer program (SPECGEN). SPECGEN results are compared with results obtained from time history analyses. (orig.)

  16. Navigating the Process of Ethical Approval: A methodological note

    Directory of Open Access Journals (Sweden)

    Eileen Carey, RNID, BSc. (hons, MSc.

    2010-12-01

    Full Text Available Classic grounded theory (CGT methodology is a general methodology whereby the researcher aims to develop an emergent conceptual theory from empirical data collected by the researcher during the research study. Gaining ethical approval from relevant ethics committees to access such data is the starting point for processing a CGT study. The adoption of the Universal Declaration on Bioethics and Human Rights (UNESCO, 2005 is an indication of global consensus on the importance of research ethics. There is, however, a wide variation of health research systems across countries and disciplines (Hearnshaw 2004. Institutional Research Boards (IRB or Research Ethics Committees (REC have been established in many countries to regulate ethical research ensuring that researchers agree to, and adhere to, specific ethical and methodological conditions prior to ethical approval being granted. Interestingly, both the processes and outcomes through which the methodological aspects pertinent to CGT studies are agreed between the researcher and ethics committee remain largely ambiguous and vague. Therefore, meeting the requirements for ethical approval from ethics committees, while enlisting the CGT methodology as a chosen research approach, can be daunting for novice researchers embarking upon their first CGT study.

  17. Action Rules Mining

    CERN Document Server

    Dardzinska, Agnieszka

    2013-01-01

    We are surrounded by data, numerical, categorical and otherwise, which must to be analyzed and processed to convert it into information that instructs, answers or aids understanding and decision making. Data analysts in many disciplines such as business, education or medicine, are frequently asked to analyze new data sets which are often composed of numerous tables possessing different properties. They try to find completely new correlations between attributes and show new possibilities for users.   Action rules mining discusses some of data mining and knowledge discovery principles and then describe representative concepts, methods and algorithms connected with action. The author introduces the formal definition of action rule, notion of a simple association action rule and a representative action rule, the cost of association action rule, and gives a strategy how to construct simple association action rules of a lowest cost. A new approach for generating action rules from datasets with numerical attributes...

  18. Application of code scaling applicability and uncertainty methodology to the large break loss of coolant

    International Nuclear Information System (INIS)

    Young, M.Y.; Bajorek, S.M.; Nissley, M.E.

    1998-01-01

    In the late 1980s, after completion of an extensive research program, the United States Nuclear Regulatory Commission (USNRC) amended its regulations (10CFR50.46) to allow the use of realistic physical models to analyze the loss of coolant accident (LOCA) in a light water reactors. Prior to this time, the evaluation of this accident was subject to a prescriptive set of rules (appendix K of the regulations) requiring conservative models and assumptions to be applied simultaneously, leading to very pessimistic estimates of the impact of this accident on the reactor core. The rule change therefore promised to provide significant benefits to owners of power reactors, allowing them to increase output. In response to the rule change, a method called code scaling, applicability and uncertainty (CSAU) was developed to apply realistic methods, while properly taking into account data uncertainty, uncertainty in physical modeling and plant variability. The method was claimed to be structured, traceable, and practical, but was met with some criticism when first demonstrated. In 1996, the USNRC approved a methodology, based on CSAU, developed by a group led by Westinghouse. The lessons learned in this application of CSAU will be summarized. Some of the issues raised concerning the validity and completeness of the CSAU methodology will also be discussed. (orig.)

  19. Philosophical Roots of Classical Grounded Theory: Its Foundations in Symbolic Interactionism

    Science.gov (United States)

    Aldiabat, Khaldoun M.; Le Navenec, Carole-Lynne

    2011-01-01

    Although many researchers have discussed the historical relationship between the Grounded Theory methodology and Symbolic Interactionism, they have not clearly articulated the congruency of their salient concepts and assumptions. The purpose of this paper is to provide a thorough discussion of this congruency. A hypothetical example about smoking…

  20. New narrow boson resonances and SU(4) symmetry: Selection rules, SU(4) mixing, and mass formulas

    International Nuclear Information System (INIS)

    Takasugi, E.; Oneda, S.

    1975-01-01

    General SU(4) sum rules are obtained for bosons in the theoretical framework of asymptotic SU(4), chiral SU(4) direct-product SU(4) charge algebra, and a simple mechanism of SU(4) and chiral SU(4) direct-product SU(4) breaking. The sum rules exhibit a remarkable interplay of the masses, SU(4) mixing angles, and axial-vector matrix elements of 16-plet boson multiplets. Under a particular circumstance (i.e., in the ''ideal'' limit) this interplay produces selection rules which may explain the remarkable stability of the newly found narrow boson resonances. General SU(4) mass formulas and inter-SU(4) -multiplet mass relations are derived and SU(4) mixing parameters are completely determined. Ground state 1 -- and 0 -+ 16-plets are especially discussed and the masses of charmed and uncharmed new members of these multiplets are predicted

  1. Scalable rule-based modelling of allosteric proteins and biochemical networks.

    Directory of Open Access Journals (Sweden)

    Julien F Ollivier

    2010-11-01

    Full Text Available Much of the complexity of biochemical networks comes from the information-processing abilities of allosteric proteins, be they receptors, ion-channels, signalling molecules or transcription factors. An allosteric protein can be uniquely regulated by each combination of input molecules that it binds. This "regulatory complexity" causes a combinatorial increase in the number of parameters required to fit experimental data as the number of protein interactions increases. It therefore challenges the creation, updating, and re-use of biochemical models. Here, we propose a rule-based modelling framework that exploits the intrinsic modularity of protein structure to address regulatory complexity. Rather than treating proteins as "black boxes", we model their hierarchical structure and, as conformational changes, internal dynamics. By modelling the regulation of allosteric proteins through these conformational changes, we often decrease the number of parameters required to fit data, and so reduce over-fitting and improve the predictive power of a model. Our method is thermodynamically grounded, imposes detailed balance, and also includes molecular cross-talk and the background activity of enzymes. We use our Allosteric Network Compiler to examine how allostery can facilitate macromolecular assembly and how competitive ligands can change the observed cooperativity of an allosteric protein. We also develop a parsimonious model of G protein-coupled receptors that explains functional selectivity and can predict the rank order of potency of agonists acting through a receptor. Our methodology should provide a basis for scalable, modular and executable modelling of biochemical networks in systems and synthetic biology.

  2. Integrity Checking and Maintenance with Active Rules in XML Databases

    DEFF Research Database (Denmark)

    Christiansen, Henning; Rekouts, Maria

    2007-01-01

    While specification languages for integrity constraints for XML data have been considered in the literature, actual technologies and methodologies for checking and maintaining integrity are still in their infancy. Triggers, or active rules, which are widely used in previous technologies for the p...... updates, the method indicates trigger conditions and correctness criteria to be met by the trigger code supplied by a developer or possibly automatic methods. We show examples developed in the Sedna XML database system which provides a running implementation of XML triggers....

  3. Common ground: An environmental ethic for Los Alamos National Laboratory

    Energy Technology Data Exchange (ETDEWEB)

    Menlove, F.L.

    1991-01-01

    Three predominant philosophies have characterized American business ethical thinking over the past several decades. The first phase is the ethics of self-interest'' which argues that maximizing self-interest coincidentally maximizes the common good. The second phase is legality ethics.'' Proponents argue that what is important is knowing the rules and following them scrupulously. The third phase might be called stake-holder ethics.'' A central tenant is that everyone affected by a decision has a moral hold on the decision maker. This paper will discuss one recent initiative of the Los Alamos National Laboratory to move beyond rules and regulations toward an environmental ethic that integrates the values of stakeholder ethics'' into the Laboratory's historical culture and value systems. These Common Ground Principles are described. 11 refs.

  4. Methods for prediction of strong earthquake ground motion. Final technical report, October 1, 1976--September 30, 1977

    International Nuclear Information System (INIS)

    Trifunac, M.D.

    1977-09-01

    The purpose of this report is to summarize the results of the work on characterization of strong earthquake ground motion. The objective of this effort has been to initiate presentation of simple yet detailed methodology for characterization of strong earthquake ground motion for use in licensing and evaluation of operating Nuclear Power Plants. This report will emphasize the simplicity of the methodology by presenting only the end results in a format that may be useful for the development of the site specific criteria in seismic risk analysis, for work on the development of modern standards and regulatory guides, and for re-evaluation of the existing power plant sites

  5. Developing an optimal valve closing rule curve for real-time pressure control in pipes

    Energy Technology Data Exchange (ETDEWEB)

    Bazarganlari, Mohammad Reza; Afshar, Hossein [Islamic Azad University, Tehran (Iran, Islamic Republic of); Kerachian, Reza [University of Tehran, Tehran (Iran, Islamic Republic of); Bashiazghadi, Seyyed Nasser [Iran University of Science and Technology, Tehran (Iran, Islamic Republic of)

    2013-01-15

    Sudden valve closure in pipeline systems can cause high pressures that may lead to serious damages. Using an optimal valve closing rule can play an important role in managing extreme pressures in sudden valve closure. In this paper, an optimal closing rule curve is developed using a multi-objective optimization model and Bayesian networks (BNs) for controlling water pressure in valve closure instead of traditional step functions or single linear functions. The method of characteristics is used to simulate transient flow caused by valve closure. Non-dominated sorting genetic algorithms-II is also used to develop a Pareto front among three objectives related to maximum and minimum water pressures, and the amount of water passes through the valve during the valve-closing process. Simulation and optimization processes are usually time-consuming, thus results of the optimization model are used for training the BN. The trained BN is capable of determining optimal real-time closing rules without running costly simulation and optimization models. To demonstrate its efficiency, the proposed methodology is applied to a reservoir-pipe-valve system and the optimal closing rule curve is calculated for the valve. The results of the linear and BN-based valve closure rules show that the latter can significantly reduce the range of variations in water hammer pressures.

  6. Methodological possibilities for using the electron and ion energy balance in thermospheric complex measurements

    International Nuclear Information System (INIS)

    Serafimov, K.B.; Serafimova, M.K.

    1991-01-01

    Combination of ground based measurements for determination of basic thermospheric characteristics is proposed . An expression for the energy transport between components of space plasma is also derived and discussed within the framework of the presented methodology which could be devided into the folowing major sections: 1) application of ionosonde, absorption measurements, TEC-measurements using Faradey radiation or the differential Doppler effect; 2) ground-based airglow measurements; 3) airglow and palsma satelite measurements. 9 refs

  7. Data development technical support document for the aircraft crash risk analysis methodology (ACRAM) standard

    International Nuclear Information System (INIS)

    Kimura, C.Y.; Glaser, R.E.; Mensing, R.W.; Lin, T.; Haley, T.A.; Barto, A.B.; Stutzke, M.A.

    1996-01-01

    The Aircraft Crash Risk Analysis Methodology (ACRAM) Panel has been formed by the US Department of Energy Office of Defense Programs (DOE/DP) for the purpose of developing a standard methodology for determining the risk from aircraft crashes onto DOE ground facilities. In order to accomplish this goal, the ACRAM panel has been divided into four teams, the data development team, the model evaluation team, the structural analysis team, and the consequence team. Each team, consisting of at least one member of the ACRAM plus additional DOE and DOE contractor personnel, specializes in the development of the methodology assigned to that team. This report documents the work performed by the data development team and provides the technical basis for the data used by the ACRAM Standard for determining the aircraft crash frequency. This report should be used to provide the generic data needed to calculate the aircraft crash frequency into the facility under consideration as part of the process for determining the aircraft crash risk to ground facilities as given by the DOE Standard Aircraft Crash Risk Assessment Methodology (ACRAM). Some broad guidance is presented on how to obtain the needed site-specific and facility specific data but this data is not provided by this document

  8. Electrical design of TNS

    International Nuclear Information System (INIS)

    Heck, F.M.; Schultz, J.H.; Smeltzer, G.S.

    1977-01-01

    The electrical design of the ORNL-Westinghouse next step (TNS) fusion reactor was begun in 1976, using a set of ground rules which were based on the overall program objectives. These objectives were to identify the design of reasonably-priced reactors, which would achieve ignition and be technology forcing. The term ''technology forcing'' was understood to mean the desirability of a large number of ignited D-T pulses and the incorporation of superconducting toroidal field (TF) coils, if at all possible. A trade study methodology was developed to compare different machine sizes and TF coil technologies and to aid in the selection of system and subsystem design approaches. The logic which led from the program objectives to the design ground rules and from the ground rules to the circuit selection is described below. The circuit design approaches were generalized and these models were incorporated into a computer program (COAST) which was used to examine the cost of overall tokamak systems as key design parameters were varied

  9. Proof of Kochen–Specker Theorem: Conversion of Product Rule to Sum Rule

    International Nuclear Information System (INIS)

    Toh, S.P.; Zainuddin, Hishamuddin

    2009-01-01

    Valuation functions of observables in quantum mechanics are often expected to obey two constraints called the sum rule and product rule. However, the Kochen–Specker (KS) theorem shows that for a Hilbert space of quantum mechanics of dimension d ≤ 3, these constraints contradict individually with the assumption of value definiteness. The two rules are not irrelated and Peres [Found. Phys. 26 (1996) 807] has conceived a method of converting the product rule into a sum rule for the case of two qubits. Here we apply this method to a proof provided by Mermin based on the product rule for a three-qubit system involving nine operators. We provide the conversion of this proof to one based on sum rule involving ten operators. (general)

  10. A SEMI-AUTOMATIC RULE SET BUILDING METHOD FOR URBAN LAND COVER CLASSIFICATION BASED ON MACHINE LEARNING AND HUMAN KNOWLEDGE

    Directory of Open Access Journals (Sweden)

    H. Y. Gu

    2017-09-01

    Full Text Available Classification rule set is important for Land Cover classification, which refers to features and decision rules. The selection of features and decision are based on an iterative trial-and-error approach that is often utilized in GEOBIA, however, it is time-consuming and has a poor versatility. This study has put forward a rule set building method for Land cover classification based on human knowledge and machine learning. The use of machine learning is to build rule sets effectively which will overcome the iterative trial-and-error approach. The use of human knowledge is to solve the shortcomings of existing machine learning method on insufficient usage of prior knowledge, and improve the versatility of rule sets. A two-step workflow has been introduced, firstly, an initial rule is built based on Random Forest and CART decision tree. Secondly, the initial rule is analyzed and validated based on human knowledge, where we use statistical confidence interval to determine its threshold. The test site is located in Potsdam City. We utilised the TOP, DSM and ground truth data. The results show that the method could determine rule set for Land Cover classification semi-automatically, and there are static features for different land cover classes.

  11. A Novel Methodology for Improving Plant Pest Surveillance in Vineyards and Crops Using UAV-Based Hyperspectral and Spatial Data.

    Science.gov (United States)

    Vanegas, Fernando; Bratanov, Dmitry; Powell, Kevin; Weiss, John; Gonzalez, Felipe

    2018-01-17

    Recent advances in remote sensed imagery and geospatial image processing using unmanned aerial vehicles (UAVs) have enabled the rapid and ongoing development of monitoring tools for crop management and the detection/surveillance of insect pests. This paper describes a (UAV) remote sensing-based methodology to increase the efficiency of existing surveillance practices (human inspectors and insect traps) for detecting pest infestations (e.g., grape phylloxera in vineyards). The methodology uses a UAV integrated with advanced digital hyperspectral, multispectral, and RGB sensors. We implemented the methodology for the development of a predictive model for phylloxera detection. In this method, we explore the combination of airborne RGB, multispectral, and hyperspectral imagery with ground-based data at two separate time periods and under different levels of phylloxera infestation. We describe the technology used-the sensors, the UAV, and the flight operations-the processing workflow of the datasets from each imagery type, and the methods for combining multiple airborne with ground-based datasets. Finally, we present relevant results of correlation between the different processed datasets. The objective of this research is to develop a novel methodology for collecting, processing, analising and integrating multispectral, hyperspectral, ground and spatial data to remote sense different variables in different applications, such as, in this case, plant pest surveillance. The development of such methodology would provide researchers, agronomists, and UAV practitioners reliable data collection protocols and methods to achieve faster processing techniques and integrate multiple sources of data in diverse remote sensing applications.

  12. A Novel Methodology for Improving Plant Pest Surveillance in Vineyards and Crops Using UAV-Based Hyperspectral and Spatial Data

    Science.gov (United States)

    Vanegas, Fernando; Weiss, John; Gonzalez, Felipe

    2018-01-01

    Recent advances in remote sensed imagery and geospatial image processing using unmanned aerial vehicles (UAVs) have enabled the rapid and ongoing development of monitoring tools for crop management and the detection/surveillance of insect pests. This paper describes a (UAV) remote sensing-based methodology to increase the efficiency of existing surveillance practices (human inspectors and insect traps) for detecting pest infestations (e.g., grape phylloxera in vineyards). The methodology uses a UAV integrated with advanced digital hyperspectral, multispectral, and RGB sensors. We implemented the methodology for the development of a predictive model for phylloxera detection. In this method, we explore the combination of airborne RGB, multispectral, and hyperspectral imagery with ground-based data at two separate time periods and under different levels of phylloxera infestation. We describe the technology used—the sensors, the UAV, and the flight operations—the processing workflow of the datasets from each imagery type, and the methods for combining multiple airborne with ground-based datasets. Finally, we present relevant results of correlation between the different processed datasets. The objective of this research is to develop a novel methodology for collecting, processing, analysing and integrating multispectral, hyperspectral, ground and spatial data to remote sense different variables in different applications, such as, in this case, plant pest surveillance. The development of such methodology would provide researchers, agronomists, and UAV practitioners reliable data collection protocols and methods to achieve faster processing techniques and integrate multiple sources of data in diverse remote sensing applications. PMID:29342101

  13. Synthetic strong ground motions for engineering design utilizing empirical Green`s functions

    Energy Technology Data Exchange (ETDEWEB)

    Hutchings, L.J.; Jarpe, S.P.; Kasameyer, P.W.; Foxall, W.

    1996-04-11

    We present a methodology for developing realistic synthetic strong ground motions for specific sites from specific earthquakes. We analyzed the possible ground motion resulting from a M = 7.25 earthquake that ruptures 82 km of the Hayward fault for a site 1.4 km from the fault in the eastern San Francisco Bay area. We developed a suite of 100 rupture scenarios for the Hayward fault earthquake and computed the corresponding strong ground motion time histories. We synthesized strong ground motion with physics-based solutions of earthquake rupture and applied physical bounds on rupture parameters. By having a suite of rupture scenarios of hazardous earthquakes for a fixed magnitude and identifying the hazard to the site from the statistical distribution of engineering parameters, we introduce a probabilistic component into the deterministic hazard calculation. Engineering parameters of synthesized ground motions agree with those recorded from the 1995 Kobe, Japan and the 1992 Landers, California earthquakes at similar distances and site geologies.

  14. The Reasons Young Children Give to Peers When Explaining Their Judgments of Moral and Conventional Rules

    Science.gov (United States)

    Mammen, Maria; Köymen, Bahar; Tomasello, Michael

    2018-01-01

    Moral justifications work, when they do, by invoking values that are shared in the common ground of the interlocutors. We asked 3- and 5-year-old peer dyads (N = 144) to identify and punish norm transgressors. In the moral condition, the transgressor violated a moral norm (e.g., by stealing); in the social rules condition, she/he violated a…

  15. Using Modern Methodologies with Maintenance Software

    Science.gov (United States)

    Streiffert, Barbara A.; Francis, Laurie K.; Smith, Benjamin D.

    2014-01-01

    Jet Propulsion Laboratory uses multi-mission software produced by the Mission Planning and Sequencing (MPS) team to process, simulate, translate, and package the commands that are sent to a spacecraft. MPS works under the auspices of the Multi-Mission Ground Systems and Services (MGSS). This software consists of nineteen applications that are in maintenance. The MPS software is classified as either class B (mission critical) or class C (mission important). The scheduling of tasks is difficult because mission needs must be addressed prior to performing any other tasks and those needs often spring up unexpectedly. Keeping track of the tasks that everyone is working on is also difficult because each person is working on a different software component. Recently the group adopted the Scrum methodology for planning and scheduling tasks. Scrum is one of the newer methodologies typically used in agile development. In the Scrum development environment, teams pick their tasks that are to be completed within a sprint based on priority. The team specifies the sprint length usually a month or less. Scrum is typically used for new development of one application. In the Scrum methodology there is a scrum master who is a facilitator who tries to make sure that everything moves smoothly, a product owner who represents the user(s) of the software and the team. MPS is not the traditional environment for the Scrum methodology. MPS has many software applications in maintenance, team members who are working on disparate applications, many users, and is interruptible based on mission needs, issues and requirements. In order to use scrum, the methodology needed adaptation to MPS. Scrum was chosen because it is adaptable. This paper is about the development of the process for using scrum, a new development methodology, with a team that works on disparate interruptible tasks on multiple software applications.

  16. THE MANAGEMENT OF SCIENTIFIC-METHODOLOGICAL WORK IN THE INSTITUTIONS OF TECHNICAL AND PROFESSIONAL EDUCATION IN CUBA

    OpenAIRE

    Lina Margarita Ramírez Lahera; Jorge González Ramírez

    2017-01-01

    This work deals with a complex of research which has been done on the management of the methodological scientific work in the institutions of the polytechnical education in Cuba. It is related to the necessity of its progress and development just to get better results in the scientific preparation of the teachers, their development and self development to high the quality of the educative. Teaching process, taking into account the new changings of the methodological work rules, that states th...

  17. Estimating and validating ground-based timber harvesting production through computer simulation

    Science.gov (United States)

    Jingxin Wang; Chris B. LeDoux

    2003-01-01

    Estimating ground-based timber harvesting systems production with an object oriented methodology was investigated. The estimation model developed generates stands of trees, simulates chain saw, drive-to-tree feller-buncher, swing-to-tree single-grip harvester felling, and grapple skidder and forwarder extraction activities, and analyzes costs and productivity. It also...

  18. Toward a Methodology of Death: Deleuze's "Event" as Method for Critical Ethnography

    Science.gov (United States)

    Rodriguez, Sophia

    2016-01-01

    This article examines how qualitative researchers, specifically ethnographers, might utilize complex philosophical concepts in order to disrupt the normative truth-telling practices embedded in social science research. Drawing on my own research experiences, I move toward a methodology of death (for researcher/researched alike) grounded in…

  19. A physical interpretation of the Titius-Bode rule and its connection to the closed orbits of Bertrandʼs theorem

    Science.gov (United States)

    Christodoulou, Dimitris M.; Kazanas, Demosthenes

    2017-12-01

    We consider the geometric Titius-Bode rule for the semimajor axes of planetary orbits. We derive an equivalent rule for the midpoints of the segments between consecutive orbits along the radial direction and we interpret it physically in terms of the work done in the gravitational field of the Sun by particles whose orbits are perturbed around each planetary orbit. On such energetic grounds, it is not surprising that some exoplanets in multiple-planet extrasolar systems obey the same relation. However, it is surprising that this simple interpretation of the Titius-Bode rule also reveals new properties of the bound closed orbits predicted by Bertrand’s theorem, which has been known since 1873.

  20. Using Rule-Based Computer Programming to Unify Communication Rules Research.

    Science.gov (United States)

    Sanford, David L.; Roach, J. W.

    This paper proposes the use of a rule-based computer programming language as a standard for the expression of rules, arguing that the adoption of a standard would enable researchers to communicate about rules in a consistent and significant way. Focusing on the formal equivalence of artificial intelligence (AI) programming to different types of…

  1. Earthquake strong ground motion studies at the Idaho National Engineering Laboratory

    International Nuclear Information System (INIS)

    Wong, Ivan; Silva, W.; Darragh, R.; Stark, C.; Wright, D.; Jackson, S.; Carpenter, G.; Smith, R.; Anderson, D.; Gilbert, H.; Scott, D.

    1989-01-01

    Site-specific strong earthquake ground motions have been estimated for the Idaho National Engineering Laboratory assuming that an event similar to the 1983 M s 7.3 Borah Peak earthquake occurs at epicentral distances of 10 to 28 km. The strong ground motion parameters have been estimated based on a methodology incorporating the Band-Limited-White-Noise ground motion model coupled with Random Vibration Theory. A 16-station seismic attenuation and site response survey utilizing three-component portable digital seismographs was also performed for a five-month period in 1989. Based on the recordings of regional earthquakes, the effects of seismic attenuation in the shallow crust and along the propagation path and local site response were evaluated. This data combined with a detailed geologic profile developed for each site based principally on borehole data, was used in the estimation of the strong ground motion parameters. The preliminary peak horizontal ground accelerations for individual sites range from approximately 0.15 to 0.35 g. Based on the authors analysis, the thick sedimentary interbeds (greater than 20 m) in the basalt section attenuate ground motions as speculated upon in a number of previous studies

  2. Human health risk assessment methodology for the UMTRA Ground Water Project

    International Nuclear Information System (INIS)

    1994-11-01

    This document presents the method used to evaluate human risks associated with ground water contamination at inactive uranium processing sites. The intent of these evaluations is to provide the public and remedial action decision-makers with information about the health risks that might be expected at each site in a manner that is easily understood. The method (1) develops probabilistic distributions for exposure variables where sufficient data exist, (2) simulates predicted exposure distributions using Monte Carlo techniques, and (3) develops toxicity ranges that reflect human data when available, animal data if human data are insufficient, regulatory levels, and uncertainties. Risk interpretation is based on comparison of the potential exposure distributions with the derived toxicity ranges. Graphic presentations are an essential element of the semiquantitative interpretation and are expected to increase understanding by the public and decision-makers

  3. Convention on nuclear safety. Rules of procedure and financial rules

    International Nuclear Information System (INIS)

    1998-01-01

    The document presents the Rules of Procedure and Financial Rules that apply mutatis mutandis to any meeting of the Contracting Parties to the Convention on Nuclear Safety (INFCIRC/449) convened in accordance with Chapter 3 of the Convention. It includes four parts: General provisions, Preparatory process for review meetings, Review meetings, and Amendment and interpretation of rules

  4. Delayed rule following.

    Science.gov (United States)

    Schmitt, D R

    2001-01-01

    Although the elements of a fully stated rule (discriminative stimulus [S(D)], some behavior, and a consequence) can occur nearly contemporaneously with the statement of the rule, there is often a delay between the rule statement and the S(D). The effects of this delay on rule following have not been studied in behavior analysis, but they have been investigated in rule-like settings in the areas of prospective memory (remembering to do something in the future) and goal pursuit. Discriminative events for some behavior can be event based (a specific setting stimulus) or time based. The latter are more demanding with respect to intention following and show age-related deficits. Studies suggest that the specificity with which the components of a rule (termed intention) are stated has a substantial effect on intention following, with more detailed specifications increasing following. Reminders of an intention, too, are most effective when they refer specifically to both the behavior and its occasion. Covert review and written notes are two effective strategies for remembering everyday intentions, but people who use notes appear not to be able to switch quickly to covert review. By focusing on aspects of the setting and rule structure, research on prospective memory and goal pursuit expands the agenda for a more complete explanation of rule effects.

  5. Investment Strategies Optimization based on a SAX-GA Methodology

    CERN Document Server

    Canelas, António M L; Horta, Nuno C G

    2013-01-01

    This book presents a new computational finance approach combining a Symbolic Aggregate approXimation (SAX) technique with an optimization kernel based on genetic algorithms (GA). While the SAX representation is used to describe the financial time series, the evolutionary optimization kernel is used in order to identify the most relevant patterns and generate investment rules. The proposed approach considers several different chromosomes structures in order to achieve better results on the trading platform The methodology presented in this book has great potential on investment markets.

  6. Hamburg rules V Hague Visby rules an English perspective

    OpenAIRE

    Tozaj Dorian; Xhelilaj Ermal

    2010-01-01

    It has often been argued for the effect of defences provided to carriers under Art IV (2) of Hague Visby Rules to almost nullify the protection guaranteed to shippers in other provisions of this convention. Therefore an all embracing universal shipper friendly convention, merely the Hamburg Rules, need be incorporated in all countries in order to address this issue and fully satisfy the intentions of the parties for the establishment of international rules in international trade

  7. Methodology of theory of stage-by-stage long-term preparation of sportsmen in single combats

    Directory of Open Access Journals (Sweden)

    Arziutov G.

    2010-04-01

    Full Text Available Results over of researches are brought on methodology of theory of stage-by-stage preparation of sportsmen in single combats. The structuralness of theory lies in possibility simple verifications of its substantive provisions, principles and laws. Development of methodology enables to begin creation of map of trainer on the stages of long-term preparation. Laws, conformities to law, principles and rules, must be collected in a map. A map enables the trainers of reserve sport to use its content during all stages of preparation of sportsman.

  8. Methodological Aspects of Building Science-based Sports Training System for Taekwondo Sportsmen

    Directory of Open Access Journals (Sweden)

    Ananchenko Konstantin

    2016-10-01

    Full Text Available The authors have solved topical scientific problems in the article: 1 the research base in the construction of theoretical and methodological foundations of sports training, based on taekwondo has been analysed; 2 the organization and methodological requirements for the training sessions of taekwondo have been researched; 3 the necessity of interaction processes of natural development and adaptation to physical activity of young taekwondo sportsmen has been grounded; 4 the necessity of scientific evidence of building young fighters training loads in microcycles, based on their individualization has been proved.

  9. Figure and ground in the visual cortex: v2 combines stereoscopic cues with gestalt rules.

    Science.gov (United States)

    Qiu, Fangtu T; von der Heydt, Rüdiger

    2005-07-07

    Figure-ground organization is a process by which the visual system identifies some image regions as foreground and others as background, inferring 3D layout from 2D displays. A recent study reported that edge responses of neurons in area V2 are selective for side-of-figure, suggesting that figure-ground organization is encoded in the contour signals (border ownership coding). Here, we show that area V2 combines two strategies of computation, one that exploits binocular stereoscopic information for the definition of local depth order, and another that exploits the global configuration of contours (Gestalt factors). These are combined in single neurons so that the "near" side of the preferred 3D edge generally coincides with the preferred side-of-figure in 2D displays. Thus, area V2 represents the borders of 2D figures as edges of surfaces, as if the figures were objects in 3D space. Even in 3D displays, Gestalt factors influence the responses and can enhance or null the stereoscopic depth information.

  10. Resolving task rule incongruence during task switching by competitor rule suppression.

    Science.gov (United States)

    Meiran, Nachshon; Hsieh, Shulan; Dimov, Eduard

    2010-07-01

    Task switching requires maintaining readiness to execute any task of a given set of tasks. However, when tasks switch, the readiness to execute the now-irrelevant task generates interference, as seen in the task rule incongruence effect. Overcoming such interference requires fine-tuned inhibition that impairs task readiness only minimally. In an experiment involving 2 object classification tasks and 2 location classification tasks, the authors show that irrelevant task rules that generate response conflicts are inhibited. This competitor rule suppression (CRS) is seen in response slowing in subsequent trials, when the competing rules become relevant. CRS is shown to operate on specific rules without affecting similar rules. CRS and backward inhibition, which is another inhibitory phenomenon, produced additive effects on reaction time, suggesting their mutual independence. Implications for current formal theories of task switching as well as for conflict monitoring theories are discussed. (c) 2010 APA, all rights reserved

  11. Drift design methodology and preliminary application for the Yucca Mountain Site Characterization Project

    International Nuclear Information System (INIS)

    Hardy, M.P.; Bauer, S.J.

    1991-12-01

    Excavation stability in an underground nuclear waste repository is required during construction, emplacement, retrieval (if required), and closure phases to ensure worker health and safety, and to prevent development of potential pathways for radionuclide migration in the post-closure period. Stable excavations are developed by appropriate excavation procedures, design of the room shape, design and installation of rock support reinforcement systems, and implementation of appropriate monitoring and maintenance programs. In addition to the loads imposed by the in situ stress field, the repository drifts will be impacted by thermal loads developed after waste emplacement and, periodically, by seismic loads from naturally occurring earthquakes and underground nuclear events. A priori evaluation of stability is required for design of the ground support system, to confirm that the thermal loads are reasonable, and to support the license application process. In this report, a design methodology for assessing drift stability is presented. This is based on site conditions, together with empirical and analytical methods. Analytical numerical methods are emphasized at this time because empirical data are unavailable for excavations in welded tuff either at elevated temperatures or under seismic loads. The analytical methodology incorporates analysis of rock masses that are systematically jointed, randomly jointed, and sparsely jointed. In situ thermal and seismic loads are considered. Methods of evaluating the analytical results and estimating ground support requirements for all the full range of expected ground conditions are outlines. The results of a preliminary application of the methodology using the limited available data are presented. 26 figs., 55 tabs

  12. Electronuclear sum rules

    International Nuclear Information System (INIS)

    Arenhoevel, H.; Drechsel, D.; Weber, H.J.

    1978-01-01

    Generalized sum rules are derived by integrating the electromagnetic structure functions along lines of constant ratio of momentum and energy transfer. For non-relativistic systems these sum rules are related to the conventional photonuclear sum rules by a scaling transformation. The generalized sum rules are connected with the absorptive part of the forward scattering amplitude of virtual photons. The analytic structure of the scattering amplitudes and the possible existence of dispersion relations have been investigated in schematic relativistic and non-relativistic models. While for the non-relativistic case analyticity does not hold, the relativistic scattering amplitude is analytical for time-like (but not for space-like) photons and relations similar to the Gell-Mann-Goldberger-Thirring sum rule exist. (Auth.)

  13. Putting Foucault to work: an approach to the practical application of Foucault's methodological imperatives

    Directory of Open Access Journals (Sweden)

    DAVID A. NICHOLLS

    2009-01-01

    Full Text Available This paper presents an overview of the methodological approach taken in a recently completed Foucauldian discourse analysis of physiotherapy practice. In keeping with other approaches common to postmodern research this paper resists the temptation to define a proper or ‘correct’ interpretation of Foucault’s methodological oeuvre; preferring instead to apply a range of Foucauldian propositions to examples drawn directly from the thesis. In the paper I elucidate on the blended archaeological and genealogical approach I took and unpack some of the key imperatives, principles and rules I grappled with in completing the thesis.

  14. Evaluating perceptual integration: uniting response-time- and accuracy-based methodologies.

    Science.gov (United States)

    Eidels, Ami; Townsend, James T; Hughes, Howard C; Perry, Lacey A

    2015-02-01

    This investigation brings together a response-time system identification methodology (e.g., Townsend & Wenger Psychonomic Bulletin & Review 11, 391-418, 2004a) and an accuracy methodology, intended to assess models of integration across stimulus dimensions (features, modalities, etc.) that were proposed by Shaw and colleagues (e.g., Mulligan & Shaw Perception & Psychophysics 28, 471-478, 1980). The goal was to theoretically examine these separate strategies and to apply them conjointly to the same set of participants. The empirical phases were carried out within an extension of an established experimental design called the double factorial paradigm (e.g., Townsend & Nozawa Journal of Mathematical Psychology 39, 321-359, 1995). That paradigm, based on response times, permits assessments of architecture (parallel vs. serial processing), stopping rule (exhaustive vs. minimum time), and workload capacity, all within the same blocks of trials. The paradigm introduced by Shaw and colleagues uses a statistic formally analogous to that of the double factorial paradigm, but based on accuracy rather than response times. We demonstrate that the accuracy measure cannot discriminate between parallel and serial processing. Nonetheless, the class of models supported by the accuracy data possesses a suitable interpretation within the same set of models supported by the response-time data. The supported model, consistent across individuals, is parallel and has limited capacity, with the participants employing the appropriate stopping rule for the experimental setting.

  15. Ground-Fault Characteristic Analysis of Grid-Connected Photovoltaic Stations with Neutral Grounding Resistance

    Directory of Open Access Journals (Sweden)

    Zheng Li

    2017-11-01

    Full Text Available A centralized grid-connected photovoltaic (PV station is a widely adopted method of neutral grounding using resistance, which can potentially make pre-existing protection systems invalid and threaten the safety of power grids. Therefore, studying the fault characteristics of grid-connected PV systems and their impact on power-grid protection is of great importance. Based on an analysis of the grid structure of a grid-connected PV system and of the low-voltage ride-through control characteristics of a photovoltaic power supply, this paper proposes a short-circuit calculation model and a fault-calculation method for this kind of system. With respect to the change of system parameters, particularly the resistance connected to the neutral point, and the possible impact on protective actions, this paper achieves the general rule of short-circuit current characteristics through a simulation, which provides a reference for devising protection configurations.

  16. TFTR grounding scheme and ground-monitor system

    International Nuclear Information System (INIS)

    Viola, M.

    1983-01-01

    The Tokamak Fusion Test Reactor (TFTR) grounding system utilizes a single-point ground. It is located directly under the machine, at the basement floor level, and is tied to the building perimeter ground. Wired to this single-point ground, via individual 500 MCM insulated cables, are: the vacuum vessel; four toroidal field coil cases/inner support structure quadrants; umbrella structure halves; the substructure ring girder; radial beams and columns; and the diagnostic systems. Prior to the first machine operation, a ground-loop removal program was initiated. It required insulation of all hangers and supports (within a 35-foot radius of the center of the machine) of the various piping, conduits, cable trays, and ventilation systems. A special ground-monitor system was designed and installed. It actively monitors each of the individual machine grounds to insure that there are no inadvertent ground loops within the machine structure or its ground and that the machine grounds are intact prior to each pulse. The TFTR grounding system has proven to be a very manageable system and one that is easy to maintain

  17. Hybrid Genetic Algorithm - Local Search Method for Ground-Water Management

    Science.gov (United States)

    Chiu, Y.; Nishikawa, T.; Martin, P.

    2008-12-01

    Ground-water management problems commonly are formulated as a mixed-integer, non-linear programming problem (MINLP). Relying only on conventional gradient-search methods to solve the management problem is computationally fast; however, the methods may become trapped in a local optimum. Global-optimization schemes can identify the global optimum, but the convergence is very slow when the optimal solution approaches the global optimum. In this study, we developed a hybrid optimization scheme, which includes a genetic algorithm and a gradient-search method, to solve the MINLP. The genetic algorithm identifies a near- optimal solution, and the gradient search uses the near optimum to identify the global optimum. Our methodology is applied to a conjunctive-use project in the Warren ground-water basin, California. Hi- Desert Water District (HDWD), the primary water-manager in the basin, plans to construct a wastewater treatment plant to reduce future septic-tank effluent from reaching the ground-water system. The treated wastewater instead will recharge the ground-water basin via percolation ponds as part of a larger conjunctive-use strategy, subject to State regulations (e.g. minimum distances and travel times). HDWD wishes to identify the least-cost conjunctive-use strategies that control ground-water levels, meet regulations, and identify new production-well locations. As formulated, the MINLP objective is to minimize water-delivery costs subject to constraints including pump capacities, available recharge water, water-supply demand, water-level constraints, and potential new-well locations. The methodology was demonstrated by an enumerative search of the entire feasible solution and comparing the optimum solution with results from the branch-and-bound algorithm. The results also indicate that the hybrid method identifies the global optimum within an affordable computation time. Sensitivity analyses, which include testing different recharge-rate scenarios, pond

  18. Choosing the rules: distinct and overlapping frontoparietal representations of task rules for perceptual decisions.

    Science.gov (United States)

    Zhang, Jiaxiang; Kriegeskorte, Nikolaus; Carlin, Johan D; Rowe, James B

    2013-07-17

    Behavior is governed by rules that associate stimuli with responses and outcomes. Human and monkey studies have shown that rule-specific information is widely represented in the frontoparietal cortex. However, it is not known how establishing a rule under different contexts affects its neural representation. Here, we use event-related functional MRI (fMRI) and multivoxel pattern classification methods to investigate the human brain's mechanisms of establishing and maintaining rules for multiple perceptual decision tasks. Rules were either chosen by participants or specifically instructed to them, and the fMRI activation patterns representing rule-specific information were compared between these contexts. We show that frontoparietal regions differ in the properties of their rule representations during active maintenance before execution. First, rule-specific information maintained in the dorsolateral and medial frontal cortex depends on the context in which it was established (chosen vs specified). Second, rule representations maintained in the ventrolateral frontal and parietal cortex are independent of the context in which they were established. Furthermore, we found that the rule-specific coding maintained in anticipation of stimuli may change with execution of the rule: representations in context-independent regions remain invariant from maintenance to execution stages, whereas rule representations in context-dependent regions do not generalize to execution stage. The identification of distinct frontoparietal systems with context-independent and context-dependent task rule representations, and the distinction between anticipatory and executive rule representations, provide new insights into the functional architecture of goal-directed behavior.

  19. Environmental filtering is the main assembly rule of ground beetles in the forest and its edge but not in the adjacent grassland.

    Science.gov (United States)

    Magura, Tibor; Lövei, Gábor L

    2017-07-04

    In a fragmented landscape, transitional zones between neighboring habitats are common, and our understanding of community organizational forces across such habitats is important. Edge studies are numerous, but the majority of them utilize information on species richness and abundance. Abundance and taxonomic diversity, however, provide little information on the functioning and phylogeny of the co-existing species. Combining the evaluation of their functional and phylogenetic relationships, we aimed to assess whether ground beetle assemblages are deterministically or stochastically structured along grassland-forest gradients. Our results showed different community assembly rules on opposite sides of the forest edge. In the grassland, co-occurring species were functionally and phylogenetically not different from the random null model, indicating a random assembly process. Contrary to this, at the forest edge and the interior, co-occurring species showed functional and phylogenetic clustering, thus environmental filtering was the likely process structuring carabid assemblages. Community assembly in the grassland was considerably affected by asymmetrical species flows (spillover) across the forest edge: more forest species penetrated into the grassland than open-habitat and generalist species entered into the forest. This asymmetrical species flow underlines the importance of the filter function of forest edges. As unfavorable, human-induced changes to the structure, composition and characteristics of forest edges may alter their filter function, edges have to be specifically considered during conservation management. © 2017 Institute of Zoology, Chinese Academy of Sciences.

  20. Smooth criminal: convicted rule-breakers show reduced cognitive conflict during deliberate rule violations.

    Science.gov (United States)

    Jusyte, Aiste; Pfister, Roland; Mayer, Sarah V; Schwarz, Katharina A; Wirth, Robert; Kunde, Wilfried; Schönenberg, Michael

    2017-09-01

    Classic findings on conformity and obedience document a strong and automatic drive of human agents to follow any type of rule or social norm. At the same time, most individuals tend to violate rules on occasion, and such deliberate rule violations have recently been shown to yield cognitive conflict for the rule-breaker. These findings indicate persistent difficulty to suppress the rule representation, even though rule violations were studied in a controlled experimental setting with neither gains nor possible sanctions for violators. In the current study, we validate these findings by showing that convicted criminals, i.e., individuals with a history of habitual and severe forms of rule violations, can free themselves from such cognitive conflict in a similarly controlled laboratory task. These findings support an emerging view that aims at understanding rule violations from the perspective of the violating agent rather than from the perspective of outside observer.

  1. A Ten-Year Rule to guide the allocation of EU emission allowances

    International Nuclear Information System (INIS)

    Ahman, Markus; Burtraw, Dallas; Kruger, Joseph; Zetterberg, Lars

    2007-01-01

    Member States in the European Union (EU) are responsible for National Allocation Plans governing the initial distribution of emission allowances in the CO 2 Emission Trading System, including rules governing allocations to installations that close and to new entrants. The European Commission has provided guidelines to discourage the use of allocation methodologies that provide incentives affecting firms' compliance behavior, for example by rewarding one type of compliance investment over another. We find that the treatment of closures and new entrants by Member States is inconsistent with the general guidelines provided by the EU. We propose stronger EU guidance regarding closures and new entrants, a more precise compensation criterion on which to justify free allocations, and a Ten-Year Rule as a component of future EU policy that can guide a transition from current practice to an approach that places greater weight on efficiency

  2. Simulation of olive grove gross primary production by the combination of ground and multi-sensor satellite data

    Science.gov (United States)

    Brilli, L.; Chiesi, M.; Maselli, F.; Moriondo, M.; Gioli, B.; Toscano, P.; Zaldei, A.; Bindi, M.

    2013-08-01

    We developed and tested a methodology to estimate olive (Olea europaea L.) gross primary production (GPP) combining ground and multi-sensor satellite data. An eddy-covariance station placed in an olive grove in central Italy provided carbon and water fluxes over two years (2010-2011), which were used as reference to evaluate the performance of a GPP estimation methodology based on a Monteith type model (modified C-Fix) and driven by meteorological and satellite (NDVI) data. A major issue was related to the consideration of the two main olive grove components, i.e. olive trees and inter-tree ground vegetation: this issue was addressed by the separate simulation of carbon fluxes within the two ecosystem layers, followed by their recombination. In this way the eddy covariance GPP measurements were successfully reproduced, with the exception of two periods that followed tillage operations. For these periods measured GPP could be approximated by considering synthetic NDVI values which simulated the expected response of inter-tree ground vegetation to tillages.

  3. The Rules of the Game—The Rules of the Player

    DEFF Research Database (Denmark)

    Thorhauge, Anne Mette

    2013-01-01

    of the game manager in order to implement the rules and provide a world for the other players. In online role-playing games, a programmed system simulates the rule system as well as part of the game manager’s tasks, while the rest of the activity is up to the players to define. Some aspects may translate more......This article presents a critical view of the concept of rules in game studies on the basis of a case study of role-playing across media. Role-playing in its traditional form is a complex activity including a game system and a number of communicative conventions where one player takes the role...... or less unproblematically across media, others are transformed by the introduction of the programmed system. This reveals some important perspectives on the sort of rules that can be simulated in a programmed system and what this means to the concept of rules in game studies....

  4. UAV-Borne photogrammetry: a low cost 3D surveying methodology for cartographic update

    Directory of Open Access Journals (Sweden)

    Caroti Gabriella

    2017-01-01

    Full Text Available Territorial management requires the most possible up-to-date mapping support of the status quo. Regional scale cartography update cycle is in the medium term (10 to 15 years: therefore, in the intervening time between updates relevant Authorities must provide timely updates for new works or territorial changes. Required surveys can exploit several technologies: ground-based GPS, Terrestrial Laser Scanning (TLS, traditional topography, or, in the case of wider areas, airborne photogrammetry or laser scanning. In recent years UAV-based photogrammetry is becoming increasingly widespread as a versatile, low-cost surveying system for small to medium areas. This surveying methodology was used to generate, in order, a dense point cloud, a high resolution Digital Surface Model (DSM and an orthophotograph of a newly built marina by the mouth of the Arno river in Pisa, Italy, which is not yet included in cartography. Surveying activities took place while the construction site was in operation. Case study issues surfaced in the course of the survey are presented and discussed, suggesting ‘good practice’ rules which, if followed in the survey planning step, can lessen unwanted effects due to criticalities. Besides, results of quality analysis of orthophotographs generated by UAV-borne images are also presented. Such results are discussed in view of a possible use of orthophotographs in updating medium- to large-scale cartography and checked against existing blueprints.

  5. Proposal to modify Rule 6, Rule 10a, and Rule 12c of the International Code of Nomenclature of Prokaryotes.

    Science.gov (United States)

    Oren, Aharon; Garrity, George M; Schink, Bernhard

    2014-04-01

    According to the current versions of Rule 10a and Rule 12c of the International Code of Nomenclature of Prokaryotes, names of a genus or subgenus and specific epithets may be taken from any source and may even be composed in an arbitrary manner. Based on these rules, names may be composed of any word or any combination of elements derived from any language with a Latin ending. We propose modifying these rules by adding the text, currently part of Recommendation 6, according to which words from languages other than Latin or Greek should be avoided as long as equivalents exist in Latin or Greek or can be constructed by combining word elements from these two languages. We also propose modification of Rule 6 by adopting some of the current paragraphs of Recommendation 6 to become part of the Rule.

  6. Ontological realism: A methodology for coordinated evolution of scientific ontologies.

    Science.gov (United States)

    Smith, Barry; Ceusters, Werner

    2010-11-15

    Since 2002 we have been testing and refining a methodology for ontology development that is now being used by multiple groups of researchers in different life science domains. Gary Merrill, in a recent paper in this journal, describes some of the reasons why this methodology has been found attractive by researchers in the biological and biomedical sciences. At the same time he assails the methodology on philosophical grounds, focusing specifically on our recommendation that ontologies developed for scientific purposes should be constructed in such a way that their terms are seen as referring to what we call universals or types in reality. As we show, Merrill's critique is of little relevance to the success of our realist project, since it not only reveals no actual errors in our work but also criticizes views on universals that we do not in fact hold. However, it nonetheless provides us with a valuable opportunity to clarify the realist methodology, and to show how some of its principles are being applied, especially within the framework of the OBO (Open Biomedical Ontologies) Foundry initiative.

  7. Methodology for identifying boundaries of systems important to safety in CANDU nuclear power plants

    International Nuclear Information System (INIS)

    Therrien, S.; Komljenovic, D.; Therrien, P.; Ruest, C.; Prevost, P.; Vaillancourt, R.

    2007-01-01

    This paper presents a methodology developed to identify the boundaries of the systems important to safety (SIS) at the Gentilly-2 Nuclear Power Plant (NPP), Hydro-Quebec. The SIS boundaries identification considers nuclear safety only. Components that are not identified as important to safety are systematically identified as related to safety. A global assessment process such as WANO/INPO AP-913 'Equipment Reliability Process' will be needed to implement adequate changes in the management rules of those components. The paper depicts results in applying the methodology to the Shutdown Systems 1 and 2 (SDS 1, 2), and to the Emergency Core Cooling System (ECCS). This validation process enabled fine tuning the methodology, performing a better estimate of the effort required to evaluate a system, and identifying components important to safety of these systems. (author)

  8. Writing biomedical manuscripts part I: fundamentals and general rules.

    Science.gov (United States)

    Ohwovoriole, A E

    2011-01-01

    It is a professional obligation for health researchers to investigate and communicate their findings to the medical community. The writing of a publishable scientific manuscript can be a daunting task for the beginner and to even some established researchers. Many manuscripts fail to get off the ground and/or are rejected. The writing task can be made easier and the quality improved by using and following simple rules and leads that apply to general scientific writing .The manuscript should follow a standard structure:(e.g. (Abstract) plus Introduction, Methods, Results, and Discussion/Conclusion, the IMRAD model. The authors must also follow well established fundamentals of good communication in science and be systematic in approach. The manuscript must move from what is currently known to what was unknown that was investigated using a hypothesis, research question or problem statement. Each section has its own style of structure and language of presentation. The beginning of writing a good manuscript is to do a good study design and to pay attention to details at every stage. Many manuscripts are rejected because of errors that can be avoided if the authors follow simple guidelines and rules. One good way to avoid potential disappointment in manuscript writing is to follow the established general rules along with those of the journal in which the paper is to be published. An important injunction is to make the writing precise, clear, parsimonious, and comprehensible to the intended audience. The purpose of this article is to arm and encourage potential biomedical authors with tools and rules that will enable them to write contemporary manuscripts, which can stand the rigorous peer review process. The expectations of standard journals, and common pitfalls the major elements of a manuscript are covered.

  9. Ground acceleration in a nuclear power plant

    International Nuclear Information System (INIS)

    Pena G, P.; Balcazar, M.; Vega R, E.

    2015-09-01

    A methodology that adopts the recommendations of international organizations for determining the ground acceleration at a nuclear power plant is outlined. Systematic presented here emphasizes the type of geological, geophysical and geotechnical studies in different areas of influence, culminating in assessments of Design Basis earthquake and the earthquake Operating Base. The methodology indicates that in regional areas where the site of the nuclear power plant is located, failures are identified in geological structures, and seismic histories of the region are documented. In the area of detail geophysical tools to generate effects to determine subsurface propagation velocities and spectra of the induced seismic waves are used. The mechanical analysis of drill cores allows estimating the efforts that generate and earthquake postulate. Studies show that the magnitude of the Fukushima earthquake, did not affect the integrity of nuclear power plants due to the rocky settlement found. (Author)

  10. Grounding language in action and perception: from cognitive agents to humanoid robots.

    Science.gov (United States)

    Cangelosi, Angelo

    2010-06-01

    In this review we concentrate on a grounded approach to the modeling of cognition through the methodologies of cognitive agents and developmental robotics. This work will focus on the modeling of the evolutionary and developmental acquisition of linguistic capabilities based on the principles of symbol grounding. We review cognitive agent and developmental robotics models of the grounding of language to demonstrate their consistency with the empirical and theoretical evidence on language grounding and embodiment, and to reveal the benefits of such an approach in the design of linguistic capabilities in cognitive robotic agents. In particular, three different models will be discussed, where the complexity of the agent's sensorimotor and cognitive system gradually increases: from a multi-agent simulation of language evolution, to a simulated robotic agent model for symbol grounding transfer, to a model of language comprehension in the humanoid robot iCub. The review also discusses the benefits of the use of humanoid robotic platform, and specifically of the open source iCub platform, for the study of embodied cognition. Copyright 2010 Elsevier B.V. All rights reserved.

  11. RuleMaDrone: A Web-Interface to Visualise Space Usage Rules for Drones

    OpenAIRE

    Trippaers, Aäron

    2015-01-01

    RuleMaDrone, an application developed within this thesis, is presented as a solution to communicate the rules and regulations to drone operators. To provide the solution a framework for drone safety was designed which consists of the rules and regulations, the drone properties and the environmental factors. RuleMaDrone is developed with this framework and thus will provide drone operators with an application which they can use to find a safe and legal fly zone. RuleMaDrone u...

  12. Refining the Relationships among Historical Figures by Implementing Inference Rules in SWRL

    Science.gov (United States)

    Fajrin Ariyani, Nurul; Saralita, Madis; Sarwosri; Sarno, Riyanarto

    2018-03-01

    The biography of historical figures is often fascinating to be known. Everything about their character, work, invention, and personal life sometimes are presented in their biography. The social and family relationships among historical figures also put into concern, especially for political figures, heroes, kings or persons who have ever been ruled a monarchy in their past. Some biographies can be found in Wikipedia as articles. Most of the social and family relationship contents of these figures are not completely depicted due to a various article’s contributors and sources. Fortunately, the missing relatives of a person might reside in the other figures’ biography in different pages. Each Wikipedia article has metadata which represents its essential information of content. By processing the metadata obtained from DBpedia and composing the inferencing rules (in the form of ontology) to identify the relationships content, it can generate several new inferred facts that complement the existing relationships. This work proposes a methodology for finding missing relationships among historical figures using inference rules in an ontology. As a result, our method can present new facts about the relationships that absent in the existing Wikipedia articles.

  13. Qualitative research in healthcare: an introduction to grounded theory using thematic analysis.

    Science.gov (United States)

    Chapman, A L; Hadfield, M; Chapman, C J

    2015-01-01

    In today's NHS, qualitative research is increasingly important as a method of assessing and improving quality of care. Grounded theory has developed as an analytical approach to qualitative data over the last 40 years. It is primarily an inductive process whereby theoretical insights are generated from data, in contrast to deductive research where theoretical hypotheses are tested via data collection. Grounded theory has been one of the main contributors to the acceptance of qualitative methods in a wide range of applied social sciences. The influence of grounded theory as an approach is, in part, based on its provision of an explicit framework for analysis and theory generation. Furthermore the stress upon grounding research in the reality of participants has also given it credence in healthcare research. As with all analytical approaches, grounded theory has drawbacks and limitations. It is important to have an understanding of these in order to assess the applicability of this approach to healthcare research. In this review we outline the principles of grounded theory, and focus on thematic analysis as the analytical approach used most frequently in grounded theory studies, with the aim of providing clinicians with the skills to critically review studies using this methodology.

  14. Strategy as simple rules.

    Science.gov (United States)

    Eisenhardt, K M; Sull, D N

    2001-01-01

    The success of Yahoo!, eBay, Enron, and other companies that have become adept at morphing to meet the demands of changing markets can't be explained using traditional thinking about competitive strategy. These companies have succeeded by pursuing constantly evolving strategies in market spaces that were considered unattractive according to traditional measures. In this article--the third in an HBR series by Kathleen Eisenhardt and Donald Sull on strategy in the new economy--the authors ask, what are the sources of competitive advantage in high-velocity markets? The secret, they say, is strategy as simple rules. The companies know that the greatest opportunities for competitive advantage lie in market confusion, but they recognize the need for a few crucial strategic processes and a few simple rules. In traditional strategy, advantage comes from exploiting resources or stable market positions. In strategy as simple rules, advantage comes from successfully seizing fleeting opportunities. Key strategic processes, such as product innovation, partnering, or spinout creation, place the company where the flow of opportunities is greatest. Simple rules then provide the guidelines within which managers can pursue such opportunities. Simple rules, which grow out of experience, fall into five broad categories: how- to rules, boundary conditions, priority rules, timing rules, and exit rules. Companies with simple-rules strategies must follow the rules religiously and avoid the temptation to change them too frequently. A consistent strategy helps managers sort through opportunities and gain short-term advantage by exploiting the attractive ones. In stable markets, managers rely on complicated strategies built on detailed predictions of the future. But when business is complicated, strategy should be simple.

  15. 49 CFR 222.41 - How does this rule affect Pre-Rule Quiet Zones and Pre-Rule Partial Quiet Zones?

    Science.gov (United States)

    2010-10-01

    ...-Rule Quiet Zone may be established by automatic approval and remain in effect, subject to § 222.51, if... Zone may be established by automatic approval and remain in effect, subject to § 222.51, if the Pre... 49 Transportation 4 2010-10-01 2010-10-01 false How does this rule affect Pre-Rule Quiet Zones and...

  16. Transitory and steady analysis of grounding structures using the LN-FDTD method

    Energy Technology Data Exchange (ETDEWEB)

    Oliveira, Rodrigo Melo e Silva de; Souza Sobrinho, Carlos Leonidas da S. [Federal University of Para (UFPA), Belem, PA (Brazil). Electrical and Computer Engineering Dept.], Emails: rodrigo@lane.ufpa.br, leonidas@ufpa.br

    2007-07-01

    This work presents an overview of the LN-FDTD method (FDTD in local and non orthogonal coordinate system) to solve Maxwell's Equations. This method has been used to simulate curved grounding structures. Results are obtained by employing the presented methodology and they are compared to reference equations available in literature. (author)

  17. Using the Chain Rule as the Key Link in Deriving the General Rules for Differentiation

    Science.gov (United States)

    Sprows, David

    2011-01-01

    The standard approach to the general rules for differentiation is to first derive the power, product, and quotient rules and then derive the chain rule. In this short article we give an approach to these rules which uses the chain rule as the main tool in deriving the power, product, and quotient rules in a manner which is more student-friendly…

  18. A Design Methodology for Medical Processes

    Science.gov (United States)

    Bonacina, Stefano; Pozzi, Giuseppe; Pinciroli, Francesco; Marceglia, Sara

    2016-01-01

    Summary Background Healthcare processes, especially those belonging to the clinical domain, are acknowledged as complex and characterized by the dynamic nature of the diagnosis, the variability of the decisions made by experts driven by their experiences, the local constraints, the patient’s needs, the uncertainty of the patient’s response, and the indeterminacy of patient’s compliance to treatment. Also, the multiple actors involved in patient’s care need clear and transparent communication to ensure care coordination. Objectives In this paper, we propose a methodology to model healthcare processes in order to break out complexity and provide transparency. Methods The model is grounded on a set of requirements that make the healthcare domain unique with respect to other knowledge domains. The modeling methodology is based on three main phases: the study of the environmental context, the conceptual modeling, and the logical modeling. Results The proposed methodology was validated by applying it to the case study of the rehabilitation process of stroke patients in the specific setting of a specialized rehabilitation center. The resulting model was used to define the specifications of a software artifact for the digital administration and collection of assessment tests that was also implemented. Conclusions Despite being only an example, our case study showed the ability of process modeling to answer the actual needs in healthcare practices. Independently from the medical domain in which the modeling effort is done, the proposed methodology is useful to create high-quality models, and to detect and take into account relevant and tricky situations that can occur during process execution. PMID:27081415

  19. A Design Methodology for Medical Processes.

    Science.gov (United States)

    Ferrante, Simona; Bonacina, Stefano; Pozzi, Giuseppe; Pinciroli, Francesco; Marceglia, Sara

    2016-01-01

    Healthcare processes, especially those belonging to the clinical domain, are acknowledged as complex and characterized by the dynamic nature of the diagnosis, the variability of the decisions made by experts driven by their experiences, the local constraints, the patient's needs, the uncertainty of the patient's response, and the indeterminacy of patient's compliance to treatment. Also, the multiple actors involved in patient's care need clear and transparent communication to ensure care coordination. In this paper, we propose a methodology to model healthcare processes in order to break out complexity and provide transparency. The model is grounded on a set of requirements that make the healthcare domain unique with respect to other knowledge domains. The modeling methodology is based on three main phases: the study of the environmental context, the conceptual modeling, and the logical modeling. The proposed methodology was validated by applying it to the case study of the rehabilitation process of stroke patients in the specific setting of a specialized rehabilitation center. The resulting model was used to define the specifications of a software artifact for the digital administration and collection of assessment tests that was also implemented. Despite being only an example, our case study showed the ability of process modeling to answer the actual needs in healthcare practices. Independently from the medical domain in which the modeling effort is done, the proposed methodology is useful to create high-quality models, and to detect and take into account relevant and tricky situations that can occur during process execution.

  20. Does contrast between eggshell ground and spot coloration affect egg rejection?

    Science.gov (United States)

    Dainson, Miri; Hauber, Mark E; López, Analía V; Grim, Tomáš; Hanley, Daniel

    2017-08-01

    Obligate avian brood parasitic species impose the costs of incubating foreign eggs and raising young upon their unrelated hosts. The most common host defence is the rejection of parasitic eggs from the nest. Both egg colours and spot patterns influence egg rejection decisions in many host species, yet no studies have explicitly examined the role of variation in spot coloration. We studied the American robin Turdus migratorius, a blue-green unspotted egg-laying host of the brown-headed cowbird Molothrus ater, a brood parasite that lays non-mimetic spotted eggs. We examined host responses to model eggs with variable spot coloration against a constant robin-mimetic ground colour to identify patterns of rejection associated with perceived contrast between spot and ground colours. By using avian visual modelling, we found that robins were more likely to reject eggs whose spots had greater chromatic (hue) but not achromatic (brightness) contrast. Therefore, egg rejection decision rules in the American robin may depend on the colour contrast between parasite eggshell spot and host ground coloration. Our study also suggests that egg recognition in relation to spot coloration, like ground colour recognition, is tuned to the natural variation of avian eggshell spot colours but not to unnatural spot colours.

  1. Isovector giant monopole resonances: A sum-rule approach

    International Nuclear Information System (INIS)

    Goeke, K.; Bonn Univ.; Castel, B.

    1980-01-01

    Several useful sum rules associated with isovector giant monopole resonances are calculated for doubly closed shell nuclei. The calculation is based on techniques known from constrained and adiabatic time-dependent Hartree-Fock theories and assume various Skyrme interactions. The results obtained form, together with the compiled literature, the basis for a quantitative description of the RPA strength distribution in terms of energy-weighted moments. These, together with strength distribution properties, are determined by a hierarchy of determinantal relations between moments. The isovector giant monopole resonance turns out to be a rather broad resonance centered at E = 46 Asup(-1/10) MeV with an extended width of more than 16 MeV. The consequences regarding isospin impurities in the nuclear ground state are discussed. (orig.)

  2. Early Site Permit Demonstration Program: Guidelines for determining design basis ground motions. Volume 2, Appendices

    Energy Technology Data Exchange (ETDEWEB)

    1993-03-18

    This report develops and applies a methodology for estimating strong earthquake ground motion. The motivation was to develop a much needed tool for use in developing the seismic requirements for structural designs. An earthquake`s ground motion is a function of the earthquake`s magnitude, and the physical properties of the earth through which the seismic waves travel from the earthquake fault to the site of interest. The emphasis of this study is on ground motion estimation in Eastern North America (east of the Rocky Mountains), with particular emphasis on the Eastern United States and southeastern Canada. Eastern North America is a stable continental region, having sparse earthquake activity with rare occurrences of large earthquakes. While large earthquakes are of interest for assessing seismic hazard, little data exists from the region to empirically quantify their effects. The focus of the report is on the attributes of ground motion in Eastern North America that are of interest for the design of facilities such as nuclear power plants. This document, Volume II, contains Appendices 2, 3, 5, 6, and 7 covering the following topics: Eastern North American Empirical Ground Motion Data; Examination of Variance of Seismographic Network Data; Soil Amplification and Vertical-to-Horizontal Ratios from Analysis of Strong Motion Data From Active Tectonic Regions; Revision and Calibration of Ou and Herrmann Method; Generalized Ray Procedure for Modeling Ground Motion Attenuation; Crustal Models for Velocity Regionalization; Depth Distribution Models; Development of Generic Site Effects Model; Validation and Comparison of One-Dimensional Site Response Methodologies; Plots of Amplification Factors; Assessment of Coupling Between Vertical & Horizontal Motions in Nonlinear Site Response Analysis; and Modeling of Dynamic Soil Properties.

  3. Building International Business Theory: A Grounded Theory Approach

    OpenAIRE

    Gligor, David; Esmark, Carol; Golgeci, Ismail

    2016-01-01

    The field of international business (IB) is in need of more theory development (Morck & Yeung, 2007). As such, the main focus of our manuscript was to provide guidance on how to build IB specific theory using grounded theory (GT). Moreover, we contribute to future theory development by identifying areas within IB where GT can be applied and the type of research issues that can be addressed using this methodology. Finally, we make a noteworthy contribution by discussing some of GT’s caveats an...

  4. Dynamic segmentation to estimate vine vigor from ground images

    OpenAIRE

    Sáiz Rubio, Verónica; Rovira Más, Francisco

    2012-01-01

    [EN] The geographic information required to implement precision viticulture applications in real fields has led to the extensive use of remote sensing and airborne imagery. While advantageous because they cover large areas and provide diverse radiometric data, they are unreachable to most of medium-size Spanish growers who cannot afford such image sourcing. This research develops a new methodology to generate globally-referenced vigor maps in vineyards from ground images taken wit...

  5. Dynamic segmentation to estimate vine vigor from ground images

    OpenAIRE

    Sáiz-Rubio, V.; Rovira-Más, F.

    2012-01-01

    The geographic information required to implement precision viticulture applications in real fields has led to the extensive use of remote sensing and airborne imagery. While advantageous because they cover large areas and provide diverse radiometric data, they are unreachable to most of medium-size Spanish growers who cannot afford such image sourcing. This research develops a new methodology to generate globally-referenced vigor maps in vineyards from ground images taken with a camera mounte...

  6. Rules, culture, and fitness.

    Science.gov (United States)

    Baum, W M

    1995-01-01

    Behavior analysis risks intellectual isolation unless it integrates its explanations with evolutionary theory. Rule-governed behavior is an example of a topic that requires an evolutionary perspective for a full understanding. A rule may be defined as a verbal discriminative stimulus produced by the behavior of a speaker under the stimulus control of a long-term contingency between the behavior and fitness. As a discriminative stimulus, the rule strengthens listener behavior that is reinforced in the short run by socially mediated contingencies, but which also enters into the long-term contingency that enhances the listener's fitness. The long-term contingency constitutes the global context for the speaker's giving the rule. When a rule is said to be "internalized," the listener's behavior has switched from short- to long-term control. The fitness-enhancing consequences of long-term contingencies are health, resources, relationships, or reproduction. This view ties rules both to evolutionary theory and to culture. Stating a rule is a cultural practice. The practice strengthens, with short-term reinforcement, behavior that usually enhances fitness in the long run. The practice evolves because of its effect on fitness. The standard definition of a rule as a verbal statement that points to a contingency fails to distinguish between a rule and a bargain ("If you'll do X, then I'll do Y"), which signifies only a single short-term contingency that provides mutual reinforcement for speaker and listener. In contrast, the giving and following of a rule ("Dress warmly; it's cold outside") can be understood only by reference also to a contingency providing long-term enhancement of the listener's fitness or the fitness of the listener's genes. Such a perspective may change the way both behavior analysts and evolutionary biologists think about rule-governed behavior.

  7. Classic Grounded Theory to Analyse Secondary Data: Reality and Reflections

    Directory of Open Access Journals (Sweden)

    Lorraine Andrews

    2012-06-01

    Full Text Available This paper draws on the experiences of two researchers and discusses how they conducted a secondary data analysis using classic grounded theory. The aim of the primary study was to explore first-time parents’ postnatal educational needs. A subset of the data from the primary study (eight transcripts from interviews with fathers was used for the secondary data analysis. The objectives of the secondary data analysis were to identify the challenges of using classic grounded theory with secondary data and to explore whether the re-analysis of primary data using a different methodology would yield a different outcome. Through the process of re-analysis a tentative theory emerged on ‘developing competency as a father’. Challenges encountered during this re-analysis included the small dataset, the pre-framed data, and limited ability for theoretical sampling. This re-analysis proved to be a very useful learning tool for author 1(LA, who was a novice with classic grounded theory.

  8. Dibenzoheptazethrene isomers with different biradical characters: An exercise of clar's aromatic sextet rule in singlet biradicaloids

    KAUST Repository

    Sun, Zhe

    2013-12-04

    Clar\\'s aromatic sextet rule has been widely used for the prediction of the reactivity and stability of polycyclic aromatic hydrocarbons with a closed-shell electronic configuration. Recent advances in open-shell biradicaloids have shown that the number of aromatic sextet rings plays an important role in determination of their ground states. In order to test the validity of this rule in singlet biradicaloids, the two soluble and stable dibenzoheptazethrene isomers DBHZ1 and DBHZ2 were prepared by different synthetic approaches and isolated in crystalline form. These two molecules have different numbers of aromatic sextet rings in their respective biradical resonance forms and thus are expected to exhibit varied singlet biradical character. This assumption was verified by different experimental methods, including nuclear magnetic resonance (NMR), electron spin resonance (ESR), superconducting quantum interference device (SQUID), steady-state and transient absorption spectroscopy (TA), and X-ray crystallographic analysis, assisted by unrestricted symmetry-broken density functional theory (DFT) calculations. DBHZ2, with more aromatic sextet rings in the biradical form, was demonstrated to possess greater biradical character than DBHZ1; as a result, DBHZ2 exhibited an intense one-photon absorption (OPA) in the near-infrared region (λabs max = 804 nm) and a large two-photon absorption (TPA) cross-section (σ(2)max = 2800 GM at 1600 nm). This investigation together with previous studies indicates that Clar\\'s aromatic sextet rule can be further extended to the singlet biradicaloids to predict their ground states and singlet biradical characters. © 2013 American Chemical Society.

  9. Modeling Nonlinear Site Response Uncertainty in Broadband Ground Motion Simulations for the Los Angeles Basin

    Science.gov (United States)

    Assimaki, D.; Li, W.; Steidl, J. M.; Schmedes, J.

    2007-12-01

    The assessment of strong motion site response is of great significance, both for mitigating seismic hazard and for performing detailed analyses of earthquake source characteristics. There currently exists, however, large degree of uncertainty concerning the mathematical model to be employed for the computationally efficient evaluation of local site effects, and the site investigation program necessary to evaluate the nonlinear input model parameters and ensure cost-effective predictions; and while site response observations may provide critical constraints on interpretation methods, the lack of a statistically significant number of in-situ strong motion records prohibits statistical analyses to be conducted and uncertainties to be quantified based entirely on field data. In this paper, we combine downhole observations and broadband ground motion synthetics for characteristic site conditions the Los Angeles Basin, and investigate the variability in ground motion estimation introduced by the site response assessment methodology. In particular, site-specific regional velocity and attenuation structures are initially compiled using near-surface geotechnical data collected at downhole geotechnical arrays, inverse low-strain velocity and attenuation profiles at these sites obtained by inversion of weak motion records and the crustal velocity structure at the corresponding locations obtained from the Southern California Earthquake Centre Community Velocity Model. Successively, broadband ground motions are simulated by means of a hybrid low/high-frequency finite source model with correlated random parameters for rupture scenaria of weak, medium and large magnitude events (M =3.5-7.5). Observed estimates of site response at the stations of interest are first compared to the ensemble of approximate and incremental nonlinear site response models. Parametric studies are next conducted for each fixed magnitude (fault geometry) scenario by varying the source-to-site distance and

  10. Consistence of Network Filtering Rules

    Institute of Scientific and Technical Information of China (English)

    SHE Kun; WU Yuancheng; HUANG Juncai; ZHOU Mingtian

    2004-01-01

    The inconsistence of firewall/VPN(Virtual Private Network) rule makes a huge maintainable cost.With development of Multinational Company,SOHO office,E-government the number of firewalls/VPN will increase rapidly.Rule table in stand-alone or network will be increased in geometric series accordingly.Checking the consistence of rule table manually is inadequate.A formal approach can define semantic consistence,make a theoretic foundation of intelligent management about rule tables.In this paper,a kind of formalization of host rules and network ones for auto rule-validation based on SET theory were proporsed and a rule validation scheme was defined.The analysis results show the superior performance of the methods and demonstrate its potential for the intelligent management based on rule tables.

  11. BIM – New rules of measurement ontology for construction cost estimation

    Directory of Open Access Journals (Sweden)

    F.H. Abanda

    2017-04-01

    Full Text Available For generations, the process of cost estimation has been manual, time-consuming and error-prone. Emerging Building Information Modelling (BIM can exploit standard measurement methods to automate cost estimation process and improve inaccuracies. Structuring standard measurement methods in an ontologically and machine readable format for a BIM software can greatly facilitate the process of improving inaccuracies in cost estimation. This study explores the development of an ontology based on New Rules of Measurement (NRM for cost estimation during the tendering stages. The methodology adopted is methontology, one of the most widely used ontology engineering methodologies. To ensure the ontology is fit for purpose, cost estimation experts are employed to check the semantics, descriptive logic-based reasoners are used to syntactically check the ontology and a leading 4D BIM modelling software is used on a case study building to test/validate the proposed ontology.

  12. Methodology for the free allocation of emission allowances in the EU ETS post 2012. Sector report for the chemical industry

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2009-11-15

    In 2013, the third trading period of the EU emission trading scheme (EU ETS) will start. With a few exceptions, no free allocation of emission allowances is foreseen in this third trading period for the emissions related to the production of electricity. These emission allowances will be auctioned. For other emissions, transitional free allocation of emission allowances is envisioned. This free allocation will be based on Community wide allocation rules that will, to the extent feasible, be based on ex-ante benchmarks. In 2013, the free allocation is 80% of the quantity determined via these rules, going down to 30% in 2020. An exception is made for activities that are deemed to be exposed to a significant risk of carbon leakage. These activities will receive an allocation of 100% of the quantity determined via the rules. The benchmarks should in principle be calculated for products, i.e. a specific performance per unit productive output, to ensure that they maximize greenhouse gas reductions throughout each production process of the sectors concerned. In this study for the European Commission, a blueprint for a methodology based on benchmarking is developed to determine the allocation rules in the EU ETS from 2013 onwards. In case where benchmarking is not regarded feasible, alternative approaches are suggested. The methodology allows determining the allocation for each EU ETS installation eligible for free allocation of emission allowances. The focus of this study is on preparing a first blueprint of an allocation methodology for free allocation of emission allowances under the EU Emission Trading Scheme for the period 2013-2020 for installations in the refinery industry. The report should be read in conjunction with the report on the project approach and general issues.

  13. Methodology for the free allocation of emission allowances in the EU ETS post 2012. Sector report for the refinery industry

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2009-11-15

    In 2013, the third trading period of the EU emission trading scheme (EU ETS) will start. With a few exceptions, no free allocation of emission allowances is foreseen in this third trading period for the emissions related to the production of electricity. These emission allowances will be auctioned. For other emissions, transitional free allocation of emission allowances is envisioned. This free allocation will be based on Community wide allocation rules that will, to the extent feasible, be based on ex-ante benchmarks. In 2013, the free allocation is 80% of the quantity determined via these rules, going down to 30% in 2020. An exception is made for activities that are deemed to be exposed to a significant risk of carbon leakage. These activities will receive an allocation of 100% of the quantity determined via the rules. The benchmarks should in principle be calculated for products, i.e. a specific performance per unit productive output, to ensure that they maximize greenhouse gas reductions throughout each production process of the sectors concerned. In this study for the European Commission, a blueprint for a methodology based on benchmarking is developed to determine the allocation rules in the EU ETS from 2013 onwards. In case where benchmarking is not regarded feasible, alternative approaches are suggested. The methodology allows determining the allocation for each EU ETS installation eligible for free allocation of emission allowances. The focus of this study is on preparing a first blueprint of an allocation methodology for free allocation of emission allowances under the EU Emission Trading Scheme for the period 2013-2020 for installations in the refinery industry. The report should be read in conjunction with the report on the project approach and general issues.

  14. A performance assessment methodology for low-level radioactive waste disposal

    International Nuclear Information System (INIS)

    Deering, L.R.; Kozak, M.W.

    1990-01-01

    To demonstrate compliance with the performance objectives governing protection of the general population in 10 CFR 61.41, applicants for land disposal of low-level radioactive waste are required to conduct a pathways analysis, or quantitative evaluation of radionuclide release, transport through environmental media, and dose to man. The Nuclear Regulatory Commission staff defined a strategy and initiated a project at Sandia National Laboratories to develop a methodology for independently evaluating an applicant's analysis of postclosure performance. This performance assessment methodology was developed in five stages: (1) identification of environmental pathways, (2) ranking, the significance of the pathways, (3) identification and integration of models for pathway analyses, (4) identification and selection of computer codes and techniques for the methodology, and (5) implementation of the codes and documentation of the methodology. The final methodology implements analytical and simple numerical solutions for source term, ground-water flow and transport, surface water transport, air transport, food chain, and dosimetry analyses, as well as more complex numerical solutions for multidimensional or transient analyses when more detailed assessments are needed. The capability to perform both simple and complex analyses is accomplished through modular modeling, which permits substitution of various models and codes to analyze system components

  15. Max-out-in pivot rule with Dantzig's safeguarding rule for the simplex method

    International Nuclear Information System (INIS)

    Tipawanna, Monsicha; Sinapiromsaran, Krung

    2014-01-01

    The simplex method is used to solve linear programming problem by improving the current basic feasible solution. It uses a pivot rule to guide the search in the feasible region. The pivot rule is used to select an entering index in simplex method. Nowadays, many pivot rule have been presented, but no pivot rule shows superior performance than other. Therefore, this is still an active research in linear programming. In this research, we present the max-out-in pivot rule with Dantzig's safeguarding for simplex method. This rule is based on maximum improvement of objective value of the current basic feasible point similar to the Dantzig's rule. We can illustrate by Klee and Minty problems that our rule outperforms that of Dantzig's rule by the number of iterations for solving linear programming problems

  16. Preliminary PANSAT ground station software design and use of an expert system to analyze telemetry

    Science.gov (United States)

    Lawrence, Gregory W.

    1994-03-01

    The Petite Amateur Navy Satellite (PANSAT) is a communications satellite designed to be used by civilian amateur radio operators. A master ground station is being built at the Naval Postgraduate School. This computer system performs satellite commands, displays telemetry, trouble-shoots problems, and passes messages. The system also controls an open loop tracking antenna. This paper concentrates on the telemetry display, decoding, and interpretation through artificial intelligence (AI). The telemetry is displayed in an easily interpretable format, so that any user can understand the current health of the satellite and be cued as to any problems and possible solutions. Only the master ground station has the ability to receive all telemetry and send commands to the spacecraft; civilian ham users do not have access to this information. The telemetry data is decommutated and analyzed before it is displayed to the user, so that the raw data will not have to be interpreted by ground users. The analysis will use CLIPS imbedded in the code, and derive its inputs from telemetry decommutation. The program is an expert system using a forward chaining set of rules based on the expected operation and parameters of the satellite. By building the rules during the construction and design of the satellite, the telemetry can be well understood and interpreted after the satellite is launched and the designers may no longer be available to provide input to the problem.

  17. State of the Art in Input Ground Motions for Seismic Fragility and Risk Assessment

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Jung Han; Choi, In Kil; Kim, Min Kyu [KAERI, Daejeon (Korea, Republic of)

    2016-05-15

    The purpose of a Seismic Probabilistic Safety Analysis (SPSA) is to determine the probability distribution of core damage due to the potential effects of earthquakes. The SPSA is performed based on four steps, a seismic hazard analysis, a component fragility evaluation, a plant system and accident sequence analysis, and a consequence analysis. There are very different spectrum shapes in every ground motions. The structural response and the seismic load applied to equipment are greatly influenced by a spectral shape of the input ground motion. Therefore the input ground motion need to be determined under the same assumption in risk calculation. Several technic for the determination of input ground motions has developed and reviewed in this study. In this research, the methodologies of the determination of input ground motion for the seismic risk assessment are reviewed and discussed. It has developed to reduce the uncertainty in fragility curves and to remove the conservatism in risk values.

  18. [Methodology for clinical research in Orthodontics, the assets of the beOrtho website].

    Science.gov (United States)

    Ruiz, Martial; Thibult, François

    2014-06-01

    The rules applying to the "evidence-based" methodology strongly influenced the clinical research in orthodontics. However, the implementation of clinical studies requires rigour, important statistical and methodological knowledge, as well as a reliable environment in order to compile and store the data obtained from research. We developed the project "beOrtho.com" (based on orthodontic evidence) in order to fill up the gap between our desire to drive clinical research and the necessity of methodological rigour in the exploitation of its results. BeOrtho website was created to answer the issue of sample recruitment, data compilation and storage, while providing help for the methodological design of clinical studies. It allows the development and monitoring of clinical studies, as well as the creation of databases. On the other hand, we designed an evaluation grid for clinical studies which helps developing systematic reviews. In order to illustrate our point, we tested a research protocol evaluating the interest of the mandibular advancement in the framework of Class II treatment. © EDP Sciences, SFODF, 2014.

  19. Ground Control for Emplacement Drifts for SR

    International Nuclear Information System (INIS)

    Y. Sun

    2000-01-01

    This analysis demonstrates that a satisfactory ground control system can be designed for the Yucca Mountain site, and provides the technical basis for the design of ground support systems to be used in repository emplacement and non-emplacement drifts. The repository ground support design was based on analytical methods using acquired computer codes, and focused on the final support systems. A literature review of case histories, including the lessons learned from the design and construction of the ESF, the studies on the seismic damages of underground openings, and the use of rock mass classification systems in the ground support design, was conducted (Sections 6.3.4 and 6.4). This review provided some basis for determining the inputs and methodologies used in this analysis. Stability of the supported and unsupported emplacement and non-emplacement drifts was evaluated in this analysis. The excavation effects (i.e., state of the stress change due to excavation), thermal effects (i.e., due to heat output from waste packages), and seismic effects (i.e., from potential earthquake events) were evaluated, and stress controlled modes of failure were examined for two in situ stress conditions (k 0 =0.3 and 1.0) using rock properties representing rock mass categories of 1 and 5. Variation of rock mass units such as the non-lithophysal (Tptpmn) and lithophysal (Tptpll) was considered in the analysis. The focus was on the non-lithophysal unit because this unit appears to be relatively weaker and has much smaller joint spacing. Therefore, the drift stability and ground support needs were considered to be controlled by the design for this rock unit. The ground support systems for both emplacement and non-emplacement drifts were incorporated into the models to assess their performance under in situ, thermal, and seismic loading conditions. Both continuum and discontinuum modeling approaches were employed in the analyses of the rock mass behavior and in the evaluation of the

  20. Totally optimal decision rules

    KAUST Repository

    Amin, Talha

    2017-11-22

    Optimality of decision rules (patterns) can be measured in many ways. One of these is referred to as length. Length signifies the number of terms in a decision rule and is optimally minimized. Another, coverage represents the width of a rule’s applicability and generality. As such, it is desirable to maximize coverage. A totally optimal decision rule is a decision rule that has the minimum possible length and the maximum possible coverage. This paper presents a method for determining the presence of totally optimal decision rules for “complete” decision tables (representations of total functions in which different variables can have domains of differing values). Depending on the cardinalities of the domains, we can either guarantee for each tuple of values of the function that totally optimal rules exist for each row of the table (as in the case of total Boolean functions where the cardinalities are equal to 2) or, for each row, we can find a tuple of values of the function for which totally optimal rules do not exist for this row.

  1. Totally optimal decision rules

    KAUST Repository

    Amin, Talha M.; Moshkov, Mikhail

    2017-01-01

    Optimality of decision rules (patterns) can be measured in many ways. One of these is referred to as length. Length signifies the number of terms in a decision rule and is optimally minimized. Another, coverage represents the width of a rule’s applicability and generality. As such, it is desirable to maximize coverage. A totally optimal decision rule is a decision rule that has the minimum possible length and the maximum possible coverage. This paper presents a method for determining the presence of totally optimal decision rules for “complete” decision tables (representations of total functions in which different variables can have domains of differing values). Depending on the cardinalities of the domains, we can either guarantee for each tuple of values of the function that totally optimal rules exist for each row of the table (as in the case of total Boolean functions where the cardinalities are equal to 2) or, for each row, we can find a tuple of values of the function for which totally optimal rules do not exist for this row.

  2. An open repository of earthquake-triggered ground-failure inventories

    Science.gov (United States)

    Schmitt, Robert G.; Tanyas, Hakan; Nowicki Jessee, M. Anna; Zhu, Jing; Biegel, Katherine M.; Allstadt, Kate E.; Jibson, Randall W.; Thompson, Eric M.; van Westen, Cees J.; Sato, Hiroshi P.; Wald, David J.; Godt, Jonathan W.; Gorum, Tolga; Xu, Chong; Rathje, Ellen M.; Knudsen, Keith L.

    2017-12-20

    Earthquake-triggered ground failure, such as landsliding and liquefaction, can contribute significantly to losses, but our current ability to accurately include them in earthquake-hazard analyses is limited. The development of robust and widely applicable models requires access to numerous inventories of ground failures triggered by earthquakes that span a broad range of terrains, shaking characteristics, and climates. We present an openly accessible, centralized earthquake-triggered groundfailure inventory repository in the form of a ScienceBase Community to provide open access to these data with the goal of accelerating research progress. The ScienceBase Community hosts digital inventories created by both U.S. Geological Survey (USGS) and non-USGS authors. We present the original digital inventory files (when available) as well as an integrated database with uniform attributes. We also summarize the mapping methodology and level of completeness as reported by the original author(s) for each inventory. This document describes the steps taken to collect, process, and compile the inventories and the process for adding additional ground-failure inventories to the ScienceBase Community in the future.

  3. Transparency in Economic and Political Decision-Making: The Identification of Sunshine Rules for Transparent Lobbying

    Directory of Open Access Journals (Sweden)

    Laboutková Šárka

    2017-09-01

    Full Text Available Lobbying transparency seems to have been a challenging topic for nearly a decade. For the purposes of the article, the authors focus on a contextual analysis of rules and measures that offers both a broad as well as comprehensive view of the required transparency of lobbying activities and the environment in which decisions are made. In this regard, focusing on the sunshine principles/sunshine rules (not purely limited to laws provides a grasp of the whole issue in a broader context. From a methodological point of view, the exploratory approach was chosen and the coding procedure is mostly dichotomous. As a result, seven key areas with 70 indicators have been identified in terms of transparency of lobbying and decision-making.

  4. Aeroelastic Ground Wind Loads Analysis Tool for Launch Vehicles

    Science.gov (United States)

    Ivanco, Thomas G.

    2016-01-01

    Launch vehicles are exposed to ground winds during rollout and on the launch pad that can induce static and dynamic loads. Of particular concern are the dynamic loads caused by vortex shedding from nearly-cylindrical structures. When the frequency of vortex shedding nears that of a lowly-damped structural mode, the dynamic loads can be more than an order of magnitude greater than mean drag loads. Accurately predicting vehicle response to vortex shedding during the design and analysis cycles is difficult and typically exceeds the practical capabilities of modern computational fluid dynamics codes. Therefore, mitigating the ground wind loads risk typically requires wind-tunnel tests of dynamically-scaled models that are time consuming and expensive to conduct. In recent years, NASA has developed a ground wind loads analysis tool for launch vehicles to fill this analytical capability gap in order to provide predictions for prelaunch static and dynamic loads. This paper includes a background of the ground wind loads problem and the current state-of-the-art. It then discusses the history and significance of the analysis tool and the methodology used to develop it. Finally, results of the analysis tool are compared to wind-tunnel and full-scale data of various geometries and Reynolds numbers.

  5. Diagnostic accuracy of the STRATIFY clinical prediction rule for falls: A systematic review and meta-analysis

    LENUS (Irish Health Repository)

    Billington, Jennifer

    2012-08-07

    AbstractBackgroundThe STRATIFY score is a clinical prediction rule (CPR) derived to assist clinicians to identify patients at risk of falling. The purpose of this systematic review and meta-analysis is to determine the overall diagnostic accuracy of the STRATIFY rule across a variety of clinical settings.MethodsA literature search was performed to identify all studies that validated the STRATIFY rule. The methodological quality of the studies was assessed using the Quality Assessment of Diagnostic Accuracy Studies tool. A STRATIFY score of ≥2 points was used to identify individuals at higher risk of falling. All included studies were combined using a bivariate random effects model to generate pooled sensitivity and specificity of STRATIFY at ≥2 points. Heterogeneity was assessed using the variance of logit transformed sensitivity and specificity.ResultsSeventeen studies were included in our meta-analysis, incorporating 11,378 patients. At a score ≥2 points, the STRATIFY rule is more useful at ruling out falls in those classified as low risk, with a greater pooled sensitivity estimate (0.67, 95% CI 0.52–0.80) than specificity (0.57, 95% CI 0.45 – 0.69). The sensitivity analysis which examined the performance of the rule in different settings and subgroups also showed broadly comparable results, indicating that the STRATIFY rule performs in a similar manner across a variety of different ‘at risk’ patient groups in different clinical settings.ConclusionThis systematic review shows that the diagnostic accuracy of the STRATIFY rule is limited and should not be used in isolation for identifying individuals at high risk of falls in clinical practice.

  6. VaR Methodology Application for Banking Currency Portfolios

    Directory of Open Access Journals (Sweden)

    Daniel Armeanu

    2007-02-01

    Full Text Available VaR has become the standard measure that financial analysts use to quantify market risk. VaR measures can have many applications, such as in risk management, to evaluate the performance of risk takers and for regulatory requirements, and hence it is very important to develop methodologies that provide accurate estimates. In particular, the Basel Committee on Banking Supervision at the Bank for International Settlements imposes to financial institutions such as banks and investment firms to meet capital requirements based on VaR estimates. In this paper we determine VaR for a banking currency portfolio and respect rules of National Bank of Romania regarding VaR report.

  7. Developing Probabilistic Operating Rules for Real-time Conjunctive Use of Surface and Groundwater Resources:Application of Support Vector Machines

    Directory of Open Access Journals (Sweden)

    Mohammad Reza Bazargan-Lari

    2011-01-01

    Full Text Available Developing optimal operating policies for conjunctive use of surface and groundwater resources when different decision makers and stakeholders with conflicting objectives are involved is usually a challenging task. This problem would be more complex when objectives related to surface and groundwater quality are taken into account. In this paper, a new methodology is developed for real time conjunctive use of surface and groundwater resources. In the proposed methodology, a well-known multi-objective genetic algorithm, namely Non-dominated Sorting Genetic Algorithm II (NSGA-II is employed to develop a Pareto front among the objectives. The Young conflict resolution theory is also used for resolving the conflict of interests among decision makers. To develop the real time conjunctive use operating rules, the Probabilistic Support Vector Machines (PSVMs, which are capable of providing probability distribution functions of decision variables, are utilized. The proposed methodology is applied to Tehran Aquifer inTehran metropolitan area,Iran. Stakeholders in the study area have some conflicting interests including supplying water with acceptable quality, reducing pumping costs, improving groundwater quality and controlling the groundwater table fluctuations. In the proposed methodology, MODFLOW and MT3D groundwater quantity and quality simulation models are linked with NSGA-II optimization model to develop Pareto fronts among the objectives. The best solutions on the Pareto fronts are then selected using the Young conflict resolution theory. The selected solution (optimal monthly operating policies is used to train and verify a PSVM. The results show the significance of applying an integrated conflict resolution approach and the capability of support vector machines for the real time conjunctive use of surface and groundwater resources in the study area. It is also shown that the validation accuracy of the proposed operating rules is higher that 80

  8. Purification of arsenic contaminated ground water using hydrated manganese dioxide

    International Nuclear Information System (INIS)

    Raje, N.; Swain, K.K.

    2002-01-01

    An analytical methodology has been developed for the separation of arsenic from ground water using inorganic material in neutral medium. The separation procedure involves the quantitative retention of arsenic on hydrated manganese dioxide, in neutral medium. The validity of the separation procedure has been checked by a standard addition method and radiotracer studies. Neutron activation analysis (NAA), a powerful measurement technique, has been used for the quantitative determination of arsenic. (author)

  9. 4. Principles of Art from Antiquity to Contemporary Pedagogy in the Context of Methodology of Art Education

    Directory of Open Access Journals (Sweden)

    Olimpiada Arbuz-Spatari

    2016-03-01

    Full Text Available The methodologies of Art Education is a system of educational documents - principles, rules, methods, procedures, forms - designed determinative - reflective thinking from teleology, content, communication arts / cultural / scientific, reception and receiver topic communicating, and are subject oriented educated / creator student under the laws of education, communication and artistic principles.

  10. Safety of High Speed Ground Transportation Systems : Analytical Methodology for Safety Validation of Computer Controlled Subsystems : Volume 2. Development of a Safety Validation Methodology

    Science.gov (United States)

    1995-01-01

    This report describes the development of a methodology designed to assure that a sufficiently high level of safety is achieved and maintained in computer-based systems which perform safety cortical functions in high-speed rail or magnetic levitation ...

  11. Quantifying reactor safety margins: Application of code scaling, applicability, and uncertainty evaluation methodology to a large-break, loss-of-coolant accident

    International Nuclear Information System (INIS)

    Boyack, B.; Duffey, R.; Wilson, G.; Griffith, P.; Lellouche, G.; Levy, S.; Rohatgi, U.; Wulff, W.; Zuber, N.

    1989-12-01

    The US Nuclear Regulatory Commission (NRC) has issued a revised rule for loss-of-coolant accident/emergency core cooling system (ECCS) analysis of light water reactors to allow the use of best-estimate computer codes in safety analysis as an option. A key feature of this option requires the licensee to quantify the uncertainty of the calculations and include that uncertainty when comparing the calculated results with acceptance limits provided in 10 CFR Part 50. To support the revised ECCS rule and illustrate its application, the NRC and its contractors and consultants have developed and demonstrated an uncertainty evaluation methodology called code scaling, applicability, and uncertainty (CSAU). The CSAU methodology and an example application described in this report demonstrate that uncertainties in complex phenomena can be quantified. The methodology is structured, traceable, and practical, as is needed in the regulatory arena. The methodology is systematic and comprehensive as it addresses and integrates the scenario, experiments, code, and plant to resolve questions concerned with: (a) code capability to scale-up processes from test facility to full-scale nuclear power plants; (b) code applicability to safety studies of a postulated accident scenario in a specified nuclear power plant; and (c) quantifying uncertainties of calculated results. 127 refs., 55 figs., 40 tabs

  12. Do Group Decision Rules Affect Trust? A Laboratory Experiment on Group Decision Rules and Trust

    DEFF Research Database (Denmark)

    Nielsen, Julie Hassing

    2016-01-01

    Enhanced participation has been prescribed as the way forward for improving democratic decision making while generating positive attributes like trust. Yet we do not know the extent to which rules affect the outcome of decision making. This article investigates how different group decision rules......-hierarchical decision-making procedures enhance trust vis-à-vis other more hierarchical decision-making procedures....... affect group trust by testing three ideal types of decision rules (i.e., a Unilateral rule, a Representative rule and a 'Non-rule') in a laboratory experiment. The article shows significant differences between the three decision rules on trust after deliberation. Interestingly, however, it finds...

  13. ConGEMs: Condensed Gene Co-Expression Module Discovery Through Rule-Based Clustering and Its Application to Carcinogenesis

    Directory of Open Access Journals (Sweden)

    Saurav Mallik

    2017-12-01

    Full Text Available For transcriptomic analysis, there are numerous microarray-based genomic data, especially those generated for cancer research. The typical analysis measures the difference between a cancer sample-group and a matched control group for each transcript or gene. Association rule mining is used to discover interesting item sets through rule-based methodology. Thus, it has advantages to find causal effect relationships between the transcripts. In this work, we introduce two new rule-based similarity measures—weighted rank-based Jaccard and Cosine measures—and then propose a novel computational framework to detect condensed gene co-expression modules ( C o n G E M s through the association rule-based learning system and the weighted similarity scores. In practice, the list of evolved condensed markers that consists of both singular and complex markers in nature depends on the corresponding condensed gene sets in either antecedent or consequent of the rules of the resultant modules. In our evaluation, these markers could be supported by literature evidence, KEGG (Kyoto Encyclopedia of Genes and Genomes pathway and Gene Ontology annotations. Specifically, we preliminarily identified differentially expressed genes using an empirical Bayes test. A recently developed algorithm—RANWAR—was then utilized to determine the association rules from these genes. Based on that, we computed the integrated similarity scores of these rule-based similarity measures between each rule-pair, and the resultant scores were used for clustering to identify the co-expressed rule-modules. We applied our method to a gene expression dataset for lung squamous cell carcinoma and a genome methylation dataset for uterine cervical carcinogenesis. Our proposed module discovery method produced better results than the traditional gene-module discovery measures. In summary, our proposed rule-based method is useful for exploring biomarker modules from transcriptomic data.

  14. ConGEMs: Condensed Gene Co-Expression Module Discovery Through Rule-Based Clustering and Its Application to Carcinogenesis.

    Science.gov (United States)

    Mallik, Saurav; Zhao, Zhongming

    2017-12-28

    For transcriptomic analysis, there are numerous microarray-based genomic data, especially those generated for cancer research. The typical analysis measures the difference between a cancer sample-group and a matched control group for each transcript or gene. Association rule mining is used to discover interesting item sets through rule-based methodology. Thus, it has advantages to find causal effect relationships between the transcripts. In this work, we introduce two new rule-based similarity measures-weighted rank-based Jaccard and Cosine measures-and then propose a novel computational framework to detect condensed gene co-expression modules ( C o n G E M s) through the association rule-based learning system and the weighted similarity scores. In practice, the list of evolved condensed markers that consists of both singular and complex markers in nature depends on the corresponding condensed gene sets in either antecedent or consequent of the rules of the resultant modules. In our evaluation, these markers could be supported by literature evidence, KEGG (Kyoto Encyclopedia of Genes and Genomes) pathway and Gene Ontology annotations. Specifically, we preliminarily identified differentially expressed genes using an empirical Bayes test. A recently developed algorithm-RANWAR-was then utilized to determine the association rules from these genes. Based on that, we computed the integrated similarity scores of these rule-based similarity measures between each rule-pair, and the resultant scores were used for clustering to identify the co-expressed rule-modules. We applied our method to a gene expression dataset for lung squamous cell carcinoma and a genome methylation dataset for uterine cervical carcinogenesis. Our proposed module discovery method produced better results than the traditional gene-module discovery measures. In summary, our proposed rule-based method is useful for exploring biomarker modules from transcriptomic data.

  15. Verification of business rules programs

    CERN Document Server

    Silva, Bruno Berstel-Da

    2013-01-01

    Rules represent a simplified means of programming, congruent with our understanding of human brain constructs. With the advent of business rules management systems, it has been possible to introduce rule-based programming to nonprogrammers, allowing them to map expert intent into code in applications such as fraud detection, financial transactions, healthcare, retail, and marketing. However, a remaining concern is the quality, safety, and reliability of the resulting programs.  This book is on business rules programs, that is, rule programs as handled in business rules management systems. Its

  16. Playing by the Rules: Researching, Teaching and Learning Sexual Ethics with Young Men in the Australian National Rugby League

    Science.gov (United States)

    Albury, Kath; Carmody, Moira; Evers, Clifton; Lumby, Catharine

    2011-01-01

    In 2004, the Australian National Rugby League (NRL) commissioned the Playing By The Rules research project in response to allegations of sexual assault by members of a professional rugby league team. This article offers an overview of the theoretical and methodological approaches adopted by the team, and the subsequent workplace education…

  17. RIGHTS, RULES, AND DEMOCRACY

    Directory of Open Access Journals (Sweden)

    Richard S. Kay, University of Connecticut-School of Law, Estados Unidos

    2012-11-01

    Full Text Available Abstract: Democracy require protection of certain fundamental rights, but can we expect courts to follow rules? There seems little escape from the proposition that substantive constitutional review by an unelected judiciary is a presumptive abridgement of democratic decision-making. Once we have accepted the proposition that there exist human rights that ought to be protected, this should hardly surprise us. No one thinks courts are perfect translators of the rules invoked before them on every occasion. But it is equally clear that rules sometimes do decide cases. In modern legal systems the relative roles of courts and legislators with respect to the rules of the system is a commonplace. Legislatures make rules. Courts apply them in particular disputes. When we are talking about human rights, however, that assumption must be clarified in at least one way. The defense of the practice of constitutional review in this article assumes courts can and do enforce rules. This article also makes clear what is the meaning of “following rules”. Preference for judicial over legislative interpretation of rights, therefore, seems to hang on the question of whether or not judges are capable of subordinating their own judgment to that incorporated in the rules by their makers. This article maintains that, in general, entrenched constitutional rules (and not just constitutional courts can and do constrain public conduct and protect human rights. The article concludes that the value judgments will depend on our estimate of the benefits we derive from the process of representative self-government. Against those benefits we will have to measure the importance we place on being able to live our lives with the security created by a regime of human rights protected by the rule of law. Keywords: Democracy. Human Rights. Rules. Judicial Review.

  18. Canonical duality theory unified methodology for multidisciplinary study

    CERN Document Server

    Latorre, Vittorio; Ruan, Ning

    2017-01-01

    This book on canonical duality theory provides a comprehensive review of its philosophical origin, physics foundation, and mathematical statements in both finite- and infinite-dimensional spaces. A ground-breaking methodological theory, canonical duality theory can be used for modeling complex systems within a unified framework and for solving a large class of challenging problems in multidisciplinary fields in engineering, mathematics, and the sciences. This volume places a particular emphasis on canonical duality theory’s role in bridging the gap between non-convex analysis/mechanics and global optimization.  With 18 total chapters written by experts in their fields, this volume provides a nonconventional theory for unified understanding of the fundamental difficulties in large deformation mechanics, bifurcation/chaos in nonlinear science, and the NP-hard problems in global optimization. Additionally, readers will find a unified methodology and powerful algorithms for solving challenging problems in comp...

  19. US utility experience in implementing the maintenance rule. Inputs to the Spanish programme

    International Nuclear Information System (INIS)

    Hevia Ruperez, F.; Gregor, F.E.

    1996-01-01

    Following the issuance of the Maintenance Rule (10 CFR 50.65) the US Utility industry developed detailed guidance for the implementation of this performance-based rule. Methods, processes and procedures are very flexible and have culminated in a variety of plant-specific applications. The experience gained over the last three years provides a valuables input towards the development of the Spanish Maintenance Rule Program. Lessons learned and insights gained from the NRC and Utility meetings, workshops, plant audits and the interchange of documents, as well as results from the information-sharing utility working groups are summarized for possible application to the specific Spanish scenario. While most of the implementation issue have been resolved, some open issues remain to be negotiated within the NRC. With the deadline for the compliance of implementation set for July 1996, US Utilities are working diligently to settle these few differences. Once more, the results of these negotiations will provide good references for the specific application and licensing of these requirements in Spanish NNPs. The paper will describe the situation in the US and the benefits, sequence and methodologies for transferring these lessons to Spanish plants. (Author)

  20. Convention on nuclear safety. Rules of procedure and financial rules

    International Nuclear Information System (INIS)

    1999-01-01

    The document is the first revision of the Rules of Procedures and Financial Rules that apply mutatis mutandis to any meetings of the Contracting Parties to the Convention on Nuclear Safety (INFCIRC/573), convened in accordance with the Chapter 3 of the Convention

  1. Convention on Nuclear Safety. Rules of procedure and financial rules

    International Nuclear Information System (INIS)

    2002-01-01

    The document is the second revision of the Rules of Procedures and Financial Rules that apply mutatis mutandis to any meetings of the Contracting Parties to the Convention on Nuclear Safety (INFCIRC/573), convened in accordance with the Chapter 3 of the Convention

  2. Service innovations breaking institutionalized rules of health care

    DEFF Research Database (Denmark)

    Wallin, Arto; Fuglsang, Lars

    2017-01-01

    .e. regulations, normative rules, and cultural-cognitive beliefs) protecting the field by introducing digitally enabled service innovations into health care markets. Design/methodology/approach – The study is qualitative and interpretative in nature and utilizes case study as a research strategy. The paper...... is based on data that were collected through narrative interviews and document analysis from seven new ventures participating in a start-up accelerator program. Findings – Results indicate that service innovations that require a change in the institutional structures of the health care system are enacted...... through three highly iterative key processes: institutional sensemaking that creates an understanding of prevailing institutional arrangements and that constructs meaning for institutional change efforts, theorization of change through linguistic device, and modifications of institutions by building...

  3. Accidental safety analysis methodology development in decommission of the nuclear facility

    Energy Technology Data Exchange (ETDEWEB)

    Park, G. H.; Hwang, J. H.; Jae, M. S.; Seong, J. H.; Shin, S. H.; Cheong, S. J.; Pae, J. H.; Ang, G. R.; Lee, J. U. [Seoul National Univ., Seoul (Korea, Republic of)

    2002-03-15

    Decontamination and Decommissioning (D and D) of a nuclear reactor cost about 20% of construction expense and production of nuclear wastes during decommissioning makes environmental issues. Decommissioning of a nuclear reactor in Korea is in a just beginning stage, lacking clear standards and regulations for decommissioning. This work accident safety analysis in decommissioning of the nuclear facility can be a solid ground for the standards and regulations. For source term analysis for Kori-1 reactor vessel, MCNP/ORIGEN calculation methodology was applied. The activity of each important nuclide in the vessel was estimated at a time after 2008, the year Kori-1 plant is supposed to be decommissioned. And a methodology for risk analysis assessment in decommissioning was developed.

  4. Biomechanical considerations of distance kicking in Australian Rules football.

    Science.gov (United States)

    Ball, Kevin

    2008-01-01

    Kicking for distance in Australian Rules football is an important skill. Here, I examine technical aspects that contribute to achieving maximal kick distance. Twenty-eight elite players kicked for distance while being videoed at 500 Hz. Two-dimensional digitized data of nine body landmarks and the football were used to calculate kinematic parameters from kicking foot toe-off to the instant before ball contact. Longer kick distances were associated with greater foot speeds and shank angular velocities at ball contact, larger last step lengths, and greater distances from the ground when ball contact occurred. Foot speed, shank angular velocity, and ball position relative to the support foot at ball contact were included in the best regression predicting distance. A continuum of technique was evident among the kickers. At one end, kickers displayed relatively larger knee angular velocities and smaller thigh angular velocities at ball contact. At the other end, kickers produced relatively larger thigh angular velocities and smaller knee angular velocities at ball contact. To increase kicking distance, increasing foot speed and shank angular velocity at ball contact, increasing the last step length, and optimizing ball position relative to the ground and support foot are recommended.

  5. Generating or developing grounded theory: methods to understand health and illness.

    Science.gov (United States)

    Woods, Phillip; Gapp, Rod; King, Michelle A

    2016-06-01

    Grounded theory is a qualitative research methodology that aims to explain social phenomena, e.g. why particular motivations or patterns of behaviour occur, at a conceptual level. Developed in the 1960s by Glaser and Strauss, the methodology has been reinterpreted by Strauss and Corbin in more recent times, resulting in different schools of thought. Differences arise from different philosophical perspectives concerning knowledge (epistemology) and the nature of reality (ontology), demanding that researchers make clear theoretical choices at the commencement of their research when choosing this methodology. Compared to other qualitative methods it has ability to achieve understanding of, rather than simply describing, a social phenomenon. Achieving understanding however, requires theoretical sampling to choose interviewees that can contribute most to the research and understanding of the phenomenon, and constant comparison of interviews to evaluate the same event or process in different settings or situations. Sampling continues until conceptual saturation is reached, i.e. when no new concepts emerge from the data. Data analysis focusses on categorising data (finding the main elements of what is occurring and why), and describing those categories in terms of properties (conceptual characteristics that define the category and give meaning) and dimensions (the variations within properties which produce specificity and range). Ultimately a core category which theoretically explains how all other categories are linked together is developed from the data. While achieving theoretical abstraction in the core category, it should be logical and capture all of the variation within the data. Theory development requires understanding of the methodology not just working through a set of procedures. This article provides a basic overview, set in the literature surrounding grounded theory, for those wanting to increase their understanding and quality of research output.

  6. Modeling of earthquake ground motion in the frequency domain

    Science.gov (United States)

    Thrainsson, Hjortur

    In recent years, the utilization of time histories of earthquake ground motion has grown considerably in the design and analysis of civil structures. It is very unlikely, however, that recordings of earthquake ground motion will be available for all sites and conditions of interest. Hence, there is a need for efficient methods for the simulation and spatial interpolation of earthquake ground motion. In addition to providing estimates of the ground motion at a site using data from adjacent recording stations, spatially interpolated ground motions can also be used in design and analysis of long-span structures, such as bridges and pipelines, where differential movement is important. The objective of this research is to develop a methodology for rapid generation of horizontal earthquake ground motion at any site for a given region, based on readily available source, path and site characteristics, or (sparse) recordings. The research includes two main topics: (i) the simulation of earthquake ground motion at a given site, and (ii) the spatial interpolation of earthquake ground motion. In topic (i), models are developed to simulate acceleration time histories using the inverse discrete Fourier transform. The Fourier phase differences, defined as the difference in phase angle between adjacent frequency components, are simulated conditional on the Fourier amplitude. Uniformly processed recordings from recent California earthquakes are used to validate the simulation models, as well as to develop prediction formulas for the model parameters. The models developed in this research provide rapid simulation of earthquake ground motion over a wide range of magnitudes and distances, but they are not intended to replace more robust geophysical models. In topic (ii), a model is developed in which Fourier amplitudes and Fourier phase angles are interpolated separately. A simple dispersion relationship is included in the phase angle interpolation. The accuracy of the interpolation

  7. Methodology for the inference of gene function from phenotype data.

    Science.gov (United States)

    Ascensao, Joao A; Dolan, Mary E; Hill, David P; Blake, Judith A

    2014-12-12

    Biomedical ontologies are increasingly instrumental in the advancement of biological research primarily through their use to efficiently consolidate large amounts of data into structured, accessible sets. However, ontology development and usage can be hampered by the segregation of knowledge by domain that occurs due to independent development and use of the ontologies. The ability to infer data associated with one ontology to data associated with another ontology would prove useful in expanding information content and scope. We here focus on relating two ontologies: the Gene Ontology (GO), which encodes canonical gene function, and the Mammalian Phenotype Ontology (MP), which describes non-canonical phenotypes, using statistical methods to suggest GO functional annotations from existing MP phenotype annotations. This work is in contrast to previous studies that have focused on inferring gene function from phenotype primarily through lexical or semantic similarity measures. We have designed and tested a set of algorithms that represents a novel methodology to define rules for predicting gene function by examining the emergent structure and relationships between the gene functions and phenotypes rather than inspecting the terms semantically. The algorithms inspect relationships among multiple phenotype terms to deduce if there are cases where they all arise from a single gene function. We apply this methodology to data about genes in the laboratory mouse that are formally represented in the Mouse Genome Informatics (MGI) resource. From the data, 7444 rule instances were generated from five generalized rules, resulting in 4818 unique GO functional predictions for 1796 genes. We show that our method is capable of inferring high-quality functional annotations from curated phenotype data. As well as creating inferred annotations, our method has the potential to allow for the elucidation of unforeseen, biologically significant associations between gene function and

  8. Ground ice and hydrothermal ground motions on aufeis plots of river valleys

    Directory of Open Access Journals (Sweden)

    V. R. Alekseev

    2015-01-01

    of river valleys are the most «hot» points of the permafrost zone. A comprehensive study of them requires organization of several reference aufeis test areas located in different natural-climatic and geocryological zones. In addition to the natural-historical and methodological aspects, the future research program should include consideration of problems related to interaction between engineering structures and aufeis events and aufeis ice-ground complexes. 

  9. Demonstration of a performance assessment methodology for nuclear waste isolation in basalt formations

    International Nuclear Information System (INIS)

    Bonano, E.J.; Davis, P.A.

    1988-01-01

    This paper summarizes the results of the demonstration of a performance assessment methodology developed by Sandia National Laboratories, Albuquerque for the US Nuclear Regulatory Commission for use in the analysis of high-level radioactive waste disposal in deep basalts. Seven scenarios that could affect the performance of a repository in basalts were analyzed. One of these scenarios, normal ground-water flow, was called the base-case scenario. This was used to demonstrate the modeling capabilities in the methodology necessary to assess compliance with the ground-water travel time criterion. The scenario analysis consisted of both scenario screening and consequence modeling. Preliminary analyses of scenarios considering heat released from the waste and the alteration of the hydraulic properties of the rock mass due to loads created by a glacier suggested that these effects would not be significant. The analysis of other scenarios indicated that those changing the flow field in the vicinity of the repository would have an impact on radionuclide discharges, while changes far from the repository may not be significant. The analysis of the base-case scenario was used to show the importance of matrix diffusion as a radionuclide retardation mechanism in fractured media. The demonstration of the methodology also included an overall sensitivity analysis to identify important parameters and/or processes. 15 refs., 13 figs., 2 tabs

  10. Selecting Tanker Steaming Speeds under Uncertainty: A Rule-Based Bayesian Reasoning Approach

    Directory of Open Access Journals (Sweden)

    N.S.F. Abdul Rahman

    2015-06-01

    Full Text Available In the tanker industry, there are a lot of uncertain conditions that tanker companies have to deal with. For example, the global financial crisis and economic recession, the increase of bunker fuel prices and global climate change. Such conditions have forced tanker companies to change tankers speed from full speed to slow speed, extra slow speed and super slow speed. Due to such conditions, the objective of this paper is to present a methodology for determining vessel speeds of tankers that minimize the cost of the vessels under such conditions. The four levels of vessel speed in the tanker industry will be investigated and will incorporate a number of uncertain conditions. This will be done by developing a scientific model using a rule-based Bayesian reasoning method. The proposed model has produced 96 rules that can be used as guidance in the decision making process. Such results help tanker companies to determine the appropriate vessel speed to be used in a dynamic operational environmental.

  11. Westinghouse loading pattern search methodology for complex core designs

    International Nuclear Information System (INIS)

    Chao, Y.A.; Alsop, B.H.; Johansen, B.J.; Morita, T.

    1991-01-01

    Pressurized water reactor core designs have become more complex and must meet a plethora of design constraints. Trends have been toward longer cycles with increased discharge burnup, increased burnable absorber (BA) number, mixed BA types, reduced radial leakage, axially blanketed fuel, and multiple-batch feed fuel regions. Obtaining economical reload core loading patterns (LPs) that meet design criteria is a difficult task to do manually. Automated LP search tools are needed. An LP search tool cannot possibly perform an exhaustive search because of the sheer size of the combinatorial problem. On the other hand, evolving complexity of the design features and constraints often invalidates expert rules based on past design experiences. Westinghouse has developed a sophisticated loading pattern search methodology. This methodology is embodied in the LPOP code, which Westinghouse nuclear designers use extensively. The LPOP code generates a variety of LPs meeting design constraints and performs a two-cycle economic evaluation of the generated LPs. The designer selects the most appropriate patterns for fine tuning and evaluation by the design codes. This paper describes the major features of the LPOP methodology that are relevant to fulfilling the aforementioned requirements. Data and examples are also provided to demonstrate the performance of LPOP in meeting the complex design needs

  12. WellnessRules: A Web 3.0 Case Study in RuleML-Based Prolog-N3 Profile Interoperation

    Science.gov (United States)

    Boley, Harold; Osmun, Taylor Michael; Craig, Benjamin Larry

    An interoperation study, WellnessRules, is described, where rules about wellness opportunities are created by participants in rule languages such as Prolog and N3, and translated within a wellness community using RuleML/XML. The wellness rules are centered around participants, as profiles, encoding knowledge about their activities conditional on the season, the time-of-day, the weather, etc. This distributed knowledge base extends FOAF profiles with a vocabulary and rules about wellness group networking. The communication between participants is organized through Rule Responder, permitting wellness-profile translation and distributed querying across engines. WellnessRules interoperates between rules and queries in the relational (Datalog) paradigm of the pure-Prolog subset of POSL and in the frame (F-logic) paradigm of N3. An evaluation of Rule Responder instantiated for WellnessRules revealed acceptable Web response times.

  13. Multiple cues add up in defining a figure on a ground.

    Science.gov (United States)

    Devinck, Frédéric; Spillmann, Lothar

    2013-01-25

    We studied the contribution of multiple cues to figure-ground segregation. Convexity, symmetry, and top-down polarity (henceforth called wide base) were used as cues. Single-cue displays as well as ambiguous stimulus patterns containing two or three cues were presented. Error rate (defined by responses to uncued stimuli) and reaction time were used to quantify the figural strength of a given cue. In the first experiment, observers were asked to report which of two regions, left or right, appeared as foreground figure. Error rate did not benefit from adding additional cues if convexity was present, suggesting that responses were based on convexity as the predominant figural determinant. However, reaction time became shorter with additional cues even if convexity was present. For example, when symmetry and wide base were added, figure-ground segregation was facilitated. In a second experiment, stimulus patterns were exposed for 150ms to rule out eye movements. Results were similar to those found in the first experiment. Both experiments suggest that under the conditions of our experiment figure-ground segregation is perceived more readily, when several cues cooperate in defining the figure. Copyright © 2012 Elsevier Ltd. All rights reserved.

  14. A methodological approach for designing a usable ontology-based GUI in healthcare.

    Science.gov (United States)

    Lasierra, N; Kushniruk, A; Alesanco, A; Borycki, E; García, J

    2013-01-01

    This paper presents a methodological approach to the design and evaluation of an interface for an ontology-based system used for designing care plans for monitoring patients at home. In order to define the care plans, physicians need a tool for creating instances of the ontology and configuring some rules. Our purpose is to develop an interface to allow clinicians to interact with the ontology. Although ontology-driven applications do not necessarily present the ontology in the user interface, it is our hypothesis that showing selected parts of the ontology in a "usable" way could enhance clinician's understanding and make easier the definition of the care plans. Based on prototyping and iterative testing, this methodology combines visualization techniques and usability methods. Preliminary results obtained after a formative evaluation indicate the effectiveness of suggested combination.

  15. The role of traffic rules.

    NARCIS (Netherlands)

    Noordzij, P.C.

    1988-01-01

    Experienced road users seem to have their own set of traffic rules (including rules about when to violate the official rules). The number of violations is enormous, causing great concern for the authorities. The situation could be improved by separating a set of rules with the aim of deterring road

  16. Determination of the ground state of an Au-supported FePc film based on the interpretation of Fe K - and L -edge x-ray magnetic circular dichroism measurements

    Science.gov (United States)

    Natoli, Calogero R.; Krüger, Peter; Bartolomé, Juan; Bartolomé, Fernando

    2018-04-01

    We determine the magnetic ground state of the FePc molecule on Au-supported thin films based on the observed values of orbital anisotropy and spectroscopic x-ray magnetic circular dichroism (XMCD) measurements at the Fe K and L edges. Starting from ab initio molecular orbital multiplet calculations for the isolated molecule, we diagonalize the spin-orbit interaction in the subspace spanned by the three lowest spin triplet states of 3A2 g and 3Eg symmetry in the presence of a saturating magnetic field at a polar angle θ with respect to the normal to the plane of the film, plus an external perturbation representing the effect of the molecules in the stack on the FePc molecule under consideration. We find that the orbital moment of the ground state strongly depends on the magnetic field direction in agreement with the sum rule analysis of the L23-edge XMCD data. We calculate integrals over the XMCD spectra at the Fe K and L23 edges as used in the sum rules and explicitly show that they agree with the expectation values of the orbital moment and effective spin moment of the ground state. On the basis of this analysis, we can rule out alternative candidates proposed in the literature.

  17. Life fraction rules

    International Nuclear Information System (INIS)

    Maile, K.

    1989-01-01

    Evaluations for lifetime estimation of high temperature loaded HTR-components under creep fatigue load had been performed. The evaluations were carried out on the basis of experimental data of strain controlled fatigue tests with respectively without hold times performed on material NiCr 22 Co 12 Mo (Inconel 617). Life prediction was made by means of the linear damage accumulation rule. Due to the high temperatures no realistic estimates of creep damage can be obtained with this rule. Therefore the rule was modified. The modifications consist in a different analysis of the relaxation curve including different calculation of the creep damage estimate resp. in an extended rule, taking into consideration the interaction between creep and fatigue. In order to reach a better result transparency and to reduce data set dependent result scattering a round robin with a given data set was carried out. The round robin yielded that for a given test temperature of T = 950deg C realistic estimate of damage can be obtained with each modification. Furthermore a reduction of resulting scatterbands in the interaction diagram can be observed, i.e. the practicability of the rule has been increased. (orig.)

  18. Monitoring of arched sched ground layer

    International Nuclear Information System (INIS)

    Listjak, M.; Slaninka, A.; Rau, L.; Pajersky, P.

    2015-01-01

    Arched Shed was a part of controlled area of NPP A1 site in Jaslovske Bohunice (Slovakia). It had been used for temporary storage of loose radioactive waste (RAW) which has been characterized within the BIDSF project C13, Characterisation of Loose Radioactive Waste'. Stored RAW has been treated and sorted within the project ',Realization of the 2 nd stage of Decommissioning Project of NPP A1'. Area of Arched Shed represents approximately 270 m 2 (45 m x 6 m). Ground layer of the AS consists mostly of soil with solid elements (stones and gravel). The aim of monitoring was to remove the contaminated soil up to 1 m below ground level. Requirement for detail monitoring of the Arched Shed ground layer resulted from conclusions of the BIDSF project C13 which has proved that massic activity 137 Cs of soil was up to few thousands Bq·kg -1 in underground layer. Dominant easy to measure radionuclide in the soil is 137 Cs which has been used as a key radionuclide for methodology of in-situ soil monitoring. Following methods has been applied during characterization: dose rate survey, sampling from defined ground layer followed by laboratory gamma spectrometry analysis by the accredited testing laboratory of radiation dosimetry VUJE (S-219) and in-situ scintillation gamma spectrometry by 1.5''x1.5'' LaBr detector. Massic activity of the remaining soil (not excavated) comply the criteria for free release into the environment (Government Regulation of Slovak Republic 345/2006 Coll.). Area was filled up by non-contaminated soil up to the ground level of surroundings. Afterward the area was covered with geotextile and concrete panels and nowadays it is ready for further usage within the NPP A1 decommissioning project as a place for treatment, conditioning and disposal of contaminated soil and concrete. (authors)

  19. Influence of mass-asymmetry and ground state spin on fission fragment angular distributions

    International Nuclear Information System (INIS)

    Thomas, R.G.; Biswas, D.C.; Saxena, A.; Pant, L.M.; Nayak, B.K.; Vind, R.P.; Sahu, P.K.; Sinha, Shrabani; Choudhury, R.K.

    2001-01-01

    The strong influence of the target or/and projectile ground state spin on the anomalously large anisotropies of fission fragments produced in the heavy-ion induced fission of actinide targets were reported earlier. Interestingly, all those systems studied were having a mass asymmetry greater than the Businaro-Gallone critical asymmetry and hence the presence of pre-equilibrium fission was unambiguously ruled out. The observed anisotropies were successfully explained using the ECD-K-States model. It is of interest to know the influence of the target/projectile ground state spin on systems having an entrance channel mass asymmetry less than the critical value where pre-equilibrium fission cannot be ignored. With this motivation we performed measurements of fission fragment angular distributions of the 16 O+ 235 U (spin=7/2) system

  20. A proposed heuristic methodology for searching reloading pattern

    International Nuclear Information System (INIS)

    Choi, K. Y.; Yoon, Y. K.

    1993-01-01

    A new heuristic method for loading pattern search has been developed to overcome shortcomings of the algorithmic approach. To reduce the size of vast solution space, general shuffling rules, a regionwise shuffling method, and a pattern grouping method were introduced. The entropy theory was applied to classify possible loading patterns into groups with similarity between them. The pattern search program was implemented with use of the PROLOG language. A two-group nodal code MEDIUM-2D was used for analysis of power distribution in the core. The above mentioned methodology has been tested to show effectiveness in reducing of solution space down to a few hundred pattern groups. Burnable poison rods were then arranged in each pattern group in accordance with burnable poison distribution rules, which led to further reduction of the solution space to several scores of acceptable pattern groups. The method of maximizing cycle length (MCL) and minimizing power-peaking factor (MPF) were applied to search for specific useful loading patterns from the acceptable pattern groups. Thus, several specific loading patterns that have low power-peaking factor and large cycle length were successfully searched from the selected pattern groups. (Author)

  1. Analysis of Rules for Islamic Inheritance Law in Indonesia Using Hybrid Rule Based Learning

    Science.gov (United States)

    Khosyi'ah, S.; Irfan, M.; Maylawati, D. S.; Mukhlas, O. S.

    2018-01-01

    Along with the development of human civilization in Indonesia, the changes and reform of Islamic inheritance law so as to conform to the conditions and culture cannot be denied. The distribution of inheritance in Indonesia can be done automatically by storing the rule of Islamic inheritance law in the expert system. In this study, we analyze the knowledge of experts in Islamic inheritance in Indonesia and represent it in the form of rules using rule-based Forward Chaining (FC) and Davis-Putman-Logemann-Loveland (DPLL) algorithms. By hybridizing FC and DPLL algorithms, the rules of Islamic inheritance law in Indonesia are clearly defined and measured. The rules were conceptually validated by some experts in Islamic laws and informatics. The results revealed that generally all rules were ready for use in an expert system.

  2. METHODOLOGY OF PROFESSIONAL PEDAGOGICAL EDUCATION: THEORY AND PRACTICE (theoretical and methodological foundations of vocational teacher education

    Directory of Open Access Journals (Sweden)

    Evgeny M. Dorozhkin

    2014-01-01

    Full Text Available The study is aimed at investigating a justification of the new approach to the problem of vocational education development through the prism of interdependence research methodology and practice. This conceptual setup allows determining the main directions for teacher training modernization of vocational schools. The authors note that the current socio-economic situation in our country has actualized the problem of personnel training. Politicians, economists and scientists’ speeches are all about the shortage of skilled personnel. They see the main reason of this catastrophic situation in the present system of primary and secondary vocational education. At least they concern over the current practice of pedagogical personnel training of vocational education who are to restore the system of vocational education. Our country, Russia has a great positive experience in solving this problem. Scientific-methodological centre for vocational teacher education is the Russian State Vocational Pedagogical University under the scientific direction of Academician of the Russian Academy of Education, G. M. Romantsev. The reflection of scientifictheoretical bases of this education led the authors to the analysis and designing (formation of existent and new professional and pedagogical methodology. Methods. The fundamental position of A. M. Novikov on the generality of the research (scientific and practical activity methodology has become the theoretical platform of the present study. Conceptual field, conceptual statements and professional model are presented as the whole system (or integrating factor. The theoretical framework has determined the logic of the study and its results. Scientific and educational methodology differentiation in terms of the subject of cognitive activity has allowed identifying the main scientific and practical disciplines of vocational teacher education. The creative concept as the subject ground is instrumental analysis of

  3. A Better Budget Rule

    Science.gov (United States)

    Dothan, Michael; Thompson, Fred

    2009-01-01

    Debt limits, interest coverage ratios, one-off balanced budget requirements, pay-as-you-go rules, and tax and expenditure limits are among the most important fiscal rules for constraining intertemporal transfers. There is considerable evidence that the least costly and most effective of such rules are those that focus directly on the rate of…

  4. Challenges for Rule Systems on the Web

    Science.gov (United States)

    Hu, Yuh-Jong; Yeh, Ching-Long; Laun, Wolfgang

    The RuleML Challenge started in 2007 with the objective of inspiring the issues of implementation for management, integration, interoperation and interchange of rules in an open distributed environment, such as the Web. Rules are usually classified as three types: deductive rules, normative rules, and reactive rules. The reactive rules are further classified as ECA rules and production rules. The study of combination rule and ontology is traced back to an earlier active rule system for relational and object-oriented (OO) databases. Recently, this issue has become one of the most important research problems in the Semantic Web. Once we consider a computer executable policy as a declarative set of rules and ontologies that guides the behavior of entities within a system, we have a flexible way to implement real world policies without rewriting the computer code, as we did before. Fortunately, we have de facto rule markup languages, such as RuleML or RIF to achieve the portability and interchange of rules for different rule systems. Otherwise, executing real-life rule-based applications on the Web is almost impossible. Several commercial or open source rule engines are available for the rule-based applications. However, we still need a standard rule language and benchmark for not only to compare the rule systems but also to measure the progress in the field. Finally, a number of real-life rule-based use cases will be investigated to demonstrate the applicability of current rule systems on the Web.

  5. Performance-based methodology for assessing seismic vulnerability and capacity of buildings

    Science.gov (United States)

    Shibin, Lin; Lili, Xie; Maosheng, Gong; Ming, Li

    2010-06-01

    This paper presents a performance-based methodology for the assessment of seismic vulnerability and capacity of buildings. The vulnerability assessment methodology is based on the HAZUS methodology and the improved capacitydemand-diagram method. The spectral displacement ( S d ) of performance points on a capacity curve is used to estimate the damage level of a building. The relationship between S d and peak ground acceleration (PGA) is established, and then a new vulnerability function is expressed in terms of PGA. Furthermore, the expected value of the seismic capacity index (SCev) is provided to estimate the seismic capacity of buildings based on the probability distribution of damage levels and the corresponding seismic capacity index. The results indicate that the proposed vulnerability methodology is able to assess seismic damage of a large number of building stock directly and quickly following an earthquake. The SCev provides an effective index to measure the seismic capacity of buildings and illustrate the relationship between the seismic capacity of buildings and seismic action. The estimated result is compared with damage surveys of the cities of Dujiangyan and Jiangyou in the M8.0 Wenchuan earthquake, revealing that the methodology is acceptable for seismic risk assessment and decision making. The primary reasons for discrepancies between the estimated results and the damage surveys are discussed.

  6. Communicating rules in recreation areas

    Science.gov (United States)

    Terence L. Ross; George H. Moeller

    1974-01-01

    Five hundred fifty-eight campers were surveyed on the Allegheny National Forest to determine their knowledge of rules governing recreation behavior. Most of them were uninformed about the rules. Results of the study suggest that previous camping experience, age, camping style, and residence significantly affect knowledge of rules. Campers who received rule brochures or...

  7. Knowledge discovery about quality of life changes of spinal cord injury patients: clustering based on rules by states.

    Science.gov (United States)

    Gibert, Karina; García-Rudolph, Alejandro; Curcoll, Lluïsa; Soler, Dolors; Pla, Laura; Tormos, José María

    2009-01-01

    In this paper, an integral Knowledge Discovery Methodology, named Clustering based on rules by States, which incorporates artificial intelligence (AI) and statistical methods as well as interpretation-oriented tools, is used for extracting knowledge patterns about the evolution over time of the Quality of Life (QoL) of patients with Spinal Cord Injury. The methodology incorporates the interaction with experts as a crucial element with the clustering methodology to guarantee usefulness of the results. Four typical patterns are discovered by taking into account prior expert knowledge. Several hypotheses are elaborated about the reasons for psychological distress or decreases in QoL of patients over time. The knowledge discovery from data (KDD) approach turns out, once again, to be a suitable formal framework for handling multidimensional complexity of the health domains.

  8. Communication grounding facility

    International Nuclear Information System (INIS)

    Lee, Gye Seong

    1998-06-01

    It is about communication grounding facility, which is made up twelve chapters. It includes general grounding with purpose, materials thermal insulating material, construction of grounding, super strength grounding method, grounding facility with grounding way and building of insulating, switched grounding with No. 1A and LCR, grounding facility of transmission line, wireless facility grounding, grounding facility in wireless base station, grounding of power facility, grounding low-tenton interior power wire, communication facility of railroad, install of arrester in apartment and house, install of arrester on introduction and earth conductivity and measurement with introduction and grounding resistance.

  9. Familiar shapes attract attention in figure-ground displays.

    Science.gov (United States)

    Nelson, Rolf A; Palmer, Stephen E

    2007-04-01

    We report five experiments that explore the effect of figure-ground factors on attention. We hypothesized that figural cues, such as familiar shape, would draw attention to the figural side in an attentional cuing task using bipartite figure-ground displays. The first two experiments used faces in profile as the familiar shape and found a perceptual advantage for targets presented on the meaningful side of the central contour in detection speed (Experiment 1) and discrimination accuracy (Experiment 2). The third experiment demonstrated the figural advantage in response time (RT) with nine other familiar shapes (including a sea horse, a guitar, a fir tree, etc.), but only when targets appeared in close proximity to the contour. A fourth experiment obtained a figural advantage in a discrimination task with the larger set of familiar shapes. The final experiment ruled out eye movements as a possible confounding factor by replicating the RT advantage for targets on the figural side of face displays when all trials containing eye movements were eliminated. The results are discussed in terms of ecological influences on attention, and are cast within the framework of Yantis and Jonides's hypothesis that attention is exogenously drawn to the onset of new perceptual objects. We argue that the figural side constitutes an "object" whereas the ground side does not, and that figural cues such as shape familiarity are effective in determining which areas represent objects.

  10. On the prediction of building damage from ground motion

    Energy Technology Data Exchange (ETDEWEB)

    Blume, John A [John A. Blume and Associates Research Division, San Francisco, CA (United States)

    1970-05-15

    In the planning of a nuclear event it is essential to consider the effects of the expected ground motion on all exposed buildings and other structures. There are various steps and procedures in this process which generally increase in scope and refinement as the preparations advance. Initial, rough estimates, based upon rules-of-thumb and preliminary predictions of ground motion and structural response, may be adequate to show general feasibility of the project. Subsequent work is done in both the field and analysis phases, to estimate the total structure exposure, to isolate special hazards, and to make damage cost estimates. Finally, specific analyses are made of special buildings or structures to identify safety problems and to make recommendations for safety measures during the proposed event. Because the ground motion and the structural response both involve many random variables and therefore some uncertainties in prediction, the probabilistic aspects must be considered, both on a broad statistical basis and for specific safety considerations. Decisions must be made as to the acceptability or non-acceptability of the risks and any indicated procedures before and during the event to reduce or to eliminate the risks. The paper discusses various techniques involved in these operations including the Spectral Matrix Method of damage prediction, the Threshold Evaluation Scale for specific building analysis, and the inelastic and probabilistic aspects of the problem. (author)

  11. On the prediction of building damage from ground motion

    International Nuclear Information System (INIS)

    Blume, John A.

    1970-01-01

    In the planning of a nuclear event it is essential to consider the effects of the expected ground motion on all exposed buildings and other structures. There are various steps and procedures in this process which generally increase in scope and refinement as the preparations advance. Initial, rough estimates, based upon rules-of-thumb and preliminary predictions of ground motion and structural response, may be adequate to show general feasibility of the project. Subsequent work is done in both the field and analysis phases, to estimate the total structure exposure, to isolate special hazards, and to make damage cost estimates. Finally, specific analyses are made of special buildings or structures to identify safety problems and to make recommendations for safety measures during the proposed event. Because the ground motion and the structural response both involve many random variables and therefore some uncertainties in prediction, the probabilistic aspects must be considered, both on a broad statistical basis and for specific safety considerations. Decisions must be made as to the acceptability or non-acceptability of the risks and any indicated procedures before and during the event to reduce or to eliminate the risks. The paper discusses various techniques involved in these operations including the Spectral Matrix Method of damage prediction, the Threshold Evaluation Scale for specific building analysis, and the inelastic and probabilistic aspects of the problem. (author)

  12. The influence of the new Basel regulation rules on the Slovak banking sector

    OpenAIRE

    Emília Zimková; Jana Tašková

    2012-01-01

    The aim of the paper is to quantify an impact of the new Basel regulation rules which are known as Basel III on the Slovak banking sector. We present methodology provided by the Bank for international settlement in its monitoring workbook and as to database the set of central bank statements and reports provided upon request have been used. Based on our calculations we discovered three main impacts of the Basel III on the Slovak banking sector: i) the volume and quality of the capital meet re...

  13. Using grounded theory methodology to conceptualize the mother-infant communication dynamic: potential application to compliance with infant feeding recommendations.

    Science.gov (United States)

    Waller, Jennifer; Bower, Katherine M; Spence, Marsha; Kavanagh, Katherine F

    2015-10-01

    Excessive, rapid weight gain in early infancy has been linked to risk of later overweight and obesity. Inappropriate infant feeding practices associated with this rapid weight gain are currently of great interest. Understanding the origin of these practices may increase the effectiveness of interventions. Low-income populations in the Southeastern United States are at increased risk for development of inappropriate infant feeding practices, secondary to the relatively low rates of breastfeeding reported from this region. The objective was to use grounded theory methodology (GTM) to explore interactions between mothers and infants that may influence development of feeding practices, and to do so among low-income, primiparous, Southeastern United States mothers. Analysis of 15 in-depth phone interviews resulted in development of a theoretical model in which Mother-Infant Communication Dynamic emerged as the central concept. The central concept suggests a communication pattern developed over the first year of life, based on a positive feedback loop, which is harmonious and results in the maternal perception of mother and infant now speaking the same language. Importantly, though harmonious, this dynamic may result from inaccurate maternal interpretation of infant cues and behaviours, subsequently leading to inappropriate infant feeding practices. Future research should test this theoretical model using direct observation of mother-infant communication, to increase the understanding of maternal interpretation of infant cues. Subsequently, interventions targeting accurate maternal interpretation of and response to infant cues, and impact on rate of infant weight gain could be tested. If effective, health care providers could potentially use these concepts to attenuate excess rapid infant weight gain. © 2013 John Wiley & Sons Ltd.

  14. PICOREF: carbon sequestration in geological reservoirs in France.Map of the unknown ''ground motion''. Final report

    International Nuclear Information System (INIS)

    Rohmer, J.; Lembezat, C.

    2006-01-01

    in the framework of the PICOREF project, ''CO 2 sequestration in geological reservoirs in France'', two main objectives are decided: the characterization of french adapted sites and the redaction of a document to ask for the storage authorization, including a methodology to survey and study the storage site. This report aims to define the unknown ground motion which the impact should present a risk for the surface installations. The project is presented, as the geological context and the proposed methodology. (A.L.B.)

  15. Methodology to evaluate the site standard seismic motion to a nuclear facility

    International Nuclear Information System (INIS)

    Soares, W.A.

    1983-01-01

    For the seismic design of nuclear facilities, the input motion is normally defined by the predicted maximum ground horizontal acceleration and the free field ground response spectrum. This spectrum is computed on the basis of records of strong motion earthquakes. The pair maximum acceleration-response spectrum is called the site standard seismic motion. An overall view of the subjects involved in the determination of the site standard seismic motion to a nuclear facility is presented. The main topics discussed are: basic principles of seismic instrumentation; dynamic and spectral concepts; design earthquakes definitions; fundamentals of seismology; empirical curves developed from prior seismic data; available methodologies and recommended procedures to evaluate the site standard seismic motion. (Author) [pt

  16. Correlation of horizontal and vertical components of strong ground motion for response-history analysis of safety-related nuclear facilities

    Energy Technology Data Exchange (ETDEWEB)

    Huang, Yin-Nan, E-mail: ynhuang@ntu.edu.tw [Dept. of Civil Engineering, National Taiwan University, No. 1, Sec. 4, Roosevelt Rd., Taipei 10617, Taiwan (China); Yen, Wen-Yi, E-mail: b01501059@ntu.edu.tw [Dept. of Civil Engineering, National Taiwan University, No. 1, Sec. 4, Roosevelt Rd., Taipei 10617, Taiwan (China); Whittaker, Andrew S., E-mail: awhittak@buffalo.edu [Dept. of Civil, Structural and Environmental Engineering, MCEER, State University of New York at Buffalo, Buffalo, NY 14260 (United States)

    2016-12-15

    Highlights: • The correlation of components of ground motion is studied using 1689 sets of records. • The data support an upper bound of 0.3 on the correlation coefficient. • The data support the related requirement in the upcoming edition of ASCE Standard 4. - Abstract: Design standards for safety-related nuclear facilities such as ASCE Standard 4-98 and ASCE Standard 43-05 require the correlation coefficient for two orthogonal components of ground motions for response-history analysis to be less than 0.3. The technical basis of this requirement was developed by Hadjian three decades ago using 50 pairs of recorded ground motions that were available at that time. In this study, correlation coefficients for (1) two horizontal components, and (2) the vertical component and one horizontal component, of a set of ground motions are computed using records from a ground-motion database compiled recently for large-magnitude shallow crustal earthquakes. The impact of the orientation of the orthogonal horizontal components on the correlation coefficient of ground motions is discussed. The rules in the forthcoming edition of ASCE Standard 4 for the correlation of components in a set of ground motions are shown to be reasonable.

  17. Organisational Rules in Schools: Teachers' Opinions about Functions of Rules, Rule-Following and Breaking Behaviours in Relation to Their Locus of Control

    Science.gov (United States)

    Demirkasimoglu, Nihan; Aydin, Inayet; Erdogan, Cetin; Akin, Ugur

    2012-01-01

    The main aim of this research is to examine teachers' opinions about functions of school rules, reasons for rule-breaking and results of rule-breaking in relation to their locus of control, gender, age, seniority and branch. 350 public elementary school teachers in Ankara are included in the correlational survey model study. According to the…

  18. The Frustrations of Reader Generalizability and Grounded Theory: Alternative Considerations for Transferability

    Directory of Open Access Journals (Sweden)

    Thomas Misco

    2007-01-01

    Full Text Available In this paper I convey a recurring problem and possible solution that arose during my doctoral research on the topic of cross-cultural Holocaust curriculum development for Latvian schools. Specifically, as I devised the methodology for my research, I experienced a number of frustrations concerning the issue of transferability and the limitations of both reader generalizability and grounded theory. Ultimately, I found a more appropriate goal for the external applicability of this and other highly contextual research studies in the form of "grounded understandings," which are tentative apprehensions of the importance or significance of phenomena and conceptualizations that hold meaning and explanatory power, but are only embryonic in their potential to generate theory.

  19. Binary effectivity rules

    DEFF Research Database (Denmark)

    Keiding, Hans; Peleg, Bezalel

    2006-01-01

    is binary if it is rationalized by an acyclic binary relation. The foregoing result motivates our definition of a binary effectivity rule as the effectivity rule of some binary SCR. A binary SCR is regular if it satisfies unanimity, monotonicity, and independence of infeasible alternatives. A binary...

  20. Technical rules in law

    Energy Technology Data Exchange (ETDEWEB)

    Debelius, J

    1978-08-01

    An important source of knowledge for technical experts is the state of the art reflected by catalogues of technical rules. Technical rules may also achieve importance in law due to a legal transformation standard. Here, rigid and flexible reference are controversial with regard to their admissibility from the point of view of constitutional law. In case of a divergence from the generally accepted technical rules, it is assumed - refutably - that the necessary care had not been taken. Technical rules are one out of several sources of information; they have no normative effect. This may result in a duty of anyone applying them to review the state of technology himself.

  1. Technical rules in law

    International Nuclear Information System (INIS)

    Debelius, J.

    1978-01-01

    An important source of knowledge for technical experts is the state of the art reflected by catalogues of technical rules. Technical rules may also achieve importance in law due to a legal transformation standard. Here, rigid and flexible reference are controversial with regard to their admissibility from the point of view of constitutional law. In case of a divergence from the generally accepted technical rules, it is assumed - refutably - that the necessary care had not been taken. Technical rules are one out of several sources of information; they have no normative effect. This may result in a duty of anyone applying them to review the state of technology himself. (orig.) [de

  2. Ground Pollution Science

    International Nuclear Information System (INIS)

    Oh, Jong Min; Bae, Jae Geun

    1997-08-01

    This book deals with ground pollution science and soil science, classification of soil and fundamentals, ground pollution and human, ground pollution and organic matter, ground pollution and city environment, environmental problems of the earth and ground pollution, soil pollution and development of geological features of the ground, ground pollution and landfill of waste, case of measurement of ground pollution.

  3. A. S. Hornby and 50 Years of the Hornby Trust

    Science.gov (United States)

    Smith, Richard; Bowers, Roger

    2012-01-01

    A. S. Hornby can justly be considered the "father" of UK-based ELT. He was the founder and first Editor of English Language Teaching (now known as ELT Journal); he established the ground rules for situational language teaching, the dominant ELT methodology in the United Kingdom up until the 1970s; he was the chief originator of the…

  4. On the appropriateness of public participation in Integrated Water Resources Management: some grounded insights from the Levant

    NARCIS (Netherlands)

    Ker Rault, P.A.; Jeffrey, P.

    2008-01-01

    Although public participation in the service of Integrated Water Resources Management had aroused much attention as a practice, little is known about stakeholders’ understandings of and expectations towards the process. Using a grounded approach we develop an interpretive methodological framework

  5. Geophysical assessment of near-field ground motion and the implications for the design of nuclear installations

    International Nuclear Information System (INIS)

    Bernreuter, D.L.

    1977-01-01

    This paper gives an in-depth discussion on the various methodologies currently available to predict the near-field ground motion from an earthquake. The limitations of the various methods are discussed in some detail in light of recently available data. It is shown that, (at least for California earthquakes) for an earthquake with a given magnitude a wide variation in the peak ground motion can occur. The change in the spectral content of the ground motion is given as a function of earthquake magnitude and peak ground acceleration. It is shown that the large g values associated with small earthquakes are relatively unimportant in the design provided the structures have a modest amount of ductility. Data recently obtained from the Friuli earthquake are also examined. Although not all the geophysical data are currently available, the provisional conclusion is reached that the relation between the strong ground motion from this earthquake and its source parameters is the same as for the western United States

  6. Light-cone distribution amplitudes of the ground state bottom baryons in HQET

    Energy Technology Data Exchange (ETDEWEB)

    Ali, A.; Wang, W. [Deutsches Elektronen-Synchrotron (DESY), Hamburg (Germany); Hambrock, C. [Technische Univ. Dortmund (Germany); Parkhomenko, A.Ya. [P.G. Demidov Yaroslavl State Univ., Yaroslavl (Russian Federation)

    2012-12-15

    We provide the definition of the complete set of light-cone distribution amplitudes (LCDAs) for the ground state heavy bottom baryons with the spin-parities J{sup P}=1/2{sup +} and J{sup P}=3/2{sup +} in the heavy quark limit. We present the renormalization effects on the twist-2 light-cone distribution amplitudes and use the QCD sum rules to compute the moments of twist-2, twist-3, and twist-4 LCDAs. Simple models for the heavy baryon distribution amplitudes are analyzed with account of their scale dependence.

  7. Methodology for the evaluation of tolerability of defects in WWER-1000/V-320 reactor pressure vessels

    International Nuclear Information System (INIS)

    Brumovsky, M.; Horacek, L.; Ruscak, M.

    1996-05-01

    The methodology provides guidelines for the assessment of tolerability of defects found during in-service inspection of the base material and overlay of WWER-1000/V-320 type reactor pressure vessels. With regard to the method of calculating the tolerability of defects and rules for the preparation and implementation of repairs, this methodology can also find use in the assessment of tolerability of defects in selected facilities of WWER-1000/V-320 type nuclear power plants provided that adequate input data concerning the materials, manufacturing technology, and operating load regime are available and that the facilities are made of ferrite/bainite type steels. This methodology should serve as a binding document underlying the development of a technical approach to provisions for a further operation of facilities in which intolerable defects have been found by nondestructive testing. (author)

  8. Following the Rules.

    Science.gov (United States)

    Katz, Anne

    2016-05-01

    I am getting better at following the rules as I grow older, although I still bristle at many of them. I was a typical rebellious teenager; no one understood me, David Bowie was my idol, and, one day, my generation was going to change the world. Now I really want people to understand me: David Bowie remains one of my favorite singers and, yes, my generation has changed the world, and not necessarily for the better. Growing up means that you have to make the rules, not just follow those set by others, and, at times, having rules makes a lot of sense.
.

  9. Heuristic simulation of nuclear systems on a supercomputer using the HAL-1987 general-purpose production-rule analysis system

    International Nuclear Information System (INIS)

    Ragheb, M.; Gvillo, D.; Makowitz, H.

    1987-01-01

    HAL-1987 is a general-purpose tool for the construction of production-rule analysis systems. It uses the rule-based paradigm from the part of artificial intelligence concerned with knowledge engineering. It uses backward-chaining and forward-chaining in an antecedent-consequent logic, and is programmed in Portable Standard Lisp (PSL). The inference engine is flexible and accommodates general additions and modifications to the knowledge base. The system is used in coupled symbolic-procedural programming adaptive methodologies for stochastic simulations. In Monte Carlo simulations of particle transport, the system considers the pre-processing of the input data to the simulation and adaptively controls the variance reduction process as the simulation progresses. This is accomplished through the use of a knowledge base of rules which encompass the user's expertise in the variance reduction process. It is also applied to the construction of model-based systems for monitoring, fault-diagnosis and crisis-alert in engineering devices, particularly in the field of nuclear reactor safety analysis

  10. Firm heterogeneity, Rules of Origin and Rules of Cumulation

    OpenAIRE

    Bombarda , Pamela; Gamberoni , Elisa

    2013-01-01

    We analyse the impact of relaxing rules of origin (ROOs) in a simple setting with heterogeneous firms that buy intermediate inputs from domestic and foreign sources. In particular, we consider the impact of switching from bilateral to diagonal cumulation when using preferences (instead of paying the MFN tariff) involving the respect of rules of origin. We find that relaxing the restrictiveness of the ROOs leads the least productive exporters to stop exporting. The empirical part confirms thes...

  11. A grounded theory como abordagem metodológica: relatos de uma experiência de campo

    Directory of Open Access Journals (Sweden)

    Marcelo de Rezende Pinto

    2012-09-01

    Full Text Available Uma vez que já é possível encontrar no Brasil alguns artigos que contemplam questões atinentes ao histórico, tipologias e principais características da grounded theory, este trabalho tem por finalidade contribuir para uma maior discussão dessa abordagem metodológica enquanto estilo de fazer pesquisa. De maneira específi ca, o trabalho tenta descrever uma experiência de campo e, principalmente, contar a saga de um pesquisador envolvido com o desafio de colocar a grounded theory em prática. Para isso, buscou-se dividir o trabalho em três partes distintas. Na primeira parte, apresentamos a grounded theory de uma maneira ampla, introduzindo alguns dos seus princípios fundamentais. Na segunda parte, descreve-se o trabalho de campo que foi realizado - nos moldes da grounded theory - com o objetivo de investigar a forma como os consumidores brasileiros oriundos das classes mais populares vivenciam suas experiências de consumo de produtos eletrônicos. Na terceira e última parte, são apresentadas algumas reflexões sobre as exigências práticas para a “operacionalização” de pesquisas comprometidas com o “espírito” da grounded theory, bem como as dúvidas, os dilemas, as difi culdades e as angústias vivenciadas ao longo de todo o processo de pesquisa contadas por quem passou por elas. ----- The Grounded Theory as Methodological Approach: reports of a field experience ----- ABSTRACT ----- As there are few articles that address issues relating to the history, types and main characteristics of grounded theory in Brazil, this paper aims to further the discussion of this methodological approach as a way of doing research. More specifically, the paper describes a field experience, and in particular the history of a researcher involved with the challenge of putting grounded theory into practice. The work is divided into three distinct parts. First we present grounded theory broadly, introducing some of its fundamental

  12. The Product and Quotient Rules Revisited

    Science.gov (United States)

    Eggleton, Roger; Kustov, Vladimir

    2011-01-01

    Mathematical elegance is illustrated by strikingly parallel versions of the product and quotient rules of basic calculus, with some applications. Corresponding rules for second derivatives are given: the product rule is familiar, but the quotient rule is less so.

  13. Methodology of safety evaluation about land disposal of low level radioactive wastes

    International Nuclear Information System (INIS)

    Suzuki, Atsuyuki

    1986-01-01

    Accompanying the progress of the construction project of low level radioactive waste storage facilities in Aomori Prefecture, the full scale land disposal of low level radioactive wastes shows its symptom also in Japan. In this report, the scientific methodology to explain the safety about the land disposal of low level radioactive wastes is discussed. The land disposal of general wastes by shallow burying has already had sufficient results. In the case of low level radioactive wastes, also the land disposal by shallow burying is considered. Low level radioactive wastes can be regarded as one form of industrial wastes, as there are many common parts in the scientific and theoretical base of the safety. Attention is paid most to the contamination of ground water. Low level radioactive wastes are solid wastes, accordingly the degree of contamination should be less. The space in which ground water existes, the phenomena of ground water movement, the phenomena of ground water dispersion and Fick's law, the adsorption effect of strata, and the evaluation of source term are explained. These are the method to analyze the degree of contamination from safety evaluation viewpoint. (Kako, I.)

  14. Sensitivity of grounding line dynamics to basal conditions

    Science.gov (United States)

    Gagliardini, O.; Brondex, J.; Chauveau, G.; Gillet-chaulet, F.; Durand, G.

    2017-12-01

    In the context of a warming climate, the dynamical contribution of Antarctica to future sea level rise is still tainted by high uncertainties. Among the processes entering these uncertainties is the link between basal hydrology, friction and grounding line dynamics. Recent works have shown how sensitive is the response of the grounding line retreat to the choice of the form of the friction law. Indeed, starting from the same initial state, grounding line retreat rates can range over almost two orders of magnitude depending on the friction law formulation.Here, we use a phenomenological law that depends on the water pressure and allows a continuous transition from a Weertman-type friction at low water pressure to a Coulomb-type friction at high water pressure. This friction law depends on two main parameters that control the Weertman and Coulomb regimes. The range of values for these two parameters is only weakly physically constrained, and it can be shown that, for a given basal shear stress, different couples of parameters can conduct to the same sliding velocity. In addition, we show that close to the grounding line where basal water pressure is high, determining these two parameters might conduct to an ill-posed inverse problem with no solution.The aim of this presentation is to discuss a methodology to guide the choice of the two friction parameters and explore the sensitivity of the grounding line dynamics to this initial choice. We present results obtained both on a synthetic configuration used by the Marine Ice Sheet Model Intercomparison exercise and for the Amundsen sea sector using the experiments proposed by InitMIP-Antarctica, the first exercise in a series of ISMIP6 ice-sheet model intercomparison activities.

  15. Assessment of potential strong ground motions in the city of Rome

    Directory of Open Access Journals (Sweden)

    L. Malagnini

    1994-06-01

    Full Text Available A methodology is used which combines stochastic generation of random series with a finite-difference technique to estimate the expected horizontal ground motion for the city of Rome as induced by a large earthquake in the Central Apennines. In this approach, source properties and long-path propagation are modelled through observed spectra of ground motion in the region, while the effects of the near-surface geology in the city are simulated by means of a finite-difference technique applied to 2-D models including elastic and anelastic properties of geologic materials and topographic variations. The parameters commonly used for earthquake engineering purposes are estimated from the simulated time histories of horizontal ground motion. We focus our attention on peak ground acceleration and velocity, and on the integral of the squared acceleration and velocity (that are proportional to the Arias intensity and seismic energy flux, respectively. Response spectra are analyzed as well. Parameter variations along 2-D profiles visualize the effects of the small-scale geological heterogeneities and topography irregularities on ground motion in the case of a strong earthquake. Interestingly, the largest amplification of peak ground acceleration and Arias intensity does not necessarily occur at the same sites where peak ground velocity and flux of seismic energy reach their highest values, depending on the frequency band of amplification. A magnitude 7 earthquake at a distance of 100 km results in peak ground accelerations ranging from 30 to 70 gals while peak ground velocities are estimated to vary from 5 to 7 cm/s; moreover, simulated time histories of horizontal ground motion yield amplitudes of 5% damped pseudovelocity response spectra as large as 15-20 cm/s for frequencies from 1to 3 Hz. In this frequency band, the mean value is 7 cm/s for firm sites and ranges from 10 to 13 cm/s for soil sites. All these results are in good agreement with predictions

  16. 75 FR 51934 - Telemarketing Sales Rule

    Science.gov (United States)

    2010-08-24

    ... FEDERAL TRADE COMMISSION 16 CFR Part 310 Telemarketing Sales Rule AGENCY: Federal Trade Commission. ACTION: Final rule; correction. SUMMARY: The Federal Trade Commission (``Commission'') published a final rule on August 10, 2010, adopting amendments to the Telemarketing Sales Rule that address the...

  17. METHODOLOGY OF PROFESSIONAL PEDAGOGICAL EDUCATION: THEORY AND PRACTICE (THEORETICAL AND METHODOLOGICAL FOUNDATIONS OF VOCATIONAL TEACHER EDUCATION

    Directory of Open Access Journals (Sweden)

    E. M. Dorozhkin

    2014-01-01

    Full Text Available The study is aimed at investigating a justification of the new approach to the problem of vocational education development through the prism of interdependence research methodology and practice. This conceptual setup allows determining the main directions for teacher training modernization of vocational schools.The authors note that the current socio-economic situation in our country has actualized the problem of personnel training. Politicians, economists and scientists’ speeches are all about the shortage of skilled personnel. They see the main reason of this catastrophic situation in the present system of primary andsecondary vocational education. At least they concern over the current practice of pedagogical personnel training of vocational education who are to restore the system of vocational education. Our country, Russia has a great positive experience in solving this problem. Scientific-methodological centre for vocational teacher education is the Russian State Vocational Pedagogical University under the scientific direction of Academicianof the Russian Academy of Education, G. M. Romantsev. The reflection of scientific-theoretical bases of this education led the authors to the analysis and designing (formation of existent and new professional and pedagogical methodology. Methods. The fundamental position of A. M. Novikov on the generality of theresearch (scientific and practical activity methodology has become the theoretical platform of the present study. Conceptual field, conceptual statements and professional model are presented as the whole system (or integrating factor. The theoretical framework has determined the logic of the study and its results.Scientific and educational methodology differentiation in terms of the subject of cogni live activity has allowed identifying the main scientific and practical disciplines of vocational teacher education. The creative concept as the subject ground is instrumental

  18. Autonomous Rule Creation for Intrusion Detection

    Energy Technology Data Exchange (ETDEWEB)

    Todd Vollmer; Jim Alves-Foss; Milos Manic

    2011-04-01

    Many computational intelligence techniques for anomaly based network intrusion detection can be found in literature. Translating a newly discovered intrusion recognition criteria into a distributable rule can be a human intensive effort. This paper explores a multi-modal genetic algorithm solution for autonomous rule creation. This algorithm focuses on the process of creating rules once an intrusion has been identified, rather than the evolution of rules to provide a solution for intrusion detection. The algorithm was demonstrated on anomalous ICMP network packets (input) and Snort rules (output of the algorithm). Output rules were sorted according to a fitness value and any duplicates were removed. The experimental results on ten test cases demonstrated a 100 percent rule alert rate. Out of 33,804 test packets 3 produced false positives. Each test case produced a minimum of three rule variations that could be used as candidates for a production system.

  19. Potential use of ground-based sensor technologies for weed detection.

    Science.gov (United States)

    Peteinatos, Gerassimos G; Weis, Martin; Andújar, Dionisio; Rueda Ayala, Victor; Gerhards, Roland

    2014-02-01

    Site-specific weed management is the part of precision agriculture (PA) that tries to effectively control weed infestations with the least economical and environmental burdens. This can be achieved with the aid of ground-based or near-range sensors in combination with decision rules and precise application technologies. Near-range sensor technologies, developed for mounting on a vehicle, have been emerging for PA applications during the last three decades. These technologies focus on identifying plants and measuring their physiological status with the aid of their spectral and morphological characteristics. Cameras, spectrometers, fluorometers and distance sensors are the most prominent sensors for PA applications. The objective of this article is to describe-ground based sensors that have the potential to be used for weed detection and measurement of weed infestation level. An overview of current sensor systems is presented, describing their concepts, results that have been achieved, already utilized commercial systems and problems that persist. A perspective for the development of these sensors is given. © 2013 Society of Chemical Industry.

  20. Endogeneously arising network allocation rules

    NARCIS (Netherlands)

    Slikker, M.

    2006-01-01

    In this paper we study endogenously arising network allocation rules. We focus on three allocation rules: the Myerson value, the position value and the component-wise egalitarian solution. For any of these three rules we provide a characterization based on component efficiency and some balanced

  1. Performance based regulation - The maintenance rule

    Energy Technology Data Exchange (ETDEWEB)

    Correia, Richard P. [NRR/DOTS/TQMP, U.S. Nuclear Regulatory Commission, Office of Nuclear Reactor Regulation, M/S OWFN 10A19, Washington, D.C. 20555 (United States)

    1997-07-01

    The U.S. Nuclear Regulatory Commission has begun a transition from 'process-oriented' to 'results-oriented' regulations. The maintenance rule is a results-oriented rule that mandates consideration of risk and plant performance. The Maintenance Rule allows licensees to devise the most effective and efficient means of achieving the results described in the rule including the use of Probabilistic Risk (or Safety) Assessments. The NRC staff conducted a series of site visits to evaluate implementation of the Rule. Conclusions from the site visits indicated that the results-oriented Maintenance Rule can be successfully implemented and enforced. (author)

  2. Performance based regulation - The maintenance rule

    International Nuclear Information System (INIS)

    Correia, Richard P.

    1997-01-01

    The U.S. Nuclear Regulatory Commission has begun a transition from 'process-oriented' to 'results-oriented' regulations. The maintenance rule is a results-oriented rule that mandates consideration of risk and plant performance. The Maintenance Rule allows licensees to devise the most effective and efficient means of achieving the results described in the rule including the use of Probabilistic Risk (or Safety) Assessments. The NRC staff conducted a series of site visits to evaluate implementation of the Rule. Conclusions from the site visits indicated that the results-oriented Maintenance Rule can be successfully implemented and enforced. (author)

  3. Persistent Rule-Following in the Face of Reversed Reinforcement Contingencies: The Differential Impact of Direct Versus Derived Rules.

    Science.gov (United States)

    Harte, Colin; Barnes-Holmes, Yvonne; Barnes-Holmes, Dermot; McEnteggart, Ciara

    2017-11-01

    Rule-governed behavior and its role in generating insensitivity to direct contingencies of reinforcement have been implicated in human psychological suffering. In addition, the human capacity to engage in derived relational responding has also been used to explain specific human maladaptive behaviors, such as irrational fears. To date, however, very little research has attempted to integrate research on contingency insensitivity and derived relations. The current work sought to fill this gap. Across two experiments, participants received either a direct rule (Direct Rule Condition) or a rule that involved a novel derived relational response (Derived Rule Condition). Provision of a direct rule resulted in more persistent rule-following in the face of competing contingencies, but only when the opportunity to follow the reinforced rule beforehand was relatively protracted. Furthermore, only in the Direct Rule Condition were there significant correlations between rule-compliance and stress. A post hoc interpretation of the findings is provided.

  4. "Chaos Rules" Revisited

    Science.gov (United States)

    Murphy, David

    2011-01-01

    About 20 years ago, while lost in the midst of his PhD research, the author mused over proposed titles for his thesis. He was pretty pleased with himself when he came up with "Chaos Rules" (the implied double meaning was deliberate), or more completely, "Chaos Rules: An Exploration of the Work of Instructional Designers in Distance Education." He…

  5. Idioms-based Business Rule Extraction

    NARCIS (Netherlands)

    R Smit (Rob)

    2011-01-01

    htmlabstractThis thesis studies the extraction of embedded business rules, using the idioms of the used framework to identify them. Embedded business rules exist as source code in the software system and knowledge about them may get lost. Extraction of those business rules could make them accessible

  6. Parochial Dissonance: A Grounded Theory of Wisconsin's New North Response to the Employability Skills Gap

    Science.gov (United States)

    Baneck, Timothy M.

    2012-01-01

    The purpose of this study was to generate a theory that explained the beliefs and behaviors of participants from business, not-for-profit business, education, and government sectors when resolving the employability skills gap. Classical grounded theory was the inductive methodology applied to this study. The New North, an 18 county region located…

  7. The impact of category structure and training methodology on learning and generalizing within-category representations.

    Science.gov (United States)

    Ell, Shawn W; Smith, David B; Peralta, Gabriela; Hélie, Sébastien

    2017-08-01

    When interacting with categories, representations focused on within-category relationships are often learned, but the conditions promoting within-category representations and their generalizability are unclear. We report the results of three experiments investigating the impact of category structure and training methodology on the learning and generalization of within-category representations (i.e., correlational structure). Participants were trained on either rule-based or information-integration structures using classification (Is the stimulus a member of Category A or Category B?), concept (e.g., Is the stimulus a member of Category A, Yes or No?), or inference (infer the missing component of the stimulus from a given category) and then tested on either an inference task (Experiments 1 and 2) or a classification task (Experiment 3). For the information-integration structure, within-category representations were consistently learned, could be generalized to novel stimuli, and could be generalized to support inference at test. For the rule-based structure, extended inference training resulted in generalization to novel stimuli (Experiment 2) and inference training resulted in generalization to classification (Experiment 3). These data help to clarify the conditions under which within-category representations can be learned. Moreover, these results make an important contribution in highlighting the impact of category structure and training methodology on the generalization of categorical knowledge.

  8. Flaw evaluation methodology for class 2, 3 components in light water reactors

    International Nuclear Information System (INIS)

    Miura, Naoki; Kashima, Koichi; Miyazaki, Katsumasa; Hasegawa, Kunio; Oritani, Naohiko

    2006-01-01

    It is quite important to validate the structural integrity of operating plant components as aged LWR plants are gradually increasing in Japan. The rules on fitness-for-service for nuclear power plants constituted by the JSME provides flaw evaluation methodology. They are mainly focused on Class 1 components, while flaw evaluation criteria for Class 2, 3 components are not consolidated. As such, they also required from the viewpoints of in-service inspection request, reduction of operating cost and systematization of consistent code/standard. In this study, basic concept of flaw evaluation for Class 2, 3 piping was considered, and it is concluded that the same evaluation procedure as Class 1 piping in the current rules is applicable. Some technical issues on practical flaw evaluation for Class 2, 3 piping were listed up, and a countermeasure for each issue was devised. Especially, both allowable flaw sizes in acceptance standards and critical flaw sizes in acceptance criteria have to be determined in consideration of degraded fracture toughness. (author)

  9. Current plans to characterize the design basis ground motion at the Yucca Mountain, Nevada Site

    International Nuclear Information System (INIS)

    Simecka, W.B.; Grant, T.A.; Voegele, M.D.; Cline, K.M.

    1992-01-01

    A site at Yucca Mountain Nevada is currently being studied to assess its suitability as a potential host site for the nation's first commercial high level waste repository. The DOE has proposed a new methodology for determining design-basis ground motions that uses both deterministic and probabilistic methods. The role of the deterministic approach is primary. It provides the level of detail needed by design engineers in the characterization of ground motions. The probabilistic approach provides a logical structured procedure for integrating the range of possible earthquakes that contribute to the ground motion hazard at the site. In addition, probabilistic methods will be used as needed to provide input for the assessment of long-term repository performance. This paper discusses the local tectonic environment, potential seismic sources and their associated displacements and ground motions. It also discusses the approach to assessing the design basis earthquake for the surface and underground facilities, as well as selected examples of the use of this type of information in design activities

  10. 18 CFR 39.10 - Changes to an Electric Reliability Organization Rule or Regional Entity Rule.

    Science.gov (United States)

    2010-04-01

    ... RULES CONCERNING CERTIFICATION OF THE ELECTRIC RELIABILITY ORGANIZATION; AND PROCEDURES FOR THE ESTABLISHMENT, APPROVAL, AND ENFORCEMENT OF ELECTRIC RELIABILITY STANDARDS § 39.10 Changes to an Electric... Reliability Organization Rule or Regional Entity Rule. 39.10 Section 39.10 Conservation of Power and Water...

  11. Incremental Learning of Context Free Grammars by Parsing-Based Rule Generation and Rule Set Search

    Science.gov (United States)

    Nakamura, Katsuhiko; Hoshina, Akemi

    This paper discusses recent improvements and extensions in Synapse system for inductive inference of context free grammars (CFGs) from sample strings. Synapse uses incremental learning, rule generation based on bottom-up parsing, and the search for rule sets. The form of production rules in the previous system is extended from Revised Chomsky Normal Form A→βγ to Extended Chomsky Normal Form, which also includes A→B, where each of β and γ is either a terminal or nonterminal symbol. From the result of bottom-up parsing, a rule generation mechanism synthesizes minimum production rules required for parsing positive samples. Instead of inductive CYK algorithm in the previous version of Synapse, the improved version uses a novel rule generation method, called ``bridging,'' which bridges the lacked part of the derivation tree for the positive string. The improved version also employs a novel search strategy, called serial search in addition to minimum rule set search. The synthesis of grammars by the serial search is faster than the minimum set search in most cases. On the other hand, the size of the generated CFGs is generally larger than that by the minimum set search, and the system can find no appropriate grammar for some CFL by the serial search. The paper shows experimental results of incremental learning of several fundamental CFGs and compares the methods of rule generation and search strategies.

  12. Entrepreneurial Orientation of Community College Workforce Divisions and the Impact of Organizational Structure: A Grounded Theory Study

    Science.gov (United States)

    Schiefen, Kathleen M.

    2010-01-01

    This research focused on how organizational structure of community colleges influenced the entrepreneurial orientation of deans, directors, vice presidents, and vice chancellors of workforce units. Using grounded theory methodology, the researcher identified three emergent theories applicable to both integrated and separate workforce units. These…

  13. Rules Extraction with an Immune Algorithm

    Directory of Open Access Journals (Sweden)

    Deqin Yan

    2007-12-01

    Full Text Available In this paper, a method of extracting rules with immune algorithms from information systems is proposed. Designing an immune algorithm is based on a sharing mechanism to extract rules. The principle of sharing and competing resources in the sharing mechanism is consistent with the relationship of sharing and rivalry among rules. In order to extract rules efficiently, a new concept of flexible confidence and rule measurement is introduced. Experiments demonstrate that the proposed method is effective.

  14. Speciation below ground: Tempo and mode of diversification in a radiation of endogean ground beetles.

    Science.gov (United States)

    Andújar, Carmelo; Pérez-González, Sergio; Arribas, Paula; Zaballos, Juan P; Vogler, Alfried P; Ribera, Ignacio

    2017-11-01

    Dispersal is a critical factor determining the spatial scale of speciation, which is constrained by the ecological characteristics and distribution of a species' habitat and the intrinsic traits of species. Endogean taxa are strongly affected by the unique qualities of the below-ground environment and its effect on dispersal, and contrasting reports indicate either high dispersal capabilities favoured by small body size and mediated by passive mechanisms, or low dispersal due to restricted movement and confinement inside the soil. We studied a species-rich endogean ground beetle lineage, Typhlocharina, including three genera and more than 60 species, as a model for the evolutionary biology of dispersal and speciation in the deep soil. A time-calibrated molecular phylogeny generated from >400 individuals was used to delimit candidate species, to study the accumulation of lineages through space and time by species-area-age relationships and to determine the geographical structure of the diversification using the relationship between phylogenetic and geographic distances across the phylogeny. Our results indicated a small spatial scale of speciation in Typhlocharina and low dispersal capacity combined with sporadic long distance, presumably passive dispersal events that fuelled the speciation process. Analysis of lineage growth within Typhlocharina revealed a richness plateau correlated with the range of distribution of lineages, suggesting a long-term species richness equilibrium mediated by density dependence through limits of habitat availability. The interplay of area- and age-dependent processes ruling the lineage diversification in Typhlocharina may serve as a general model for the evolution of high species diversity in endogean mesofauna. © 2017 John Wiley & Sons Ltd.

  15. Proof Rules for Recursive Procedures

    NARCIS (Netherlands)

    Hesselink, Wim H.

    1993-01-01

    Four proof rules for recursive procedures in a Pascal-like language are presented. The main rule deals with total correctness and is based on results of Gries and Martin. The rule is easier to apply than Martin's. It is introduced as an extension of a specification format for Pascal-procedures, with

  16. Rules Versus Discretion in Monetary Policy

    OpenAIRE

    Stanley Fischer

    1988-01-01

    This paper examines the case for rules rather than discretion in the conduct of monetary policy, from both historical and analytic perspectives. The paper starts with the rules of the game under the gold standard. These rules were ill-defined and not adhered to; active discretionary policy was pursued to defend the gold standard -- but the gold standard came closer to a regime of rules than the current system. The arguments for rules in general developed by Milton Friedman are described mo ap...

  17. Rule Making and Rule Breaking: Game Development and the Governance of Emergent Behaviour

    Directory of Open Access Journals (Sweden)

    Jennifer R. Whitson

    2010-07-01

    Full Text Available Discussions of ‘control’ in games often center on players and their myriad attempts to push back upon the systems that seek to constrain them. The fact that players resist the constraints imposed upon them is not surprising, nor is it surprising that counterplay and control are such rich topics for game studies academics. In this article, I argue that players are invited by games to bend the rules. It is in the very nature of play to find the movement between the rules, and for many players the ‘fun’ in play is the inherent challenge of attempting to master, defeat, or remake games’ formal structures. These rationalities of play preclude blind obedience to the rules and have distinct implications for how games are governed. While there have been numerous studies of players who bend or break the rules (Consalvo, 2007; Foo and Koivisto, 2004; Dibbell, 1998; Kolko and Reid, 1998; Williams, 2006; Mnookin, 1997 and players who alter and re-make the rules in their role of co-producers (Sotamaa, 2009; Kücklich, 2005; Humphreys, 2005; Taylor, 2006b, there is little research on game development companies and their attempts to harness these rationalities of play and uphold the rules beyond the reflexive writings of game designers themselves (Curtis, 1992; Morningstar and Farmer, 1991; Koster, 2002.

  18. Gaming the system. Dodging the rules, ruling the dodgers.

    Science.gov (United States)

    Morreim, E H

    1991-03-01

    Although traditional obligations of fidelity require physicians to deliver quality care to their patients, including to utilize costly technologies, physicians are steadily losing their accustomed control over the necessary resources. The "economic agents" who own the medical and monetary resources of care now impose a wide array of rules and restrictions in order to contain their costs of operation. However, physicians can still control resources indirectly through "gaming the system," employing tactics such as "fudging" that exploit resource rules' ambiguity and flexibility to bypass the rules while ostensibly honoring them. Physicians may be especially inclined to game the system where resource rules seriously underserve patients' needs, where economic agents seem to be "gaming the patient," with needless obstacles to care, or where others, such as hospitals or even physicians themselves, may be denied needed reimbursements. Though tempting, gaming is morally and medically hazardous. It can harm patients and society, offend honesty, and violate basic principles of contractual and distributive justice. It is also, in fact, usually unnecessary in securing needed resources for patients. More fundamentally, we must reconsider what physicians owe their patients. They owe what is theirs to give: their competence, care and loyalty. In light of medicine's changing economics, two new duties emerge: economic advising, whereby physicians explicitly discuss the economic as well as medical aspects of each treatment option; and economic advocacy, whereby physicians intercede actively on their patients' behalf with the economic agents who control the resources.

  19. The res judicata rule in jurisdictional decisions of the international Court of justice

    Directory of Open Access Journals (Sweden)

    Kreća Milenko

    2014-01-01

    Full Text Available The author discusses the effects of the res judicata rule as regards jurisdictional decisions of the International Court of Justice. He finds that there exists a special position of a judgment on preliminary objection in respect to both aspects of the res judicata rule - its binding force and finality. A perception of distinct relativity of a jurisdictional decision of the Court, expressing its interlocatory character pervades, in his opinion, the body of law regulating the Court's activity. Preliminary objections as such do not exhaust objections to the jurisdiction of the Court, as evidenced by non-preliminary objections to the jurisdiction of the Court giving rise to the application of the principle compétence de la compétence understood in the narrow sense. With regard to the binding force of a judgment on preliminary objections, it does not create legal obligations stricto sensu. The author finds that the relative character of jurisdictional decisions of the Court as compared with a judgment on the merits is justified on a number of grounds.

  20. Ground cross-modal impedance as a tool for analyzing ground/plate interaction and ground wave propagation.

    Science.gov (United States)

    Grau, L; Laulagnet, B

    2015-05-01

    An analytical approach is investigated to model ground-plate interaction based on modal decomposition and the two-dimensional Fourier transform. A finite rectangular plate subjected to flexural vibration is coupled with the ground and modeled with the Kirchhoff hypothesis. A Navier equation represents the stratified ground, assumed infinite in the x- and y-directions and free at the top surface. To obtain an analytical solution, modal decomposition is applied to the structure and a Fourier Transform is applied to the ground. The result is a new tool for analyzing ground-plate interaction to resolve this problem: ground cross-modal impedance. It allows quantifying the added-stiffness, added-mass, and added-damping from the ground to the structure. Similarity with the parallel acoustic problem is highlighted. A comparison between the theory and the experiment shows good matching. Finally, specific cases are investigated, notably the influence of layer depth on plate vibration.

  1. Participants' views of telephone interviews within a grounded theory study.

    Science.gov (United States)

    Ward, Kim; Gott, Merryn; Hoare, Karen

    2015-12-01

    To offer a unique contribution to the evolving debate around the use of the telephone during semistructured interview by drawing on interviewees' reflections on telephone interview during a grounded theory study. The accepted norm for qualitative interviews is to conduct them face-to-face. It is typical to consider collecting qualitative data via telephone only when face-to-face interview is not possible. During a grounded theory study, exploring users' experiences with overnight mask ventilation for sleep apnoea, the authors selected the telephone to conduct interviews. This article reports participants' views on semistructured interview by telephone. An inductive thematic analysis was conducted on data pertaining to the use of the telephone interview in a grounded theory study. The data were collected during 4 months of 2011 and 6 months in 2014. The article presents an inductive thematic analysis of sixteen participants' opinions about telephone interviewing and discusses these in relation to existing literature reporting the use of telephone interviews in grounded theory studies. Overall, participants reported a positive experience of telephone interviewing. From each participants reports we identified four themes from the data: being 'phone savvy; concentrating on voice instead of your face; easy rapport; and not being judged or feeling inhibited. By drawing on these data, we argue that the telephone as a data collection tool in grounded theory research and other qualitative methodologies need not be relegated to second best status. Rather, researchers can consider telephone interview a valuable first choice option. © 2015 John Wiley & Sons Ltd.

  2. SPECIAL RULES OF MITIGATION OF PUNISHMENT IN CASE OF THE CONCLUSION OF THE PRE-TRIAL COOPERATION AGREEMENT, AT THE SPECIAL PROCEDURE OF FOR THE TRIAL AND AT THE SHORTENED ORDER OF INQUIRY

    Directory of Open Access Journals (Sweden)

    Tatiana Nepomnyashchaya

    2017-01-01

    Full Text Available The subject. The article analyzes the rules for the appointment of punishment in the case of a pre-trial cooperation agreement, with a special procedure for the trial and with a shortened procedure of conducting inquiry, regulated by art. 62 of the RF Criminal Code “Turning out a Sentence when Mitigating Circumstances Exist”. The authors give an answer to two questions: 1 Does the legal nature of these institutions correspond to the legal nature of mitigating circumstances; 2 Is it advisable to consolidate in a one article of the law different legal regulations.Methodology. Authors use such researching methods as analysis and synthesis, formally legal, comparative legal.Results. Rules for the appointment of punishment in the conclusion of a pre-trial cooperation agreement, stipulated by the pt. 2, 4 of art. 62 of the RF Criminal Code, regulate not the order of accounting for mitigating circumstances, but the legal consequences associated with the promotion of a person, which concluded and executed a pre-trial cooperation agreement, that does not correspond to the legal nature of the pt. 1, 3 of art. 62 of the RF Criminal Code.The legal nature of the rules for the appointment of punishment, established in pt. 5 of art. 62 of the RF Criminal Code, also does not correspond to the legal nature of the rules for the imposition of punishment in the presence of mitigating circumstances, because mitigation of punishment occurs on criminal procedural grounds, which are not mitigating circumstances.Conclusions. In authors opinion, fastening in art. 62 of the RF Criminal Code of three independent rules for the imposition of punishment, namely, the rules for the imposition of punishment in the presence of mitigating circumstances (pt. 1, 3 of art. 62 of the Criminal Code, at the conclusion of a pre-trial cooperation agreement (pt. 2, 4 of art. 62 of the Crim-inal Code, with a special order of the trial and a shortened procedure for conducting an inquiry

  3. Application of realistic (best- estimate) methodologies for large break loss of coolant (LOCA) safety analysis: licensing of Westinghouse ASTRUM evaluation model in Spain

    International Nuclear Information System (INIS)

    Lage, Carlos; Frepoli, Cesare

    2010-01-01

    When the LOCA Final Acceptance Criteria for Light Water Reactors was issued in Appendix K of 10CFR50 both the USNRC and the industry recognized that the rule was highly conservative. At that time, however, the degree of conservatism in the analysis could not be quantified. As a result, the USNRC began a research program to identify the degree of conservatism in those models permitted in the Appendix K rule and to develop improved thermal-hydraulic computer codes so that realistic accident analysis calculations could be performed. The overall results of this research program quantified the conservatism in the Appendix K rule and confirmed that some relaxation of the rule can be made without a loss in safety to the public. Also, from a risk-informed perspective it is recognized that conservatism is not always a complete defense for lack of sophistication in models. In 1988, as a result of the improved understanding of LOCA phenomena, the USNRC staff amended the requirements of 10 CFR 50.46 and Appendix K, 'ECCS Evaluation Models', so that a realistic evaluation model may be used to analyze the performance of the ECCS during a hypothetical LOCA. Under the amended rules, best-estimate plus uncertainty (BEPU) thermal-hydraulic analysis may be used in place of the overly prescriptive set of models mandated by Appendix K rule. Further guidance for the use of best-estimate codes was provided in Regulatory Guide 1.157 To demonstrate use of the revised ECCS rule, the USNRC and its consultants developed a method called the Code Scaling, Applicability, and Uncertainty (CSAU) evaluation methodology as an approach for defining and qualifying a best-estimate thermal-hydraulic code and quantifying the uncertainties in a LOCA analysis. More recently the CSAU principles have been generalized in the Evaluation Model Development and Assessment Process (EMDAP) of Regulatory Guide 1.203. ASTRUM is the Westinghouse Best Estimate Large Break LOCA evaluation model applicable to two-, three

  4. The Neuroscience of Storing and Molding Tool Action Concepts: how plastic is grounded cognition?

    Directory of Open Access Journals (Sweden)

    J.C. Mizelle

    2010-11-01

    Full Text Available Choosing how to use tools to accomplish a task is a natural and seemingly trivial aspect of our lives, yet engages complex neural mechanisms. Recently, work in healthy populations has led to the idea that tool knowledge is grounded to allow for appropriate recall based on some level of personal history. This grounding has presumed neural loci for tool use, centered on parieto-temporo-frontal areas to fuse perception and action representations into one dynamic system. A challenge for this idea is related to one of its great benefits. For such a system to exist, it must be very plastic, to allow for the introduction of novel tools or concepts of tool use and modification of existing ones. Thus, learning new tool usage (familiar tools in new situations and new tools in familiar situations must involve mapping into this grounded network while marinating existing rules for tool usage. This plasticity may present a challenging breadth of encoding that needs to be optimally stored and accessed. The aim of this work is to explore the challenges of plasticity related to changing or incorporating representations of tool action within the theory of grounded cognition and propose a modular model of tool-object goal related accomplishment. While considering the neuroscience evidence for this approach, we will focus on the requisite plasticity for this system. Further, we will highlight challenges for flexibility and organization of already grounded tool actions and provide thoughts on future research to better evaluate mechanisms of encoding in the theory of grounded cognition.

  5. Game-theoretic modeling of curtailment rules and network investments with distributed generation

    International Nuclear Information System (INIS)

    Andoni, Merlinda; Robu, Valentin; Früh, Wolf-Gerrit; Flynn, David

    2017-01-01

    Highlights: •Comparative study on curtailment rules and their effects on RES profitability. •Proposal of novel fair curtailment rule which minimises generators’ disruption. •Modeling of private network upgrade as leader-follower (Stackelberg) game. •New model incorporating stochastic generation and variable demand. •New methodology for setting transmission charges in private network upgrade. -- Abstract: Renewable energy has achieved high penetration rates in many areas, leading to curtailment, especially if existing network infrastructure is insufficient and energy generated cannot be exported. In this context, Distribution Network Operators (DNOs) face a significant knowledge gap about how to implement curtailment rules that achieve desired operational objectives, but at the same time minimise disruption and economic losses for renewable generators. In this work, we study the properties of several curtailment rules widely used in UK renewable energy projects, and their effect on the viability of renewable generation investment. Moreover, we propose a new curtailment rule which guarantees fair allocation of curtailment amongst all generators with minimal disruption. Another key knowledge gap faced by DNOs is how to incentivise private network upgrades, especially in settings where several generators can use the same line against the payment of a transmission fee. In this work, we provide a solution to this problem by using tools from algorithmic game theory. Specifically, this setting can be modelled as a Stackelberg game between the private transmission line investor and local renewable generators, who are required to pay a transmission fee to access the line. We provide a method for computing the equilibrium of this game, using a model that captures the stochastic nature of renewable energy generation and demand. Finally, we use the practical setting of a grid reinforcement project from the UK and a large dataset of wind speed measurements and demand

  6. Intelligent wear mode identification system for marine diesel engines based on multi-level belief rule base methodology

    Science.gov (United States)

    Yan, Xinping; Xu, Xiaojian; Sheng, Chenxing; Yuan, Chengqing; Li, Zhixiong

    2018-01-01

    Wear faults are among the chief causes of main-engine damage, significantly influencing the secure and economical operation of ships. It is difficult for engineers to utilize multi-source information to identify wear modes, so an intelligent wear mode identification model needs to be developed to assist engineers in diagnosing wear faults in diesel engines. For this purpose, a multi-level belief rule base (BBRB) system is proposed in this paper. The BBRB system consists of two-level belief rule bases, and the 2D and 3D characteristics of wear particles are used as antecedent attributes on each level. Quantitative and qualitative wear information with uncertainties can be processed simultaneously by the BBRB system. In order to enhance the efficiency of the BBRB, the silhouette value is adopted to determine referential points and the fuzzy c-means clustering algorithm is used to transform input wear information into belief degrees. In addition, the initial parameters of the BBRB system are constructed on the basis of expert-domain knowledge and then optimized by the genetic algorithm to ensure the robustness of the system. To verify the validity of the BBRB system, experimental data acquired from real-world diesel engines are analyzed. Five-fold cross-validation is conducted on the experimental data and the BBRB is compared with the other four models in the cross-validation. In addition, a verification dataset containing different wear particles is used to highlight the effectiveness of the BBRB system in wear mode identification. The verification results demonstrate that the proposed BBRB is effective and efficient for wear mode identification with better performance and stability than competing systems.

  7. Competition-strength-dependent ground suppression in figure-ground perception.

    Science.gov (United States)

    Salvagio, Elizabeth; Cacciamani, Laura; Peterson, Mary A

    2012-07-01

    Figure-ground segregation is modeled as inhibitory competition between objects that might be perceived on opposite sides of borders. The winner is the figure; the loser is suppressed, and its location is perceived as shapeless ground. Evidence of ground suppression would support inhibitory competition models and would contribute to explaining why grounds are shapeless near borders shared with figures, yet such evidence is scarce. We manipulated whether competition from potential objects on the ground side of figures was high (i.e., portions of familiar objects were potentially present there) or low (novel objects were potentially present). We predicted that greater competition would produce more ground suppression. The results of two experiments in which suppression was assessed via judgments of the orientation of target bars confirmed this prediction; a third experiment showed that ground suppression is short-lived. Our findings support inhibitory competition models of figure assignment, in particular, and models of visual perception entailing feedback, in general.

  8. 18 CFR 385.2201 - Rules governing off-the-record communications (Rule 2201).

    Science.gov (United States)

    2010-04-01

    ... 603, a neutral (other than an arbitrator) under Rule 604 in an alternative dispute resolution... any person outside the Commission, any off-the-record communication. (c) Definitions. For purposes of... in which an intervenor disputes any material issue, any proceeding initiated pursuant to rule 206 by...

  9. A Spoonful of (Added) Sugar Helps the Constitution Go Down: Curing the Compelled Speech Commercial Speech Doctrine with FDA’s Added Sugars Rule.

    Science.gov (United States)

    Smith, Colleen

    On May 27, 2016, the Food and Drug Administration (FDA) announced that it was adopting a new rule that requires food manufacturers to list—on the already mandated Nutrition Facts label—how many grams of sugar have been added to a food product. Many opponents have criticized this “added sugars” rule on First Amendment grounds, arguing that the rule violates the commercial speech rights of food manufacturers. Whether the rule would survive constitutional scrutiny or not is an open question because the compelled commercial speech doctrine is anything but clear. Courts are split over whether Zauderer’s rational basis test, Central Hudson’s intermediate scrutiny, or some combination of the two should apply to a mandated disclosure like FDA’s added sugars rule. This Paper explains that the added sugars rule is unique in the history of mandated nutrition labeling in that the rule is motivated largely by public health concerns and backed by reports that assert that consumers should limit their intake of added sugars. In contrast, correcting and preventing consumer deception has been a major driving force behind the remainder of FDA’s mandated nutrition labeling. Because of this distinct rationale, the added sugars rule does not fit neatly into any currently existing compelled commercial speech test. This Paper uses the added sugars rule to highlight the deficiencies in the existing tests. Finally, this Paper proposes a new compelled commercial speech test that would adequately balance the interest of all of the effected parties: the government, the public, and food manufacturers.

  10. Regional analysis of ground and above-ground climate

    Science.gov (United States)

    1981-12-01

    The regional suitability of underground construction as a climate control technique is discussed with reference to (1) a bioclimatic analysis of long term weather data for 29 locations in the United States to determine appropriate above ground climate control techniques, (2) a data base of synthesized ground temperatures for the coterminous United States, and (3) monthly dew point ground temperature comparisons for identifying the relative likelihood of condensation from one region to another. It is concluded that the suitability of Earth tempering as a practice and of specific Earth sheltered design stereotypes varies geographically; while the subsurface almost always provides a thermal advantage on its own terms when compared to above ground climatic data, it can, nonetheless, compromise the effectiveness of other, regionally more important climate control techniques. Reviews of above and below ground climate mapping schemes related to human comfort and architectural design, and detailed description of a theoretical model of ground temperature, heat flow, and heat storage in the ground are included. Strategies of passive climate control are presented in a discussion of the building bioclimatic analysis procedure which has been applied in a computer analysis of 30 years of weather data for each of 20 locations in the United States.

  11. Regional analysis of ground and above-ground climate

    Energy Technology Data Exchange (ETDEWEB)

    1981-12-01

    The regional suitability of underground construction as a climate control technique is discussed with reference to (1) a bioclimatic analysis of long-term weather data for 29 locations in the United States to determine appropriate above ground climate control techniques, (2) a data base of synthesized ground temperatures for the coterminous United States, and (3) monthly dew point ground temperature comparisons for identifying the relative likelihood of condensation from one region to another. It is concluded that the suitability of earth tempering as a practice and of specific earth-sheltered design stereotypes varies geographically; while the subsurface almost always provides a thermal advantage on its own terms when compared to above ground climatic data, it can, nonetheless, compromise the effectiveness of other, regionally more important climate control techniques. Also contained in the report are reviews of above and below ground climate mapping schemes related to human comfort and architectural design, and detailed description of a theoretical model of ground temperature, heat flow, and heat storage in the ground. Strategies of passive climate control are presented in a discussion of the building bioclimatic analysis procedure which has been applied in a computer analysis of 30 years of weather data for each of 29 locations in the United States.

  12. KEBERADAAN KONSEP RULE BY LAW (NEGARA BERDASARKAN HUKUM DIDALAM TEORI NEGARA HUKUM THE RULE OF LAW

    Directory of Open Access Journals (Sweden)

    Made Hendra Wijaya

    2013-11-01

    Full Text Available This research titled, the existence of the concept of rule by law (state law within thestate theories of law the rule of law, which is where the first problem: How can theadvantages of Rule by Law in the theory of law Rule of Law?, How is the dis advantages of aconcept of Rule by law in the theory of law Rule of Law.This research method using the method of normative, legal research that examines thewritten laws of the various aspects, ie aspects of the theory, history, philosophy, comparative,structure and composition, scope, and content, consistent, overview, and chapter by chapter,formality, and the binding force of a law, and the legal language used, but did not examine orimlementasi applied aspects. By using this approach of Historical analysis and approach oflegal conceptual analysis.In this research have found that the advantages of the concept of Rule by Law lies in theproviding of certainty, can also be social control for the community, thus ensuring all citizensin good order at all reciprocal relationships within the community. And Disadvantages of theconcept of Rule by Law if the Law which legalized state action is not supported by democracyand human rights, and the principles of justice, there will be a denial of human rights,widespread poverty, and racial segregation, and if the law is only utilized out by theauthorities as a means to legalize all forms of actions that violate human can inflicttotalitarian nature of the ruling

  13. Application of quality improvement analytic methodology in emergency medicine research: A comparative evaluation.

    Science.gov (United States)

    Harries, Bruce; Filiatrault, Lyne; Abu-Laban, Riyad B

    2018-05-30

    Quality improvement (QI) analytic methodology is rarely encountered in the emergency medicine literature. We sought to comparatively apply QI design and analysis techniques to an existing data set, and discuss these techniques as an alternative to standard research methodology for evaluating a change in a process of care. We used data from a previously published randomized controlled trial on triage-nurse initiated radiography using the Ottawa ankle rules (OAR). QI analytic tools were applied to the data set from this study and evaluated comparatively against the original standard research methodology. The original study concluded that triage nurse-initiated radiographs led to a statistically significant decrease in mean emergency department length of stay. Using QI analytic methodology, we applied control charts and interpreted the results using established methods that preserved the time sequence of the data. This analysis found a compelling signal of a positive treatment effect that would have been identified after the enrolment of 58% of the original study sample, and in the 6th month of this 11-month study. Our comparative analysis demonstrates some of the potential benefits of QI analytic methodology. We found that had this approach been used in the original study, insights regarding the benefits of nurse-initiated radiography using the OAR would have been achieved earlier, and thus potentially at a lower cost. In situations where the overarching aim is to accelerate implementation of practice improvement to benefit future patients, we believe that increased consideration should be given to the use of QI analytic methodology.

  14. Do Fiscal Rules Matter?

    DEFF Research Database (Denmark)

    Grembi, Veronica; Nannicini, Tommaso; Troiano, Ugo

    2016-01-01

    , the central government imposed fiscal rules on municipal governments, and in 2001 relaxed them below 5,000 inhabitants. We exploit the before/after and discontinuous policy variation, and show that relaxing fiscal rules increases deficits and lowers taxes. The effect is larger if the mayor can be reelected......Fiscal rules are laws aimed at reducing the incentive to accumulate debt, and many countries adopt them to discipline local governments. Yet, their effectiveness is disputed because of commitment and enforcement problems. We study their impact applying a quasi-experimental design in Italy. In 1999...

  15. Assimilating to Hierarchical Culture: A Grounded Theory Study on Communication among Clinical Nurses.

    Science.gov (United States)

    Kim, MinYoung; Oh, Seieun

    2016-01-01

    The purpose of this study was to generate a substantive model that accounts for the explanatory social processes of communication in which nurses were engaged in clinical settings in Korea. Grounded theory methodology was used in this study. A total of 15 clinical nurses participated in the in-depth interviews. "Assimilating to the hierarchical culture" emerged as the basic social process of communication in which the participants engaged in their work environments. To adapt to the cultures of their assigned wards, the nurses learned to be silent and engaged in their assimilation into the established hierarchy. The process of assimilation consisted of three phases based on the major goals that nurses worked to achieve: getting to know about unspoken rules, persevering within the culture, and acting as senior nurse. Seven strategies and actions utilized to achieve the major tasks emerged as subcategories, including receiving strong disapproval, learning by observing, going silent, finding out what is acceptable, minimizing distress, taking advantages as senior nurse, and taking responsibilities as senior nurse. The findings identified how the pattern of communication in nursing organizations affected the way in which nurses were assimilated into organizational culture, from individual nurses' perspectives. In order to improve the rigid working atmosphere and culture in nursing organizations and increase members' satisfaction with work and quality of life, managers and staff nurses need training that focuses on effective communication and encouraging peer opinion-sharing within horizontal relationships. Moreover, organization-level support should be provided to create an environment that encourages free expression.

  16. Assimilating to Hierarchical Culture: A Grounded Theory Study on Communication among Clinical Nurses

    Science.gov (United States)

    2016-01-01

    The purpose of this study was to generate a substantive model that accounts for the explanatory social processes of communication in which nurses were engaged in clinical settings in Korea. Grounded theory methodology was used in this study. A total of 15 clinical nurses participated in the in-depth interviews. “Assimilating to the hierarchical culture” emerged as the basic social process of communication in which the participants engaged in their work environments. To adapt to the cultures of their assigned wards, the nurses learned to be silent and engaged in their assimilation into the established hierarchy. The process of assimilation consisted of three phases based on the major goals that nurses worked to achieve: getting to know about unspoken rules, persevering within the culture, and acting as senior nurse. Seven strategies and actions utilized to achieve the major tasks emerged as subcategories, including receiving strong disapproval, learning by observing, going silent, finding out what is acceptable, minimizing distress, taking advantages as senior nurse, and taking responsibilities as senior nurse. The findings identified how the pattern of communication in nursing organizations affected the way in which nurses were assimilated into organizational culture, from individual nurses’ perspectives. In order to improve the rigid working atmosphere and culture in nursing organizations and increase members’ satisfaction with work and quality of life, managers and staff nurses need training that focuses on effective communication and encouraging peer opinion-sharing within horizontal relationships. Moreover, organization-level support should be provided to create an environment that encourages free expression. PMID:27253389

  17. Exploring the use of grounded theory as a methodological approach to examine the 'black box' of network leadership in the national quality forum.

    Science.gov (United States)

    Hoflund, A Bryce

    2013-01-01

    This paper describes how grounded theory was used to investigate the "black box" of network leadership in the creation of the National Quality Forum. Scholars are beginning to recognize the importance of network organizations and are in the embryonic stages of collecting and analyzing data about network leadership processes. Grounded theory, with its focus on deriving theory from empirical data, offers researchers a distinctive way of studying little-known phenomena and is therefore well suited to exploring network leadership processes. Specifically, this paper provides an overview of grounded theory, a discussion of the appropriateness of grounded theory to investigating network phenomena, a description of how the research was conducted, and a discussion of the limitations and lessons learned from using this approach.

  18. Theoretical Grounds of Identification of the Essence of the Enterprise Development Efficiency Category

    Directory of Open Access Journals (Sweden)

    Adzhavenko Maryna M.

    2014-02-01

    Full Text Available Modern economic conditions put a new problem in front of scientists, namely: capability of an enterprise to survive in the unfavourable external environment. This problem is a system and complex one and its solution is within the plane of management of capital, personnel, development, efficiency, etc. The article marks out that efficiency is a corner stone of the modern economic science, which justifies studies of the gnoseological essence of the efficiency category. The main goal of the article lies in the study of scientific and theoretical grounds of formation of the enterprise development efficiency under modern conditions of the changing internal and external environments. The other goals of the article are identification of the essence of the development efficiency category, deepening the theoretical foundation of assessment of efficiency of enterprise development in the modern economic science. The article conducts an ontological analysis of the essence and goals of the enterprise development efficiency notion, studies evolution of scientific approaches and systemises theoretical provisions of the specified category and their assessment in the economic science. In the result of the study the article identifies a new vector of theoretical grounds and dominating logic of formation of the methodology of assessment of efficiency of enterprises under conditions of innovation development of the state, namely: it underlines principles of systemacy, complexity, self-organisation, significance of human capital as an important factor of increase of efficiency and development. Development of methodological grounds of assessment of efficiency of enterprise innovation development is a prospective direction of further studies.

  19. Study of the factors associated with substance use in adolescence using Association Rules.

    Science.gov (United States)

    García, Elena Gervilla; Blasco, Berta Cajal; López, Rafael Jiménez; Pol, Alfonso Palmer

    2010-01-01

    The aim of this study is to analyse the factors related to the use of addictive substances in adolescence using association rules, descriptive tools included in Data Mining. Thus, we have a database referring to the consumption of addictive substances in adolescence, and use the free distribution program in the R arules package (version 2.10.0). The sample was made up of 9,300 students between the ages of 14 and 18 (47.1% boys and 52.9% girls) with an average age of 15.6 (SE=1.2). The adolescents answered an anonymous questionnaire on personal, family and environmental risk factors related to substance use. The best rules obtained with regard to substance use relate the consumption of alcohol to perceived parenting style and peer consumption (confidence = 0.8528), the use of tobacco (smoking), cannabis and cocaine to perceived parental action and illegal behaviour (confidence = 0.8032, 0.8718 and 1.0000, respectively), and the use of ecstasy to peer consumption (confidence = 1.0000). In general, the association rules show in a simple manner the relationship between certain patterns of perceived parental action, behaviours that deviate from social behavioural norms, peer consumption and the use of different legal and illegal drugs of abuse in adolescence. The implications of the results obtained are described, together with the usefulness of this new methodology of analysis.

  20. Bayesian inference for identifying interaction rules in moving animal groups.

    Directory of Open Access Journals (Sweden)

    Richard P Mann

    Full Text Available The emergence of similar collective patterns from different self-propelled particle models of animal groups points to a restricted set of "universal" classes for these patterns. While universality is interesting, it is often the fine details of animal interactions that are of biological importance. Universality thus presents a challenge to inferring such interactions from macroscopic group dynamics since these can be consistent with many underlying interaction models. We present a Bayesian framework for learning animal interaction rules from fine scale recordings of animal movements in swarms. We apply these techniques to the inverse problem of inferring interaction rules from simulation models, showing that parameters can often be inferred from a small number of observations. Our methodology allows us to quantify our confidence in parameter fitting. For example, we show that attraction and alignment terms can be reliably estimated when animals are milling in a torus shape, while interaction radius cannot be reliably measured in such a situation. We assess the importance of rate of data collection and show how to test different models, such as topological and metric neighbourhood models. Taken together our results both inform the design of experiments on animal interactions and suggest how these data should be best analysed.