WorldWideScience

Sample records for ground rules methodologies

  1. Methodological issues in grounded theory.

    Science.gov (United States)

    Cutcliffe, J R

    2000-06-01

    Examination of the qualitative methodological literature shows that there appear to be conflicting opinions and unresolved issues regarding the nature and process of grounded theory. Researchers proposing to utilize this method would therefore be wise to consider these conflicting opinions. This paper therefore identifies and attempts to address four key issues, namely, sampling, creativity and reflexivity, the use of literature, and precision within grounded theory. The following recommendations are made. When utilizing a grounded method researchers need to consider their research question, clarify what level of theory is likely to be induced from their study, and then decide when they intend to access and introduce the second body of literature. They should acknowledge that in the early stages of data collection, some purposeful sampling appears to occur. In their search for conceptually dense theory, grounded theory researchers may wish to free themselves from the constraints that limit their use of creativity and tacit knowledge. Furthermore, the interests of researchers might be served by attention to issues of precision including, avoiding method slurring, ensuring theoretical coding occurs, and using predominantly one method of grounded theory while explaining and describing any deviation away from this chosen method. Such mindfulness and the resulting methodological rigour is likely to increase the overall quality of the inquiry and enhance the credibility of the findings.

  2. Grounded theory methodology--narrativity revisited.

    Science.gov (United States)

    Ruppel, Paul Sebastian; Mey, Günter

    2015-06-01

    This article aims to illuminate the role of narrativity in Grounded Theory Methodology and to explore an approach within Grounded Theory Methodology that is sensitized towards aspects of narrativity. The suggested approach takes into account narrativity as an aspect of the underlying data. It reflects how narrativity could be conceptually integrated and systematically used for shaping the way in which coding, category development and the presentation of results in a Grounded Theory Methodology study proceed.

  3. Grounded Theory Methodology: Positivism, Hermeneutics, and Pragmatism

    Science.gov (United States)

    Age, Lars-Johan

    2011-01-01

    Glaserian grounded theory methodology, which has been widely adopted as a scientific methodology in recent decades, has been variously characterised as "hermeneutic" and "positivist." This commentary therefore takes a different approach to characterising grounded theory by undertaking a comprehensive analysis of: (a) the philosophical paradigms of…

  4. Grounded theory: methodology and philosophical perspective.

    Science.gov (United States)

    Ghezeljeh, Tahereh Najafi; Emami, Azita

    2009-01-01

    Constructivist grounded theory reshapes the interactive relationship between researcher and participants and provides the reader with a sense of the analytical views through which the researcher examines the data. This paper presents an overview of grounded theory and constructivist grounded theory, exploring the ontological, epistemological and methodological aspects using examples from nursing research.

  5. Grounded Theory as a General Research Methodology

    Directory of Open Access Journals (Sweden)

    Judith A. Holton, Ph.D.

    2008-06-01

    Full Text Available Since its inception over forty years ago, grounded theory has achieved canonical status in the research world (Locke, 2001, p.1. Qualitative researchers, in particular, have embraced grounded theory although often without sufficient scholarship in the methodology (Partington, 2000, p.93; 2002, p.136. The embrace renders many researchers unable to perceive grounded theory as a general methodology and an alternative to the dominant qualitative and quantitative research paradigms. The result is methodological confusion and an often unconscious remodelling of the original methodology (Glaser, 2003. Given the various interpretations and approaches that have been popularised under the rubric of grounded theory, this paper addresses the important distinction between grounded theory as a general methodology and its popularisation as a qualitative research method. The paper begins with a brief overview of grounded theory’s origins and its philosophical foundations then continues by addressing the basic distinction between abstract conceptualisation as employed in classic grounded theory and the conceptual description approach as adopted by many qualitative researchers. The paper continues with a brief overview of the criteria for judging the quality of classic grounded theory and concludes by detailing its methodological principles.

  6. Grounded theory as feminist research methodology.

    Science.gov (United States)

    Keddy, B; Sims, S L; Stern, P N

    1996-03-01

    Feminist research is evolving, and with it new methods of doing science. In this feminist post-positivist era, grounded theory, while less inclusive and descriptive than ethnography, allows for complex analysis of complex questions. While Glaser & Strauss (the originators of this methodology) have written about grounded theory in an esoteric way, others have written extensively about this method in a much clearer and less rigid fashion. In this paper we discuss how grounded theory could be used in a creative and constantly evolving manner for feminist research.

  7. Grounded theory in nursing research: Part 1--Methodology.

    Science.gov (United States)

    McCann, Terence V; Clark, Eileen

    2003-01-01

    The epistemological underpinnings of grounded theory make it valuable in the study of nursing, which is premised on an interpersonal process between nurses and clients. Further, it is a useful style of research when there is little prior information about a topic. In this article (Part 1), Terence McCann and Eileen Clark outline the key features of this methodology. In the follow-up article (Part 2, McCann and Clark 2003a), a critique is provided of grounded theory and the two main approaches to this methodology. In the final article in the series (Part 3, McCann and Clark 2003b), the authors illustrate how grounded theory can be applied to nursing research with examples from McCann's Australian study (McCann and Baker 2001) of how community mental health nurses promote wellness with clients who are experiencing an early episode of psychotic illness.

  8. Fuzzy-Rule-Based Object Identification Methodology for NAVI System

    Directory of Open Access Journals (Sweden)

    Rosalyn R. Porle

    2005-08-01

    Full Text Available We present an object identification methodology applied in a navigation assistance for visually impaired (NAVI system. The NAVI has a single board processing system (SBPS, a digital video camera mounted headgear, and a pair of stereo earphones. The captured image from the camera is processed by the SBPS to generate a specially structured stereo sound suitable for vision impaired people in understanding the presence of objects/obstacles in front of them. The image processing stage is designed to identify the objects in the captured image. Edge detection and edge-linking procedures are applied in the processing of image. A concept of object preference is included in the image processing scheme and this concept is realized using a fuzzy-rule base. The blind users are trained with the stereo sound produced by NAVI for achieving a collision-free autonomous navigation.

  9. Cosmological Physics Ground Rules and How to Evaluate Cosmologies

    Science.gov (United States)

    Dilworth, D. J.

    2009-12-01

    This paper is a simple reminder for cosmology enthusiasts of the bright line separating the laws of physics from science fiction. It provides some tools: rules, guidelines and a definition of space useful for examining cosmology science claims and concepts. It explains the stringent thresholds for an idea before it can accurately be called a scientific theory or hypothesis; and who bears the burden of proof for a theory. These simple tools provide solid ground so you may more easily examine cosmology claims to help make up your own mind which side of the science/science fiction line a specific claim belongs on.

  10. Evolving Grounded Theory Methodology: towards a discursive approach.

    Science.gov (United States)

    McCreaddie, May; Payne, Sheila

    2010-06-01

    Grounded Theory Methodology (GTM) is a widely cited research approach based upon symbolic interaction with a focus on interaction, action and processes. Relatively recently, Discursive Psychology; a language-based interaction research approach also based on symbolic interaction, emerged. At present Discursive Psychology is principally cited in the social sciences literature. Given Discursive Psychology's symbolic interaction foundations, what relevance does this approach have for evolving GTM? A number of methodological challenges were posed by a study looking at humour in Clinical Nurse Specialist-patient interactions. This paper will use the phenomenon of spontaneous humour in healthcare interactions to illustrate the potential for a new form of GTM drawing on discursive approaches; Discursive GTM. First, the challenges presented by a study looking at spontaneous humour in Clinical Nurse Specialist-patient interactions are presented. Second, the research approach adopted to meet these challenges - Discursive GTM (DGTM) - is explicated and the results of the study are outlined. Third, the different GTM approaches and Discursive Psychology are compared and contrasted in relation to the DGTM approach adopted. Finally, the challenges and tensions of using DGTM as well as the opportunities afforded by the use of naturally occurring data are reviewed. The authors contend that a DGTM approach may be appropriate in analyzing certain phenomena. In particular, we highlight the potential contribution of naturally occurring data as an adjunct to researcher-elicited data. Thus, when exploring particular phenomena, a DGTM approach may address the potentially under-developed symbolic interaction tenet of language.

  11. Integrating Software in the Teaching of Grounded Theory Methodology

    Directory of Open Access Journals (Sweden)

    Agnes Mühlmeyer-Mentzel

    2011-09-01

    Full Text Available The implementation of our hands-on seminar is based on the understanding of grounded theory methodology (GTM as a craft that can be taught to a great extent. The successful learning process requires knowing, understanding and practising the procedural steps of this craft. It is also important to open up spaces for the development of reflexive and analytical competences. Having an orientation toward a research project within the teaching-learning process assists in deepening the understanding of GTM and provides a scope for practise and reflection at the same time. It is important for us to retain the student-centred nature of the teaching-learning process to enable active and praxis-oriented student engagement instead of focusing on the transmission of factual knowledge. The structural fit that exists between GTM and ATLAS.ti allows students to experience the software as a support in the analyses of their own data. URN: http://nbn-resolving.de/urn:nbn:de:0114-fqs1103171

  12. A Decision Making Methodology in Support of the Business Rules Lifecycle

    Science.gov (United States)

    Wild, Christopher; Rosca, Daniela

    1998-01-01

    The business rules that underlie an enterprise emerge as a new category of system requirements that represent decisions about how to run the business, and which are characterized by their business-orientation and their propensity for change. In this report, we introduce a decision making methodology which addresses several aspects of the business rules lifecycle: acquisition, deployment and evolution. We describe a meta-model for representing business rules in terms of an enterprise model, and also a decision support submodel for reasoning about and deriving the rules. The possibility for lifecycle automated assistance is demonstrated in terms of the automatic extraction of business rules from the decision structure. A system based on the metamodel has been implemented, including the extraction algorithm. This is the final report for Daniela Rosca's PhD fellowship. It describes the work we have done over the past year, current research and the list of publications associated with her thesis topic.

  13. Ruling the commons. Introducing a new methodology for the analysis of historical commons

    NARCIS (Netherlands)

    De Moor, M.; Lana Berasaín, José Miguel; Laborda Peman, M.; van Weeren, R.; Winchester, Angus

    2016-01-01

    Despite significant progress in recent years, the evolution of commons over the long run remains an under-explored area. During the last years an international team of historians has worked under the umbrella of the Common Rules Project in order to design and test a new methodology aimed at

  14. Ground tilt monitoring at Phlegraean Fields (Italy: a methodological approach

    Directory of Open Access Journals (Sweden)

    C. Del Gaudio

    2003-06-01

    Full Text Available Among geodetic methods used for monitoring ground deformation in volcanic areas, tiltmetry represents the most rapid technique and therefore it is used by almost all the volcanological observatories in the world. The deformation of volcanic building is not only the result of endogenous causes (i.e. dykes injection or magma rising, but also non-tectonic environmental factors. Such troubles cannot be removed completely but they can be reduce. This article outlines the main source of errors affecting the signals recorded by Phlegraean tilt, network, such as the dependence of the tilt response on temperature and to the thermoelastic effect on ground deformation. The analytical procedure used to evaluate about such errors and their reduction is explained. An application to data acquired from the tilt network during two distinct phases of ground uplift and subsidence of the Phlegraean Fields is reported.

  15. A usage-centered evaluation methodology for unmanned ground vehicles

    NARCIS (Netherlands)

    Diggelen, J. van; Looije, R.; Mioch, T.; Neerincx, M.A.; Smets, N.J.J.M.

    2012-01-01

    This paper presents a usage-centered evaluation method to assess the capabilities of a particular Unmanned Ground Vehicle (UGV) for establishing the operational goals. The method includes a test battery consisting of basic tasks (e.g., slalom, funnel driving, object detection). Tests can be of diffe

  16. A usage-centered evaluation methodology for unmanned ground vehicles

    NARCIS (Netherlands)

    Diggelen, J. van; Looije, R.; Mioch, T.; Neerincx, M.A.; Smets, N.J.J.M.

    2012-01-01

    This paper presents a usage-centered evaluation method to assess the capabilities of a particular Unmanned Ground Vehicle (UGV) for establishing the operational goals. The method includes a test battery consisting of basic tasks (e.g., slalom, funnel driving, object detection). Tests can be of diffe

  17. Business analysis methodology in telecommunication industry - the research based on the grounded theory

    National Research Council Canada - National Science Library

    Hana Nenickova

    2013-01-01

      The objective of this article is to present the grounded theory using in the qualitative research as a basis to build a business analysis methodology for the implementation of information systems...

  18. Independent Orbiter Assessment (IOA): FMEA/CIL instructions and ground rules

    Science.gov (United States)

    Traves, S. T.

    1986-01-01

    The McDonnell Douglas Astronautics Company was selected to conduct an independent assessment of the Orbiter Failure Mode and Effects Analysis/Critical Items List (FMEA/CIL). Part of this effort involved an examination of the FMEA/CIL preparation instructions and ground rules. Assessment objectives were to identify omissions and ambiguities in the ground rules that may impede the identification of shuttle orbiter safety and mission critical items, and to ensure that ground rules allow these items to receive proper management visibility for risk assessment. Assessment objectives were followed during the performance of the assessment without being influenced by external considerations such as effects on budget, schedule, and documentation growth. Assessment personnel were employed who had a strong reliability background but no previous space shuttle FMEA/CIL experience to ensure an independent assessment would be achieved. The following observations were made: (1) not all essential items are in the CIL for management visibility; (2) ground rules omit FMEA/CIL coverage of items that perform critical functions; (3) essential items excluded from the CIL do not receive design justification; and (4) FMEAs/CILs are not updated in a timely manner. In addition to the above issues, a number of other issues were identified that correct FMEA/CIL preparation instruction omissions and clarify ambiguities. The assessment was successful in that many of the issues have significant safety implications.

  19. Are There Two Methods of Grounded Theory? Demystifying the Methodological Debate

    Directory of Open Access Journals (Sweden)

    Cheri Ann Hernandez, RN, Ph.D., CDE

    2008-06-01

    Full Text Available Grounded theory is an inductive research method for the generation of substantive or formal theory, using qualitative or quantitative data generated from research interviews, observation, or written sources, or some combination thereof (Glaser & Strauss, 1967. In recent years there has been much controversy over the etiology of its discovery, as well as, the exact way in which grounded theory research is to be operationalized. Unfortunately, this situation has resulted in much confusion, particularly among novice researchers who wish to utilize this research method. In this article, the historical, methodological and philosophical roots of grounded theory are delineated in a beginning effort to demystify this methodological debate. Grounded theory variants such as feminist grounded theory (Wuest, 1995 or constructivist grounded theory (Charmaz, 1990 are beyond the scope of this discussion.

  20. Adopting a Grounded Theory Approach to Cultural-Historical Research: Conflicting Methodologies or Complementary Methods?

    Directory of Open Access Journals (Sweden)

    Jayson Seaman PhD

    2008-03-01

    Full Text Available Grounded theory has long been regarded as a valuable way to conduct social and educational research. However, recent constructivist and postmodern insights are challenging long-standing assumptions, most notably by suggesting that grounded theory can be flexibly integrated with existing theories. This move hinges on repositioning grounded theory from a methodology with positivist underpinnings to an approach that can be used within different theoretical frameworks. In this article the author reviews this recent transformation of grounded theory, engages in the project of repositioning it as an approach by using cultural historical activity theory as a test case, and outlines several practical methods implied by the joint use of grounded theory as an approach and activity theory as a methodology. One implication is the adoption of a dialectic, as opposed to a constructivist or objectivist, stance toward grounded theory inquiry, a stance that helps move past the problem of emergence versus forcing.

  1. CIOs' views of HIPAA Security Rule implementation--an application of Q-methodology.

    Science.gov (United States)

    Ao, Mei; Walker, Rosemary

    2005-01-01

    The purpose of this study is to uncover the attitudes held by chief information officers (CIOs) regarding the implementation of HIPAA's Security Rule. In March and April of 2004, five Chicago area CIOs were surveyed and asked to rank 26 opinion statements that presented possible implementation barriers to the Security Rule. Q-methodology, which is a powerful tool in subjective study, was employed to identify and categorize the viewpoints of CIOs toward the barriers. Two factors (opinion types) that represented two different views--socially motivated CIOs and resources-motivated CIOs--regarding the implementation barriers were extracted. The study sheds light on the attitudes and perceptions of CIOs as they begin rule implementation. Current CIOs can use this information as a way to begin to examine what the prevailing attitude may be at their institution and, therefore, how to begin building a successful implementation strategy.

  2. Software Development and Test Methodology for a Distributed Ground System

    Science.gov (United States)

    Ritter, George; Guillebeau, Pat; McNair, Ann R. (Technical Monitor)

    2002-01-01

    The Marshall Space Flight Center's (MSFC) Payload Operations Center (POC) ground system has evolved over a period of about 10 years. During this time the software processes have migrated from more traditional to more contemporary development processes in an effort to minimize unnecessary overhead while maximizing process benefits. The Software processes that have evolved still emphasize requirements capture, software configuration management, design documenting, and making sure the products that have been developed are accountable to initial requirements. This paper will give an overview of how the Software Processes have evolved, highlighting the positives as well as the negatives. In addition, we will mention the COTS tools that have been integrated into the processes and how the COTS have provided value to the project.

  3. Rule-based Expert Systems for Selecting Information Systems Development Methodologies

    Directory of Open Access Journals (Sweden)

    Abdel Nasser H. Zaied

    2013-08-01

    Full Text Available Information Systems (IS are increasingly becoming regarded as crucial to an organization's success. Information Systems Development Methodologies (ISDMs are used by organizations to structure the information system development process. ISDMs are essential for structuring project participants’ thinking and actions; therefore ISDMs play an important role to achieve successful projects. There are different ISDMs and no methodology can claim that it can be applied to any organization. The problem facing decision makers is how to select an appropriate development methodology that may increase the probability of system success. This paper takes this issue into account when study ISDMs and provides a Rule-based Expert System as a tool for selecting appropriate ISDMs. The proposed expert system consists of three main phases to automate the process of selecting ISDMs.Three approaches were used to test the proposed expert system. Face validation through six professors and six IS professionals, predictive validation through twenty four experts and blind validation through nine employees working in IT field.The results show that the proposed system was found to be run without any errors, offered a friendly user interface and its suggestions matching user expectations with 95.8%. It also can help project managers, systems' engineers, systems' developers, consultants, and planners in the process of selecting the suitable ISDM. Finally, the results show that the proposed Rule-based Expert System can facilities the selection process especially for new users and non-specialist in Information System field.

  4. Construction of a Conceptualization of Personal Knowledge within a Knowledge Management Perspective Using Grounded Theory Methodology

    Science.gov (United States)

    Straw, Eric M.

    2013-01-01

    The current research used grounded theory methodology (GTM) to construct a conceptualization of personal knowledge within a knowledge management (KM) perspective. The need for the current research was based on the use of just two categories of knowledge, explicit and tacit, within KM literature to explain diverse characteristics of personal…

  5. Construction of a Conceptualization of Personal Knowledge within a Knowledge Management Perspective Using Grounded Theory Methodology

    Science.gov (United States)

    Straw, Eric M.

    2013-01-01

    The current research used grounded theory methodology (GTM) to construct a conceptualization of personal knowledge within a knowledge management (KM) perspective. The need for the current research was based on the use of just two categories of knowledge, explicit and tacit, within KM literature to explain diverse characteristics of personal…

  6. Scalability of a Methodology for Generating Technical Trading Rules with GAPs Based on Risk-Return Adjustment and Incremental Training

    Science.gov (United States)

    de La Cal, E. A.; Fernández, E. M.; Quiroga, R.; Villar, J. R.; Sedano, J.

    In previous works a methodology was defined, based on the design of a genetic algorithm GAP and an incremental training technique adapted to the learning of series of stock market values. The GAP technique consists in a fusion of GP and GA. The GAP algorithm implements the automatic search for crisp trading rules taking as objectives of the training both the optimization of the return obtained and the minimization of the assumed risk. Applying the proposed methodology, rules have been obtained for a period of eight years of the S&P500 index. The achieved adjustment of the relation return-risk has generated rules with returns very superior in the testing period to those obtained applying habitual methodologies and even clearly superior to Buy&Hold. This work probes that the proposed methodology is valid for different assets in a different market than previous work.

  7. Ruling the Commons. Introducing a new methodology for the analysis of historical commons

    Directory of Open Access Journals (Sweden)

    Tine de Moor

    2016-10-01

    Full Text Available Despite significant progress in recent years, the evolution of commons over the long run remains an under-explored area within commons studies. During the last years an international team of historians have worked under the umbrella of the Common Rules Project in order to design and test a new methodology aimed at advancing our knowledge on the dynamics of institutions for collective action – in particular commons. This project aims to contribute to the current debate on commons on three different fronts. Theoretically, it explicitly draws our attention to issues of change and adaptation in the commons – contrasting with more static analyses. Empirically, it highlights the value of historical records as a rich source of information for longitudinal analysis of the functioning of commons. Methodologically, it develops a systematic way of analyzing and comparing commons’ regulations across regions and time, setting a number of variables that have been defined on the basis of the “most common denominators” in commons regulation across countries and time periods. In this paper we introduce the project, describe our sources and methodology, and present the preliminary results of our analysis.

  8. X-ray-absorption sum rules in jj-coupled operators and ground-state moments of actinide ions

    NARCIS (Netherlands)

    van der Laan, G; Thole, BT

    1996-01-01

    Sum rules for magnetic x-ray dichroism, relating the signals of the spin-orbit split core level absorption edges to the ground-state spin and orbital operators, are expressed in jj-coupled operators. These sum rules can be used in the region of intermediate coupling by taking into account the cross

  9. A Rule-Based Local Search Algorithm for General Shift Design Problems in Airport Ground Handling

    DEFF Research Database (Denmark)

    Clausen, Tommy

    We consider a generalized version of the shift design problem where shifts are created to cover a multiskilled demand and fit the parameters of the workforce. We present a collection of constraints and objectives for the generalized shift design problem. A local search solution framework with mul...... with multiple neighborhoods and a loosely coupled rule engine based on simulated annealing is presented. Computational experiments on real-life data from various airport ground handling organization show the performance and flexibility of the proposed algorithm....

  10. Electrical and seismic mixing rules for detecting changes in ground ice content in permafrost studies

    Science.gov (United States)

    Hauck, Christian; Hilbich, Christin

    2017-04-01

    Geophysical methods are now widely used in permafrost research to detect and monitor frozen ground and potentially quantify the ground ice content in the soil. Hereby, often a combination of different methods is used to reduce the ambiguities inherent with the indirect nature of geophysical surveys. Geophysical mixing rules and petrophysical relationships originally developed by exploration industry may help to quantitatively relate geophysical variables such as the electrical resistivity or the seismic P-wave velocity to the physical properties of the subsurface. Two of these mixing rules were combined by Hauck et al. (2011) in a so-called 4-phase model to attempt to quantify the ground ice, air- and water content and their changes with time in permafrost environments (e.g. Pellet et al. 2016). However, these mixing rules are often either empirically derived (making use of a large number of borehole samples) or based on a simplified mixing model, i.e. an equal weighting of each phase component (ice, water, soil/rock, air) depending on the actual fractional content of each phase. There is thus no obvious 'best choice' model from the available geophysical approaches. Stimulated by recent theoretical work by Glover (2010), who analysed the relationships between the empirical and theory-derived mixing models, this contribution aims to analyse the applicability of various mixing models for electrical and seismic data sets in the context of detecting and monitoring permafrost degradation. Input data stem from various geophysical surveys around the world and ground truth data for validation is available from corresponding permafrost boreholes from the PERMOS and GTN-P data bases. Glover, P. W. (2010). A generalized Archie's law for n phases. Geophysics, 75(6), E247-E265. Hauck, C., Böttcher, M. and Maurer, H. (2011): A new model for estimating subsurface ice content based on combined electrical and seismic data sets. The Cryosphere, 5, 453-468. Pellet C., Hilbich C

  11. Business analysis methodology in telecommunication industry – the research based on the grounded theory

    Directory of Open Access Journals (Sweden)

    Hana Nenickova

    2013-10-01

    Full Text Available The objective of this article is to present the grounded theory using in the qualitative research as a basis to build a business analysis methodology for the implementation of information systems in telecommunication enterprises in Czech Republic. In the preparation of the methodology I have used the current needs of telecommunications companies, which are characterized mainly by high dependence on information systems. Besides that, this industry is characterized by high flexibility and competition and compressing of the corporate strategy timeline. The grounded theory of business analysis defines the specifics of the telecommunications industry, focusing on the very specific description of the procedure for collecting the business requirements and following the business strategy.

  12. Surface Signature Characterization at SPE through Ground-Proximal Methods: Methodology Change and Technical Justification

    Energy Technology Data Exchange (ETDEWEB)

    Schultz-Fellenz, Emily S. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2015-09-09

    A portion of LANL’s FY15 SPE objectives includes initial ground-based or ground-proximal investigations at the SPE Phase 2 site. The area of interest is the U2ez location in Yucca Flat. This collection serves as a baseline for discrimination of surface features and acquisition of topographic signatures prior to any development or pre-shot activities associated with SPE Phase 2. Our team originally intended to perform our field investigations using previously vetted ground-based (GB) LIDAR methodologies. However, the extended proposed time frame of the GB LIDAR data collection, and associated data processing time and delivery date, were unacceptable. After technical consultation and careful literature research, LANL identified an alternative methodology to achieve our technical objectives and fully support critical model parameterization. Very-low-altitude unmanned aerial systems (UAS) photogrammetry appeared to satisfy our objectives in lieu of GB LIDAR. The SPE Phase 2 baseline collection was used as a test of this UAS photogrammetric methodology.

  13. Qualitative data analysis using the n Vivo programe and the application of the methodology of grounded theory procedures

    Directory of Open Access Journals (Sweden)

    Niedbalski Jakub

    2012-02-01

    Full Text Available The main aim of the article is to identify the capabilities and constraints of using CAQDAS (Computer-Assisted Qualitative Data Analysis Software programs in qualitative data analysis. Our considerations are based on the personal experiences gained while conducting the research projects using the methodology of grounded theory (GT and the NVivo 8 program. In presented article we focusedon relations between the methodological principles of grounded theory and the technical possibilities of NVivo 8. The paper presents our opinion about the most important options available in NVivo 8 and their application in the studies based on the methodology of grounded theory.

  14. Initial building investigations at Aberdeen Proving Ground, Maryland: Objectives and methodology

    Energy Technology Data Exchange (ETDEWEB)

    Brubaker, K.L.; Dougherty, J.M.; McGinnis, L.D.

    1994-12-01

    As part of an environmental-contamination source-definition program at Aberdeen Proving Ground, detailed internal and external inspections of 23 potentially contaminated buildings are being conducted to describe and characterize the state of each building as it currently exists and to identify areas potentially contaminated with toxic or other hazardous substances. In addition, a detailed geophysical investigation is being conducted in the vicinity of each target building to locate and identify subsurface structures, associated with former building operations, that are potential sources of contamination. This report describes the objectives of the initial building inspections, including the geophysical investigations, and discusses the methodology that has been developed to achieve these objectives.

  15. Ground validation of DPR precipitation rate over Italy using H-SAF validation methodology

    Science.gov (United States)

    Puca, Silvia; Petracca, Marco; Sebastianelli, Stefano; Vulpiani, Gianfranco

    2017-04-01

    The H-SAF project (Satellite Application Facility on support to Operational Hydrology and Water Management, funded by EUMETSAT) is aimed at retrieving key hydrological parameters such as precipitation, soil moisture and snow cover. Within the H-SAF consortium, the Product Precipitation Validation Group (PPVG) evaluate the accuracy of instantaneous and accumulated precipitation products with respect to ground radar and rain gauge data adopting the same methodology (using a Unique Common Code) throughout Europe. The adopted validation methodology can be summarized by the following few steps: (1) ground data (radar and rain gauge) quality control; (2) spatial interpolation of rain gauge measurements; (3) up-scaling of radar data to satellite native grid; (4) temporal comparison of satellite and ground-based precipitation products; and (5) production and evaluation of continuous and multi-categorical statistical scores for long time series and case studies. The statistical scores are evaluated taking into account the satellite product native grid. With the recent advent of the GPM era starting in march 2014, more new global precipitation products are available. The validation methodology developed in H-SAF can be easily applicable to different precipitation products. In this work, we have validated instantaneous precipitation data estimated from DPR (Dual-frequency Precipitation Radar) instrument onboard of the GPM-CO (Global Precipitation Measurement Core Observatory) satellite. In particular, we have analyzed the near surface and estimated precipitation fields collected in the 2A-Level for 3 different scans (NS, MS and HS). The Italian radar mosaic managed by the National Department of Civil Protection available operationally every 10 minutes is used as ground reference data. The results obtained highlight the capability of the DPR to identify properly the precipitation areas with higher accuracy in estimating the stratiform precipitation (especially for the HS). An

  16. Advancing Nursing Research in the Visual Era: Reenvisioning the Photovoice Process Across Phenomenological, Grounded Theory, and Critical Theory Methodologies.

    Science.gov (United States)

    Evans-Agnew, Robin A; Boutain, Doris M; Rosemberg, Marie-Anne S

    Photovoice is a powerful research method that employs participant photography for advancing voice, knowledge, and transformative change among groups historically or currently marginalized. Paradoxically, this research method risks exploitation of participant voice because of weak methodology to method congruence. The purposes of this retrospective article are to revisit current interdisciplinary research using photovoice and to suggest how to advance photovoice by improving methodology-method congruence. Novel templates are provided for improving the photovoice process across phenomenological, grounded theory, and critical theory methodologies.

  17. A Coding Scheme Development Methodology Using Grounded Theory For Qualitative Analysis Of Pair Programming

    Directory of Open Access Journals (Sweden)

    Stephan Salinger

    2008-01-01

    Full Text Available A number of quantitative studies of pair programming (the practice of two programmers working together using just one computer have partially conflicting results. Qualitative studies are needed to explain what is really going on. We support such studies by taking a grounded theory (GT approach for deriving a coding scheme for the objective conceptual description of specific pair programming sessions independent of a particular research goal. The present article explains why our initial attempts at using GT failed and describes how to avoid these difficulties by a predetermined perspective on the data, concept naming rules, an analysis results metamodel, and pair coding. These practices may be helpful in all GT situations, particularly those involving very rich data such as video data. We illustrate the operation and usefulness of these practices by real examples derived from our coding work and present a few preliminary hypotheses regarding pair programming that have surfaced.

  18. Interrelationships Between Receiver/Relative Operating Characteristics Display, Binomial, Logit, and Bayes' Rule Probability of Detection Methodologies

    Science.gov (United States)

    Generazio, Edward R.

    2014-01-01

    Unknown risks are introduced into failure critical systems when probability of detection (POD) capabilities are accepted without a complete understanding of the statistical method applied and the interpretation of the statistical results. The presence of this risk in the nondestructive evaluation (NDE) community is revealed in common statements about POD. These statements are often interpreted in a variety of ways and therefore, the very existence of the statements identifies the need for a more comprehensive understanding of POD methodologies. Statistical methodologies have data requirements to be met, procedures to be followed, and requirements for validation or demonstration of adequacy of the POD estimates. Risks are further enhanced due to the wide range of statistical methodologies used for determining the POD capability. Receiver/Relative Operating Characteristics (ROC) Display, simple binomial, logistic regression, and Bayes' rule POD methodologies are widely used in determining POD capability. This work focuses on Hit-Miss data to reveal the framework of the interrelationships between Receiver/Relative Operating Characteristics Display, simple binomial, logistic regression, and Bayes' Rule methodologies as they are applied to POD. Knowledge of these interrelationships leads to an intuitive and global understanding of the statistical data, procedural and validation requirements for establishing credible POD estimates.

  19. Identification and sensitivity analysis of a correlated ground rule system (design arc)

    Science.gov (United States)

    Eastman, Eric; Chidambarrao, Dureseti; Rausch, Werner; Topaloglu, Rasit O.; Shao, Dongbing; Ramachandran, Ravikumar; Angyal, Matthew

    2017-04-01

    We demonstrate a tool which can function as an interface between VLSI designers and process-technology engineers throughout the Design-Technology Co-optimization (DTCO) process. This tool uses a Monte Carlo algorithm on the output of lithography simulations to model the frequency of fail mechanisms on wafer. Fail mechanisms are defined according to process integration flow: by Boolean operations and measurements between original and derived shapes. Another feature of this design rule optimization methodology is the use of a Markov-Chain-based algorithm to perform a sensitivity analysis, the output of which may be used by process engineers to target key process-induced variabilities for improvement. This tool is used to analyze multiple Middle-Of-Line fail mechanisms in a 10nm inverter design and identify key process assumptions that will most strongly affect the yield of the structures. This tool and the underlying algorithm are also shown to be scalable to arbitrarily complex geometries in three dimensions. Such a characteristic which is becoming more important with the introduction of novel patterning technologies and more complex 3-D on-wafer structures.

  20. Back- and fore-grounding ontology: exploring the linkages between critical realism, pragmatism, and methodologies in health & rehabilitation sciences.

    Science.gov (United States)

    DeForge, Ryan; Shaw, Jay

    2012-03-01

    Back- and fore-grounding ontology: exploring the linkages between critical realism, pragmatism, and methodologies in health & rehabilitation sciences As two doctoral candidates in a health and rehabilitation sciences program, we describe in this paper our respective paradigmatic locations along a quite nonlinear ontological-epistemological-axiological-methodological chain. In a turn-taking fashion, we unpack the tenets of critical realism and pragmatism, and then trace the linkages from these paradigmatic locations through to the methodological choices that address a community-based research problem. Beyond serving as an answer to calls for academics in training to demonstrate philosophical-theoretical-methodological integrity and coherence in their scholarship, this paper represents critical realism and its fore-grounding of a deeply stratified ontology in reflexive relation to pragmatism and its back-grounding of ontology. We conclude by considering the merits and challenges of conducting research from within singular versus proliferate paradigmatic perspectives.

  1. [Nursing care systematization according to the nurses' view: a methodological approach based on grounded theory].

    Science.gov (United States)

    de Medeiros, Ana Lúcia; dos Santos, Sérgio Ribeiro; de Cabral, Rômulo Wanderley Lima

    2012-09-01

    This study was aimed at understanding, from the nurses' perspective, the experience of going through the Systematization of nursing care (SNC) in an obstetric service unit. We used grounded theory as the theoretical and methodological framework. The subjects of this study consisted of thirteen nurses from a public hospital in the city of João Pessoa, in the state of Paraíba. The data analysis resulted in the following phenomenon. "perceiving SNC as a working method that organizes, directs and improves the quality of care by bringing visibility and providing security for the nursing staff" The nurses expressed the extent of knowledge about the SNC experienced in obstetrics as well as considered the nursing process as a decision-making process, which guides the reasoning of nurses in the planning of nursing care in obstetrics. It was concluded that nurses perceive the SNC as an instrument of theoretical-practical articulation leading to personalized assistance.

  2. Exploring student nurse anesthetist stressors and coping using grounded theory methodology.

    Science.gov (United States)

    Phillips, Joy Kieffer

    2010-12-01

    The purpose of this qualitative study was to examine the challenges that recent graduates of nurse anesthesia programs coped with during their anesthesia curriculum from their perspective. The initial research questions for this study were: From the graduates'perspective, what were the stressors that they encountered during their nurse anesthesia program? And how did they successfully negotiate those stressors in order to graduate from their program? This phenomenon was studied using grounded theory methodology. The data were collected by individual, semistructured, in-depth interviews with 12 recent nurse anesthesia program graduates, from 5 different nurse anesthesia programs, who have been out of school for less than 2 years. This exploration into student nurse anesthetist stress and coping articulates 3 phases of development as these students progressed through their program. The phases are transitioning in (first 9 months of program), finding their way (9 to 18 months into program), and transitioning out (18 to 28 months into program). Coping mechanisms employed by the participants were problem focused, emotion focused, and a combination of the 2. Recommendations for action and future research are discussed.

  3. Medicare program; replacement of reasonable charge methodology by fee schedules for parenteral and enteral nutrients, equipment, and supplies. Final rule.

    Science.gov (United States)

    2001-08-28

    This final rule implements fee schedules for payment of parenteral and enteral nutrition (PEN) items and services furnished under the prosthetic device benefit, defined in section 1861(s)(8) of the Social Security Act. The authority for establishing these fee schedules is provided by the Balanced Budget Act of 1997, which amended the Social Security Act at section 1842(s). Section 1842(s) of the Social Security Act specifies that statewide or other area wide fee schedules may be implemented for the following items and services still subject to the reasonable charge payment methodology: medical supplies; home dialysis supplies and equipment; therapeutic shoes; parenteral and enteral nutrients, equipment, and supplies; electromyogram devices; salivation devices; blood products; and transfusion medicine. This final rule describes changes made to the proposed fee schedule payment methodology for these items and services and provides that the fee schedules for PEN items and services are effective for all covered items and services furnished on or after January 1, 2002. Fee schedules will not be implemented for electromyogram devices and salivation devices at this time since these items are not covered by Medicare. In addition, fee schedules will not be implemented for medical supplies, home dialysis supplies and equipment, therapeutic shoes, blood products, and transfusion medicine at this time since the data required to establish these fee schedules are inadequate.

  4. Aiding eco-labelling process and its implementation: Environmental Impact Assessment Methodology to define Product Category Rules for canned anchovies.

    Science.gov (United States)

    Laso, Jara; Margallo, María; Fullana, Pére; Bala, Alba; Gazulla, Cristina; Irabien, Ángel; Aldaco, Rubén

    2017-01-01

    To be able to fulfil high market expectations for a number of practical applications, Environmental Product Declarations (EPDs) have to meet and comply with specific and strict methodological prerequisites. These expectations include the possibility to add up Life Cycle Assessment (LCA)-based information in the supply chain and to compare different EPDs. To achieve this goal, common and harmonized calculation rules have to be established, the so-called Product Category Rules (PCRs), which set the overall LCA calculation rules to create EPDs. This document provides PCRs for the assessment of the environmental performance of canned anchovies in Cantabria Region based on an Environmental Sustainability Assessment (ESA) method. This method uses two main variables: the natural resources sustainability (NRS) and the environmental burdens sustainability (EBS). To reduce the complexity of ESA and facilitate the decision-making process, all variables are normalized and weighted to obtain two global dimensionless indexes: resource consumption (X1) and environmental burdens (X2). •This paper sets the PCRs adapted to the Cantabrian canned anchovies.•ESA method facilitates the product comparison and the decision-making process.•This paper stablishes all the steps that an EPD should include within the PCRs of Cantabrian canned anchovies.

  5. A new methodology for monitoring wood fluxes in rivers using a ground camera: Potential and limits

    Science.gov (United States)

    Benacchio, Véronique; Piégay, Hervé; Buffin-Bélanger, Thomas; Vaudor, Lise

    2017-02-01

    Ground imagery, which produces large amounts of valuable data at high frequencies, is increasingly used by fluvial geomorphologists to survey and understand processes. While such technology provides immense quantities of information, it can be challenging to analyze and requires automatization and associated development of new methodologies. This paper presents a new approach to automate the processing of image analysis to monitor wood delivery from the upstream Rhône River (France). The Génissiat dam is used as an observation window; all pieces of wood coming from the catchment are trapped here, hence a wood raft accumulates over time. In 2011, we installed an Axis 211W camera to acquire oblique images of the reservoir every 10 min with the goal of automatically detecting a wood raft area, in order to transform it to wood weight (t) and flux (t/d). The methodology we developed is based on random forest classification to detect the wood raft surface over time, which provided a good classification rate of 97.2%. Based on 14 mechanical wood extractions that included weight of wood removed each time, conducted during the survey period, we established a relationship between wood weight and wood raft surface area observed just before the extraction (R2 = 0.93). We found that using such techniques to continuously monitor wood flux is difficult because the raft undergoes very significant changes through time in terms of density, with a very high interday and intraday variability. Misclassifications caused by changes in weather conditions can be mitigated as well as errors from variation in pixel resolution (owing to camera position or window size), but a set of effects on raft density and mobility must still be explored (e.g., dam operation effects, wind on the reservoir surface). At this stage, only peak flow contribution to wood delivery can be well calculated, but determining an accurate, continuous series of wood flux is not possible. Several recommendations are

  6. A study to derive a clinical decision rule for triage of emergency department patients with chest pain: design and methodology

    Directory of Open Access Journals (Sweden)

    Jaffe Allan

    2008-02-01

    Full Text Available Abstract Background Chest pain is the second most common chief complaint in North American emergency departments. Data from the U.S. suggest that 2.1% of patients with acute myocardial infarction and 2.3% of patients with unstable angina are misdiagnosed, with slightly higher rates reported in a recent Canadian study (4.6% and 6.4%, respectively. Information obtained from the history, 12-lead ECG, and a single set of cardiac enzymes is unable to identify patients who are safe for early discharge with sufficient sensitivity. The 2007 ACC/AHA guidelines for UA/NSTEMI do not identify patients at low risk for adverse cardiac events who can be safely discharged without provocative testing. As a result large numbers of low risk patients are triaged to chest pain observation units and undergo provocative testing, at significant cost to the healthcare system. Clinical decision rules use clinical findings (history, physical exam, test results to suggest a diagnostic or therapeutic course of action. Currently no methodologically robust clinical decision rule identifies patients safe for early discharge. Methods/design The goal of this study is to derive a clinical decision rule which will allow emergency physicians to accurately identify patients with chest pain who are safe for early discharge. The study will utilize a prospective cohort design. Standardized clinical variables will be collected on all patients at least 25 years of age complaining of chest pain prior to provocative testing. Variables strongly associated with the composite outcome acute myocardial infarction, revascularization, or death will be further analyzed with multivariable analysis to derive the clinical rule. Specific aims are to: i apply standardized clinical assessments to patients with chest pain, incorporating results of early cardiac testing; ii determine the inter-observer reliability of the clinical information; iii determine the statistical association between the clinical

  7. Preliminary methodology investigation of mask pattern fidelity for 250-nm design rules

    Science.gov (United States)

    Coleman, Thomas P.; Sauer, Charles A.; Naber, Robert J.; Hamaker, Henry Chris

    1995-07-01

    Techniques have been developed that can quickly and accurately measure corner rounding and contact fill as key indicators of pattern fidelity. Using these techniques, we have examined writing variables for their effect on the lithographic quality of 1.0 micrometers contact. A small contact is perhaps the most demanding figure to achieve, so the results shown can be considered the worst case for 4X radicle manufacturing at 250 nm design rules. A MEBES 4500 was used as the writing tool, using PBS resist on quartz masks. Standard printing methods, single-phase printing (SPP) and multiphase printing (2X MPP) were examined. Results indicate that excellent corner rounding results can be achieved with small address sizes, regardless of the writing strategy or the dose used. As expected, larger spot sizes increase the amount of corner rounding, regardless of the address. As the pattern address is increased, judicious choices of spot size reduce potential pattern fidelity loss when imaging small contracts and other fine features. Multiphase printing is a technique that offers advantages to the user. Its use of offset scan voting (OSV) is a significant factor in reducing placement errors. MPP (2X) has an additional advantage of providing higher dosages. This provides flexibility in resist choices and in the selection of a process window. With 2X MPP, the user has a wide range of addresses and spot sizes that will give excellent results. The dynamic range of operating conditions possible with 2X MPP when writing 1.0 micrometers contacts is a reduced subset of those available using SPP, due to the 2X writing grid (output address). Implementation of 2X MPP has been limited on previous MEBES models due to increased write times of multipass writing. The MEBES 4500 data path supports 2X MPP with write times that approximate SPP. The practical operating envelope of both writing strategies are detailed in this paper. Overall, the MEBES 4500 has a large dynamic operating range. When

  8. Even free radicals should follow some rules: a guide to free radical research terminology and methodology.

    Science.gov (United States)

    Forman, Henry Jay; Augusto, Ohara; Brigelius-Flohe, Regina; Dennery, Phyllis A; Kalyanaraman, Balaraman; Ischiropoulos, Harry; Mann, Giovanni E; Radi, Rafael; Roberts, L Jackson; Vina, Jose; Davies, Kelvin J A

    2015-01-01

    Free radicals and oxidants are now implicated in physiological responses and in several diseases. Given the wide range of expertise of free radical researchers, application of the greater understanding of chemistry has not been uniformly applied to biological studies. We suggest that some widely used methodologies and terminologies hamper progress and need to be addressed. We make the case for abandonment and judicious use of several methods and terms and suggest practical and viable alternatives. These changes are suggested in four areas: use of fluorescent dyes to identify and quantify reactive species, methods for measurement of lipid peroxidation in complex biological systems, claims of antioxidants as radical scavengers, and use of the terms for reactive species.

  9. Selection of Grounded Theory as an Appropriate Research Methodology for a Dissertation: One Student’s Perspective

    Directory of Open Access Journals (Sweden)

    James W. Jones, Ed.D.

    2009-06-01

    Full Text Available Doctoral students wanting to use grounded theory as a methodological approach for their dissertation often face multiple challenges gaining acceptance of their approach by their committee. This paper presents the case that the author used to overcome these challenges through the process of eliminating other methodologies, leaving grounded theory as the preferred method for the desired research issue. Through examining the approach used successfully by the author, other doctoral students will be able to frame similar arguments justifying the use of grounded theory in their dissertations and seeing the use of the method continue to spread into new fields and applications. This paper examines the case built for selecting grounded theory as a defensible dissertation approach. The basic research issue that I wanted to investigate was how practitioners in an applied field sought information in their work; in other words, how they researched. I further narrowed the investigation down to a more specific field, but the paper presented here is left in broader form so that other students can see the approach in more general terms.

  10. Toward Paradigmatic Change in TESOL Methodologies: Building Plurilingual Pedagogies from the Ground Up

    Science.gov (United States)

    Lin, Angel

    2013-01-01

    Contemporary TESOL methodologies have been characterized by compartmentalization of languages in the classroom. However, recent years have seen the beginning signs of paradigmatic change in TESOL methodologies that indicate a move toward plurilingualism. In this article, the author draws on the case of Hong Kong to illustrate how, in the past four…

  11. Figure and Ground in the Visual Cortex: V2 Combines Stereoscopic Cues with Gestalt Rules

    Science.gov (United States)

    Qiu, Fangtu T.; von der Heydt, Rüdiger

    2006-01-01

    Figure-ground organization is a process by which the visual system identifies some image regions as foreground and others as background, inferring three-dimensional (3D) layout from 2D displays. A recent study reported that edge responses of neurons in area V2 are selective for side-of-figure, suggesting that figure-ground organization is encoded in the contour signals (border-ownership coding). Here we show that area V2 combines two strategies of computation, one that exploits binocular stereoscopic information for the definition of local depth order, and another that exploits the global configuration of contours (gestalt factors). These are combined in single neurons so that the ‘near’ side of the preferred 3D edge generally coincides with the preferred side-of-figure in 2D displays. Thus, area V2 represents the borders of 2D figures as edges of surfaces, as if the figures were objects in 3D space. Even in 3D displays gestalt factors influence the responses and can enhance or null the stereoscopic depth information. PMID:15996555

  12. Linking the Intercultural and Grounded Theory: Methodological Issues in Migration Research

    Directory of Open Access Journals (Sweden)

    Vera Sheridan

    2009-01-01

    Full Text Available Connecting intercultural research with Grounded Theory was advocated in the early history of intercultural theorising and includes the development of researchers' intercultural competencies. Such competency comes to the fore where intercultural theory places an equal emphasis on home and host cultures in migration research. In this context we have found a Grounded Theory approach particularly suitable for disentangling complex interlinkings within migration experiences and their individual outcomes. Grounded Theory allows for the exploration of various theories in different fields and the emergence of new or deeper interpretations of intercultural experiences, including where research has not engaged deeply with or avoided intercultural contexts. The use of software, based on Grounded Theory, provides the resource for systematically exploring the inter-related nature of data. In addition, engaging in intercultural research, in particular, raises questions around our practice as social science researchers: adherence to ethics guidelines, for instance, can be in some conflict with the relations we build with members of communities whose cultural values, for instance around friendship or trust, impact on the norms of both our own and institutional expectations. This leads to reflection on the relationship with research participants in terms of our own intercultural experiences and position. URN: urn:nbn:de:0114-fqs0901363

  13. Development of methodology and computer programs for the ground response spectrum and the probabilistic seismic hazard analysis

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Joon Kyoung [Semyung Univ., Research Institute of Industrial Science and Technol , Jecheon (Korea, Republic of)

    1996-12-15

    Objective of this study is to investigate and develop the methodologies and corresponding computer codes, compatible to the domestic seismological and geological environments, for estimating ground response spectrum and probabilistic seismic hazard. Using the PSHA computer program, the Cumulative Probability Functions(CPDF) and Probability Functions (PDF) of the annual exceedence have been investigated for the analysis of the uncertainty space of the annual probability at ten interested seismic hazard levels (0.1 g to 0.99 g). The cumulative provability functions and provability functions of the annual exceedence have been also compared to those results from the different input parameter spaces.

  14. Grounding of Six Sigma s Breakthrough Cookbook: How to research a methodology?

    NARCIS (Netherlands)

    de Koning, H.; de Mast, J.

    2005-01-01

    The Six Sigma programme has developed into a standard for quality and efficiency improvement in business and industry. This fact makes scientific research into the validity and applicability of this methodology important. This article explores the possibilities of a scientific study of the methodolo

  15. Grounding of Six Sigma s Breakthrough Cookbook: How to research a methodology?

    NARCIS (Netherlands)

    de Koning, H.; de Mast, J.

    2005-01-01

    The Six Sigma programme has developed into a standard for quality and efficiency improvement in business and industry. This fact makes scientific research into the validity and applicability of this methodology important. This article explores the possibilities of a scientific study of the

  16. A Test of a Strong Ground Motion Prediction Methodology for the 7 September 1999, Mw=6.0 Athens Earthquake

    Energy Technology Data Exchange (ETDEWEB)

    Hutchings, L; Ioannidou, E; Voulgaris, N; Kalogeras, I; Savy, J; Foxall, W; Stavrakakis, G

    2004-08-06

    We test a methodology to predict the range of ground-motion hazard for a fixed magnitude earthquake along a specific fault or within a specific source volume, and we demonstrate how to incorporate this into probabilistic seismic hazard analyses (PSHA). We modeled ground motion with empirical Green's functions. We tested our methodology with the 7 September 1999, Mw=6.0 Athens earthquake, we: (1) developed constraints on rupture parameters based on prior knowledge of earthquake rupture processes and sources in the region; (2) generated impulsive point shear source empirical Green's functions by deconvolving out the source contribution of M < 4.0 aftershocks; (3) used aftershocks that occurred throughout the area and not necessarily along the fault to be modeled; (4) ran a sufficient number of scenario earthquakes to span the full variability of ground motion possible; (5) found that our distribution of synthesized ground motions span what actually occurred and their distribution is realistically narrow; (6) determined that one of our source models generates records that match observed time histories well; (7) found that certain combinations of rupture parameters produced ''extreme'' ground motions at some stations; (8) identified that the ''best fitting'' rupture models occurred in the vicinity of 38.05{sup o} N 23.60{sup o} W with center of rupture near 12 km, and near unilateral rupture towards the areas of high damage, and this is consistent with independent investigations; and (9) synthesized strong motion records in high damage areas for which records from the earthquake were not recorded. We then developed a demonstration PSHA for a source region near Athens utilizing synthesized ground motion rather that traditional attenuation. We synthesized 500 earthquakes distributed throughout the source zone likely to have Mw=6.0 earthquakes near Athens. We assumed an average return period of 1000 years for this

  17. Methodology for the Construction of a Rule-Based Knowledge Base Enabling the Selection of Appropriate Bronze Heat Treatment Parameters Using Rough Sets

    Directory of Open Access Journals (Sweden)

    Górny Z.

    2015-04-01

    Full Text Available Decisions regarding appropriate methods for the heat treatment of bronzes affect the final properties obtained in these materials. This study gives an example of the construction of a knowledge base with application of the rough set theory. Using relevant inference mechanisms, knowledge stored in the rule-based database allows the selection of appropriate heat treatment parameters to achieve the required properties of bronze. The paper presents the methodology and the results of exploratory research. It also discloses the methodology used in the creation of a knowledge base.

  18. Ground-Water Capture Zone Delineation of Hypothetical Systems: Methodology Comparison and Real-World Applications

    Science.gov (United States)

    Ahern, J. A.; Lilly, M. R.; Hinzman, L. D.

    2003-12-01

    A capture zone is the aquifer volume through which ground-water flows to a pumping well over a given time of travel. Determining a well's capture zone aids in water-supply management by creating an awareness of the water source. This helps ensure sustainable pumping operations and outlines areas where protection from contamination is critical. We are delineating the capture zones of hypothetical conceptual models that resemble the Fairbanks, Alaska floodplain both in aquifer parameters and boundary conditions. We begin with a very simple hydrogeologic system and gradually add complexity such as heterogeneity, anisotropy, multiple wells, and zones of permafrost. Commonly-used delineation methods are applied to each case. These include calculated fixed-radius, analytical and numerical models. The calculated fixed-radius method uses a mathematical equation with several simplifying assumptions. Analytical techniques employ a series of equations that likewise assume simple conditions, although to a lesser degree than the fixed-radius method. Our chosen numerical model is MODFLOW-2000, which offers a particle-tracking package (MODPATH) for delineating recharge areas. The delineations are overlayed for each conceptual model in order to compare the capture zones produced by the different methods. Contrasts between capture zones increase with the complexity of the hydrogeology. Simpler methods are restricted by their underlying assumptions. When methods can no longer account for complexities in the conceptual model, the resulting delineations remain similar to those of simpler models. Meanwhile, the zones generated by more sophisticated methods are able to change with changes to the conceptual model. Hence, the simpler methods now lack accuracy and credibility. We have found that these simpler techniques tend to overestimate the capture zone. Water-supply managers must consider such inaccuracies when evaluating the costs of each method. In addition to comparing delineation

  19. Assessment of hygienic conditions of ground pepper (Piper nigrum L.) on the market in Sao Paulo City, by means of two methodologies for detecting the light filth

    Science.gov (United States)

    Pepper should to be collected, processed, and packed under optimum conditions to avoid the presence of foreign matter. The hygienic conditions of ground pepper marketted in São Paulo city were assessed in determining the presence of foreign matter by means of two extraction methodologies. This study...

  20. Spectral Analyses and Radiation Exposures from Several Ground-Level Enhancement (GLE) Solar Proton Events: A Comparison of Methodologies

    Science.gov (United States)

    Atwell, William; Tylka, Allan; Dietrich, William; Badavi, Francis; Rojdev, Kristina

    2011-01-01

    Several methods for analyzing the particle spectra from extremely large solar proton events, called Ground-Level Enhancements (GLEs), have been developed and utilized by the scientific community to describe the solar proton energy spectra and have been further applied to ascertain the radiation exposures to humans and radio-sensitive systems, namely electronics. In this paper 12 GLEs dating back to 1956 are discussed, and the three methods for describing the solar proton energy spectra are reviewed. The three spectral fitting methodologies are EXP [an exponential in proton rigidity (R)], WEIB [Weibull fit: an exponential in proton energy], and the Band function (BAND) [a double power law in proton rigidity]. The EXP and WEIB methods use low energy (MeV) GLE solar proton data and make extrapolations out to approx.1 GeV. On the other hand, the BAND method utilizes low- and medium-energy satellite solar proton data combined with high-energy solar proton data deduced from high-latitude neutron monitoring stations. Thus, the BAND method completely describes the entire proton energy spectrum based on actual solar proton observations out to 10 GeV. Using the differential spectra produced from each of the 12 selected GLEs for each of the three methods, radiation exposures are presented and discussed in detail. These radiation exposures are then compared with the current 30-day and annual crew exposure limits and the radiation effects to electronics.

  1. Using grounded theory methodology to conceptualize the mother-infant communication dynamic: potential application to compliance with infant feeding recommendations.

    Science.gov (United States)

    Waller, Jennifer; Bower, Katherine M; Spence, Marsha; Kavanagh, Katherine F

    2015-10-01

    Excessive, rapid weight gain in early infancy has been linked to risk of later overweight and obesity. Inappropriate infant feeding practices associated with this rapid weight gain are currently of great interest. Understanding the origin of these practices may increase the effectiveness of interventions. Low-income populations in the Southeastern United States are at increased risk for development of inappropriate infant feeding practices, secondary to the relatively low rates of breastfeeding reported from this region. The objective was to use grounded theory methodology (GTM) to explore interactions between mothers and infants that may influence development of feeding practices, and to do so among low-income, primiparous, Southeastern United States mothers. Analysis of 15 in-depth phone interviews resulted in development of a theoretical model in which Mother-Infant Communication Dynamic emerged as the central concept. The central concept suggests a communication pattern developed over the first year of life, based on a positive feedback loop, which is harmonious and results in the maternal perception of mother and infant now speaking the same language. Importantly, though harmonious, this dynamic may result from inaccurate maternal interpretation of infant cues and behaviours, subsequently leading to inappropriate infant feeding practices. Future research should test this theoretical model using direct observation of mother-infant communication, to increase the understanding of maternal interpretation of infant cues. Subsequently, interventions targeting accurate maternal interpretation of and response to infant cues, and impact on rate of infant weight gain could be tested. If effective, health care providers could potentially use these concepts to attenuate excess rapid infant weight gain. © 2013 John Wiley & Sons Ltd.

  2. Cross-Laboratory Comparative Study of the Impact of Experimental and Regression Methodologies on Salmonella Thermal Inactivation Parameters in Ground Beef.

    Science.gov (United States)

    Hildebrandt, Ian M; Marks, Bradley P; Juneja, Vijay K; Osoria, Marangeli; Hall, Nicole O; Ryser, Elliot T

    2016-07-01

    Isothermal inactivation studies are commonly used to quantify thermal inactivation kinetics of bacteria. Meta-analyses and comparisons utilizing results from multiple sources have revealed large variations in reported thermal resistance parameters for Salmonella, even when in similar food materials. Different laboratory or regression methodologies likely are the source of methodology-specific artifacts influencing the estimated parameters; however, such effects have not been quantified. The objective of this study was to evaluate the effects of laboratory and regression methodologies on thermal inactivation data generation, interpretation, modeling, and inherent error, based on data generated in two independent laboratories. The overall experimental design consisted of a cross-laboratory comparison using two independent laboratories (Michigan State University and U.S. Department of Agriculture, Agricultural Research Service, Eastern Regional Research Center [ERRC] laboratories), both conducting isothermal Salmonella inactivation studies (55, 60, 62°C) in ground beef, and each using two methodologies reported in prior studies. Two primary models (log-linear and Weibull) with one secondary model (Bigelow) were fitted to the resultant data using three regression methodologies (two two-step regressions and a one-step regression). Results indicated that laboratory methodology impacted the estimated D60°C- and z-values (α = 0.05), with the ERRC methodology yielding parameter estimates ∼25% larger than the Michigan State University methodology, regardless of the laboratory. Regression methodology also impacted the model and parameter error estimates. Two-step regressions yielded root mean square error values on average 40% larger than the one-step regressions. The Akaike Information Criterion indicated the Weibull as the more correct model in most cases; however, caution should be used to confirm model robustness in application to real-world data. Overall, the

  3. Prediction of ground water quality index to assess suitability for drinking purposes using fuzzy rule-based approach

    Science.gov (United States)

    Gorai, A. K.; Hasni, S. A.; Iqbal, Jawed

    2016-11-01

    Groundwater is the most important natural resource for drinking water to many people around the world, especially in rural areas where the supply of treated water is not available. Drinking water resources cannot be optimally used and sustained unless the quality of water is properly assessed. To this end, an attempt has been made to develop a suitable methodology for the assessment of drinking water quality on the basis of 11 physico-chemical parameters. The present study aims to select the fuzzy aggregation approach for estimation of the water quality index of a sample to check the suitability for drinking purposes. Based on expert's opinion and author's judgement, 11 water quality (pollutant) variables (Alkalinity, Dissolved Solids (DS), Hardness, pH, Ca, Mg, Fe, Fluoride, As, Sulphate, Nitrates) are selected for the quality assessment. The output results of proposed methodology are compared with the output obtained from widely used deterministic method (weighted arithmetic mean aggregation) for the suitability of the developed methodology.

  4. An assessment methodology for thermal energy storage evaluation

    Energy Technology Data Exchange (ETDEWEB)

    Brown, D.R.; Dirks, J.A.; Drost, M.K.; Spanner, G.E.; Williams, T.A.

    1987-11-01

    This report documents an assessment methodology for evaluating the cost, performance, and overall economic feasibility of thermal energy storage (TES) concepts. The methodology was developed by Thermal Energy Storage Evaluation Program personnel at Pacific Northwest Laboratory (PNL) for use by PNL and other TES concept evaluators. The methodology is generically applicable to all TES concepts; however, specific analyses may require additional or more detailed definition of the ground rules, assumptions, and analytical approach. The overall objective of the assessment methodology is to assist in preparing equitable and proper evaluations of TES concepts that will allow developers and end-users to make valid decisions about research and development (R and D) and implementation. The methodology meets this objective by establishing standard approaches, ground rules, assumptions, and definitions that are analytically correct and can be consistently applied by concept evaluators. 15 refs., 4 figs., 13 tabs.

  5. Methodology for evaluating the grounding system in electrical substations; Metodologia para la evaluacion del sistema de puesta a tierra en subestaciones electricas

    Energy Technology Data Exchange (ETDEWEB)

    Torrelles Rivas, L.F [Universidad Nacional Experimental Politecnica: Antonio Jose de Sucre (UNEXPO), Guayana, Bolivar (Venezuela)]. E-mail: torrellesluis@gmail.com; Alvarez, P. [Petroleos de Venezuela S.A (PDVSA), Maturin, Monagas (Venezuela)]. E-mail: alvarezph@pdvsa.com

    2013-03-15

    The present work proposes a methodology for evaluating grounding systems in electrical substations from medium and high voltage, in order to diagnose the state of the elements of the grounding system and the corresponding electrical variables. The assessment methodology developed includes a visual inspection phase to the elements of the substation. Then, by performing measurements and data analysis, the electrical continuity between the components of the substation and the mesh ground is verified, the soil resistivity and resistance of the mesh. Also included in the methodology the calculation of the step and touch voltage of the substation, based on the criteria of the International IEEE standards. We study the case of the 115 kV Pirital Substation belonging to PDVSA Oriente Transmission Network. [Spanish] En el presente trabajo se plantea una metodologia para la evaluacion de sistemas de puesta a tierra en subestaciones electricas de media y alta tension, con la finalidad de diagnosticar el estado de los elementos que conforman dicho sistema y las variables electricas correspondientes. La metodologia de evaluacion desarrollada incluye una fase de inspeccion visual de los elementos que conforman la subestacion. Luego, mediante la ejecucion de mediciones y analisis de datos, se verifica la continuidad electrica entre los componentes de la subestacion y la malla de puesta a tierra, la resistividad del suelo y resistencia de la malla. Se incluye tambien en la metodologia el calculo de las tensiones de paso y de toque de la subestacion, segun lo fundamentado en los criterios de los estandares Internacionales IEEE. Se estudia el caso de la Subestacion Pirital 115 kV perteneciente a la Red de Transmision de PDVSA Oriente.

  6. Validation of a ground motion synthesis and prediction methodology for the 1988, M=6.0, Saguenay Earthquake

    Energy Technology Data Exchange (ETDEWEB)

    Hutchings, L.; Jarpe, S.; Kasameyer, P.; Foxall, W.

    1998-01-01

    We model the 1988, M=6.0, Saguenay earthquake. We utilize an approach that has been developed to predict strong ground motion. this approach involves developing a set of rupture scenarios based upon bounds on rupture parameters. rupture parameters include rupture geometry, hypocenter, rupture roughness, rupture velocity, healing velocity (rise times), slip distribution, asperity size and location, and slip vector. Scenario here refers to specific values of these parameters for an hypothesized earthquake. Synthetic strong ground motion are then generated for each rupture scenario. A sufficient number of scenarios are run to span the variability in strong ground motion due to the source uncertainties. By having a suite of rupture scenarios of hazardous earthquakes for a fixed magnitude and identifying the hazard to the site from the one standard deviation value of engineering parameters we have introduced a probabilistic component to the deterministic hazard calculation, For this study we developed bounds on rupture scenarios from previous research on this earthquake. The time history closest to the observed ground motion was selected as a model for the Saguenay earthquake.

  7. DOE`s topical report on a methodology to assess vibratory ground motion and fault displacement hazards at the Yucca Mountain site

    Energy Technology Data Exchange (ETDEWEB)

    Fenster, D.F. [TRW Environmental Safety Systems Inc., Vienna, VA (United States); Quittmeyer, R.C. [TRW Environmental Safety Systems, Inc., Las Vegas, NV (United States)

    1994-12-31

    The DOE`s Office of Civilian Radioactive Waste Management is in the process of characterizing the Yucca Mountain site, Nye County, Nevada, to evaluate its suitability for development of a geologic repository for spent fuel and high-level radioactive waste. If the site is found suitable, much of these data and analyses can be used in an application for a license to the Nuclear Regulatory Commission. The topical report and methodology described in this paper represent a revision of the deterministic and probabilistic approaches to assessing seismic hazards described in DOE`s Site Characterization Report. The proposed probabilistic methodology incorporates experience gained while siting and licensing nuclear power plants and other critical facilities during the past decade. In contrast to the traditional deterministic approach, this methodology incorporates all available geologic, geophysical, and seismological data; frequency of occurrence; and variability and uncertainty in both conceptual models and parameters. It also integrates the hazard from all potential sources. Probabilistic approaches have been used primarily to assess hazards from vibratory ground motion, but this approach also applies to assessing fault displacement hazards. The proposed methodology will provide input for design of surface and subsurface facilities (pre- and postclosure periods) and for performance assessment.

  8. Quality Control Methodologies for Advanced EMI Sensor Data Acquisition and Anomaly Classification - Former Southwestern Proving Ground, Arkansas

    Science.gov (United States)

    2015-07-01

    DEMONSTRATION REPORT Quality Control Methodologies for Advanced EMI Sensor Data Acquisition and Anomaly Classification – Former Southwestern...concentrations. A total of 11.23 acres of dynamic surveys were conducted using MetalMapper advanced electromagnetic induction ( EMI ) sensor. A total of...Order Navigation Points ................................................................................13 5.2.3 Initial EMI Survey

  9. A methodology for extracting knowledge rules from artificial neural networks applied to forecast demand for electric power; Uma metodologia para extracao de regras de conhecimento a partir de redes neurais artificiais aplicadas para previsao de demanda por energia eletrica

    Energy Technology Data Exchange (ETDEWEB)

    Steinmetz, Tarcisio; Souza, Glauber; Ferreira, Sandro; Santos, Jose V. Canto dos; Valiati, Joao [Universidade do Vale do Rio dos Sinos (PIPCA/UNISINOS), Sao Leopoldo, RS (Brazil). Programa de Pos-Graduacao em Computacao Aplicada], Emails: trsteinmetz@unisinos.br, gsouza@unisinos.br, sferreira, jvcanto@unisinos.br, jfvaliati@unisinos.br

    2009-07-01

    We present a methodology for the extraction of rules from Artificial Neural Networks (ANN) trained to forecast the electric load demand. The rules have the ability to express the knowledge regarding the behavior of load demand acquired by the ANN during the training process. The rules are presented to the user in an easy to read format, such as IF premise THEN consequence. Where premise relates to the input data submitted to the ANN (mapped as fuzzy sets), and consequence appears as a linear equation describing the output to be presented by the ANN, should the premise part holds true. Experimentation demonstrates the method's capacity for acquiring and presenting high quality rules from neural networks trained to forecast electric load demand for several amounts of time in the future. (author)

  10. Methodological Grounds of Formation of the Scorecard of Diagnostics of the Innovation Component of Technological Processes of Industrial Enterprises

    Directory of Open Access Journals (Sweden)

    Zhezhuha Volodymyr Yo.

    2014-01-01

    Full Text Available The goal of the article is development of scientific provisions with respect of methodological specific features of formation of the scorecard of diagnostics of the innovation component of technological processes of industrial enterprises. The article proves fragmentariness of views of theoreticians and practitioners on this problem. It justifies a necessity of formation of these indicators from the point of view of a system approach, consequently, accounting of all interconnections and interdependencies between them, and also coverage of main factors that identify the diagnosed innovation component. Pursuant to result of the study the article specifies system properties of the scorecard of diagnostics of the innovation component of technological processes of industrial enterprises. It reveals a possibility of application of two alternative approaches to formation of this scorecard, and also establishes their advantages and shortcomings. It gives a number of requirements, which have to be met when selecting, developing and forming the scorecard of diagnostics of the innovation component of technological processes of industrial enterprises. It reflects key issues that should form the basis of the methodological approach to formation of this system. It shows typology of indicators of diagnostics of the innovation component of technological processes of industrial enterprises, which gives subjects of diagnostics a possibility to select relevant indicators depending on the set criteria and restrictions. Prospects of further studies in this direction should lie in development of the system of a specific scorecard of diagnostics of each innovation component of technological processes of industrial enterprises with consideration of methodological specific features of its formation described in the article.

  11. Games and Diabetes: A Review Investigating Theoretical Frameworks, Evaluation Methodologies, and Opportunities for Design Grounded in Learning Theories.

    Science.gov (United States)

    Lazem, Shaimaa; Webster, Mary; Holmes, Wayne; Wolf, Motje

    2015-09-02

    Here we review 18 articles that describe the design and evaluation of 1 or more games for diabetes from technical, methodological, and theoretical perspectives. We undertook searches covering the period 2010 to May 2015 in the ACM, IEEE, Journal of Medical Internet Research, Studies in Health Technology and Informatics, and Google Scholar online databases using the keywords "children," "computer games," "diabetes," "games," "type 1," and "type 2" in various Boolean combinations. The review sets out to establish, for future research, an understanding of the current landscape of digital games designed for children with diabetes. We briefly explored the use and impact of well-established learning theories in such games. The most frequently mentioned theoretical frameworks were social cognitive theory and social constructivism. Due to the limitations of the reported evaluation methodologies, little evidence was found to support the strong promise of games for diabetes. Furthermore, we could not establish a relation between design features and the game outcomes. We argue that an in-depth discussion about the extent to which learning theories could and should be manifested in the design decisions is required. © 2015 Diabetes Technology Society.

  12. A cost effective and operational methodology for wall to wall Above Ground Biomass (AGB) and carbon stocks estimation and mapping: Nepal REDD+

    Science.gov (United States)

    Gilani, H., Sr.; Ganguly, S.; Zhang, G.; Koju, U. A.; Murthy, M. S. R.; Nemani, R. R.; Manandhar, U.; Thapa, G. J.

    2015-12-01

    Nepal is a landlocked country with 39% forest cover of the total land area (147,181 km2). Under the Forest Carbon Partnership Facility (FCPF) and implemented by the World Bank (WB), Nepal chosen as one of four countries best suitable for results-based payment system for Reducing Emissions from Deforestation and Forest Degradation (REDD and REDD+) scheme. At the national level Landsat based, from 1990 to 2000 the forest area has declined by 2%, i.e. by 1467 km2, whereas from 2000 to 2010 it has declined only by 0.12% i.e. 176 km2. A cost effective monitoring and evaluation system for REDD+ requires a balanced approach of remote sensing and ground measurements. This paper provides, for Nepal a cost effective and operational 30 m Above Ground Biomass (AGB) estimation and mapping methodology using freely available satellite data integrated with field inventory. Leaf Area Index (LAI) generated based on propose methodology by Ganguly et al. (2012) using Landsat-8 the OLI cloud free images. To generate tree canopy height map, a density scatter graph between the Geoscience Laser Altimeter System (GLAS) on the Ice, Cloud, and Land Elevation Satellite (ICESat) estimated maximum height and Landsat LAI nearest to the center coordinates of the GLAS shots show a moderate but significant exponential correlation (31.211*LAI0.4593, R2= 0.33, RMSE=13.25 m). From the field well distributed circular (750m2 and 500m2), 1124 field plots (0.001% representation of forest cover) measured which were used for estimation AGB (ton/ha) using Sharma et al. (1990) proposed equations for all tree species of Nepal. A satisfactory linear relationship (AGB = 8.7018*Hmax-101.24, R2=0.67, RMSE=7.2 ton/ha) achieved between maximum canopy height (Hmax) and AGB (ton/ha). This cost effective and operational methodology is replicable, over 5-10 years with minimum ground samples through integration of satellite images. Developed AGB used to produce optimum fuel wood scenarios using population and road

  13. Grounded theory.

    Science.gov (United States)

    Harris, Tina

    2015-04-29

    Grounded theory is a popular research approach in health care and the social sciences. This article provides a description of grounded theory methodology and its key components, using examples from published studies to demonstrate practical application. It aims to demystify grounded theory for novice nurse researchers, by explaining what it is, when to use it, why they would want to use it and how to use it. It should enable nurse researchers to decide if grounded theory is an appropriate approach for their research, and to determine the quality of any grounded theory research they read.

  14. Grounded cognition.

    Science.gov (United States)

    Barsalou, Lawrence W

    2008-01-01

    Grounded cognition rejects traditional views that cognition is computation on amodal symbols in a modular system, independent of the brain's modal systems for perception, action, and introspection. Instead, grounded cognition proposes that modal simulations, bodily states, and situated action underlie cognition. Accumulating behavioral and neural evidence supporting this view is reviewed from research on perception, memory, knowledge, language, thought, social cognition, and development. Theories of grounded cognition are also reviewed, as are origins of the area and common misperceptions of it. Theoretical, empirical, and methodological issues are raised whose future treatment is likely to affect the growth and impact of grounded cognition.

  15. A New Methodology for Open Pit Slope Design in Karst-Prone Ground Conditions Based on Integrated Stochastic-Limit Equilibrium Analysis

    Science.gov (United States)

    Zhang, Ke; Cao, Ping; Ma, Guowei; Fan, Wenchen; Meng, Jingjing; Li, Kaihui

    2016-07-01

    Using the Chengmenshan Copper Mine as a case study, a new methodology for open pit slope design in karst-prone ground conditions is presented based on integrated stochastic-limit equilibrium analysis. The numerical modeling and optimization design procedure contain a collection of drill core data, karst cave stochastic model generation, SLIDE simulation and bisection method optimization. Borehole investigations are performed, and the statistical result shows that the length of the karst cave fits a negative exponential distribution model, but the length of carbonatite does not exactly follow any standard distribution. The inverse transform method and acceptance-rejection method are used to reproduce the length of the karst cave and carbonatite, respectively. A code for karst cave stochastic model generation, named KCSMG, is developed. The stability of the rock slope with the karst cave stochastic model is analyzed by combining the KCSMG code and the SLIDE program. This approach is then applied to study the effect of the karst cave on the stability of the open pit slope, and a procedure to optimize the open pit slope angle is presented.

  16. Methodology for determining whether an increase in a state's child poverty rate is the result of the TANF program--Administration for Children and Families, HHS. Proposed rule.

    Science.gov (United States)

    1998-09-23

    The Administration for Children and Families is proposing a methodology to determine the child poverty rate in each State. If a State experiences an increase in its child poverty rate of 5 percent or more as a result of its Temporary Assistance for Needy Families (TANF) program, the State must submit and implement a corrective action plan. This requirement is a part of the new welfare reform block grant program enacted in 1996.

  17. Top Level Space Cost Methodology (TLSCM)

    Science.gov (United States)

    2007-11-02

    Software 7 6. ACEIT . 7 C. Ground Rules and Assumptions 7 D. Typical Life Cycle Cost Distribution 7 E. Methodologies 7 1. Cost/budget Threshold 9 2. Analogy...which is based on real-time Air Force and space programs. Ref.(25:2- 8, 2-9) 6. ACEIT : Automated Cost Estimating Integrated Tools( ACEIT ), Tecolote...Research, Inc. There is a way to use the ACEIT cost program to get a print-out of an expanded WBS. Therefore, find someone that has ACEIT experience and

  18. Medicare Program; Medicare Shared Savings Program; Accountable Care Organizations--Revised Benchmark Rebasing Methodology, Facilitating Transition to Performance-Based Risk, and Administrative Finality of Financial Calculations. Final rule.

    Science.gov (United States)

    2016-06-10

    Under the Medicare Shared Savings Program (Shared Savings Program), providers of services and suppliers that participate in an Accountable Care Organization (ACO) continue to receive traditional Medicare fee-for-service (FFS) payments under Parts A and B, but the ACO may be eligible to receive a shared savings payment if it meets specified quality and savings requirements. This final rule addresses changes to the Shared Savings Program, including: Modifications to the program's benchmarking methodology, when resetting (rebasing) the ACO's benchmark for a second or subsequent agreement period, to encourage ACOs' continued investment in care coordination and quality improvement; an alternative participation option to encourage ACOs to enter performance-based risk arrangements earlier in their participation under the program; and policies for reopening of payment determinations to make corrections after financial calculations have been performed and ACO shared savings and shared losses for a performance year have been determined.

  19. Exploring the use of grounded theory as a methodological approach to examine the 'black box' of network leadership in the national quality forum.

    Science.gov (United States)

    Hoflund, A Bryce

    2013-01-01

    This paper describes how grounded theory was used to investigate the "black box" of network leadership in the creation of the National Quality Forum. Scholars are beginning to recognize the importance of network organizations and are in the embryonic stages of collecting and analyzing data about network leadership processes. Grounded theory, with its focus on deriving theory from empirical data, offers researchers a distinctive way of studying little-known phenomena and is therefore well suited to exploring network leadership processes. Specifically, this paper provides an overview of grounded theory, a discussion of the appropriateness of grounded theory to investigating network phenomena, a description of how the research was conducted, and a discussion of the limitations and lessons learned from using this approach.

  20. Legal Methodology Study in the Content of Ruling China by Law---Report on Legal Methodology Study of China(2014)%法治中国背景下的法律方法论研究--2014年中国法律方法论研究学术报告

    Institute of Scientific and Technical Information of China (English)

    孙光宁; 焦宝乾

    2015-01-01

    In 2014 ,the research of legal methodology is focusing on practical function of its own , trying to make sense to legal study and construction of rule of law .In basic theory ,legal dogmatic and social science of law are confronted with each other ,although not so seriously .We can see that different legal theory can benefit from each other .In system of its theory ,legal interpretation ,legal argumentation is further studied in case analysis .In the practice of rule of law construction ,legal methodology is so important to both macro ideas and judicial institutions ,with guiding case system being a good case in point .In short ,stressing on rule of law ,supplying theory reference will be the trend for legal methodology study in the future .%在2014年中,法律方法论的研究成果集中于凸显自身的实践功能,力图对法学研究和法治实践发挥直接而明确的推动作用,这一宏观指向体现在以下三个方面:在理论基础方面,法教义学和社科法学之争并未达到剑拔弩张的程度,其背后的共识是法学研究应当取长补短,在各自分析视角内为法治实践提供最优参考。在自身理论体系方面,法律解释、法律论证等具体法律方法的研究,也特别强调以史为鉴和案例分析,最终落脚点也是为当前的法治实践服务。在法治建设的实践方面,法律方法论对现在的宏观法治理念和微观的司法制度,都有重要推动作用,案例指导制度就是典型例证。关注法治实践,为法治建设提供重要的理论参考,将继续成为法律方法论研究的整体取向。

  1. Ground-penetrating radar for sedimentology: methodological advances and examples from the Usumacinta-Grijalva delta plain, Tabasco, México

    NARCIS (Netherlands)

    Van Dam, Remke; Nooren, Kees; Dogan, Mine; Hoek, Wim

    2014-01-01

    Ground-penetrating radar (GPR) is widely used as a tool for imaging sedimentary structures and reconstructing depositional history in a range of settings. Most GPR systems use a pair of dipole antennas to transmit and receive electromagnetic energy, typically in the frequency range of 0.025-1 GHz. R

  2. Revised Rules for Concrete Bridges

    DEFF Research Database (Denmark)

    Thoft-Christensen, Palle; Jensen, F. M.; Middleton, C.;

    This paper is based on research performed for the Highway Agency, London, UK under the project DPU/9/44 "Revision of Bridge Assessment Rules Based on Whole Life Performance: Concrete Bridges" It contains details of a methodology which can be used to generate Whole Life (WL) reliability profiles....... These WL reliability profiles may be used to establish revised rules for Concrete Bridges....

  3. From Darwin to constructivism: the evolution of grounded theory.

    Science.gov (United States)

    Hall, Helen; Griffiths, Debra; McKenna, Lisa

    2013-01-01

    To explore the evolution of grounded theory and equip the reader with a greater understanding of the diverse conceptual positioning that is evident in the methodology. Grounded theory was developed during the modernist phase of research to develop theories that are derived from data and explain human interaction. Its philosophical foundations derive from symbolic interactionism and were influenced by a range of scholars including Charles Darwin and George Mead. Rather than a rigid set of rules and procedures, grounded theory is a way of conceptualising data. Researchers demonstrate a range of perspectives and there is significant variation in the way the methodology is interpreted and executed. Some grounded theorists continue to align closely with the original post-positivist view, while others take a more constructivist approach. Although the diverse interpretations accommodate flexibility, they may also result in confusion. The grounded theory approach enables researchers to align to their own particular world view and use methods that are flexible and practical. With an appreciation of the diverse philosophical approaches to grounded theory, researchers are enabled to use and appraise the methodology more effectively.

  4. Cortical dynamics of figure-ground separation in response to 2D pictures and 3D scenes:How V2 combines border ownership, stereoscopic cues, and Gestalt grouping rules

    Directory of Open Access Journals (Sweden)

    Stephen eGrossberg

    2016-01-01

    Full Text Available The FACADE model, and its laminar cortical realization and extension in the 3D LAMINART model, have explained, simulated, and predicted many perceptual and neurobiological data about how the visual cortex carries out 3D vision and figure-ground perception, and how these cortical mechanisms enable 2D pictures to generate 3D percepts of occluding and occluded objects. In particular, these models have proposed how border ownership occurs, but have not yet explicitly explained the correlation between multiple properties of border ownership neurons in cortical area V2 that were reported in a remarkable series of neurophysiological experiments by von der Heydt and his colleagues; namely, border ownership, contrast preference, binocular stereoscopic information, selectivity for side-of-figure, Gestalt rules, and strength of attentional modulation, as well as the time course during which such properties arise. This article shows how, by combining 3D LAMINART properties that were discovered in two parallel streams of research, a unified explanation of these properties emerges. This explanation proposes, moreover, how these properties contribute to the generation of consciously seen 3D surfaces. The first research stream models how processes like 3D boundary grouping and surface filling-in interact in multiple stages within and between the V1 interblob – V2 interstripe – V4 cortical stream and the V1 blob – V2 thin stripe – V4 cortical stream, respectively. Of particular importance for understanding figure-ground separation is how these cortical interactions convert computationally complementary boundary and surface mechanisms into a consistent conscious percept, including the critical use of surface contour feedback signals from surface representations in V2 thin stripes to boundary representations in V2 interstripes. Remarkably, key figure-ground properties emerge from these feedback interactions. The second research stream shows how cells that

  5. [Methodology of the approach to express-estimation of radiation risk for public health under the influence of radionuclides present in the ground waters].

    Science.gov (United States)

    Korenkov, I P; Lashchenova, T N; Klochkova, N V

    2013-01-01

    The methodological approach for the express-estimation of the value of individual lifetime cancer risk due to the groundwater use by population for drinking is supposed. The calculation of risk is performed with the use of only the values of specific activity of 226Ra in underground water. The formulas for calculating the value of individual lifetime cancer risk in the groundwater use by the population in drinking aims for oral and inhalation routes of exposure are suggested.

  6. Cortical Dynamics of Figure-Ground Separation in Response to 2D Pictures and 3D Scenes: How V2 Combines Border Ownership, Stereoscopic Cues, and Gestalt Grouping Rules.

    Science.gov (United States)

    Grossberg, Stephen

    2015-01-01

    The FACADE model, and its laminar cortical realization and extension in the 3D LAMINART model, have explained, simulated, and predicted many perceptual and neurobiological data about how the visual cortex carries out 3D vision and figure-ground perception, and how these cortical mechanisms enable 2D pictures to generate 3D percepts of occluding and occluded objects. In particular, these models have proposed how border ownership occurs, but have not yet explicitly explained the correlation between multiple properties of border ownership neurons in cortical area V2 that were reported in a remarkable series of neurophysiological experiments by von der Heydt and his colleagues; namely, border ownership, contrast preference, binocular stereoscopic information, selectivity for side-of-figure, Gestalt rules, and strength of attentional modulation, as well as the time course during which such properties arise. This article shows how, by combining 3D LAMINART properties that were discovered in two parallel streams of research, a unified explanation of these properties emerges. This explanation proposes, moreover, how these properties contribute to the generation of consciously seen 3D surfaces. The first research stream models how processes like 3D boundary grouping and surface filling-in interact in multiple stages within and between the V1 interblob-V2 interstripe-V4 cortical stream and the V1 blob-V2 thin stripe-V4 cortical stream, respectively. Of particular importance for understanding figure-ground separation is how these cortical interactions convert computationally complementary boundary and surface mechanisms into a consistent conscious percept, including the critical use of surface contour feedback signals from surface representations in V2 thin stripes to boundary representations in V2 interstripes. Remarkably, key figure-ground properties emerge from these feedback interactions. The second research stream shows how cells that compute absolute disparity in

  7. Implementing XML Schema Naming and Design Rules

    Energy Technology Data Exchange (ETDEWEB)

    Lubell, Joshua [National Institute of Standards and Technology (NIST); Kulvatunyou, Boonserm [ORNL; Morris, Katherine [National Institute of Standards and Technology (NIST); Harvey, Betty [Electronic Commerce Connection, Inc.

    2006-08-01

    We are building a methodology and tool kit for encoding XML schema Naming and Design Rules (NDRs) in a computer-interpretable fashion, enabling automated rule enforcement and improving schema quality. Through our experience implementing rules from various NDR specifications, we discuss some issues and offer practical guidance to organizations grappling with NDR development.

  8. Cross-validation Methodology between Ground and GPM Satellite-based Radar Rainfall Product over Dallas-Fort Worth (DFW) Metroplex

    Science.gov (United States)

    Chen, H.; Chandrasekar, V.; Biswas, S.

    2015-12-01

    Over the past two decades, a large number of rainfall products have been developed based on satellite, radar, and/or rain gauge observations. However, to produce optimal rainfall estimation for a given region is still challenging due to the space time variability of rainfall at many scales and the spatial and temporal sampling difference of different rainfall instruments. In order to produce high-resolution rainfall products for urban flash flood applications and improve the weather sensing capability in urban environment, the center for Collaborative Adaptive Sensing of the Atmosphere (CASA), in collaboration with National Weather Service (NWS) and North Central Texas Council of Governments (NCTCOG), has developed an urban radar remote sensing network in DFW Metroplex. DFW is the largest inland metropolitan area in the U.S., that experiences a wide range of natural weather hazards such as flash flood and hailstorms. The DFW urban remote sensing network, centered by the deployment of eight dual-polarization X-band radars and a NWS WSR-88DP radar, is expected to provide impacts-based warning and forecasts for benefit of the public safety and economy. High-resolution quantitative precipitation estimation (QPE) is one of the major goals of the development of this urban test bed. In addition to ground radar-based rainfall estimation, satellite-based rainfall products for this area are also of interest for this study. Typical example is the rainfall rate product produced by the Dual-frequency Precipitation Radar (DPR) onboard Global Precipitation Measurement (GPM) Core Observatory satellite. Therefore, cross-comparison between ground and space-based rainfall estimation is critical to building an optimal regional rainfall system, which can take advantages of the sampling differences of different sensors. This paper presents the real-time high-resolution QPE system developed for DFW urban radar network, which is based upon the combination of S-band WSR-88DP and X

  9. Effects of Different Methods on the Comparison between Land Surface and Ground Phenology—A Methodological Case Study from South-Western Germany

    Directory of Open Access Journals (Sweden)

    Gourav Misra

    2016-09-01

    Full Text Available Several methods exist for extracting plant phenological information from time series of satellite data. However, there have been only a few successful attempts to temporarily match satellite observations (Land Surface Phenology or LSP with ground based phenological observations (Ground Phenology or GP. The classical pixel to point matching problem along with the temporal and spatial resolution of remote sensing data are some of the many issues encountered. In this study, MODIS-sensor’s Normalised Differenced Vegetation Index (NDVI time series data were smoothed using two filtering techniques for comparison. Several start of season (SOS methods established in the literature, namely thresholds of amplitude, derivatives and delayed moving average, were tested for determination of LSP-SOS for broadleaf forests at a site in southwestern Germany using 2001–2013 time series of NDVI data. The different LSP-SOS estimates when compared with species-rich GP dataset revealed that different LSP-SOS extraction methods agree better with specific phases of GP, and the choice of data processing or smoothing strongly affects the LSP-SOS extracted. LSP methods mirroring late SOS dates, i.e., 75% amplitude and 1st derivative, indicated a better match in means and trends, and high, significant correlations of up to 0.7 with leaf unfolding and greening of late understory and broadleaf tree species. GP-SOS of early understory leaf unfolding partly were significantly correlated with earlier detecting LSP-SOS, i.e., 20% amplitude and 3rd derivative. Early understory SOS were, however, more difficult to detect from NDVI due to the lack of a high resolution land cover information.

  10. [Introduction to grounded theory].

    Science.gov (United States)

    Wang, Shou-Yu; Windsor, Carol; Yates, Patsy

    2012-02-01

    Grounded theory, first developed by Glaser and Strauss in the 1960s, was introduced into nursing education as a distinct research methodology in the 1970s. The theory is grounded in a critique of the dominant contemporary approach to social inquiry, which imposed "enduring" theoretical propositions onto study data. Rather than starting from a set theoretical framework, grounded theory relies on researchers distinguishing meaningful constructs from generated data and then identifying an appropriate theory. Grounded theory is thus particularly useful in investigating complex issues and behaviours not previously addressed and concepts and relationships in particular populations or places that are still undeveloped or weakly connected. Grounded theory data analysis processes include open, axial and selective coding levels. The purpose of this article was to explore the grounded theory research process and provide an initial understanding of this methodology.

  11. 离化态原子基态电子结构特征与轨道竞争规律∗%Characteristics of ground state electronic structures of ionized atoms and rules of their orbital comp etitions

    Institute of Scientific and Technical Information of China (English)

    金锐; 高翔; 曾德灵; 顾春; 岳现房; 李家明

    2016-01-01

    ) Furthermore, with Dirac-Slater method we can ob-tain the localized self-consistent potential, thereby we can study the orbital competition rules for different atoms. Using the three of our designed atomic orbital competition graphs, all of our calculated ground configurations for over 7000 ionized atoms are conveniently expressed. We systematically summarize the rules of orbital competitions for different elements in different periods. We elucidate the mechanism of orbital competition (i.e., orbital collapsing) with the help of self-consistent atomic potential of ionized atoms. Also we compare the orbital competition rules for different periods of transition elements, the rare-earth and transuranium elements with the variation of the self-consistent filed for different periods. On this basis, we summarize the relationship between the orbital competitions and some bulk properties forsome elements, such as the superconductivity, the optical properties, the mechanical strength, and the chemistry activ-ities. We find that there exist some “abnormal” orbital competitions for some lowly ionized and neutral atoms which may lead to the unique bulk properties for the element. With the ground state electronic structures of ionized atoms, we can construct the basis of accurate quasi-complete configuration interaction (CI) calculations, and further accurately calculate the physical quantities like the energy levels, transition rates, collision cross section, etc. Therefore we can meet the requirements of scientific researches such as the analysis of high-power free-electron laser experiments and the accurate measurement of the mass of nuclei.%离化态原子广泛存在于等离子体物质中,其相关性质是天体物理、受控核聚变等前沿科学研究领域的重要基础。基于独立电子近似,本文系统研究了扩展周期表元素(26 Z 6119)所有中性和离化态原子的基态电子结构。基于设计的原子轨道竞争图,系统总结了各周期

  12. The Grounded Theory Bookshelf

    Directory of Open Access Journals (Sweden)

    Vivian B. Martin, Ph.D.

    2005-03-01

    Full Text Available Bookshelf will provide critical reviews and perspectives on books on theory and methodology of interest to grounded theory. This issue includes a review of Heaton’s Reworking Qualitative Data, of special interest for some of its references to grounded theory as a secondary analysis tool; and Goulding’s Grounded Theory: A practical guide for management, business, and market researchers, a book that attempts to explicate the method and presents a grounded theory study that falls a little short of the mark of a fully elaborated theory.Reworking Qualitative Data, Janet Heaton (Sage, 2004. Paperback, 176 pages, $29.95. Hardcover also available.

  13. 基于置信规则库推理的多属性双边匹配决策方法%Belief rule base inference methodology for two-sided matching decision with multi-attribute

    Institute of Scientific and Technical Information of China (English)

    方志坚; 杨隆浩; 傅仰耿; 陈建华

    2016-01-01

    This thesis presents a tentative study on a new two-sided matching approach,which is proposed to solve the two-sided matching problem with uncertain information and multiple attributes.The multi-attributes matching decision making(MAMDM)problem is one of the most important key points in the two-sided matching study,which has evoked great attention for the scholars in recent years.A belief rule-base inference methodology using the evidence reasoning approach(RIMER)has been introduced in this thesis to solve the problem of MAMDM.At the beginning of this thesis,the authors explain the reason why they choose to use belief degree.The current research on the problem of MAMDM is mainly restricted to the study of a kind of two-sided matching,whose evaluation information is linguistic values or interval values.But there exists a lack of study in belief degree as evaluation value. As belief degree can be used to deal with different kinds of uncertain and incomplete information,using it as evaluation value may trigger a new breakthrough in the study of MAMDM.Through the analysis of simulation ex-periments datas and the application of RIMER,belief degrees evaluation information is converted into different levels of confidence information.Then a 0-1 programming model is built by making use of different levels of confidence information to obtain a final matching scheme.It is also pointed out in the thesis that an output error may be caused when BRB(belief rule-base)input is higher than threshold value.To solve this problem,the authors propose that the input value can be incorporated into the uncertainty by the adoption of cutting method.If cutting method is not suitable,linear mapping method can be applied to reduce the influence of the results.The case study analysis shows that it is feasible and effective to adopt the new proposed approach to solve the problem of multi-attributes matching decision making.%针对具有不确定信息的多属性双边匹配决策问题,引

  14. Spatio-Temporal Rule Mining

    DEFF Research Database (Denmark)

    Gidofalvi, Gyozo; Pedersen, Torben Bach

    2005-01-01

    Recent advances in communication and information technology, such as the increasing accuracy of GPS technology and the miniaturization of wireless communication devices pave the road for Location-Based Services (LBS). To achieve high quality for such services, spatio-temporal data mining techniques...... are needed. In this paper, we describe experiences with spatio-temporal rule mining in a Danish data mining company. First, a number of real world spatio-temporal data sets are described, leading to a taxonomy of spatio-temporal data. Second, the paper describes a general methodology that transforms...... the spatio-temporal rule mining task to the traditional market basket analysis task and applies it to the described data sets, enabling traditional association rule mining methods to discover spatio-temporal rules for LBS. Finally, unique issues in spatio-temporal rule mining are identified and discussed....

  15. Grounded Theory Approach in Social Research

    Directory of Open Access Journals (Sweden)

    Dr Venkat Pulla

    2014-11-01

    Full Text Available This paper discusses Grounded Theory, which is one of the newer methodologies becoming popular with social researchers since its evolution in the late 1960s. The paper discusses the principles and processes of the Grounded Theory and then explores the nature of codes, coding process and the concept of saturation. It then goes on to discuss the pros and cons, arguments for and against the use of Grounded Theory methodology in social research and explores the applicability of this methodology in producing sound theoretical basis for practice. Selected narratives from the author’s recent studies are used to explain the processes of Grounded Theory methodology.

  16. Collaboration rules.

    Science.gov (United States)

    Evans, Philip; Wolf, Bob

    2005-01-01

    Corporate leaders seeking to boost growth, learning, and innovation may find the answer in a surprising place: the Linux open-source software community. Linux is developed by an essentially volunteer, self-organizing community of thousands of programmers. Most leaders would sell their grandmothers for workforces that collaborate as efficiently, frictionlessly, and creatively as the self-styled Linux hackers. But Linux is software, and software is hardly a model for mainstream business. The authors have, nonetheless, found surprising parallels between the anarchistic, caffeinated, hirsute world of Linux hackers and the disciplined, tea-sipping, clean-cut world of Toyota engineering. Specifically, Toyota and Linux operate by rules that blend the self-organizing advantages of markets with the low transaction costs of hierarchies. In place of markets' cash and contracts and hierarchies' authority are rules about how individuals and groups work together (with rigorous discipline); how they communicate (widely and with granularity); and how leaders guide them toward a common goal (through example). Those rules, augmented by simple communication technologies and a lack of legal barriers to sharing information, create rich common knowledge, the ability to organize teams modularly, extraordinary motivation, and high levels of trust, which radically lowers transaction costs. Low transaction costs, in turn, make it profitable for organizations to perform more and smaller transactions--and so increase the pace and flexibility typical of high-performance organizations. Once the system achieves critical mass, it feeds on itself. The larger the system, the more broadly shared the knowledge, language, and work style. The greater individuals' reputational capital, the louder the applause and the stronger the motivation. The success of Linux is evidence of the power of that virtuous circle. Toyota's success is evidence that it is also powerful in conventional companies.

  17. Rule, Britannia

    DEFF Research Database (Denmark)

    Christensen, Jørgen Riber

    2011-01-01

    Thomas Arne’s The Masque of Alfred (1740) with a libretto by James Thomson and David Mallet was written and performed in the historical context of George II’s reign where a kind of constitutional monarchy based on the Bill of Rights from 1689 was granting civil rights to the early bourgeoisie...... of the Proms, and this article considers it as a global real-time media event. “Rule, Britannia!” is placed in the contexts of political history, cultural history and experience economy....

  18. Methodology of International Law1

    OpenAIRE

    Dominicé, Christian

    2014-01-01

    I. DEFINITION Methodology seeks to define the means of acquiring scientific knowledge. There is no generally accepted definition of the methodology of international law. In this article it will be taken to comprise both its wider meaning of the methods used in the acquisition of a scientific knowledge of the international legal system and its narrower and more specialized meaning, the methods used to determine the existence of norms or rules of international law. The correlation of these two ...

  19. 14 CFR 141.81 - Ground training.

    Science.gov (United States)

    2010-01-01

    ... 14 Aeronautics and Space 3 2010-01-01 2010-01-01 false Ground training. 141.81 Section 141.81... OTHER CERTIFICATED AGENCIES PILOT SCHOOLS Operating Rules § 141.81 Ground training. (a) Except as provided in paragraph (b) of this section, each instructor who is assigned to a ground training course...

  20. Nonzero Solubility Rule

    Institute of Scientific and Technical Information of China (English)

    尉志武; 周蕊; 刘芸

    2002-01-01

    A solubility-related rule, nonzero solubility rule, is introduced in this paper. It is complementary to the existing rules such as the "like dissolves like" rule and can be understood on the basis of classical chemical thermodynamics.

  1. Tourism Methodologies

    DEFF Research Database (Denmark)

    This volume offers methodological discussions within the multidisciplinary field of tourism and shows how tourism researchers develop and apply new tourism methodologies. The book is presented as an anthology, giving voice to many diverse researchers who reflect on tourism methodology in different...... in interview and field work situations, and how do we engage with the performative aspects of tourism as a field of study? The book acknowledges that research is also performance and that it constitutes an aspect of intervention in the situations and contexts it is trying to explore. This is an issue dealt...

  2. Ruled Laguerre minimal surfaces

    KAUST Repository

    Skopenkov, Mikhail

    2011-10-30

    A Laguerre minimal surface is an immersed surface in ℝ 3 being an extremal of the functional ∫ (H 2/K-1)dA. In the present paper, we prove that the only ruled Laguerre minimal surfaces are up to isometry the surfaces ℝ (φλ) = (Aφ, Bφ, Cφ + D cos 2φ) + λ(sin φ, cos φ, 0), where A,B,C,D ε ℝ are fixed. To achieve invariance under Laguerre transformations, we also derive all Laguerre minimal surfaces that are enveloped by a family of cones. The methodology is based on the isotropic model of Laguerre geometry. In this model a Laguerre minimal surface enveloped by a family of cones corresponds to a graph of a biharmonic function carrying a family of isotropic circles. We classify such functions by showing that the top view of the family of circles is a pencil. © 2011 Springer-Verlag.

  3. Getting grounded: using Glaserian grounded theory to conduct nursing research.

    Science.gov (United States)

    Hernandez, Cheri Ann

    2010-03-01

    Glaserian grounded theory is a powerful research methodology for understanding client behaviour in a particular area. It is therefore especially relevant for nurse researchers. Nurse researchers use grounded theory more frequently than other qualitative analysis research methods because of its ability to provide insight into clients' experiences and to make a positive impact. However, there is much confusion about the use of grounded theory.The author delineates key components of grounded theory methodology, areas of concern, and the resulting implications for nursing knowledge development. Knowledge gained from Glaserian grounded theory research can be used to institute measures for enhancing client-nurse relationships, improving quality of care, and ultimately improving client quality of life. In addition, it can serve to expand disciplinary knowledge in nursing because the resulting substantive theory is a middle-range theory that can be subjected to later quantitative testing.

  4. Derivation of the inverse Schulze-Hardy rule.

    Science.gov (United States)

    Trefalt, Gregor

    2016-03-01

    The inverse Schulze-Hardy rule was recently proposed based on experimental observations. This rule describes an interesting situation of the aggregation of charged colloidal particles in the presence of the multivalent coions. Specifically, it can be shown that the critical coagulation concentration is inversely proportional to the coion valence. Here the derivation of the inverse Schulze-Hardy rule based on purely theoretical grounds is presented. This derivation complements the classical Schulze-Hardy rule, which describes the multivalent counterion systems.

  5. On methodology

    DEFF Research Database (Denmark)

    Cheesman, Robin; Faraone, Roque

    2002-01-01

    This is an English version of the methodology chapter in the authors' book "El caso Berríos: Estudio sobre información errónea, desinformación y manipulación de la opinión pública".......This is an English version of the methodology chapter in the authors' book "El caso Berríos: Estudio sobre información errónea, desinformación y manipulación de la opinión pública"....

  6. The rule of rescue.

    Science.gov (United States)

    McKie, John; Richardson, Jeff

    2003-06-01

    Jonsen coined the term "Rule of Rescue"(RR) to describe the imperative people feel to rescue identifiable individuals facing avoidable death. In this paper we attempt to draw a more detailed picture of the RR, identifying its conflict with cost-effectiveness analysis, the preference it entails for identifiable over statistical lives, the shock-horror response it elicits, the preference it entails for lifesaving over non-lifesaving measures, its extension to non-life-threatening conditions, and whether it is motivated by duty or sympathy. We also consider the measurement problems it raises, and argue that quantifying the RR would probably require a two-stage procedure. In the first stage the size of the individual utility gain from a health intervention would be assessed using a technique such as the Standard Gamble or the Time Trade-Off, and in the second the social benefits arising from the RR would be quantified employing the Person Trade-Off. We also consider the normative status of the RR. We argue that it can be defended from a utilitarian point of view, on the ground that rescues increase well-being by reinforcing people's belief that they live in a community that places great value upon life. However, utilitarianism has long been criticised for failing to take sufficient account of fairness, and the case is no different here: fairness requires that we do not discriminate between individuals on morally irrelevant grounds, whereas being "identifiable" does not seem to be a morally relevant ground for discrimination.

  7. Methodological guidelines

    Energy Technology Data Exchange (ETDEWEB)

    Halsnaes, K.; Callaway, J.M.; Meyer, H.J.

    1999-04-01

    The guideline document establishes a general overview of the main components of climate change mitigation assessment. This includes an outline of key economic concepts, scenario structure, common assumptions, modelling tools and country study assumptions. The guidelines are supported by Handbook Reports that contain more detailed specifications of calculation standards, input assumptions and available tools. The major objectives of the project have been provided a methodology, an implementing framework and a reporting system which countries can follow in meeting their future reporting obligations under the FCCC and for GEF enabling activities. The project builds upon the methodology development and application in the UNEP National Abatement Coasting Studies (UNEP, 1994a). The various elements provide countries with a road map for conducting climate change mitigation studies and submitting national reports as required by the FCCC. (au) 121 refs.

  8. Application of the NEI 95-10 methodology in the Rcic system of the Unit-1, to implement the criterions of the license renovation rule; (10 Cfr-54); Aplicacion de la metodologia del NEI 95-10 en el sistema RCIC de la U-1, para implementar los criterios de la regla de renovacion de licencia; (10 CFR54)

    Energy Technology Data Exchange (ETDEWEB)

    Diaz, A.; Mendoza, G.; Arganis, C.; Viais, J.; Contreras, A. [ININ, Carretera Mexico-Toluca s/n, 52750 Ocoyoacac, Estado de Mexico (Mexico); Fernandez, G.; Medina, G., E-mail: angeles.diaz@inin.gob.mx [Comision Federal de Electricidad, Central Nucleoelectrica Laguna Verde, Carretera Cardel-Nautla Km 42.5, Alto Lucero, Veracruz (Mexico)

    2012-10-15

    In December of 1991, the US National Regulatory Commission (US NRC) published the 10 Cfr-54, to establish the procedures, criterions and necessary requirements for the license renovation of a nuclear power station. In 1994 the US NRC proposed an amendment to these requirements that basically were centered in the effects of the aging in structures and passive components of long life and in the inclusion of Time Limited Aging Analyses (TLAAs). In a general way, is established that the requester should show to the regulator body that the effects of the aging in structures, systems and components are and will be appropriately negotiated or that the TLAAs has been evaluated for the operation extended period. The NEI 95-10 is a guide document developed by the Nuclear Energy Institute (NEI), to provide a focus accepted to cover the requirements of 10 Cfr-54, offering an efficient process that allows to any requester to complete in a practical way the requirements of the License Renovation Rule and to supplement their solicitude. This work presents the application of this guide to the Reactor Core Insulation Cooling (Rcic) of the Unit 1 of the nuclear power plant of Laguna Verde, elect as pilot system to carry out the application of the 10 Cfr-54 following the recommended methodology by the Industrial Guide for the implementation of the License Renovation Rule. (Author)

  9. Multifractal methodology

    CERN Document Server

    Salat, Hadrien; Arcaute, Elsa

    2016-01-01

    Various methods have been developed independently to study the multifractality of measures in many different contexts. Although they all convey the same intuitive idea of giving a "dimension" to sets where a quantity scales similarly within a space, they are not necessarily equivalent on a more rigorous level. This review article aims at unifying the multifractal methodology by presenting the multifractal theoretical framework and principal practical methods, namely the moment method, the histogram method, multifractal detrended fluctuation analysis (MDFA) and modulus maxima wavelet transform (MMWT), with a comparative and interpretative eye.

  10. Ground Wars

    DEFF Research Database (Denmark)

    Nielsen, Rasmus Kleis

    Political campaigns today are won or lost in the so-called ground war--the strategic deployment of teams of staffers, volunteers, and paid part-timers who work the phones and canvass block by block, house by house, voter by voter. Ground Wars provides an in-depth ethnographic portrait of two...... infrastructures that utilize large databases with detailed individual-level information for targeting voters, and armies of dedicated volunteers and paid part-timers. Nielsen challenges the notion that political communication in America must be tightly scripted, controlled, and conducted by a select coterie...... of professionals. Yet he also quashes the romantic idea that canvassing is a purer form of grassroots politics. In today's political ground wars, Nielsen demonstrates, even the most ordinary-seeming volunteer knocking at your door is backed up by high-tech targeting technologies and party expertise. Ground Wars...

  11. New rules revamp national security prosecutions

    Science.gov (United States)

    Gwynne, Peter

    2016-06-01

    Following the collapse of recent prosecutions of Chinese-American scientists on national security grounds, the US Department of Justice (DOJ) has issued new rules on such lawsuits that will require top officials in Washington to review and supervise all cases that implicitly involve spying - rather than leaving decisions to local prosecutors.

  12. Research Methodology

    CERN Document Server

    Rajasekar, S; Philomination, P

    2006-01-01

    In this manuscript various components of research are listed and briefly discussed. The topics considered in this write-up cover a part of the research methodology paper of Master of Philosophy (M.Phil.) course and Doctor of Philosophy (Ph.D.) course. The manuscript is intended for students and research scholars of science subjects such as mathematics, physics, chemistry, statistics, biology and computer science. Various stages of research are discussed in detail. Special care has been taken to motivate the young researchers to take up challenging problems. Ten assignment works are given. For the benefit of young researchers a short interview with three eminent scientists is included at the end of the manuscript.

  13. Methodological advances

    Directory of Open Access Journals (Sweden)

    Lebreton, J.-D.

    2004-06-01

    Full Text Available The study of population dynamics has long depended on methodological progress. Among many striking examples, continuous time models for populations structured in age (Sharpe & Lotka, 1911 were made possible by progress in the mathematics of integral equations. Therefore the relationship between population ecology and mathematical and statistical modelling in the broad sense raises a challenge in interdisciplinary research. After the impetus given in particular by Seber (1982, the regular biennial EURING conferences became a major vehicle to achieve this goal. It is thus not surprising that EURING 2003 included a session entitled “Methodological advances”. Even if at risk of heterogeneity in the topics covered and of overlap with other sessions, such a session was a logical way of ensuring that recent and exciting new developments were made available for discussion, further development by biometricians and use by population biologists. The topics covered included several to which full sessions were devoted at EURING 2000 (Anderson, 2001 such as: individual covariates, Bayesian methods, and multi–state models. Some other topics (heterogeneity models, exploited populations and integrated modelling had been addressed by contributed talks or posters. Their presence among “methodological advances”, as well as in other sessions of EURING 2003, was intended as a response to their rapid development and potential relevance to biological questions. We briefly review all talks here, including those not published in the proceedings. In the plenary talk, Pradel et al. (in prep. developed GOF tests for multi–state models. Until recently, the only goodness–of–fit procedures for multistate models were ad hoc, and non optimal, involving use of standard tests for single state models (Lebreton & Pradel, 2002. Pradel et al. (2003 proposed a general approach based in particular on mixtures of multinomial distributions. Pradel et al. (in prep. showed

  14. Optimal short-sighted rules

    Directory of Open Access Journals (Sweden)

    Sacha eBourgeois-Gironde

    2012-09-01

    Full Text Available The aim of this paper is to assess the relevance of methodological transfers from behavioral ecology to experimental economics with respect to the elicitation of intertemporal preferences. More precisely our discussion will stem from the analysis of Stephens and Anderson’s (2001 seminal article. In their study with blue jays they document that foraging behavior typically implements short sighted choice rules which are beneficial in the long-run. Such long term profitability of short-sighted behavior cannot be evidenced when using a self-control paradigm (one which contrasts in a binary way sooner smaller and later larger payoffs but becomes apparent when ecological patch-paradigms (replicating economic situations in which the main trade-off consists in staying on a food patch or leaving for another patch are implemented. We transfer this methodology in view of contrasting foraging strategies and self-control in human intertemporal choices.

  15. 区域法治发展的概念意义--一种法哲学方法论上的初步分析%Conceptual Significance of the Development of Regional Rule of Law:A Preliminary Analysis Using the Methodology of Philosophy of Law

    Institute of Scientific and Technical Information of China (English)

    公丕祥

    2014-01-01

    The advancement of regional rule of law is high on the agenda for the modernization of Chi-nese legal system. Basically, the notion of region refers to the following two cases:the one across dif-ferent countries and the one within a country. The region in the second sense is defined as a unit based on the internal administrative hierarchy of a sovereign state or as a cluster of such units connected by close geo-relations. The region discussed in this paper is of the second type. As an integral part of the development of rule of law of a state, regional rule of law is the specific realization of rule of law in a specific area within its borders. The methodology of the regional rule of law development research is an organic hierarchical system. Since philosophy of law makes a key part of this system, we will focus on this perspective in our paper. The concept of“unity in diversity” advanced by Karl Marx can help us obtain this kind of understanding: the development of regional rule of law is a “natural historical process”, which is characterized by its internal unity and diversity as well as by a dimension of“unity in diversities”. Methodological individuation is a principle formed in the philosophical development from Hegel to the 19th century Germanic secular historicism and given a full play by Marx Weber. To adopt the method of individuation on the basis of a critical adaptation will help us discover the secrets of the development of regional rule of law. It should be noted that when tackling the development of re-gional rule of law from the perspective of philosophy of law, we have to: carefully deal with the rela-tion between the collective and the individual;reveal and abstract the essential relation between the in-dividual actions;try to discover the cause and effect of the individual actions, etc.%推进区域法治发展,这是中国法制现代化进程中的一项重大议程。一般来说,区域包括全球意义上的区域和国家层

  16. Bonnet Ruled Surfaces

    Institute of Scientific and Technical Information of China (English)

    Filiz KANBAY

    2005-01-01

    We consider the Bonnet ruled surfaces which admit only one non-trivial isometry that preserves the principal curvatures. We determine the Bonnet ruled surfaces whose generators and orthogonal trajectories form a special net called an A-net.

  17. Cosmological diagrammatic rules

    CERN Document Server

    Giddings, Steven B

    2010-01-01

    A simple set of diagrammatic rules is formulated for perturbative evaluation of ``in-in" correlators, as is needed in cosmology and other nonequilibrium problems. These rules are both intuitive, and efficient for calculational purposes.

  18. Cosmological diagrammatic rules

    Energy Technology Data Exchange (ETDEWEB)

    Giddings, Steven B. [Department of Physics, University of California, Santa Barbara, CA 93106 (United States); Sloth, Martin S., E-mail: giddings@physics.ucsb.edu, E-mail: sloth@cern.ch [CERN, Physics Department, Theory Unit, CH-1211 Geneva 23 (Switzerland)

    2010-07-01

    A simple set of diagrammatic rules is formulated for perturbative evaluation of ''in-in'' correlators, as is needed in cosmology and other nonequilibrium problems. These rules are both intuitive, and efficient for calculational purposes.

  19. Phonological reduplication in sign language: rules rule

    Directory of Open Access Journals (Sweden)

    Iris eBerent

    2014-06-01

    Full Text Available Productivity—the hallmark of linguistic competence—is typically attributed to algebraic rules that support broad generalizations. Past research on spoken language has documented such generalizations in both adults and infants. But whether algebraic rules form part of the linguistic competence of signers remains unknown. To address this question, here we gauge the generalization afforded by American Sign Language (ASL. As a case study, we examine reduplication (X→XX—a rule that, inter alia, generates ASL nouns from verbs. If signers encode this rule, then they should freely extend it to novel syllables, including ones with features that are unattested in ASL. And since reduplicated disyllables are preferred in ASL, such rule should favor novel reduplicated signs. Novel reduplicated signs should thus be preferred to nonreduplicative controls (in rating, and consequently, such stimuli should also be harder to classify as nonsigns (in the lexical decision task. The results of four experiments support this prediction. These findings suggest that the phonological knowledge of signers includes powerful algebraic rules. The convergence between these conclusions and previous evidence for phonological rules in spoken language suggests that the architecture of the phonological mind is partly amodal.

  20. Parton model sum rules

    CERN Document Server

    Hinchliffe, Ian; Hinchliffe, Ian; Kwiatkowski, Axel

    1996-01-01

    This review article discusses the experimental and theoretical status of various Parton Model sum rules. The basis of the sum rules in perturbative QCD is discussed. Their use in extracting the value of the strong coupling constant is evaluated and the failure of the naive version of some of these rules is assessed.

  1. Modifying Intramural Rules.

    Science.gov (United States)

    Rokosz, Francis M.

    1981-01-01

    Standard sports rules can be altered to improve the game for intramural participants. These changes may improve players' attitudes, simplify rules for officials, and add safety features to a game. Specific rule modifications are given for volleyball, football, softball, floor hockey, basketball, and soccer. (JN)

  2. Portable design rules for bulk CMOS

    Science.gov (United States)

    Griswold, T. W.

    1982-01-01

    It is pointed out that for the past several years, one school of IC designers has used a simplified set of nMOS geometric design rules (GDR) which is 'portable', in that it can be used by many different nMOS manufacturers. The present investigation is concerned with a preliminary set of design rules for bulk CMOS which has been verified for simple test structures. The GDR are defined in terms of Caltech Intermediate Form (CIF), which is a geometry-description language that defines simple geometrical objects in layers. The layers are abstractions of physical mask layers. The design rules do not presume the existence of any particular design methodology. Attention is given to p-well and n-well CMOS processes, bulk CMOS and CMOS-SOS, CMOS geometric rules, and a description of the advantages of CMOS technology.

  3. Binary effectivity rules

    DEFF Research Database (Denmark)

    Keiding, Hans; Peleg, Bezalel

    2006-01-01

    is binary if it is rationalized by an acyclic binary relation. The foregoing result motivates our definition of a binary effectivity rule as the effectivity rule of some binary SCR. A binary SCR is regular if it satisfies unanimity, monotonicity, and independence of infeasible alternatives. A binary...... effectivity rule is regular if it is the effectivity rule of some regular binary SCR. We characterize completely the family of regular binary effectivity rules. Quite surprisingly, intrinsically defined von Neumann-Morgenstern solutions play an important role in this characterization...

  4. New Safety rules

    CERN Multimedia

    Safety Commission

    2008-01-01

    The revision of CERN Safety rules is in progress and the following new Safety rules have been issued on 15-04-2008: Safety Procedure SP-R1 Establishing, Updating and Publishing CERN Safety rules: http://cern.ch/safety-rules/SP-R1.htm; Safety Regulation SR-S Smoking at CERN: http://cern.ch/safety-rules/SR-S.htm; Safety Regulation SR-M Mechanical Equipment: http://cern.ch/safety-rules/SR-M.htm; General Safety Instruction GSI-M1 Standard Lifting Equipment: http://cern.ch/safety-rules/GSI-M1.htm; General Safety Instruction GSI-M2 Standard Pressure Equipment: http://cern.ch/safety-rules/GSI-M2.htm; General Safety Instruction GSI-M3 Special Mechanical Equipment: http://cern.ch/safety-rules/GSI-M3.htm. These documents apply to all persons under the Director General’s authority. All Safety rules are available at the web page: http://www.cern.ch/safety-rules The Safety Commission

  5. Action Rules Mining

    CERN Document Server

    Dardzinska, Agnieszka

    2013-01-01

    We are surrounded by data, numerical, categorical and otherwise, which must to be analyzed and processed to convert it into information that instructs, answers or aids understanding and decision making. Data analysts in many disciplines such as business, education or medicine, are frequently asked to analyze new data sets which are often composed of numerous tables possessing different properties. They try to find completely new correlations between attributes and show new possibilities for users.   Action rules mining discusses some of data mining and knowledge discovery principles and then describe representative concepts, methods and algorithms connected with action. The author introduces the formal definition of action rule, notion of a simple association action rule and a representative action rule, the cost of association action rule, and gives a strategy how to construct simple association action rules of a lowest cost. A new approach for generating action rules from datasets with numerical attributes...

  6. Scalar Glueballs A Gaussian Sum-rules Analysis

    CERN Document Server

    Harnett, D

    2002-01-01

    Although marginally more complicated than the traditional laplace sum-rules, gaussian sum-rules have the advantage of being able to probe excited and ground hadronic states with similar sensitivity. Gaussian sum-rule analysis techniques are applied to the problematic scalar glueball channel to determine masses, widths, and relative resonance strengths of low-lying scalar glueball states contributing to the hadronic spectral function. An important feature of our analysis is the inclusion of instanton contributions to the scalar gluonic correlation function. Compared with the next-to-leading gaussian sum- rule, the analysis of the lowest weighted sum-rule (which contains a large scale independent contribution from the low energy theorem) is shown to be unreliable because of instability under QCD uncertainties. However, the presence of instanton effects leads to approximately consistent mass scales in the lowest weighted and next- lowest weighted sum-rules. The analysis of the next-to- leading sum-rule demonstra...

  7. The Development of Constructivist Grounded Theory

    Directory of Open Access Journals (Sweden)

    Jane Mills

    2006-03-01

    Full Text Available Constructivist grounded theory is a popular method for research studies primarily in the disciplines of psychology, education, and nursing. In this article, the authors aim to locate the roots of constructivist grounded theory and then trace its development. They examine key grounded theory texts to discern their ontological and epistemological orientation. They find Strauss and Corbin's texts on grounded theory to possess a discernable thread of constructivism in their approach to inquiry. They also discuss Charmaz's landmark work on constructivist grounded theory relative to her positioning of the researcher in relation to the participants, analysis of the data, and rendering of participants' experiences into grounded theory. Grounded theory can be seen as a methodological spiral that begins with Glaser and Strauss' original text and continues today. The variety of epistemological positions that grounded theorists adopt are located at various points on this spiral and are reflective of their underlying ontologies.

  8. A Methodology for Generating Placement Rules that Utilizes Logistic Regression

    Science.gov (United States)

    Wurtz, Keith

    2008-01-01

    The purpose of this article is to provide the necessary tools for institutional researchers to conduct a logistic regression analysis and interpret the results. Aspects of the logistic regression procedure that are necessary to evaluate models are presented and discussed with an emphasis on cutoff values and choosing the appropriate number of…

  9. Measuring interesting rules in Characteristic rule

    CERN Document Server

    Warnars, Spits

    2010-01-01

    Finding interesting rule in the sixth strategy step about threshold control on generalized relations in attribute oriented induction, there is possibility to select candidate attribute for further generalization and merging of identical tuples until the number of tuples is no greater than the threshold value, as implemented in basic attribute oriented induction algorithm. At this strategy step there is possibility the number of tuples in final generalization result still greater than threshold value. In order to get the final generalization result which only small number of tuples and can be easy to transfer into simple logical formula, the seventh strategy step about rule transformation is evolved where there will be simplification by unioning or grouping the identical attribute. Our approach to measure interesting rule is opposite with heuristic measurement approach by Fudger and Hamilton where the more complex concept hierarchies, more interesting results are likely to be found, but our approach the simple...

  10. Symmetrization Selection Rules, 2

    CERN Document Server

    Page, P R

    1996-01-01

    We introduce strong interaction selection rules for the two-body decay and production of hybrid and conventional mesons coupling to two S-wave hybrid or conventional mesons. The rules arise from symmetrization in states in the limit of non-relativistically moving quarks. The conditions under which hybrid coupling to S-wave states is suppressed are determined by the rules, and the nature of their breaking is indicated.

  11. Rule Generation Based On Dominance Matrices and Functions

    Institute of Scientific and Technical Information of China (English)

    安利平; 陈增强; 袁著祉; 仝凌云

    2004-01-01

    Rough set theory has proved to be a useful tool for rule induction. But, the theory based on indiscernibility relation or similarity relation cannot induce rules from decision tables with criteria. Greco et al have proposed a new rough set approach based on dominance relation to handle the problems. In this paper, the concept of dominance matrix is put forward and the dominance function is constructed to compute the minimal decision rules that are more general and applicable than the ones induced by the classical rough set theory. In addition, the methodology of simplification is presented to eliminate the redundancy in the rule set.

  12. Linguistic Valued Association Rules

    Institute of Scientific and Technical Information of China (English)

    LU Jian-jiang; QIAN Zuo-ping

    2002-01-01

    Association rules discovering and prediction with data mining method are two topics in the field of information processing. In this paper, the records in database are divided into many linguistic values expressed with normal fuzzy numbers by fuzzy c-means algorithm, and a series of linguistic valued association rules are generated. Then the records in database are mapped onto the linguistic values according to largest subject principle, and the support and confidence definitions of linguistic valued association rules are also provided. The discovering and prediction methods of the linguistic valued association rules are discussed through a weather example last.

  13. Methodological problems in Rorschach research

    Directory of Open Access Journals (Sweden)

    Đurić-Jočić Dragana

    2007-01-01

    Full Text Available Comprehensive System of Rorschach interpretation is considered as nomotetic system that makes possible using of projective method in research projects. However, research use of Rorschach method besides of appropriate knowledge of assign procedures and interpretation rules, means a knowledge of specific methodological issues. The Rorschach indicators are nor independent, as they are a part of specific net, so in some research it is necessary to control basic variables not to get artifacts in our research. This is basically relied on researches where we compare groups, as well as in normative studies where through cross-cultural we compare Rorschach indicators. .

  14. 'Grounded' Politics

    DEFF Research Database (Denmark)

    Schmidt, Garbi

    2012-01-01

    play within one particular neighbourhood: Nørrebro in the Danish capital, Copenhagen. The article introduces the concept of grounded politics to analyse how groups of Muslim immigrants in Nørrebro use the space, relationships and history of the neighbourhood for identity political statements....... The article further describes how national political debates over the Muslim presence in Denmark affect identity political manifestations within Nørrebro. By using Duncan Bell’s concept of mythscape (Bell, 2003), the article shows how some political actors idealize Nørrebro’s past to contest the present...

  15. On the ground state of metallic hydrogen

    Science.gov (United States)

    Chakravarty, S.; Ashcroft, N. W.

    1978-01-01

    A proposed liquid ground state of metallic hydrogen at zero temperature is explored and a variational upper bound to the ground state energy is calculated. The possibility that the metallic hydrogen is a liquid around the metastable point (rs = 1.64) cannot be ruled out. This conclusion crucially hinges on the contribution to the energy arising from the third order in the electron-proton interaction which is shown here to be more significant in the liquid phase than in crystals.

  16. Stable canonical rules

    NARCIS (Netherlands)

    Iemhoff, R.; Bezhanishvili, N.; Bezhanishvili, Guram

    2016-01-01

    We introduce stable canonical rules and prove that each normal modal multi-conclusion consequence relation is axiomatizable by stable canonical rules. We apply these results to construct finite refutation patterns for modal formulas, and prove that each normal modal logic is axiomatizable by stable

  17. Branes and wrapping rules

    CERN Document Server

    Bergshoeff, Eric A

    2011-01-01

    We show that the branes of ten-dimensional IIA/IIB string theory must satisfy, upon toroidal compactification, specific wrapping rules in order to reproduce the number of supersymmetric branes that follows from a supergravity analysis. The realization of these wrapping rules suggests that IIA/IIB string theory contains a whole class of generalized Kaluza-Klein monopoles.

  18. Branes and Wrapping Rules

    NARCIS (Netherlands)

    Bergshoeff, E.; Riccioni, F.

    2012-01-01

    We show that the branes of ten-dimensional IA/IIB string theory must satisfy, upon toroidal compactification, specific wrapping rules in order to reproduce the number of supersymmetric branes that follows from a supergravity analysis. The realization of these wrapping rules suggests that IA/IIB stri

  19. Scoped Dynamic Rewrite Rules

    NARCIS (Netherlands)

    Visser, Eelco

    2002-01-01

    The applicability of term rewriting to program transformation is limited by the lack of control over rule application and by the context-free nature of rewrite rules. The first problem is addressed by languages supporting user-definable rewriting strategies. This paper addresses the second problem b

  20. The Validity of Divergent Grounded Theory Method

    Directory of Open Access Journals (Sweden)

    Martin Nils Amsteus PhD

    2014-02-01

    Full Text Available The purpose of this article is to assess whether divergence of grounded theory method may be considered valid. A review of literature provides a basis for understanding and evaluating grounded theory. The principles and nature of grounded theory are synthesized along with theoretical and practical implications. It is deduced that for a theory to be truly grounded in empirical data, the method resulting in the theory should be the equivalent of pure induction. Therefore, detailed, specified, stepwise a priori procedures may be seen as unbidden or arbitrary. It is concluded that divergent grounded theory can be considered valid. The author argues that securing methodological transparency through the description of the actual principles and procedures employed, as well as tailoring them to the particular circumstances, is more important than adhering to predetermined stepwise procedures. A theoretical foundation is provided from which diverse theoretical developments and methodological procedures may be developed, judged, and refined based on their own merits.

  1. Advances in QCD sum rule calculations

    CERN Document Server

    Melikhov, Dmitri

    2016-01-01

    We review the recent progress in the applications of QCD sum rules to hadron properties with the emphasis on the following selected problems: (i) development of new algorithms for the extraction of ground-state parameters from two-point correlators; (ii) form factors at large momentum transfers from three-point vacuum correlation functions; (iii) properties of exotic tetraquark hadrons from correlation functions of four-quark currents.

  2. Advances in QCD sum-rule calculations

    Energy Technology Data Exchange (ETDEWEB)

    Melikhov, Dmitri [Institute for High Energy Physics, Austrian Academy of Sciences, Nikolsdorfergasse 18, A-1050 Vienna, Austria D. V. Skobeltsyn Institute of Nuclear Physics, M. V. Lomonosov Moscow State University, Moscow (Russian Federation)

    2016-01-22

    We review the recent progress in the applications of QCD sum rules to hadron properties with the emphasis on the following selected problems: (i) development of new algorithms for the extraction of ground-state parameters from two-point correlators; (ii) form factors at large momentum transfers from three-point vacuum correlation functions: (iii) properties of exotic tetraquark hadrons from correlation functions of four-quark currents.

  3. Grounded theory, feminist theory, critical theory: toward theoretical triangulation.

    Science.gov (United States)

    Kushner, Kaysi Eastlick; Morrow, Raymond

    2003-01-01

    Nursing and social science scholars have examined the compatibility between feminist and grounded theory traditions in scientific knowledge generation, concluding that they are complementary, yet not without certain tensions. This line of inquiry is extended to propose a critical feminist grounded theory methodology. The construction of symbolic interactionist, feminist, and critical feminist variants of grounded theory methodology is examined in terms of the presuppositions of each tradition and their interplay as a process of theoretical triangulation.

  4. Measurement of ground motion in various sites

    Energy Technology Data Exchange (ETDEWEB)

    Bialowons, W.; Amirikas, R.; Bertolini, A.; Kruecker, D.

    2007-04-15

    Ground vibrations may affect low emittance beam transport in linear colliders, Free Electron Lasers (FEL) and synchrotron radiation facilities. This paper is an overview of a study program to measure ground vibrations in various sites which can be used for site characterization in relation to accelerator design. Commercial broadband seismometers have been used to measure ground vibrations and the resultant database is available to the scientific community. The methodology employed is to use the same equipment and data analysis tools for ease of comparison. This database of ground vibrations taken in 19 sites around the world is first of its kind. (orig.)

  5. Ground rules of the pluripotency gene regulatory network.

    KAUST Repository

    Li, Mo

    2017-01-03

    Pluripotency is a state that exists transiently in the early embryo and, remarkably, can be recapitulated in vitro by deriving embryonic stem cells or by reprogramming somatic cells to become induced pluripotent stem cells. The state of pluripotency, which is stabilized by an interconnected network of pluripotency-associated genes, integrates external signals and exerts control over the decision between self-renewal and differentiation at the transcriptional, post-transcriptional and epigenetic levels. Recent evidence of alternative pluripotency states indicates the regulatory flexibility of this network. Insights into the underlying principles of the pluripotency network may provide unprecedented opportunities for studying development and for regenerative medicine.

  6. Rules on Paper, Rules in Practice

    OpenAIRE

    Al-Dahdah, Edouard; Corduneanu-Huci, Cristina; Raballand, Gael; Sergenti, Ernest; Ababsa, Myriam

    2016-01-01

    The primary focus of this book is on a specific outcome of the rule of law: the practical enforcement of laws and policies, and the determinants of this enforcement, or lack thereof. Are there significant and persistent differences in implementation across countries? Why are some laws and policies more systematically enforced than others? Are “good” laws likely to be enacted, and if not, what stands in the way? We answer these questions using a theoretical framework and detailed empirical...

  7. Do Fiscal Rules Matter?

    DEFF Research Database (Denmark)

    Grembi, Veronica; Nannicini, Tommaso; Troiano, Ugo

    2016-01-01

    Fiscal rules are laws aimed at reducing the incentive to accumulate debt, and many countries adopt them to discipline local governments. Yet, their effectiveness is disputed because of commitment and enforcement problems. We study their impact applying a quasi-experimental design in Italy. In 1999......, the central government imposed fiscal rules on municipal governments, and in 2001 relaxed them below 5,000 inhabitants. We exploit the before/after and discontinuous policy variation, and show that relaxing fiscal rules increases deficits and lowers taxes. The effect is larger if the mayor can be reelected...

  8. Grounding & human health - a review

    Science.gov (United States)

    Jamieson, I. A.; Jamieson, S. S.; ApSimon, H. M.; Bell, J. N. B.

    2011-06-01

    Whilst grounding is often undertaken in industry as a matter of good practice in situations where the risk of excess charge exists, little thought is usually given to the biological effects that such measures may have, or possible benefits that may arise from the more widespread application of electrostatic and other 'electromagnetic hygiene' measures in hospitals and the general built environment. Research, which is still in its infancy, indicates that grounding the human body using suitable methodologies, particularly in low electromagnetic field environments, can significantly enhance biological functioning. It is proposed that there are often a number of electrostatic and 'electromagnetic hygiene' factors that need to be addressed before the beneficial effects of grounding the human body can be fully realised in many everyday environments.

  9. Dodgson's Rule Approximations and Absurdity

    CERN Document Server

    McCabe-Dansted, John C

    2010-01-01

    With the Dodgson rule, cloning the electorate can change the winner, which Young (1977) considers an "absurdity". Removing this absurdity results in a new rule (Fishburn, 1977) for which we can compute the winner in polynomial time (Rothe et al., 2003), unlike the traditional Dodgson rule. We call this rule DC and introduce two new related rules (DR and D&). Dodgson did not explicitly propose the "Dodgson rule" (Tideman, 1987); we argue that DC and DR are better realizations of the principle behind the Dodgson rule than the traditional Dodgson rule. These rules, especially D&, are also effective approximations to the traditional Dodgson's rule. We show that, unlike the rules we have considered previously, the DC, DR and D& scores differ from the Dodgson score by no more than a fixed amount given a fixed number of alternatives, and thus these new rules converge to Dodgson under any reasonable assumption on voter behaviour, including the Impartial Anonymous Culture assumption.

  10. General Quantization Rule

    CERN Document Server

    Maiz, F

    2012-01-01

    A general quantization rule for bound states of the Schrodinger equation is presented. Like fundamental theory of integral, our idea is mainly based on dividing the potential into many pieces, solving the Schr\\"odinger equation, and deriving the general quantization rule. For both exactly and non-exactly solvable systems, the energy levels of all the bound states can be easily calculated from the general quantization rule. Using this new general quantization rule, we re-calculate the energy levels for the one-dimensional system, with an infinite square well, with the harmonic oscillator potential, with the Morse Potential, with the symmetric and asymmetric Rosen-Morse potentials, with the first P\\"oschl-Teller potential, with the Coulomb Potential, with the V-shape Potential, and the ax^4 potential, and for the three dimensions systems, with the harmonic oscillator potential, with the ordinary Coulomb potential, and for the hydrogen atom.

  11. Drug Plan Coverage Rules

    Science.gov (United States)

    ... get about Medicare Lost/incorrect Medicare card Report fraud & abuse File a complaint Identity theft: protect yourself ... drug plan How Part D works with other insurance Find health & drug plans Drug plan coverage rules ...

  12. Staff rules and regulations

    CERN Multimedia

    HR Department

    2007-01-01

    The 11th edition of the Staff Rules and Regulations, dated 1 January 2007, adopted by the Council and the Finance Committee in December 2006, is currently being distributed to departmental secretariats. The Staff Rules and Regulations, together with a summary of the main modifications made, will be available, as from next week, on the Human Resources Department's intranet site: http://cern.ch/hr-web/internal/admin_services/rules/default.asp The main changes made to the Staff Rules and Regulations stem from the five-yearly review of employment conditions of members of the personnel. The changes notably relate to: the categories of members of the personnel (e.g. removal of the local staff category); the careers structure and the merit recognition system; the non-residence, installation and re-installation allowances; the definition of family, family allowances and family-related leave; recognition of partnerships; education fees. The administrative circulars, some of which are being revised following the ...

  13. TANF Rules Data Base

    Data.gov (United States)

    U.S. Department of Health & Human Services — Single source providing information on Temporary Assistance for Needy Families (TANF) program rules among States and across years (currently 1996-2010), including...

  14. Generalized Multidimensional Association Rules

    Institute of Scientific and Technical Information of China (English)

    周傲英; 周水庚; 金文; 田增平

    2000-01-01

    The problem of association rule mining has gained considerable prominence in the data mining community for its use as an important tool of knowl-edge discovery from large-scale databases. And there has been a spurt of research activities around this problem. Traditional association rule mining is limited to intra-transaction. Only recently the concept of N-dimensional inter-transaction as-sociation rule (NDITAR) was proposed by H.J. Lu. This paper modifies and extends Lu's definition of NDITAR based on the analysis of its limitations, and the general-ized multidimensional association rule (GMDAR) is subsequently introduced, which is more general, flexible and reasonable than NDITAR.

  15. Revised Total Coliform Rule

    Science.gov (United States)

    The Revised Total Coliform Rule (RTCR) aims to increase public health protection through the reduction of potential pathways for fecal contamination in the distribution system of a public water system (PWS).

  16. (FIELD) SYMMETRIZATION SELECTION RULES

    Energy Technology Data Exchange (ETDEWEB)

    P. PAGE

    2000-08-01

    QCD and QED exhibit an infinite set of three-point Green's functions that contain only OZI rule violating contributions, and (for QCD) are subleading in the large N{sub c} expansion. We prove that the QCD amplitude for a neutral hybrid {l_brace}1,3,5. . .{r_brace}{+-} exotic current to create {eta}{pi}{sup 0} only comes from OZI rule violating contributions under certain conditions, and is subleading in N{sub c}.

  17. Symmetrization Selection Rules, 1

    CERN Document Server

    Page, P R

    1996-01-01

    We introduce a category of strong and electromagnetic interaction selection rules for the two-body connected decay and production of exotic J^{PC} = 0^{+-}, 1^{-+}, 2^{+-}, 3^{-+}, ... hybrid and four-quark mesons. The rules arise from symmetrization in states in addition to Bose symmetry and CP invariance. Examples include various decays to \\eta'\\eta, \\eta\\pi, \\eta'\\pi and four-quark interpretations of a 1^{-+} signal.

  18. News and Trading Rules

    Science.gov (United States)

    2003-01-01

    indexes or small groups of forex series. Although I use a shorter time period – five years for the work on technical analysis and machine learning, only...I start with practitioner-developed technical analysis constructs, sys- tematically examining their ability to generate trading rules profitable on...a large universe of stocks. Then, I use these technical analysis constructs as the underlying representation for a simple trading rule leaner, with

  19. Data breaches. Final rule.

    Science.gov (United States)

    2008-04-11

    This document adopts, without change, the interim final rule that was published in the Federal Register on June 22, 2007, addressing data breaches of sensitive personal information that is processed or maintained by the Department of Veterans Affairs (VA). This final rule implements certain provisions of the Veterans Benefits, Health Care, and Information Technology Act of 2006. The regulations prescribe the mechanisms for taking action in response to a data breach of sensitive personal information.

  20. METHODOLOGICAL ELEMENTS OF SITUATIONAL ANALYSIS

    Directory of Open Access Journals (Sweden)

    Tetyana KOVALCHUK

    2016-07-01

    Full Text Available The article deals with the investigation of theoretical and methodological principles of situational analysis. The necessity of situational analysis is proved in modern conditions. The notion “situational analysis” is determined. We have concluded that situational analysis is a continuous system study which purpose is to identify dangerous situation signs, to evaluate comprehensively such signs influenced by a system of objective and subjective factors, to search for motivated targeted actions used to eliminate adverse effects of the exposure of the system to the situation now and in the future and to develop the managerial actions needed to bring the system back to norm. It is developed a methodological approach to the situational analysis, its goal is substantiated, proved the expediency of diagnostic, evaluative and searching functions in the process of situational analysis. The basic methodological elements of the situational analysis are grounded. The substantiation of the principal methodological elements of system analysis will enable the analyst to develop adaptive methods able to take into account the peculiar features of a unique object which is a situation that has emerged in a complex system, to diagnose such situation and subject it to system and in-depth analysis, to identify risks opportunities, to make timely management decisions as required by a particular period.

  1. Mechanisms of rule acquisition and rule following in inductive reasoning.

    Science.gov (United States)

    Crescentini, Cristiano; Seyed-Allaei, Shima; De Pisapia, Nicola; Jovicich, Jorge; Amati, Daniele; Shallice, Tim

    2011-05-25

    Despite the recent interest in the neuroanatomy of inductive reasoning processes, the regional specificity within prefrontal cortex (PFC) for the different mechanisms involved in induction tasks remains to be determined. In this study, we used fMRI to investigate the contribution of PFC regions to rule acquisition (rule search and rule discovery) and rule following. Twenty-six healthy young adult participants were presented with a series of images of cards, each consisting of a set of circles numbered in sequence with one colored blue. Participants had to predict the position of the blue circle on the next card. The rules that had to be acquired pertained to the relationship among succeeding stimuli. Responses given by subjects were categorized in a series of phases either tapping rule acquisition (responses given up to and including rule discovery) or rule following (correct responses after rule acquisition). Mid-dorsolateral PFC (mid-DLPFC) was active during rule search and remained active until successful rule acquisition. By contrast, rule following was associated with activation in temporal, motor, and medial/anterior prefrontal cortex. Moreover, frontopolar cortex (FPC) was active throughout the rule acquisition and rule following phases before a rule became familiar. We attributed activation in mid-DLPFC to hypothesis generation and in FPC to integration of multiple separate inferences. The present study provides evidence that brain activation during inductive reasoning involves a complex network of frontal processes and that different subregions respond during rule acquisition and rule following phases.

  2. Footy Rules in China

    Institute of Scientific and Technical Information of China (English)

    AMY; BAINBRIDGE

    2008-01-01

    One glance across the sports ground, and you could be mistaken for thinking you were watching any old football team training in Australia. The differ- ence is, the ground is at Nankai University in central Tianjin, all the players are Chinese, and the result from this training session is more important than a regular, relaxed kick and laugh.

  3. Polarimetry from the Ground Up

    CERN Document Server

    Keller, C U

    2008-01-01

    Ground-based solar polarimetry has made great progress over the last decade. Nevertheless, polarimetry is still an afterthought in most telescope and instrument designs, and most polarimeters are designed based on experience and rules of thumb rather than using more formal systems engineering approaches as is common in standard optical design efforts. Here we present the first steps in creating a set of systems engineering approaches to the design of polarimeters that makes sure that the final telescope-instrument-polarimeter system is more than the sum of its parts.

  4. Management Research and Grounded Theory: A review of grounded theorybuilding approach in organisational and management research.

    Directory of Open Access Journals (Sweden)

    Graham J.J. Kenealy, Ph.D.

    2008-06-01

    Full Text Available Grounded theory is a systematic methodology for the collection and analysis of data which was discovered by Glaser and Strauss in the 1960’s. The discovery of this method was first presented to the academic community in their book ‘The Discovery of Grounded Theory’ (1967 which still remains a primary point of reference for those undertaking qualitative research and grounded theory in particular. This powerful research method has become very popular in some research domains; whilst increasing in popularity it is still less prevalent in the field of organisational and management research particularly in its original form. This self reflexive paper sets out to explore the possibilities for this imbalance which takes the discussion onto the areas of methodological adaptation and training. It also enters the debate about access to research subjects and provides a succinct argument supporting the notion that grounded theory should simply be viewed as a method that develops empirically grounded conceptual theory.

  5. Novice Rules for Projectile Motion.

    Science.gov (United States)

    Maloney, David P.

    1988-01-01

    Investigates several aspects of undergraduate students' rules for projectile motion including general patterns; rules for questions about time, distance, solids and liquids; and changes in rules when asked to ignore air resistance. Reports approach differences by sex and high school physics experience, and that novice rules are situation…

  6. The role of traffic rules.

    NARCIS (Netherlands)

    Noordzij, P.C.

    1988-01-01

    Experienced road users seem to have their own set of traffic rules (including rules about when to violate the official rules). The number of violations is enormous, causing great concern for the authorities. The situation could be improved by separating a set of rules with the aim of deterring road

  7. Nature and Function of Rules.

    Science.gov (United States)

    Fields, Barry A.

    1997-01-01

    Surveyed Year 1 and 2 teachers in Australia about their classroom rules. Found that teachers have about six rules for their classes relating to pupil-pupil relations, completing academic tasks, movement around the classroom, property, safety, and other. Most rules concerned pupil-pupil interactions, and all rules can be seen as a way of…

  8. The rule of law

    Directory of Open Access Journals (Sweden)

    Besnik Murati

    2015-07-01

    Full Text Available The state as an international entity and its impact on the individual’s right has been and still continues to be a crucial factor in the relationship between private and public persons. States vary in terms of their political system, however, democratic states are based on the separation of powers and human rights within the state. Rule of law is the product of many actors in a state, including laws, individuals, society, political system, separation of powers, human rights, the establishment of civil society, the relationship between law and the individual, as well as, individual-state relations. Purpose and focus of this study is the importance of a functioning state based on law, characteristics of the rule of law, separation of powers and the basic concepts of the rule of law.

  9. Cosmic Sum Rules

    DEFF Research Database (Denmark)

    T. Frandsen, Mads; Masina, Isabella; Sannino, Francesco

    2011-01-01

    We introduce new sum rules allowing to determine universal properties of the unknown component of the cosmic rays and show how it can be used to predict the positron fraction at energies not yet explored by current experiments and to constrain specific models.......We introduce new sum rules allowing to determine universal properties of the unknown component of the cosmic rays and show how it can be used to predict the positron fraction at energies not yet explored by current experiments and to constrain specific models....

  10. Generalized Deterministic Traffic Rules

    CERN Document Server

    Fuks, H; Fuks, Henryk; Boccara, Nino

    1997-01-01

    We study a family of deterministic models for highway traffic flow which generalize cellular automaton rule 184. This family is parametrized by the speed limit $m$ and another parameter $k$ that represents a ``degree of aggressiveness'' in driving, strictly related to the distance between two consecutive cars. We compare two driving strategies with identical maximum throughput: ``conservative'' driving with high speed limit and ``aggressive'' driving with low speed limit. Those two strategies are evaluated in terms of accident probability. We also discuss fundamental diagrams of generalized traffic rules and examine limitations of maximum achievable throughput. Possible modifications of the model are considered.

  11. THE AGILE METHODOLOGY

    Directory of Open Access Journals (Sweden)

    Charul Deewan

    2012-09-01

    Full Text Available The technologies are numerous and Software is the one whichis most widely used. Some companies have their owncustomized methodology for developing their software but themajority speaks about two kinds of methodologies: Traditionaland Agile methodologies. In this paper, we will discuss someof the aspects of what Agile methodology is, how it can beused to get the best result from a project, how do we get it towork in an organization.

  12. Language Policy and Methodology

    Science.gov (United States)

    Liddicoat, Antony J.

    2004-01-01

    The implementation of a language policy is crucially associated with questions of methodology. This paper explores approaches to language policy, approaches to methodology and the impact that these have on language teaching practice. Language policies can influence decisions about teaching methodologies either directly, by making explicit…

  13. The rules of the Rue Morgue

    Energy Technology Data Exchange (ETDEWEB)

    Vanzi, M. [Univ. of Cagliari (Italy)

    1995-12-31

    This paper, it is evident, is mostly a joke, based on the fascinating (but not original) consideration of any failure analysis as a detective story. The Poe`s tale is a perfect instrument (but surely not the only possible one) for playing the game. If any practical application of ``The Rules of the Rue Morgue`` may be expected, it is on the possibility of defining what leaves us unsatisfied when a Failure Analyst`s result sounds out of tune. The reported Violations to the Dupin Postulate summarize the objections that the author would like to repeat for his own analyses, and for those cases in which he is required to review the work of other. On the constructive side, the proposed Rules, it has been repeatedly said, are common sense indications, and are surely not exhaustive, on a practical ground. Skill, patience, luck and memory are also required, but, unfortunately, not always and not together available. It should be of the greatest aid for the Failure Analyst community, in any case, that each public report could point out how it obeyed to a widely accepted set of failure analysis rules. Maybe -- why not? -- the Rules of the Rue Morgue. As a last consideration, for concluding the joke, the author invites his readers to open the original Poe`s tale at the very beginning of the story, when Monsieur Dupin is introduced. Thinking of the Failure Analyst as a member of the excellent family of the Scientists, many of us will sigh and smile.

  14. A Rigorous Methodology for Analyzing and Designing Plug-Ins

    DEFF Research Database (Denmark)

    Fasie, Marieta V.; Haxthausen, Anne Elisabeth; Kiniry, Joseph

    2013-01-01

    . This paper addresses these problems by describing a rigorous methodology for analyzing and designing plug-ins. The methodology is grounded in the Extended Business Object Notation (EBON) and covers informal analysis and design of features, GUI, actions, and scenarios, formal architecture design, including...... behavioral semantics, and validation. The methodology is illustrated via a case study whose focus is an Eclipse environment for the RAISE formal method's tool suite....

  15. Adler sum rule

    CERN Document Server

    Adler, Stephen L

    2009-01-01

    The Adler sum rule states that the integral over energy of a difference of neutrino-nucleon and antineutrino-nucleon structure functions is a constant, independent of the four-momentum transfer squared. This constancy is a consequence of the local commutation relations of the time components of the hadronic weak current, which follow from the underlying quark structure of the standard model.

  16. TEDXCERN BREAKS THE RULES

    CERN Multimedia

    CERN, Bulletin

    2015-01-01

    On Friday, 9 October, TEDxCERN brought together 14 ‘rule-breakers’ to explore ideas that push beyond the boundaries of academia. They addressed a full house of 600 audience members, as well as thousands watching the event online.

  17. 13 Rules That Expire

    Science.gov (United States)

    Karp, Karen S.; Bush, Sarah B.; Dougherty, Barbara J.

    2014-01-01

    Overgeneralizing commonly accepted strategies, using imprecise vocabulary, and relying on tips and tricks that do not promote conceptual mathematical understanding can lead to misunderstanding later in students' math careers. In this article, the authors present thirteen pervasive mathematics rules that "expire." With the…

  18. Staff rules and regulations

    CERN Document Server

    HR Department

    2007-01-01

    The 11th edition of the Staff Rules and Regulations, dated 1 January 2007, adopted by the Council and the Finance Committee in December 2006, is currently being distributed to departmental secretariats. The Staff Rules and Regulations, together with a summary of the main modifications made, will be available, as from next week, on the Human Resources Department's intranet site: http://cern.ch/hr-web/internal/admin_services/rules/default.asp The main changes made to the Staff Rules and Regulations stem from the five-yearly review of employment conditions of members of the personnel. The changes notably relate to: the categories of members of the personnel (e.g. removal of the local staff category); the careers structure and the merit recognition system; the non-residence, installation and re-installation allowances; the definition of family, family allowances and family-related leave; recognition of partnerships; education fees. The administrative circulars, some of which are being revised following the m...

  19. Post Rule of Law

    DEFF Research Database (Denmark)

    Carlson, Kerstin Bree

    2016-01-01

    addresses the practice of hybridity in ICP, drawing examples from the construction and evolution of hybrid procedure at the International Criminal Tribunal for the Former Yugoslavia (ICTY), to argue that the hybridity practiced by international criminal tribunals renders them ‘post rule of law’ institutions...

  20. Comment concerning Leonardo's rule

    CERN Document Server

    Sotolongo-Costa, O; Oseguera-Manzanilla, T; Díaz-Guerrero, D S

    2016-01-01

    In this comment we propose a novel explanation for the Leonardo's rule concerning the tree branching. According to Leonardo's notebooks he observed that if one observes the branches of a tree, the squared radius of the principal branch is equal to the sum of the squared radius of the branch daughters.

  1. Willpower and Personal Rules.

    Science.gov (United States)

    Benabou, Roland; Tirole, Jean

    2004-01-01

    We develop a theory of internal commitments or "personal rules" based on self-reputation over one's willpower, which transforms lapses into precedents that undermine future self-restraint. The foundation for this mechanism is the imperfect recall of past motives and feelings, leading people to draw inferences from their past actions. The degree of…

  2. Building Grounded Theory in Entrepreneurship Research

    DEFF Research Database (Denmark)

    Mäkelä, Markus; Turcan, Romeo V.

    2007-01-01

    In this chapter we describe the process of building of theory from data (Glaser and Strauss 1967; Strauss and Corbin 1998). We discuss current grounded theory in relation to research in entrepreneurship and point out directions and potential improvements for further research in this field...... our approach to grounded theory, we acknowledge the existence of other approaches and try to locate our approach in relation to them. As an important part of this discussion, we take a stand on how to usefully define ‘grounded theory’ and ‘case study research’. Second, we seek to firmly link our....... The chapter has two goals. First, we wish to provide an explicit paradigmatic positioning of the grounded theory methodology, discussing the most relevant views of ontology and epistemology that can be used as alternative starting points for conducting grounded theory research. While the chapter introduces...

  3. TESTS AND METHODOLOGIES FOR THE SURVEY OF NARROW SPACES

    Directory of Open Access Journals (Sweden)

    L. Perfetti

    2017-02-01

    Full Text Available The research illustrated in this article aimed at identifying a good standard methodology to survey very narrow spaces during 3D investigation of Cultural Heritage. It is an important topic in today’s era of BIM modelling applied to Cultural Heritage. Spaces like staircases, corridors and passages are very common in the architectural or archaeological fields, and obtaining a 3D-oriented survey of those areas can be a very complex task when completeness of the model and high precision are requested. Photogrammetry appears to be the most promising solution in terms of versatility and manoeuvrability also considering the quality of the required data. Fisheye lenses were studied and tested in depth because of their significant advantage in the field of view if compared with rectilinear lenses. This advantage alone can be crucial to reduce the total amount of photos and, as a consequence, to obtain manageable data, to simplify the survey phase and to significantly reduce the elaboration time. In order to overcome the main issue that arise when using fisheye lenses, which is the lack of rules that can be employed to design the survey, a general mathematical formulation to precisely estimate the GSD (Ground Sampling Distance for every optical projection is presented here. A complete survey of a real complex case study was performed in order to test and stress the proposed methodology, and to handle a fisheye-based survey from beginning to end: the photogrammetric survey of the Minguzzi Staircase. It is a complex service spiral-staircase located in the Duomo di Milano with a total height of 25 meters and characterized by a narrow walkable space about 70 centimetres wide.

  4. 78 FR 803 - Revisions to Electric Reliability Organization Definition of Bulk Electric System and Rules of...

    Science.gov (United States)

    2013-01-04

    ...; Philip D. Moeller, John R. Norris, and Cheryl A. LaFleur. FINAL RULE (Issued December 20, 2012) Table of...\\ Holland Comments at 2. 35. Some commenters oppose approval on various grounds. For example, NARUC is...

  5. Constructing New Theory for Identifying Students with Emotional Disturbance: A Constructivist Approach to Grounded Theory

    OpenAIRE

    2012-01-01

    A grounded theory study that examined how practitioners in a county alternative and correctional education setting identify youth with emotional and behavioral difficulties for special education services provides an exemplar for a constructivist approach to grounded theory methodology. Discussion focuses on how a constructivist orientation to grounded theory methodology informed research decisions, shaped the development of the emergent grounded theory, and prompted a way of thinking about da...

  6. Bayesian Learning and the Psychology of Rule Induction

    Science.gov (United States)

    Endress, Ansgar D.

    2013-01-01

    In recent years, Bayesian learning models have been applied to an increasing variety of domains. While such models have been criticized on theoretical grounds, the underlying assumptions and predictions are rarely made concrete and tested experimentally. Here, I use Frank and Tenenbaum's (2011) Bayesian model of rule-learning as a case study to…

  7. Allocating SMART Reliability and Maintainability Goals to NASA Ground Systems

    Science.gov (United States)

    Gillespie, Amanda; Monaghan, Mark

    2013-01-01

    This paper will describe the methodology used to allocate Reliability and Maintainability (R&M) goals to Ground Systems Development and Operations (GSDO) subsystems currently being designed or upgraded.

  8. Simple rules guide dragonfly migration

    Science.gov (United States)

    Wikelski, Martin; Moskowitz, David; Adelman, James S; Cochran, Jim; Wilcove, David S; May, Michael L

    2006-01-01

    Every year billions of butterflies, dragonflies, moths and other insects migrate across continents, and considerable progress has been made in understanding population-level migratory phenomena. However, little is known about destinations and strategies of individual insects. We attached miniaturized radio transmitters (ca 300 mg) to the thoraxes of 14 individual dragonflies (common green darners, Anax junius) and followed them during their autumn migration for up to 12 days, using receiver-equipped Cessna airplanes and ground teams. Green darners exhibited distinct stopover and migration days. On average, they migrated every 2.9±0.3 days, and their average net advance was 58±11 km in 6.1±0.9 days (11.9±2.8 km d−1) in a generally southward direction (186±52°). They migrated exclusively during the daytime, when wind speeds were less than 25 km h−1, regardless of wind direction, but only after two nights of successively lower temperatures (decrease of 2.1±0.6 °C in minimum temperature). The migratory patterns and apparent decision rules of green darners are strikingly similar to those proposed for songbirds, and may represent a general migration strategy for long-distance migration of organisms with high self-propelled flight speeds. PMID:17148394

  9. Verification of business rules programs

    CERN Document Server

    Silva, Bruno Berstel-Da

    2013-01-01

    Rules represent a simplified means of programming, congruent with our understanding of human brain constructs. With the advent of business rules management systems, it has been possible to introduce rule-based programming to nonprogrammers, allowing them to map expert intent into code in applications such as fraud detection, financial transactions, healthcare, retail, and marketing. However, a remaining concern is the quality, safety, and reliability of the resulting programs.  This book is on business rules programs, that is, rule programs as handled in business rules management systems. Its

  10. "Naturalist Inquiry" and Grounded Theory

    Directory of Open Access Journals (Sweden)

    Barney G. Glaser

    2004-01-01

    Full Text Available The world of Qualitative Data Analysis (QDA methodology became quite taken with LINCOLN and GUBA's book "Naturalist Inquiry" (1985. I have no issue with it with respect to its application to QDA; it helped clarify and advance so many QDA issues. However, its application to Grounded Theory (GT has been a major block on GT, as originated, by its cooptation and corruption hence remodeling of GT by default. LINCOLN and GUBA have simply assumed GT is just another QDA method, which it is not. In "The Grounded Theory Perspective II" (GLASER 2002a, Chapter 9 on credibility, I have discussed "Naturalist In­quiry" (NI thought regarding how LINCOLN and GUBA's notion of "trustworthy" data (or worrisome data orientation and how their view of constant comparison can and has remodeled and eroded GT. In this paper I will consider other aspects of NI that remodel GT. URN: urn:nbn:de:0114-fqs040170

  11. Stop. Write! Writing Grounded Theory

    Directory of Open Access Journals (Sweden)

    Barney G. Glaser, PhD, Hon. PhD

    2012-06-01

    Full Text Available The message in this book, the dictum in this book, is to stop and write when the Grounded Theory (GT methodology puts you in that ready position. Stop unending conceptualization, unending data coverage, and unending listening to others who would egg you on with additional data, ideas and/or requirements or simply wait too long. I will discuss these ideas in detail. My experience with PhD candidates is that for the few who write when ready, many do not and SHOULD. Simply put, many write-up, but many more should.

  12. The ATLAS SCT grounding and shielding concept and implementation

    CERN Document Server

    Bates, RL; Bernabeu, J; Bizzell, J; Bohm, J; Brenner, R; Bruckman de Renstrom, P A; Catinaccio, A; Cindro, V; Ciocio, A; Civera, J V; Chouridou, S; Dervan, P; Dick, B; Dolezal, Z; Eklund, L; Feld, L; Ferrere, D; Gadomski, S; Gonzalez, F; Gornicki, E; Greenhall, A; Grillo, A A; Grosse-Knetter, J; Gruwe, M; Haywood, S; Hessey, N P; Ikegami, Y; Jones, T J; Kaplon, J; Kodys, P; Kohriki, T; Kondo, T; Koperny, S; Lacasta, C; Lozano Bahilo, J; Malecki, P; Martinez-McKinney, F; McMahon, S J; McPherson, A; Mikulec, B; Mikus, M; Moorhead, G F; Morrissey, M C; Nagai, K; Nichols, A; O'Shea, V; Pater, J R; Peeters, S J M; Pernegger, H; Perrin, E; Phillips, P W; Pieron, J P; Roe, S; Sanchez, J; Spencer, E; Stastny, J; Tarrant, J; Terada, S; Tyndel, M; Unno, Y; Wallny, R; Weber, M; Weidberg, A R; Wells, P S; Werneke, P; Wilmut, I

    2012-01-01

    This paper describes the design and implementation of the grounding and shielding system for the ATLAS SemiConductor Tracker (SCT). The mitigation of electromagnetic interference and noise pickup through power lines is the critical design goal as they have the potential to jeopardize the electrical performance. We accomplish this by adhering to the ATLAS grounding rules, by avoiding ground loops and isolating the different subdetectors. Noise sources are identified and design rules to protect the SCT against them are described. A rigorous implementation of the design was crucial to achieve the required performance. This paper highlights the location, connection and assembly of the different components that affect the grounding and shielding system: cables, filters, cooling pipes, shielding enclosure, power supplies and others. Special care is taken with the electrical properties of materials and joints. The monitoring of the grounding system during the installation period is also discussed. Finally, after con...

  13. Testing the reduction rule

    CERN Document Server

    Hegerfeldt, G C

    2011-01-01

    The reduction rule, also known as the projection postulate, specifies the state after an ideal measurement. There are two forms, the original rule of von Neumann and a nowadays mostly used modification thereof due to L\\"uders, but sometimes also attributed to von Neumann. However, which form applies depends on the details of the measuring apparatus. Here we therefore consider the following problem: Given an ensemble of systems in an unknown pure or mixed state, an observable $\\hat A$ and an apparatus which performs a measurement of $\\hat A$ on the ensemble, but whose detailed working is unknown ('black box'), how can one test whether the apparatus performs a L\\"uders or von Neumann measurement?

  14. Chaos Rules Revisited

    Directory of Open Access Journals (Sweden)

    David Murphy

    2011-11-01

    Full Text Available About 20 years ago, while lost in the midst of my PhD research, I mused over proposed titles for my thesis. I was pretty pleased with myself when I came up with Chaos Rules (the implied double meaning was deliberate, or more completely, Chaos Rules: An Exploration of the Work of Instructional Designers in Distance Education. I used the then-emerging theories of chaos and complexity to underpin my analysis. So it was with more than a little excitement that I read the call for contributions to this special issue of IRRODL. What follows is a walk-through of my thesis with an emphasis on the contribution of chaos and complexity theory.

  15. New Games, New Rules

    DEFF Research Database (Denmark)

    Constantiou, Ioanna; Kallinikos, Jannis

    2015-01-01

    , the usefulness of big data rests on their steady updatability, a condition that reduces the time span within which this data is useful or relevant. Jointly, these attributes challenge established rules of strategy making as these are manifested in the canons of procuring structured information of lasting value...... the wider social and institutional context of longstanding data practices and the significance they carry for management and organizations....

  16. Real Rules of Inference

    Science.gov (United States)

    1986-01-01

    Robert Nozick’s acceptance rule [Noz81], Acc(h) iff Prob(K I h) > = I- e and Prob(K I -h) <= e and Prob(h) > Prob(K I -h), for small e: and I.J. Good and...Solving," IJCAI 1983. [Noz8ll Nozick , R. Philosophical Explanations, Harvard, 1981. (Nut84] Nute, D. "Logical Relations," Philosophical Studies 46, 1984

  17. Modal extension rule

    Institute of Scientific and Technical Information of China (English)

    WU Xia; SUN Jigui; LIN Hai; FENG Shasha

    2005-01-01

    Modal logics are good candidates for a formal theory of agents. The efficiency of reasoning method in modal logics is very important, because it determines whether or not the reasoning method can be widely used in systems based on agent. In this paper,we modify the extension rule theorem proving method we presented before, and then apply it to P-logic that is translated from modal logic by functional transformation. At last, we give the proof of its soundness and completeness.

  18. Ground motion estimation and nonlinear seismic analysis

    Energy Technology Data Exchange (ETDEWEB)

    McCallen, D.B.; Hutchings, L.J.

    1995-08-14

    Site specific predictions of the dynamic response of structures to extreme earthquake ground motions are a critical component of seismic design for important structures. With the rapid development of computationally based methodologies and powerful computers over the past few years, engineers and scientists now have the capability to perform numerical simulations of many of the physical processes associated with the generation of earthquake ground motions and dynamic structural response. This paper describes application of a physics based, deterministic, computational approach for estimation of earthquake ground motions which relies on site measurements of frequently occurring small (i.e. M < 3 ) earthquakes. Case studies are presented which illustrate application of this methodology for two different sites, and nonlinear analyses of a typical six story steel frame office building are performed to illustrate the potential sensitivity of nonlinear response to site conditions and proximity to the causative fault.

  19. Reliability Centered Maintenance - Methodologies

    Science.gov (United States)

    Kammerer, Catherine C.

    2009-01-01

    Journal article about Reliability Centered Maintenance (RCM) methodologies used by United Space Alliance, LLC (USA) in support of the Space Shuttle Program at Kennedy Space Center. The USA Reliability Centered Maintenance program differs from traditional RCM programs because various methodologies are utilized to take advantage of their respective strengths for each application. Based on operational experience, USA has customized the traditional RCM methodology into a streamlined lean logic path and has implemented the use of statistical tools to drive the process. USA RCM has integrated many of the L6S tools into both RCM methodologies. The tools utilized in the Measure, Analyze, and Improve phases of a Lean Six Sigma project lend themselves to application in the RCM process. All USA RCM methodologies meet the requirements defined in SAE JA 1011, Evaluation Criteria for Reliability-Centered Maintenance (RCM) Processes. The proposed article explores these methodologies.

  20. Scenario development methodologies

    Energy Technology Data Exchange (ETDEWEB)

    Eng, T. [Swedish Nuclear Fuel and Waste Management Co., Stockholm (Sweden); Hudson, J. [Rock Engineering Consultants, Welwyn Garden City, Herts (United Kingdom); Stephansson, O. [Royal Inst. of Tech., Stockholm (Sweden). Div. of Engineering Geology; Skagius, K.; Wiborgh, M. [Kemakta, Stockholm (Sweden)

    1994-11-01

    In the period 1981-1994, SKB has studied several methodologies to systematize and visualize all the features, events and processes (FEPs) that can influence a repository for radioactive waste in the future. All the work performed is based on the terminology and basic findings in the joint SKI/SKB work on scenario development presented in the SKB Technical Report 89-35. The methodologies studied are (a) Event tree analysis, (b) Influence diagrams and (c) Rock Engineering Systems (RES) matrices. Each one of the methodologies is explained in this report as well as examples of applications. One chapter is devoted to a comparison between the two most promising methodologies, namely: Influence diagrams and the RES methodology. In conclusion a combination of parts of the Influence diagram and the RES methodology is likely to be a promising approach. 26 refs.

  1. LANGUAGE POLICY AND METHODOLOGY

    Directory of Open Access Journals (Sweden)

    Antony J. Liddicoat

    2004-06-01

    Full Text Available The implementation of a language policy is crucially associated with questions of methodology. This paper explores approaches to language policy, approaches to methodology and the impact that these have on language teaching practice. Language policies can influence decisions about teaching methodologies either directly, by making explicit recommendations about the methods to be used in classroom practice, or indirectly, through the conceptualisation of language leaming which underlies the policy. It can be argued that all language policies have the potential to influence teaching methodologies indirectly and that those policies which have explicit recommendations about methodology are actually functioning of two levels. This allows for the possibility of conflict between the direct and indirect dimensions of the policy which results from an inconsistency between the explicitly recommended methodology and the underlying conceptualisation of language teaching and learning which informs the policy.

  2. Consistence of Network Filtering Rules

    Institute of Scientific and Technical Information of China (English)

    SHE Kun; WU Yuancheng; HUANG Juncai; ZHOU Mingtian

    2004-01-01

    The inconsistence of firewall/VPN(Virtual Private Network) rule makes a huge maintainable cost.With development of Multinational Company,SOHO office,E-government the number of firewalls/VPN will increase rapidly.Rule table in stand-alone or network will be increased in geometric series accordingly.Checking the consistence of rule table manually is inadequate.A formal approach can define semantic consistence,make a theoretic foundation of intelligent management about rule tables.In this paper,a kind of formalization of host rules and network ones for auto rule-validation based on SET theory were proporsed and a rule validation scheme was defined.The analysis results show the superior performance of the methods and demonstrate its potential for the intelligent management based on rule tables.

  3. REx: An Efficient Rule Generator

    CERN Document Server

    Kamruzzaman, S M

    2010-01-01

    This paper describes an efficient algorithm REx for generating symbolic rules from artificial neural network (ANN). Classification rules are sought in many areas from automatic knowledge acquisition to data mining and ANN rule extraction. This is because classification rules possess some attractive features. They are explicit, understandable and verifiable by domain experts, and can be modified, extended and passed on as modular knowledge. REx exploits the first order information in the data and finds shortest sufficient conditions for a rule of a class that can differentiate it from patterns of other classes. It can generate concise and perfect rules in the sense that the error rate of the rules is not worse than the inconsistency rate found in the original data. An important feature of rule extraction algorithm, REx, is its recursive nature. They are concise, comprehensible, order insensitive and do not involve any weight values. Extensive experimental studies on several benchmark classification problems, s...

  4. Ground water and energy

    Energy Technology Data Exchange (ETDEWEB)

    1980-11-01

    This national workshop on ground water and energy was conceived by the US Department of Energy's Office of Environmental Assessments. Generally, OEA needed to know what data are available on ground water, what information is still needed, and how DOE can best utilize what has already been learned. The workshop focussed on three areas: (1) ground water supply; (2) conflicts and barriers to ground water use; and (3) alternatives or solutions to the various issues relating to ground water. (ACR)

  5. RuleML-Based Learning Object Interoperability on the Semantic Web

    Science.gov (United States)

    Biletskiy, Yevgen; Boley, Harold; Ranganathan, Girish R.

    2008-01-01

    Purpose: The present paper aims to describe an approach for building the Semantic Web rules for interoperation between heterogeneous learning objects, namely course outlines from different universities, and one of the rule uses: identifying (in)compatibilities between course descriptions. Design/methodology/approach: As proof of concept, a rule…

  6. Open verification methodology cookbook

    CERN Document Server

    Glasser, Mark

    2009-01-01

    Functional verification is an art as much as a science. It requires not only creativity and cunning, but also a clear methodology to approach the problem. The Open Verification Methodology (OVM) is a leading-edge methodology for verifying designs at multiple levels of abstraction. It brings together ideas from electrical, systems, and software engineering to provide a complete methodology for verifying large scale System-on-Chip (SoC) designs. OVM defines an approach for developing testbench architectures so they are modular, configurable, and reusable. This book is designed to help both novic

  7. Structure Function Sum rules for Systems with Large Scattering Lengths

    CERN Document Server

    Goldberger, Walter D

    2010-01-01

    We use a dispersion relation in conjunction with the operator product expansion (OPE) to derive model independent sum rules for the dynamic structure functions of systems with large scattering lengths. We present an explicit sum rule for the structure functions that control the density and spin response of the many-body ground state. Our methods are general, and apply to either fermions or bosons which interact through two-body contact interactions with large scattering lengths. By employing a Borel transform of the OPE, the relevant integrals are weighted towards infrared frequencies, thus allowing for greater overlap low energy data. Similar sum rules can be derived for other response functions. The sum rules can be used to extract the contact parameter introduced by Tan, including universality violating corrections at finite scattering lengths.

  8. Modifications of Team Sports Rules.

    Science.gov (United States)

    Rokosz, Francis M.

    In general, there are two reasons for modifying the rules in sport activities: (1) to meet a specific objective or (2) to solve a perceived problem. The sense of the original game is usually not altered significantly because the number of rule changes is kept to a minimum. Changes in rules may be made for administrative or financial reasons, or to…

  9. Making sense of grounded theory in medical education.

    Science.gov (United States)

    Kennedy, Tara J T; Lingard, Lorelei A

    2006-02-01

    Grounded theory is a research methodology designed to develop, through collection and analysis of data that is primarily (but not exclusively) qualitative, a well-integrated set of concepts that provide a theoretical explanation of a social phenomenon. This paper aims to provide an introduction to key features of grounded theory methodology within the context of medical education research. In this paper we include a discussion of the origins of grounded theory, a description of key methodological processes, a comment on pitfalls encountered commonly in the application of grounded theory research, and a summary of the strengths of grounded theory methodology with illustrations from the medical education domain. The significant strengths of grounded theory that have resulted in its enduring prominence in qualitative research include its clearly articulated analytical process and its emphasis on the generation of pragmatic theory that is grounded in the data of experience. When applied properly and thoughtfully, grounded theory can address research questions of significant relevance to the domain of medical education.

  10. Establishing the common ground in European psychotraumatology

    OpenAIRE

    Şar, Vedat

    2015-01-01

    INVITED EDITORIAL Establishing the common ground in European psychotraumatology The chief ethical rule is the following: thou shalt not have antifragility at the expense of the fragility of others. (Taleb, 2012) Europe is nicely complex; that is, rich and full of diversity. Lessons learned from the painful past are immense (Betancourt, 2015) together with a healthy anxiety about the future. One may perceive Europe as the most pros-perous, peaceful, and safest par...

  11. Discretion versus rules in fiscal and monetary policies

    Directory of Open Access Journals (Sweden)

    T.G. Savchenko

    2015-09-01

    Full Text Available The article studies the dilemma of the application of discretionary measures and rules (automatic mechanisms in economic policy. On the basis of international experience the author has made the conclusion about the active use of fiscal and monetary rules in the practice of state regulation of economic processes in foreign countries. Ukrainian fiscal and monetary policies are based on the application of discretionary measures and this reduces the possibility of prediction and is not conducive to building trust on the part of economic agents. The paper presents the methodological approach to the analysis of the contradictions in economic policy. Such contradictions underlie determinants in economic policy development. According to the results of the approach mentioned above the author proves the necessity of the explicit monetary rules for the National Bank of Ukraine and the need to improve the effectiveness of the fiscal (budgetary rules in Ukraine.

  12. Case Study Research Methodology in Nursing Research.

    Science.gov (United States)

    Cope, Diane G

    2015-11-01

    Through data collection methods using a holistic approach that focuses on variables in a natural setting, qualitative research methods seek to understand participants' perceptions and interpretations. Common qualitative research methods include ethnography, phenomenology, grounded theory, and historic research. Another type of methodology that has a similar qualitative approach is case study research, which seeks to understand a phenomenon or case from multiple perspectives within a given real-world context.

  13. Grounded theory and leadership research: A critical realist perspective

    OpenAIRE

    2011-01-01

    The methodology of grounded theory has great potential to contribute to our understanding of leadership within particular substantive contexts. However, our notions of good science might constrain these contributions. We argue that for grounded theorists a tension might exist between a desire to create a contextualised theory of leadership and a desire for scientifically justified issues of validity and generalizable theory. We also explore how the outcome of grounded theory research can crea...

  14. Data Centric Development Methodology

    Science.gov (United States)

    Khoury, Fadi E.

    2012-01-01

    Data centric applications, an important effort of software development in large organizations, have been mostly adopting a software methodology, such as a waterfall or Rational Unified Process, as the framework for its development. These methodologies could work on structural, procedural, or object oriented based applications, but fails to capture…

  15. Creativity in phenomenological methodology

    DEFF Research Database (Denmark)

    Dreyer, Pia; Martinsen, Bente; Norlyk, Annelise

    2014-01-01

    on the methodologies of van Manen, Dahlberg, Lindseth & Norberg, the aim of this paper is to argue that the increased focus on creativity and arts in research methodology is valuable to gain a deeper insight into lived experiences. We illustrate this point through examples from empirical nursing studies, and discuss...

  16. The Methodology of Magpies

    Science.gov (United States)

    Carter, Susan

    2014-01-01

    Arts/Humanities researchers frequently do not explain methodology overtly; instead, they "perform" it through their use of language, textual and historic cross-reference, and theory. Here, methodologies from literary studies are shown to add to Higher Education (HE) an exegetical and critically pluralist approach. This includes…

  17. Menopause and Methodological Doubt

    Science.gov (United States)

    Spence, Sheila

    2005-01-01

    Menopause and methodological doubt begins by making a tongue-in-cheek comparison between Descartes' methodological doubt and the self-doubt that can arise around menopause. A hermeneutic approach is taken in which Cartesian dualism and its implications for the way women are viewed in society are examined, both through the experiences of women…

  18. The Methodology of Magpies

    Science.gov (United States)

    Carter, Susan

    2014-01-01

    Arts/Humanities researchers frequently do not explain methodology overtly; instead, they "perform" it through their use of language, textual and historic cross-reference, and theory. Here, methodologies from literary studies are shown to add to Higher Education (HE) an exegetical and critically pluralist approach. This includes…

  19. VEM: Virtual Enterprise Methodology

    DEFF Research Database (Denmark)

    Tølle, Martin; Vesterager, Johan

    2003-01-01

    This chapter presents a virtual enterprise methodology (VEM) that outlines activities to consider when setting up and managing virtual enterprises (VEs). As a methodology the VEM helps companies to ask the right questions when preparing for and setting up an enterprise network, which works...

  20. VEM: Virtual Enterprise Methodology

    DEFF Research Database (Denmark)

    Tølle, Martin; Vesterager, Johan

    2003-01-01

    This chapter presents a virtual enterprise methodology (VEM) that outlines activities to consider when setting up and managing virtual enterprises (VEs). As a methodology the VEM helps companies to ask the right questions when preparing for and setting up an enterprise network, which works as a b...

  1. Menopause and Methodological Doubt

    Science.gov (United States)

    Spence, Sheila

    2005-01-01

    Menopause and methodological doubt begins by making a tongue-in-cheek comparison between Descartes' methodological doubt and the self-doubt that can arise around menopause. A hermeneutic approach is taken in which Cartesian dualism and its implications for the way women are viewed in society are examined, both through the experiences of women…

  2. Rapid Dialogue Prototyping Methodology

    NARCIS (Netherlands)

    Bui Huu Trung, B.H.T.; Sojka, P.; Rajman, M.; Kopecek, I.; Melichar, M.; Pala, K.

    2004-01-01

    This paper is about the automated production of dialogue models. The goal is to propose and validate a methodology that allows the production of finalized dialogue models (i.e. dialogue models specific for given applications) in a few hours. The solution we propose for such a methodology, called the

  3. Description logic rules

    CERN Document Server

    Krötzsch, M

    2010-01-01

    Ontological modelling today is applied in many areas of science and technology,including the Semantic Web. The W3C standard OWL defines one of the most important ontology languages based on the semantics of description logics. An alternative is to use rule languages in knowledge modelling, as proposed in the W3C's RIF standard. So far, it has often been unclear how to combine both technologies without sacrificing essential computational properties. This book explains this problem and presents new solutions that have recently been proposed. Extensive introductory chapters provide the necessary

  4. Service dogs. Final rule.

    Science.gov (United States)

    2012-09-05

    The Department of Veterans Affairs (VA) amends its regulations concerning veterans in need of service dogs. Under this final rule, VA will provide to veterans with visual, hearing, or mobility impairments benefits to support the use of a service dog as part of the management of such impairments. The benefits include assistance with veterinary care, travel benefits associated with obtaining and training a dog, and the provision, maintenance, and replacement of hardware required for the dog to perform the tasks necessary to assist such veterans.

  5. Web Personalization of Indian e-Commerce Websites using Classification Methodologies

    Directory of Open Access Journals (Sweden)

    Agarwal Devendera

    2010-11-01

    Full Text Available The paper highlights the classification methodologies using Bayesian Rule for Indian e-commerce websites. It deals with generating cluster of users having fraudulent intentions. Secondly, it also focuses on Bayesian Ontology Requirement for efficient Possibilistic Outcomes.

  6. [Thoughts regarding researchers utilizing Grounded Theory].

    Science.gov (United States)

    Leite, Joséte Luzia; da Silva, Laura Johanson; de Oliveira, Rosane Mara Pontes; Stipp, Marluci Andrade Conceição

    2012-06-01

    This descriptive-reflexive study was performed with the objective to present the characteristics of researchers who use the Grounded Theory method, and outline the development of aptitudes for the researcher to become a Grounded Theoretician. The theoretical discussion was based on the frameworks of this methodology and supported by the literature. The article presents the main demands of qualitative studies using Grounded Theory, and important behaviors, attitudes and characteristics developed by the researchers. It is concluded that learning about Grounded Theory involves more than operationalizing a group of procedures and techniques. It also involves facing challenges to change one's attitude as a researcher and develop new ways of thinking and researching, gathering knowledge based on data to form a theory.

  7. Lean methodology: an evidence-based practice approach for healthcare improvement.

    Science.gov (United States)

    Johnson, Pauline M; Patterson, Claire J; OʼConnell, Mary P

    2013-12-10

    Lean methodology, an evidence-based practice approach adopted from Toyota, is grounded on the pillars of respect for people and continuous improvement. This article describes the use of Lean methodology to improve healthcare outcomes for patients with community-acquired pneumonia. Nurse practitioners and other clinicians should be knowledgeable about this methodology and become leaders in Lean transformation.

  8. Evolution of Decision Rules Used for IT Portfolio Management: An Inductive Approach

    Science.gov (United States)

    Karhade, Prasanna P.; Shaw, Michael J.; Subramanyam, Ramanath

    IT portfolio management and the related planning decisions for IT-dependent initiatives are critical to organizational performance. Building on the logic of appropriateness theoretical framework, we define an important characteristic of decision rules used during IT portfolio planning; rule appropriateness with regards to the risk-taking criterion. We propose that rule appropriateness will be an important factor explaining the evolution of rules over time. Using an inductive learning methodology, we analyze a unique dataset of actual IT portfolio planning decisions spanning two consecutive years within one organization. We present systematic comparative analysis of the evolution of rules used in planning over two years to validate our research proposition. We find that rules that were inappropriate in the first year are being redefined to design appropriate rules for use in the second year. Our work provides empirical evidence demonstrating organizational learning and improvements in IT portfolio planning capabilities.

  9. How to do a grounded theory study: a worked example of a study of dental practices

    Directory of Open Access Journals (Sweden)

    Evans R

    2011-09-01

    Full Text Available Abstract Background Qualitative methodologies are increasingly popular in medical research. Grounded theory is the methodology most-often cited by authors of qualitative studies in medicine, but it has been suggested that many 'grounded theory' studies are not concordant with the methodology. In this paper we provide a worked example of a grounded theory project. Our aim is to provide a model for practice, to connect medical researchers with a useful methodology, and to increase the quality of 'grounded theory' research published in the medical literature. Methods We documented a worked example of using grounded theory methodology in practice. Results We describe our sampling, data collection, data analysis and interpretation. We explain how these steps were consistent with grounded theory methodology, and show how they related to one another. Grounded theory methodology assisted us to develop a detailed model of the process of adapting preventive protocols into dental practice, and to analyse variation in this process in different dental practices. Conclusions By employing grounded theory methodology rigorously, medical researchers can better design and justify their methods, and produce high-quality findings that will be more useful to patients, professionals and the research community.

  10. The Product and Quotient Rules Revisited

    Science.gov (United States)

    Eggleton, Roger; Kustov, Vladimir

    2011-01-01

    Mathematical elegance is illustrated by strikingly parallel versions of the product and quotient rules of basic calculus, with some applications. Corresponding rules for second derivatives are given: the product rule is familiar, but the quotient rule is less so.

  11. Analysis of the tensor-tensor type scalar tetraquark states with QCD sum rules

    CERN Document Server

    Wang, Zhi-Gang

    2016-01-01

    In this article, we study the ground states and the first radial excited states of the tensor-tensor type scalar hidden-charm tetraquark states with the QCD sum rules. We separate the ground state contributions from the first radial excited state contributions unambiguously, and obtain the QCD sum rules for the ground states and the first radial excited states, respectively. Then we search for the Borel parameters and continuum threshold parameters according to four criteria and obtain the masses of the tensor-tensor type scalar hidden-charm tetraquark states, which can be confronted to the experimental data in the future.

  12. ONTOLOGY BASED DATA MINING METHODOLOGY FOR DISCRIMINATION PREVENTION

    Directory of Open Access Journals (Sweden)

    Nandana Nagabhushana

    2014-09-01

    Full Text Available Data Mining is being increasingly used in the field of automation of decision making processes, which involve extraction and discovery of information hidden in large volumes of collected data. Nonetheless, there are negative perceptions like privacy invasion and potential discrimination which contribute as hindrances to the use of data mining methodologies in software systems employing automated decision making. Loan granting, Employment, Insurance Premium calculation, Admissions in Educational Institutions etc., can make use of data mining to effectively prevent human biases pertaining to certain attributes like gender, nationality, race etc. in critical decision making. The proposed methodology prevents discriminatory rules ensuing due to the presence of certain information regarding sensitive discriminatory attributes in the data itself. Two aspects of novelty in the proposal are, first, the rule mining technique based on ontologies and the second, concerning generalization and transformation of the mined rules that are quantized as discriminatory, into non-discriminatory ones.

  13. Doing Formal Grounded Theory: A review

    Directory of Open Access Journals (Sweden)

    Tom Andrews PhD

    2007-06-01

    Full Text Available This is the latest in a family of Grounded Theory books by Glaser that continue to build on previous work and make the methodology much more explicit. Its purpose is quite simply to provide Grounded Theory researchers with a set of procedures that can be followed to generate a Formal Grounded Theory (FGT. Despite several chapters in previous books that deal with generating formal grounded theory it has been given scant attention by researchers and this book aims to reverse this. It brings together and synthesises these previous writings in one book and seeks to specify much more clearly what is meant by a formal grounded theory. As with other more recent books by Glaser, this one is based on data in that the procedures outlined are come from previously generated formal grounded theories. However, Glaser cautions that this is based on limited data since not many FGTs exist yet and as more are generated, the method will become more explicit. The book has been eagerly anticipated by grounded theorists and it does not disappoint.

  14. A Belief Rule-Based Expert System to Diagnose Influenza

    DEFF Research Database (Denmark)

    Hossain, Mohammad Shahadat; Khalid, Md. Saifuddin; Akter, Shamima

    2014-01-01

    ). The RIMER approach can handle different types of uncertainties, both in knowledge representation, and in inference procedures. The knowledge-base of this system was constructed by using records of the real patient data along with in consultation with the Influenza specialists of Bangladesh. Practical case......, development and application of an expert system to diagnose influenza under uncertainty. The recently developed generic belief rule-based inference methodology by using the evidential reasoning (RIMER) approach is employed to develop this expert system, termed as Belief Rule Based Expert System (BRBES...... studies were used to validate the BRBES. The system generated results are effective and reliable than from manual system in terms of accuracy....

  15. Empirically derived neighbourhood rules for urban land-use modelling

    DEFF Research Database (Denmark)

    Hansen, Henning Sten

    2012-01-01

    interaction between neighbouring land uses is an important component in urban cellular automata. Nevertheless, this component is often calibrated through trial-and-error estimation. The aim of this project has been to develop an empirically derived landscape metric supporting cellular-automata-based land......-use modelling. Through access to very detailed urban land-use data it has been possible to derive neighbourhood rules empirically, and test their sensitivity to the land-use classification applied, the regional variability of the rules, and their time variance. The developed methodology can be implemented...

  16. Design Methodology - Design Synthesis

    DEFF Research Database (Denmark)

    Andreasen, Mogens Myrup

    2003-01-01

    ABSTRACT Design Methodology shall be seen as our understanding of how to design; it is an early (emerging late 60ies) and original articulation of teachable and learnable methodics. The insight is based upon two sources: the nature of the designed artefacts and the nature of human designing. Today...... Design Methodology is part of our practice and our knowledge about designing, and it has been strongly supported by the establishing and work of a design research community. The aim of this article is to broaden the reader¿s view of designing and Design Methodology. This is done by sketching...... the development of Design Methodology through time and sketching some important approaches and methods. The development is mainly forced by changing industrial condition, by the growth of IT support for designing, but also by the growth of insight into designing created by design researchers....

  17. Update of Part 61 Impacts Analysis Methodology. Methodology report. Volume 1

    Energy Technology Data Exchange (ETDEWEB)

    Oztunali, O.I.; Roles, G.W.

    1986-01-01

    Under contract to the US Nuclear Regulatory Commission, the Envirosphere Company has expanded and updated the impacts analysis methodology used during the development of the 10 CFR Part 61 rule to allow improved consideration of the costs and impacts of treatment and disposal of low-level waste that is close to or exceeds Class C concentrations. The modifications described in this report principally include: (1) an update of the low-level radioactive waste source term, (2) consideration of additional alternative disposal technologies, (3) expansion of the methodology used to calculate disposal costs, (4) consideration of an additional exposure pathway involving direct human contact with disposed waste due to a hypothetical drilling scenario, and (5) use of updated health physics analysis procedures (ICRP-30). Volume 1 of this report describes the calculational algorithms of the updated analysis methodology.

  18. Unanimity rule on networks

    Science.gov (United States)

    Lambiotte, Renaud; Thurner, Stefan; Hanel, Rudolf

    2007-10-01

    We present a model for innovation, evolution, and opinion dynamics whose spreading is dictated by a unanimity rule. The underlying structure is a directed network, the state of a node is either activated or inactivated. An inactivated node will change only if all of its incoming links come from nodes that are activated, while an activated node will remain activated forever. It is shown that a transition takes place depending on the initial condition of the problem. In particular, a critical number of initially activated nodes is necessary for the whole system to get activated in the long-time limit. The influence of the degree distribution of the nodes is naturally taken into account. For simple network topologies we solve the model analytically; the cases of random and small world are studied in detail. Applications for food-chain dynamics and viral marketing are discussed.

  19. Maximizing Health or Sufficient Capability in Economic Evaluation? A Methodological Experiment of Treatment for Drug Addiction.

    Science.gov (United States)

    Goranitis, Ilias; Coast, Joanna; Day, Ed; Copello, Alex; Freemantle, Nick; Frew, Emma

    2016-11-17

    Conventional practice within the United Kingdom and beyond is to conduct economic evaluations with "health" as evaluative space and "health maximization" as the decision-making rule. However, there is increasing recognition that this evaluative framework may not always be appropriate, and this is particularly the case within public health and social care contexts. This article presents a methodological case study designed to explore the impact of changing the evaluative space within an economic evaluation from health to capability well-being and the decision-making rule from health maximization to the maximization of sufficient capability. Capability well-being is an evaluative space grounded on Amartya Sen's capability approach and assesses well-being based on individuals' ability to do and be the things they value in life. Sufficient capability is an egalitarian approach to decision making that aims to ensure everyone in society achieves a normatively sufficient level of capability well-being. The case study is treatment for drug addiction, and the cost-effectiveness of 2 psychological interventions relative to usual care is assessed using data from a pilot trial. Analyses are undertaken from a health care and a government perspective. For the purpose of the study, quality-adjusted life years (measured using the EQ-5D-5L) and years of full capability equivalent and years of sufficient capability equivalent (both measured using the ICECAP-A [ICEpop CAPability measure for Adults]) are estimated. The study concludes that different evaluative spaces and decision-making rules have the potential to offer opposing treatment recommendations. The implications for policy makers are discussed.

  20. Transparent Guideline Methodology Needed

    DEFF Research Database (Denmark)

    Lidal, Ingeborg; Norén, Camilla; Mäkelä, Marjukka

    2013-01-01

    Group.2 Similar criteria for guideline quality have been suggested elsewhere.3 Our conclusion was that this much needed guideline is currently unclear about several aspects of the methodology used in developing the recommendations. This means potential users cannot be certain that the recommendations...... are based on best currently available evidence. Our concerns are in two main categories: the rigor of development, including methodology of searching, evaluating, and combining the evidence; and editorial independence, including funding and possible conflicts of interest....

  1. The biosphere rules.

    Science.gov (United States)

    Unruh, Gregory C

    2008-02-01

    Sustainability, defined by natural scientists as the capacity of healthy ecosystems to function indefinitely, has become a clarion call for business. Leading companies have taken high-profile steps toward achieving it: Wal-Mart, for example, with its efforts to reduce packaging waste, and Nike, which has removed toxic chemicals from its shoes. But, says Unruh, the director of Thunderbird's Lincoln Center for Ethics in Global Management, sustainability is more than an endless journey of incremental steps. It is a destination, for which the biosphere of planet Earth--refined through billions of years of trial and error--is a perfect model. Unruh distills some lessons from the biosphere into three rules: Use a parsimonious palette. Managers can rethink their sourcing strategies and dramatically simplify the number and types of materials their companies use in production, making recycling cost-effective. After the furniture manufacturer Herman Miller discovered that its leading desk chair had 200 components made from more than 800 chemical compounds, it designed an award-winning successor whose far more limited materials palette is 96% recyclable. Cycle up, virtuously. Manufacturers should design recovery value into their products at the outset. Shaw Industries, for example, recycles the nylon fiber from its worn-out carpet into brand-new carpet tile. Exploit the power of platforms. Platform design in industry tends to occur at the component level--but the materials in those components constitute a more fundamental platform. Patagonia, by recycling Capilene brand performance underwear, has achieved energy costs 76% below those for virgin sourcing. Biosphere rules can teach companies how to build ecologically friendly products that both reduce manufacturing costs and prove highly attractive to consumers. And managers need not wait for a green technological revolution to implement them.

  2. Health as expanding consciousness: a nursing perspective for grounded theory research.

    Science.gov (United States)

    Brown, Janet Witucki

    2011-07-01

    Margaret Newman's theory of health as expanding consciousness provides an excellent nursing perspective for nursing grounded theory research studies. Application of this nursing theory to grounded theory research provides a unitary-transformative paradigm perspective to the sociological underpinnings of grounded theory methodology. The fit between this particular nursing theory and grounded theory methodology is apparent when purpose, timing, process, and health outcomes of the two are compared. In this column, the theory of health as expanding consciousness is described and the theory's research as praxis methodology is compared to grounded theory methodology. This is followed by a description of how the theory of health as expanding consciousness can be utilized as a perspective for nursing grounded theory research.

  3. First-order formative rules

    OpenAIRE

    Fuhs, Carsten; Kop, C.

    2014-01-01

    This paper discusses the method of formative rules for first-order term rewriting, which was previously defined for a higher-order setting. Dual to the well-known usable rules, formative rules allow dropping some of the term constraints that need to be solved during a termination proof. Compared to the higher-order definition, the first-order setting allows for significant improvements of the technique.

  4. Remarks on kernel Bayes' rule

    OpenAIRE

    Johno, Hisashi; Nakamoto, Kazunori; Saigo, Tatsuhiko

    2015-01-01

    Kernel Bayes' rule has been proposed as a nonparametric kernel-based method to realize Bayesian inference in reproducing kernel Hilbert spaces. However, we demonstrate both theoretically and experimentally that the prediction result by kernel Bayes' rule is in some cases unnatural. We consider that this phenomenon is in part due to the fact that the assumptions in kernel Bayes' rule do not hold in general.

  5. Rule-Based Semantic Sensing

    CERN Document Server

    Woznowski, Przemyslaw

    2011-01-01

    Rule-Based Systems have been in use for decades to solve a variety of problems but not in the sensor informatics domain. Rules aid the aggregation of low-level sensor readings to form a more complete picture of the real world and help to address 10 identified challenges for sensor network middleware. This paper presents the reader with an overview of a system architecture and a pilot application to demonstrate the usefulness of a system integrating rules with sensor middleware.

  6. Admissibility of logical inference rules

    CERN Document Server

    Rybakov, VV

    1997-01-01

    The aim of this book is to present the fundamental theoretical results concerning inference rules in deductive formal systems. Primary attention is focused on: admissible or permissible inference rules the derivability of the admissible inference rules the structural completeness of logics the bases for admissible and valid inference rules. There is particular emphasis on propositional non-standard logics (primary, superintuitionistic and modal logics) but general logical consequence relations and classical first-order theories are also considered. The book is basically self-contained and

  7. Fluctuations in classical sum rules.

    Science.gov (United States)

    Elton, John R; Lakshminarayan, Arul; Tomsovic, Steven

    2010-10-01

    Classical sum rules arise in a wide variety of physical contexts. Asymptotic expressions have been derived for many of these sum rules in the limit of long orbital period (or large action). Although sum-rule convergence may well be exponentially rapid for chaotic systems in a global phase-space sense with time, individual contributions to the sums may fluctuate with a width which diverges in time. Our interest is in the global convergence of sum rules as well as their local fluctuations. It turns out that a simple version of a lazy baker map gives an ideal system in which classical sum rules, their corrections, and their fluctuations can be worked out analytically. This is worked out in detail for the Hannay-Ozorio sum rule. In this particular case the rate of convergence of the sum rule is found to be governed by the Pollicott-Ruelle resonances, and both local and global boundaries for which the sum rule may converge are given. In addition, the width of the fluctuations is considered and worked out analytically, and it is shown to have an interesting dependence on the location of the region over which the sum rule is applied. It is also found that as the region of application is decreased in size the fluctuations grow. This suggests a way of controlling the length scale of the fluctuations by considering a time dependent phase-space volume, which for the lazy baker map decreases exponentially rapidly with time.

  8. Taking-On: A Grounded Theory of Addressing Barriers in Task Completion

    Science.gov (United States)

    Austinson, Julie Ann

    2011-01-01

    This study of taking-on was conducted using classical grounded theory methodology (Glaser, 1978, 1992, 1998, 2001, 2005; Glaser & Strauss, 1967). Classical grounded theory is inductive, empirical, and naturalistic; it does not utilize manipulation or constrained time frames. Classical grounded theory is a systemic research method used to generate…

  9. Regional Shelter Analysis Methodology

    Energy Technology Data Exchange (ETDEWEB)

    Dillon, Michael B. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Dennison, Deborah [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Kane, Jave [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Walker, Hoyt [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Miller, Paul [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2015-08-01

    The fallout from a nuclear explosion has the potential to injure or kill 100,000 or more people through exposure to external gamma (fallout) radiation. Existing buildings can reduce radiation exposure by placing material between fallout particles and exposed people. Lawrence Livermore National Laboratory was tasked with developing an operationally feasible methodology that could improve fallout casualty estimates. The methodology, called a Regional Shelter Analysis, combines the fallout protection that existing buildings provide civilian populations with the distribution of people in various locations. The Regional Shelter Analysis method allows the consideration of (a) multiple building types and locations within buildings, (b) country specific estimates, (c) population posture (e.g., unwarned vs. minimally warned), and (d) the time of day (e.g., night vs. day). The protection estimates can be combined with fallout predictions (or measurements) to (a) provide a more accurate assessment of exposure and injury and (b) evaluate the effectiveness of various casualty mitigation strategies. This report describes the Regional Shelter Analysis methodology, highlights key operational aspects (including demonstrating that the methodology is compatible with current tools), illustrates how to implement the methodology, and provides suggestions for future work.

  10. Clarification of the Blurred Boundaries between Grounded Theory and Ethnography: Differences and Similarities

    Science.gov (United States)

    Aldiabat, Khaldoun; Le Navenec, Carol-Lynne

    2011-01-01

    There is confusion among graduate students about how to select the qualitative methodology that best fits their research question. Often this confusion arises in regard to making a choice between a grounded theory methodology and an ethnographic methodology. This difficulty may stem from the fact that these students do not have a clear…

  11. Computational Grounded Cognition: A New Alliance between Grounded Cognition and Computational Modeling

    Directory of Open Access Journals (Sweden)

    Giovanni ePezzulo

    2013-01-01

    Full Text Available Grounded theories assume that there is no central module for cognition. According to this view, all cognitive phenomena, including those considered the province of amodal cognition such as reasoning, numeric and language processing, are ultimately grounded in (and emerge from a variety of bodily, affective, perceptual and motor processes. The development and expression of cognition is constrained by the embodiment of cognitive agents and various contextual factors (physical and social in which they are immersed. The grounded framework has received numerous empirical confirmations. Still, there are very few explicit computational models that implement grounding in sensory, motor and affective processes as intrinsic to cognition, and demonstrate that grounded theories can mechanistically implement higher cognitive abilities. We propose a new alliance between grounded cognition and computational modeling towards a novel multidisciplinary enterprise: Computational Grounded Cognition. We clarify the defining features of this novel approach and emphasize the importance of using the methodology of Cognitive Robotics, which permits simultaneous consideration of multiple aspects of grounding, embodiment, and situatedness, showing how they constrain the development and expression of cognition.

  12. Tutorial on Modeling VAT Rules Using OWL-DL

    DEFF Research Database (Denmark)

    Nielsen, Morten Ib; Simonsen, Jakob Grue; Larsen, Ken Friis

    This paper reports on work in progress. We present a methodology for constructing an OWL-DL model of a subset of Danish VAT rules. It is our intention that domain experts without training in formal modeling or computer science should be able to create and maintain the model using our methodology....... In an ERP setting such a model could reduce the Total Cost of Ownership (TCO) and increase the quality of the system. We have selected OWL-DL because we believe that description logic is suited for modeling VAT rules due to the decidability of important inference problems that are key to the way we plan...... to use the model and because OWL-DL is relatively intuitive to use....

  13. Airport Ground Staff Scheduling

    DEFF Research Database (Denmark)

    Clausen, Tommy

    travels safely and efficiently through the airport. When an aircraft lands, a significant number of tasks must be performed by different groups of ground crew, such as fueling, baggage handling and cleaning. These tasks must be complete before the aircraft is able to depart, as well as check......-in and security services. These tasks are collectively known as ground handling, and are the major source of activity with airports. The business environments of modern airports are becoming increasingly competitive, as both airports themselves and their ground handling operations are changing to private...... ownership. As airports are in competition to attract airline routes, efficient and reliable ground handling operations are imperative for the viability and continued growth of both airports and airlines. The increasing liberalization of the ground handling market prompts ground handling operators...

  14. A Functional HAZOP Methodology

    DEFF Research Database (Denmark)

    Liin, Netta; Lind, Morten; Jensen, Niels

    2010-01-01

    A HAZOP methodology is presented where a functional plant model assists in a goal oriented decomposition of the plant purpose into the means of achieving the purpose. This approach leads to nodes with simple functions from which the selection of process and deviation variables follow directly....... The functional HAZOP methodology lends itself directly for implementation into a computer aided reasoning tool to perform root cause and consequence analysis. Such a tool can facilitate finding causes and/or consequences far away from the site of the deviation. A functional HAZOP assistant is proposed...... and investigated in a HAZOP study of an industrial scale Indirect Vapor Recompression Distillation pilot Plant (IVaRDiP) at DTU-Chemical and Biochemical Engineering. The study shows that the functional HAZOP methodology provides a very efficient paradigm for facilitating HAZOP studies and for enabling reasoning...

  15. Generalisation of the Ostwald's rule

    Science.gov (United States)

    Anisimov, M. P.; Petrova-Bogdanova, O. O.

    2013-05-01

    More than century ago, Ostwald (1897) [1] manifested a rule, which, according his assumption, was a general principle for any process in nature. The rule was formulated by Ostwald and named as a rule of stages with is giving a formation sequence of different phases. Schmelzer et al. [2] generalized this rule of stages for nucleation. In the present research the dynamics of phase transformations is considered using the recent idea (Anisimov et al. [3]) for a semiempirical design of the nucleation rate surfaces. The current generalization of the Ostwald's rule for the formation sequences of phases is proved on the basis of common results on nucleation studies. It is very natural to think that any new phase formation (not prohibited by thermodynamic constrains) can be realized for each channel of phase transformation even the probability of some kind of the new phases is low. The relative efficiency of each channel can be dramatically changed when parameters of process are varied.

  16. Ground Vehicle Robotics

    Science.gov (United States)

    2013-08-20

    Ground Vehicle Robotics Jim Parker Associate Director, Ground Vehicle Robotics UNCLASSIFIED: Distribution Statement A. Approved for public...DATE 20 AUG 2013 2. REPORT TYPE Briefing Charts 3. DATES COVERED 09-05-2013 to 15-08-2013 4. TITLE AND SUBTITLE Ground Vehicle Robotics 5a...Willing to take Risk on technology -User Evaluated -Contested Environments -Operational Data Applied Robotics for Installation & Base Ops -Low Risk

  17. Changing methodologies in TESOL

    CERN Document Server

    Spiro, Jane

    2013-01-01

    Covering core topics from vocabulary and grammar to teaching, writing speaking and listening, this textbook shows you how to link research to practice in TESOL methodology. It emphasises how current understandings have impacted on the language classroom worldwide and investigates the meaning of 'methods' and 'methodology' and the importance of these for the teacher: as well as the underlying assumptions and beliefs teachers bring to bear in their practice. By introducing you to language teaching approaches, you will explore the way these are influenced by developments in our understanding of l

  18. Methodology for research I.

    Science.gov (United States)

    Garg, Rakesh

    2016-09-01

    The conduct of research requires a systematic approach involving diligent planning and its execution as planned. It comprises various essential predefined components such as aims, population, conduct/technique, outcome and statistical considerations. These need to be objective, reliable and in a repeatable format. Hence, the understanding of the basic aspects of methodology is essential for any researcher. This is a narrative review and focuses on various aspects of the methodology for conduct of a clinical research. The relevant keywords were used for literature search from various databases and from bibliographies of the articles.

  19. Intel design for manufacturing and evolution of design rules

    Science.gov (United States)

    Webb, Clair

    2008-03-01

    The difficult issues in continuing Moore's law with the lack of improvement in lithography resolution are well known. 1, 2, 3 Design rules have to change and DFM methodology has to continue to improve to enable Moore's law scaling. This paper will discuss our approach to DFM though co-optimization across design and process. The poly layer is used to show how rules have changed to meet patterning requirements and how co-optimization has been used to define the poly design rules. With the introduction and ramp of several products on our 45nm technology, we have shown our ability to meet the goals of Moore's law scaling at high yields in volume manufacturing on a two year cycle.

  20. Review and Application of Ship Collision and Grounding Analysis Procedures

    DEFF Research Database (Denmark)

    Pedersen, Preben Terndrup

    2010-01-01

    It is the purpose of the paper to present a review of prediction and analysis tools for collision and grounding analyses and to outline a probabilistic procedure for which these tools can be used by the maritime industry to develop performance based rules to reduce the risk associated with human...

  1. Chinese and Western Interpretations of China's "Peaceful Development" Discourse: A Rule-Oriented Constructivist Perspective

    Directory of Open Access Journals (Sweden)

    Jing Jing

    2014-12-01

    Full Text Available This paper offers a rule-oriented constructivist perspective on understanding the distinct Chinese and western interpretations of China's "peaceful development" discourse framework. It takes the correlations of discourse, rules and rule initiated by Nicolas Onuf as an analytical tool to identify the discrepancies between and within the Chinese and western patterns of discourse, rules and rule on this issue. Critical analyses of Chinese and western discourse are provided as a source for understanding the lack of trust between China and the West on China's "peaceful development". This methodology, which synthesizes the rule-oriented constructivist perspective and concrete discourse analysis, is an innovated attempt to implement the conventional positivist perspective on this issue.

  2. Reduced rule base self-tuning fuzzy PI controller for TCSC

    Energy Technology Data Exchange (ETDEWEB)

    Hameed, Salman; Das, Biswarup; Pant, Vinay [Department of Electrical Engineering, Indian Institute of Technology, Roorkee, Roorkee - 247 667, Uttarakhand (India)

    2010-11-15

    In this paper, a reduced rule base self-tuning fuzzy PI controller (STFPIC) for thyristor controlled series capacitor (TCSC) is proposed. Essentially, a STFPIC consists of two fuzzy logic controllers (FLC). In this work, for each FLC, 49 rules have been used and as a result, the overall complexity of the STFPIC increases substantially. To reduce this complexity, application of singular value decomposition (SVD) based rule reduction technique is also proposed in this paper. By applying this methodology, the number of rules in each FLC has been reduced from 49 to 9. Therefore, the proposed rule base reduction technique reduces the total number of rules in the STFPIC by almost 80% (from 49 x 2 = 98 to 9 x 2 = 18), thereby reducing the complexity of the STFPIC significantly. The feasibility of the proposed algorithm has been tested on 2-area 4-machine power system and 10-machine 39-bus system through detailed digital simulation using MATLAB/SIMULINK. (author)

  3. Seismic hazard methodology for the Central and Eastern United States. Volume 1: methodology. Final report

    Energy Technology Data Exchange (ETDEWEB)

    McGuire, R.K.; Veneziano, D.; Toro, G.; O' Hara, T.; Drake, L.; Patwardhan, A.; Kulkarni, R.; Kenney, R.; Winkler, R.; Coppersmith, K.

    1986-07-01

    A methodology to estimate the hazard of earthquake ground motion at a site has been developed. The methodology consists of systematic procedures to characterize earthquake sources, the seismicity parameters of those sources, and functions for the attenuation of seismic energy, incorporating multiple input interpretations by earth scientists. Uncertainties reflecting permissible alternative inperpretations are quantified by use of probability logic trees and are propagated through the hazard results. The methodology is flexible and permits, for example, interpretations of seismic sources that are consistent with earth-science practice in the need to depict complexity and to accommodate alternative hypotheses. This flexibility is achieved by means of a tectonic framework interpretation from which alternative seismic sources are derived. To estimate rates of earthquake recurrence, maximum use is made of the historical earthquake database in establishing a uniform measure of earthquake size, in identifying independent events, and in detemining the completeness of the earthquake record in time, space, and magnitude. Procedures developed as part of the methodology permit relaxation of the usual assumption of homogeneous seismicity within a source and provide unbiased estimates of recurrence parameters. The methodology incorporates the Poisson-exponential earthquake recurrence model and an extensive assessment of its applicability is provided. Finally, the methodology includes procedures to aggregate hazard results from a number of separate input interpretations to obtain a best-estimate value of hazard, together with its uncertainty, at a site.

  4. Pollutant Ground Concentrations of Nonneutrally Buoyant Particles.

    Science.gov (United States)

    Mandel, Alon; Stern, Eli; Ullmann, Amos; Brauner, Neima

    2017-03-23

    A methodology is suggested for the estimation of the mass density and the cumulative ground deposition of a nonvolatile, nonneutrally buoyant, air pollutant (liquid or solid) released from a polluted column (following an explosion caused during routine operation in, e.g., the chemical industry or due to any kind of hostile act) and deposited on the ground via gravitational settling. In many cases, the deposited mass due to gravitational settling constitutes a significant fraction of the original inventory released from the source. Implementation of the methodology in preliminary risk assessments can serve as an efficient tool for emergency planning for both immediate and long-term measures such as evacuation and decontamination. The methodology considers, inter alia, an estimation of the critical particle diameter, particle size, and mass distributions along the polluted column. This methodology was developed to apply in rural regions since proper application of relevant meteorological input data can be accomplished mainly for such areas. © 2017 Society for Risk Analysis.

  5. The methodological cat

    Directory of Open Access Journals (Sweden)

    Marin Dinu

    2014-03-01

    Full Text Available Economics understands action as having the connotation of here and now, the proof being that it excessively uses, for explicative purposes, two limitations of sense: space is seen as the place with a private destination (through the cognitive dissonance of methodological individualism, and time is seen as the short term (through the dystopia of rational markets.

  6. Video: Modalities and Methodologies

    Science.gov (United States)

    Hadfield, Mark; Haw, Kaye

    2012-01-01

    In this article, we set out to explore what we describe as the use of video in various modalities. For us, modality is a synthesizing construct that draws together and differentiates between the notion of "video" both as a method and as a methodology. It encompasses the use of the term video as both product and process, and as a data collection…

  7. Methodological Advances in Dea

    NARCIS (Netherlands)

    L. Cherchye (Laurens); G.T. Post (Thierry)

    2001-01-01

    textabstractWe survey the methodological advances in DEA over the last 25 years and discuss the necessary conditions for a sound empirical application. We hope this survey will contribute to the further dissemination of DEA, the knowledge of its relative strengths and weaknesses, and the tools

  8. 9 CFR 11.3 - Scar rule.

    Science.gov (United States)

    2010-01-01

    ... 9 Animals and Animal Products 1 2010-01-01 2010-01-01 false Scar rule. 11.3 Section 11.3 Animals... WELFARE HORSE PROTECTION REGULATIONS § 11.3 Scar rule. The scar rule applies to all horses born on or after October 1, 1975. Horses subject to this rule that do not meet the following scar rule...

  9. Pesticides in Ground Water

    DEFF Research Database (Denmark)

    Bjerg, Poul Løgstrup

    1996-01-01

    Review af: Jack E. Barbash & Elizabeth A. Resek (1996). Pesticides in Ground Water. Distribution trends and governing factors. Ann Arbor Press, Inc. Chelsea, Michigan. pp 588.......Review af: Jack E. Barbash & Elizabeth A. Resek (1996). Pesticides in Ground Water. Distribution trends and governing factors. Ann Arbor Press, Inc. Chelsea, Michigan. pp 588....

  10. Pesticides in Ground Water

    DEFF Research Database (Denmark)

    Bjerg, Poul Løgstrup

    1996-01-01

    Review af: Jack E. Barbash & Elizabeth A. Resek (1996). Pesticides in Ground Water. Distribution trends and governing factors. Ann Arbor Press, Inc. Chelsea, Michigan. pp 588.......Review af: Jack E. Barbash & Elizabeth A. Resek (1996). Pesticides in Ground Water. Distribution trends and governing factors. Ann Arbor Press, Inc. Chelsea, Michigan. pp 588....

  11. A Survey on Speech Enhancement Methodologies

    Directory of Open Access Journals (Sweden)

    Ravi Kumar. K

    2016-12-01

    Full Text Available Speech enhancement is a technique which processes the noisy speech signal. The aim of speech enhancement is to improve the perceived quality of speech and/or to improve its intelligibility. Due to its vast applications in mobile telephony, VOIP, hearing aids, Skype and speaker recognition, the challenges in speech enhancement have grown over the years. It is more challenging to suppress back ground noise that effects human communication in noisy environments like airports, road works, traffic, and cars. The objective of this survey paper is to outline the single channel speech enhancement methodologies used for enhancing the speech signal which is corrupted with additive background noise and also discuss the challenges and opportunities of single channel speech enhancement. This paper mainly focuses on transform domain techniques and supervised (NMF, HMM speech enhancement techniques. This paper gives frame work for developments in speech enhancement methodologies

  12. Genetic Programming for the Generation of Crisp and Fuzzy Rule Bases in Classification and Diagnosis of Medical Data

    DEFF Research Database (Denmark)

    Dounias, George; Tsakonas, Athanasios; Jantzen, Jan

    2002-01-01

    This paper demonstrates two methodologies for the construction of rule-based systems in medical decision making. The first approach consists of a method combining genetic programming and heuristic hierarchical rule-base construction. The second model is composed by a strongly-typed genetic progra...

  13. Using Modern Methodologies with Maintenance Software

    Science.gov (United States)

    Streiffert, Barbara A.; Francis, Laurie K.; Smith, Benjamin D.

    2014-01-01

    Jet Propulsion Laboratory uses multi-mission software produced by the Mission Planning and Sequencing (MPS) team to process, simulate, translate, and package the commands that are sent to a spacecraft. MPS works under the auspices of the Multi-Mission Ground Systems and Services (MGSS). This software consists of nineteen applications that are in maintenance. The MPS software is classified as either class B (mission critical) or class C (mission important). The scheduling of tasks is difficult because mission needs must be addressed prior to performing any other tasks and those needs often spring up unexpectedly. Keeping track of the tasks that everyone is working on is also difficult because each person is working on a different software component. Recently the group adopted the Scrum methodology for planning and scheduling tasks. Scrum is one of the newer methodologies typically used in agile development. In the Scrum development environment, teams pick their tasks that are to be completed within a sprint based on priority. The team specifies the sprint length usually a month or less. Scrum is typically used for new development of one application. In the Scrum methodology there is a scrum master who is a facilitator who tries to make sure that everything moves smoothly, a product owner who represents the user(s) of the software and the team. MPS is not the traditional environment for the Scrum methodology. MPS has many software applications in maintenance, team members who are working on disparate applications, many users, and is interruptible based on mission needs, issues and requirements. In order to use scrum, the methodology needed adaptation to MPS. Scrum was chosen because it is adaptable. This paper is about the development of the process for using scrum, a new development methodology, with a team that works on disparate interruptible tasks on multiple software applications.

  14. Using Modern Methodologies with Maintenance Software

    Science.gov (United States)

    Streiffert, Barbara A.; Francis, Laurie K.; Smith, Benjamin D.

    2014-01-01

    Jet Propulsion Laboratory uses multi-mission software produced by the Mission Planning and Sequencing (MPS) team to process, simulate, translate, and package the commands that are sent to a spacecraft. MPS works under the auspices of the Multi-Mission Ground Systems and Services (MGSS). This software consists of nineteen applications that are in maintenance. The MPS software is classified as either class B (mission critical) or class C (mission important). The scheduling of tasks is difficult because mission needs must be addressed prior to performing any other tasks and those needs often spring up unexpectedly. Keeping track of the tasks that everyone is working on is also difficult because each person is working on a different software component. Recently the group adopted the Scrum methodology for planning and scheduling tasks. Scrum is one of the newer methodologies typically used in agile development. In the Scrum development environment, teams pick their tasks that are to be completed within a sprint based on priority. The team specifies the sprint length usually a month or less. Scrum is typically used for new development of one application. In the Scrum methodology there is a scrum master who is a facilitator who tries to make sure that everything moves smoothly, a product owner who represents the user(s) of the software and the team. MPS is not the traditional environment for the Scrum methodology. MPS has many software applications in maintenance, team members who are working on disparate applications, many users, and is interruptible based on mission needs, issues and requirements. In order to use scrum, the methodology needed adaptation to MPS. Scrum was chosen because it is adaptable. This paper is about the development of the process for using scrum, a new development methodology, with a team that works on disparate interruptible tasks on multiple software applications.

  15. Pattern Discovery Using Association Rules

    Directory of Open Access Journals (Sweden)

    Ms Kiruthika M,

    2011-12-01

    Full Text Available The explosive growth of Internet has given rise to many websites which maintain large amount of user information. To utilize this information, identifying usage pattern of users is very important. Web usage mining is one of the processes of finding out this usage pattern and has many practical applications. Our paper discusses how association rules can be used to discover patterns in web usage mining. Our discussion starts with preprocessing of the given weblog, followed by clustering them and finding association rules. These rules provide knowledge that helps to improve website design, in advertising, web personalization etc.

  16. Communication, concepts and grounding.

    Science.gov (United States)

    van der Velde, Frank

    2015-02-01

    This article discusses the relation between communication and conceptual grounding. In the brain, neurons, circuits and brain areas are involved in the representation of a concept, grounding it in perception and action. In terms of grounding we can distinguish between communication within the brain and communication between humans or between humans and machines. In the first form of communication, a concept is activated by sensory input. Due to grounding, the information provided by this communication is not just determined by the sensory input but also by the outgoing connection structure of the conceptual representation, which is based on previous experiences and actions. The second form of communication, that between humans or between humans and machines, is influenced by the first form. In particular, a more successful interpersonal communication might require forms of situated cognition and interaction in which the entire representations of grounded concepts are involved.

  17. Stochastic ground motion simulation

    Science.gov (United States)

    Rezaeian, Sanaz; Xiaodan, Sun; Beer, Michael; Kougioumtzoglou, Ioannis A.; Patelli, Edoardo; Siu-Kui Au, Ivan

    2014-01-01

    Strong earthquake ground motion records are fundamental in engineering applications. Ground motion time series are used in response-history dynamic analysis of structural or geotechnical systems. In such analysis, the validity of predicted responses depends on the validity of the input excitations. Ground motion records are also used to develop ground motion prediction equations(GMPEs) for intensity measures such as spectral accelerations that are used in response-spectrum dynamic analysis. Despite the thousands of available strong ground motion records, there remains a shortage of records for large-magnitude earthquakes at short distances or in specific regions, as well as records that sample specific combinations of source, path, and site characteristics.

  18. Ground energy coupling

    Science.gov (United States)

    Metz, P. D.

    The feasibility of ground coupling for various heat pump systems was investigated. Analytical heat flow models were developed to approximate design ground coupling devices for use in solar heat pump space conditioning systems. A digital computer program called GROCS (GRound Coupled Systems) was written to model 3-dimensional underground heat flow in order to simulate the behavior of ground coupling experiments and to provide performance predictions which have been compared to experimental results. GROCS also has been integrated with TRNSYS. Soil thermal property and ground coupling device experiments are described. Buried tanks, serpentine earth coils in various configurations, lengths and depths, and sealed vertical wells are being investigated. An earth coil used to heat a house without use of resistance heating is described.

  19. Methodological foundation on education content of fencing

    Directory of Open Access Journals (Sweden)

    Yermakov S.S.

    2010-05-01

    Full Text Available A concept is exposed maintenance of education. Approaches are certain in relation of forming maintenance of educational discipline «Fencing» on the program of physical education for the students of higher educational pedagogical establishments. Facilities of fencing are grounded. Methodological approaches which bind the blocks of maintenance of fencing to the components of forming of physical culture of future teachers are resulted. The model of educational discipline is created «Fencing». Influence of the author program is rotined on athletic-sporting activity, physical perfection, receipt of pedagogical knowledges, capabilities of pedagogical thought of students.

  20. Evaluating data worth for ground-water management under uncertainty

    Science.gov (United States)

    Wagner, B.J.

    1999-01-01

    A decision framework is presented for assessing the value of ground-water sampling within the context of ground-water management under uncertainty. The framework couples two optimization models-a chance-constrained ground-water management model and an integer-programing sampling network design model-to identify optimal pumping and sampling strategies. The methodology consists of four steps: (1) The optimal ground-water management strategy for the present level of model uncertainty is determined using the chance-constrained management model; (2) for a specified data collection budget, the monitoring network design model identifies, prior to data collection, the sampling strategy that will minimize model uncertainty; (3) the optimal ground-water management strategy is recalculated on the basis of the projected model uncertainty after sampling; and (4) the worth of the monitoring strategy is assessed by comparing the value of the sample information-i.e., the projected reduction in management costs-with the cost of data collection. Steps 2-4 are repeated for a series of data collection budgets, producing a suite of management/monitoring alternatives, from which the best alternative can be selected. A hypothetical example demonstrates the methodology's ability to identify the ground-water sampling strategy with greatest net economic benefit for ground-water management.A decision framework is presented for assessing the value of ground-water sampling within the context of ground-water management under uncertainty. The framework couples two optimization models - a chance-constrained ground-water management model and an integer-programming sampling network design model - to identify optimal pumping and sampling strategies. The methodology consists of four steps: (1) The optimal ground-water management strategy for the present level of model uncertainty is determined using the chance-constrained management model; (2) for a specified data collection budget, the monitoring

  1. Spatial variation in near-ground radiation and low temperature. Interactions with forest vegetation

    Energy Technology Data Exchange (ETDEWEB)

    Blennow, K.

    1997-10-01

    Low temperature has a large impact on the survival and distribution of plants. Interactive effects with high irradiance lead to cold-induced photo inhibition, which may impact on the establishment and growth of tree seedlings. In this thesis, novel approaches are applied for relating the spatial variability in low temperature and irradiance to photosynthetic performance and growth of tree seedlings, and for modelling the micro- and local-scale spatial variations in low temperature for heterogeneous terrain. The methodologies include the development and use of a digital image analysis system for hemispherical photographs, the use of Geographic Information Systems (GIS) and statistical methods, field data acquisition of meteorological elements, plant structure, growth and photosynthetic performance. Temperature and amounts of intercepted direct radiant energy for seedlings on clear days (IDRE) were related to chlorophyll a fluorescence, and the dry weight of seedlings. The combination of increased IDRE with reduced minimum temperatures resulted in persistent and strong photo inhibition as the season progressed, with likely implications for the establishment of tree seedlings at forest edges, and within shelter wood. For models of spatial distribution of low air temperature, the sky view factor was used to parameterize the radiative cooling, whilst drainage, ponding and stagnation of cold air, and thermal properties of the ground were all considered. The models hint at which scales and processes govern the development of spatial variations in low temperature for the construction of corresponding mechanistic models. The methodology is well suited for detecting areas that will be frost prone after clearing of forest and for comparing the magnitudes of impacts on low air temperature of forest management practices, such as shelter wood and soil preparation. The results can be used to formulate ground rules for use in practical forestry 141 refs, 5 figs, 1 tab

  2. Derivation of a tuberculosis screening rule for sub-Saharan African prisons.

    Science.gov (United States)

    Harris, J B; Siyambango, M; Levitan, E B; Maggard, K R; Hatwiinda, S; Foster, E M; Chamot, E; Kaunda, K; Chileshe, C; Krüüner, A; Henostroza, G; Reid, S E

    2014-07-01

    Lusaka Central Prison, Zambia. To derive screening rules for tuberculosis (TB) using data collected during a prison-wide TB and human immunodeficiency virus (HIV) screening program. We derived rules with two methodologies: logistic regression and classification and regression trees (C&RT). We evaluated the performance of the derived rules as well as existing World Health Organization (WHO) screening recommendations in our cohort of inmates, as measured by sensitivity, specificity, and positive and negative predictive values. The C&RT-derived rule recommended diagnostic testing of all inmates who were underweight (defined as body mass index [BMI] < 18.5 kg/m(2)] or HIV-infected; the C&RT-derived rule had 60% sensitivity and 71% specificity. The logistic regression-derived rule recommended diagnostic testing of inmates who were underweight, HIV-infected or had chest pain; the logistic regression-derived rule had 74% sensitivity and 57% specificity. Two of the WHO recommendations had sensitivities that were similar to our logistic regression rule but had poorer specificities, resulting in a greater testing burden. Low BMI and HIV infection were the most robust predictors of TB in our inmates; chest pain was additionally retained in one model. BMI and HIV should be further evaluated as the basis for TB screening rules for inmates, with modification as needed to improve the performance of the rules.

  3. 40 CFR 141.402 - Ground water source microbial monitoring and analytical methods.

    Science.gov (United States)

    2010-07-01

    ... Rule § 141.402 Ground water source microbial monitoring and analytical methods. (a) Triggered source water monitoring—(1) General requirements. A ground water system must conduct triggered source water... State, systems must submit for State approval a triggered source water monitoring plan that identifies...

  4. 40 CFR 141.404 - Treatment technique violations for ground water systems.

    Science.gov (United States)

    2010-07-01

    ... 40 Protection of Environment 22 2010-07-01 2010-07-01 false Treatment technique violations for ground water systems. 141.404 Section 141.404 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) WATER PROGRAMS (CONTINUED) NATIONAL PRIMARY DRINKING WATER REGULATIONS Ground Water Rule § 141.404 Treatment technique violations for...

  5. 14 CFR 91.1101 - Pilots: Initial, transition, and upgrade ground training.

    Science.gov (United States)

    2010-01-01

    ... 14 Aeronautics and Space 2 2010-01-01 2010-01-01 false Pilots: Initial, transition, and upgrade... RULES Fractional Ownership Operations Program Management § 91.1101 Pilots: Initial, transition, and upgrade ground training. Initial, transition, and upgrade ground training for pilots must include...

  6. Simplification Rules for Birdtrack Operators

    CERN Document Server

    Alckock-Zeilinger, Judith

    2016-01-01

    This paper derives a set of easy-to-use tools designed to simplify calculations with birdtrack op- erators comprised of symmetrizers and antisymmetrizers. In particular, we present cancellation rules allowing one to shorten the birdtrack expressions of operators, and propagation rules identifying the circumstances under which it is possible to propagate symmetrizers past antisymmetrizers and vice versa. We exhibit the power of these simplification rules by means of a short example in which we apply the tools derived in this paper on a typical operator that can be encountered in the representation theory of SU(N) over the product space $V^{\\otimes m}$ . These rules form the basis for the construction of compact Hermitian Young projection operators and their transition operators addressed in companion papers.

  7. General Anti-Avoidance Rule in Latvian Tax Law

    Directory of Open Access Journals (Sweden)

    Milana Belevica

    2016-03-01

    Full Text Available The tax law systems of the EU Member States differ strongly; one is based on the specific anti – avoidance provisions governed by the general principle of prohibition of abuse stated in court jurisprudence, the basement of the other is a written judicial rule which prohibits the abuse – general anti – avoidance rule. General anti-avoidance rules are needed because of conflicts of laws in the borders of one state as well the conflicts of different state’s jurisprudence. There is no legal definition of tax avoidance in the EU law nevertheless the notion of tax avoidance is firmly connected to the concept of abuse of law – a general principle of EU law which has got its prompt development in the resent tax case law of the Court of Justice of the European Union (CJEU. The UK practice is undoubtedly the positive example of methodologically precise legal ruling in the sphere of complicated abstract issues of abuse in tax law. This paper aims to describe the concept of general anti avoidance rule, comparing theoretical cognitions, regulation in Latvia and UK and also tax case law of the Court of Justice of the European Union.

  8. Methodology for EMC Evaluation.

    Science.gov (United States)

    1980-12-31

    phase filter will produce three times as much current as a 3- phase filter having the same line-to-ground capacitance. Conceivably, this current could...be more important than that from 3- phase filters . For the time being, we assume that single- phase filters are less than 1/3 as numerous as 3- phase ... filters , and, therefore, will contribute less structure current than 3- phase filters . 2.3 Radiated Emissions Radiated fields from power lines and cables

  9. A Novel Rule Induction Algorithm

    Institute of Scientific and Technical Information of China (English)

    ZHENG Jianguo; LIU Fang; WANG Lei; JIAO Licheng

    2001-01-01

    Knowledge discovery in databases is concerned with extracting useful information from databases, and the immune algorithm is a biological theory-based and globally searching algorithm. A specific immune algorithm is designed for discovering a few interesting, high-level prediction rules from databases, rather than discovering classification knowledge as usual in the literatures. Simulations show that this novel algorithm is able to improve the stability of the population, increase the holistic performance and make the rules extracted have higher precision.

  10. Probability Theory without Bayes' Rule

    OpenAIRE

    Rodriques, Samuel G.

    2014-01-01

    Within the Kolmogorov theory of probability, Bayes' rule allows one to perform statistical inference by relating conditional probabilities to unconditional probabilities. As we show here, however, there is a continuous set of alternative inference rules that yield the same results, and that may have computational or practical advantages for certain problems. We formulate generalized axioms for probability theory, according to which the reverse conditional probability distribution P(B|A) is no...

  11. Methodology, Meditation, and Mindfulness

    Directory of Open Access Journals (Sweden)

    Balveer Singh Sikh

    2016-04-01

    Full Text Available Understanding the nondualistic nature of mindfulness is a complex and challenging task particularly when most clinical psychology draws from Western methodologies and methods. In this article, we argue that the integration of philosophical hermeneutics with Eastern philosophy and practices may provide a methodology and methods to research mindfulness practice. Mindfulness hermeneutics brings together the nondualistically aligned Western philosophies of Heidegger and Gadamer and selected Eastern philosophies and practices in an effort to bridge the gap between these differing worldviews. Based on the following: (1 fusion of horizons, (2 being in a hermeneutic circle, (3 understanding as intrinsic to awareness, and (4 the ongoing practice of meditation, a mindfulness hermeneutic approach was used to illuminate deeper understandings of mindfulness practice in ways that are congruent with its underpinning philosophies.

  12. METHODOLOGICAL BASES OF OUTSOURCING

    Directory of Open Access Journals (Sweden)

    Lanskaya D. V.

    2014-09-01

    Full Text Available Outsourcing is investigated from a position of finding steady and unique competitive advantages of a public corporation due to attraction of carriers of unique intellectual and uses social capitals of the specialized companies within the institutional theory. Key researchers and events in the history of outsourcing are marked out, the existing approaches to definition of the concept of outsourcing, advantage and risks from application of technology of outsourcing are considered. It is established that differences of outsourcing, sub-contraction and cooperation are not in the nature of the functional relations, and in the depth of considered economic terms and phenomena. The methodology of outsourcing is considered as a part of methodology of cooperation of enterprise innovative structures of being formed sector of knowledge economy

  13. Transparent Guideline Methodology Needed

    DEFF Research Database (Denmark)

    Lidal, Ingeborg; Norén, Camilla; Mäkelä, Marjukka

    2013-01-01

    As part of learning at the Nordic Workshop of Evidence-based Medicine, we have read with interest the practice guidelines for central venous access, published in your Journal in 2012.1 We appraised the quality of this guideline using the checklist developed by The Evidence-Based Medicine Working ...... are based on best currently available evidence. Our concerns are in two main categories: the rigor of development, including methodology of searching, evaluating, and combining the evidence; and editorial independence, including funding and possible conflicts of interest....... Group.2 Similar criteria for guideline quality have been suggested elsewhere.3 Our conclusion was that this much needed guideline is currently unclear about several aspects of the methodology used in developing the recommendations. This means potential users cannot be certain that the recommendations...

  14. Soft Systems Methodology

    Science.gov (United States)

    Checkland, Peter; Poulter, John

    Soft systems methodology (SSM) is an approach for tackling problematical, messy situations of all kinds. It is an action-oriented process of inquiry into problematic situations in which users learn their way from finding out about the situation, to taking action to improve it. The learning emerges via an organised process in which the situation is explored using a set of models of purposeful action (each built to encapsulate a single worldview) as intellectual devices, or tools, to inform and structure discussion about a situation and how it might be improved. This paper, written by the original developer Peter Checkland and practitioner John Poulter, gives a clear and concise account of the approach that covers SSM's specific techniques, the learning cycle process of the methodology and the craft skills which practitioners develop. This concise but theoretically robust account nevertheless includes the fundamental concepts, techniques, core tenets described through a wide range of settings.

  15. Tobacco documents research methodology.

    Science.gov (United States)

    Anderson, Stacey J; McCandless, Phyra M; Klausner, Kim; Taketa, Rachel; Yerger, Valerie B

    2011-05-01

    Tobacco documents research has developed into a thriving academic enterprise since its inception in 1995. The technology supporting tobacco documents archiving, searching and retrieval has improved greatly since that time, and consequently tobacco documents researchers have considerably more access to resources than was the case when researchers had to travel to physical archives and/or electronically search poorly and incompletely indexed documents. The authors of the papers presented in this supplement all followed the same basic research methodology. Rather than leave the reader of the supplement to read the same discussion of methods in each individual paper, presented here is an overview of the methods all authors followed. In the individual articles that follow in this supplement, the authors present the additional methodological information specific to their topics. This brief discussion also highlights technological capabilities in the Legacy Tobacco Documents Library and updates methods for organising internal tobacco documents data and findings.

  16. Electronic ground state of Ni$_2^+$

    CERN Document Server

    Zamudio-Bayer, V; Bülow, C; Leistner, G; Terasaki, A; Issendorff, B v; Lau, J T

    2016-01-01

    The $^{4}\\Phi_{9/2}$ ground state of the Ni$_2^+$ diatomic molecular cation is determined experimentally from temperature and magnetic-field-dependent x-ray magnetic circular dichroism spectroscopy in a cryogenic ion trap, where an electronic and rotational temperature of $7.4 \\pm 0.2$ K was achieved by buffer gas cooling of the molecular ion. The contribution of the magnetic dipole term to the x-ray magnetic circular dichroism spin sum rule amounts to $7\\, T_z = 0.17 \\pm 0.06$ $\\mu_B$ per atom, approximately 11 \\% of the spin magnetic moment. We find that, in general, homonuclear diatomic molecular cations of $3d$ transition metals seem to adopt maximum spin magnetic moments in their electronic ground states.

  17. Strangeness in the baryon ground states

    CERN Document Server

    Semke, A

    2012-01-01

    We compute the strangeness content of the baryon ground states based on an analysis of recent lattice simulations of the BMW, PACS, LHPC and HSC groups for the pion-mass dependence of the baryon masses. Our results rely on the relativistic chiral Lagrangian and large-$N_c$ sum rule estimates of the counter terms relevant for the baryon masses at N$^3$LO. A partial summation is implied by the use of physical baryon and meson masses in the one-loop contributions to the baryon self energies. A simultaneous description of the lattice results of the BMW, LHPC, PACS and HSC groups is achieved. We predict the pion- and strangeness sigma terms and the pion-mass dependence of the octet and decuplet ground states at different strange quark masses.

  18. Land evaluation methodology

    OpenAIRE

    Lustig, Thomas

    1998-01-01

    This paper reviews non-computerised and computerised land evaluation methods or methodologies, and realises the difficulties to incorporate biophysical and socioeconomic factors from different levels. Therefore, this paper theorises an alternative land evaluation approach, which is tested and elaborated in an agricultural community in the North of Chile. The basis of the approach relies on holistic thinking and attempts to evaluate the potential for improving assumed unsustainable goat manage...

  19. Pipeline ADC Design Methodology

    OpenAIRE

    Zhao, Hui

    2012-01-01

    Demand for high-performance analog-to-digital converter (ADC) integrated circuits (ICs) with optimal combined specifications of resolution, sampling rate and power consumption becomes dominant due to emerging applications in wireless communications, broad band transceivers, digital-intermediate frequency (IF) receivers and countless of digital devices. This research is dedicated to develop a pipeline ADC design methodology with minimum power dissipation, while keeping relatively high speed an...

  20. Albert Einstein's Methodology

    OpenAIRE

    Weinstein, Galina

    2012-01-01

    This paper discusses Einstein's methodology. 1. Einstein characterized his work as a theory of principle and reasoned that beyond kinematics, the 1905 heuristic relativity principle could offer new connections between non-kinematical concepts. 2. Einstein's creativity and inventiveness and process of thinking; invention or discovery. 3. Einstein considered his best friend Michele Besso as a sounding board and his class-mate from the Polytechnic Marcel Grossman, as his active partner. Yet, Ein...

  1. Pipeline ADC Design Methodology

    OpenAIRE

    Zhao, Hui

    2012-01-01

    Demand for high-performance analog-to-digital converter (ADC) integrated circuits (ICs) with optimal combined specifications of resolution, sampling rate and power consumption becomes dominant due to emerging applications in wireless communications, broad band transceivers, digital-intermediate frequency (IF) receivers and countless of digital devices. This research is dedicated to develop a pipeline ADC design methodology with minimum power dissipation, while keeping relatively high speed an...

  2. Using the Chain Rule as the Key Link in Deriving the General Rules for Differentiation

    Science.gov (United States)

    Sprows, David

    2011-01-01

    The standard approach to the general rules for differentiation is to first derive the power, product, and quotient rules and then derive the chain rule. In this short article we give an approach to these rules which uses the chain rule as the main tool in deriving the power, product, and quotient rules in a manner which is more student-friendly…

  3. Demystifying Theoretical Sampling in Grounded Theory Research

    Directory of Open Access Journals (Sweden)

    Jenna Breckenridge BSc(Hons,Ph.D.Candidate

    2009-06-01

    Full Text Available Theoretical sampling is a central tenet of classic grounded theory and is essential to the development and refinement of a theory that is ‘grounded’ in data. While many authors appear to share concurrent definitions of theoretical sampling, the ways in which the process is actually executed remain largely elusive and inconsistent. As such, employing and describing the theoretical sampling process can present a particular challenge to novice researchers embarking upon their first grounded theory study. This article has been written in response to the challenges faced by the first author whilst writing a grounded theory proposal. It is intended to clarify theoretical sampling for new grounded theory researchers, offering some insight into the practicalities of selecting and employing a theoretical sampling strategy. It demonstrates that the credibility of a theory cannot be dissociated from the process by which it has been generated and seeks to encourage and challenge researchers to approach theoretical sampling in a way that is apposite to the core principles of the classic grounded theory methodology.

  4. [A commentary on the Ruling of the Tribuna Constitucional 212/1996 of 19 December 1996 (I)].

    Science.gov (United States)

    González Morán, L

    1998-01-01

    This article is a commentary on Spain's Constitutional Court's ruling of 19 December 1996 (STC 212/1996), on the challenge (596/89) on grounds of alleged unconstitutionality made against Law 42/1988, 28 December, which regulates the donation of human embryos and foetuses or the cells, tissues and organs therefrom. The article is structured as follows: it opens with a summary of Law 42/1988, since this is felt necessary to understand the subsequent challenge made on grounds of alleged unconstitutionality. We then provide specific details of the challenge and the resulting ruling, before concluding with some critical remarks on the aforementioned Law and ruling.

  5. Military Strategists are from Mars, Rule of Law Theorists are from Venus: Why Imposition of the Rule of Law Requires a Goldwater-Nichols Modeled Interagency Reform

    Science.gov (United States)

    2008-04-01

    ground for terrorists unless conditions are improved. 14 JAMES DOBBINS ET AL., THE BEGINNERS GUIDE TO NATION-BUILDING vi (2007) ("Western governments...principles and profits. What will it take for Russia to move beyond the Wild West capitalism to more orderly market economics? Developing the rule of

  6. Ground State Spin Logic

    CERN Document Server

    Whitfield, J D; Biamonte, J D

    2012-01-01

    Designing and optimizing cost functions and energy landscapes is a problem encountered in many fields of science and engineering. These landscapes and cost functions can be embedded and annealed in experimentally controllable spin Hamiltonians. Using an approach based on group theory and symmetries, we examine the embedding of Boolean logic gates into the ground state subspace of such spin systems. We describe parameterized families of diagonal Hamiltonians and symmetry operations which preserve the ground state subspace encoding the truth tables of Boolean formulas. The ground state embeddings of adder circuits are used to illustrate how gates are combined and simplified using symmetry. Our work is relevant for experimental demonstrations of ground state embeddings found in both classical optimization as well as adiabatic quantum optimization.

  7. Ground Vehicle Robotics Presentation

    Science.gov (United States)

    2012-08-14

    Mr. Jim Parker Associate Director Ground Vehicle Robotics Distribution Statement A. Approved for public release Report Documentation Page...Briefing 3. DATES COVERED 01-07-2012 to 01-08-2012 4. TITLE AND SUBTITLE Ground Vehicle Robotics Presentation 5a. CONTRACT NUMBER 5b. GRANT...ABSTRACT Provide Transition-Ready, Cost-Effective, and Innovative Robotics and Control System Solutions for Manned, Optionally-Manned, and Unmanned

  8. 75 FR 35856 - Self-Regulatory Organizations; Notice of Filing of Proposed Rule Change by New York Stock...

    Science.gov (United States)

    2010-06-23

    ... party.\\9\\ NYSE Rule 124 outlines the complex pricing formula used to determine the price of odd-lot..., 2009) (SR-NYSE-2009-27) (modification to pricing and execution methodology to execute odd- lot portion... the pricing methodology for the odd-lot portion of a PRL order and the systems capable of accepting...

  9. Constructing New Theory for Identifying Students with Emotional Disturbance: A Constructivist Approach to Grounded Theory

    Directory of Open Access Journals (Sweden)

    Dori Barnett

    2012-06-01

    Full Text Available A grounded theory study that examined how practitioners in a county alternative and correctional education setting identify youth with emotional and behavioral difficulties for special education services provides an exemplar for a constructivist approach to grounded theory methodology. Discussion focuses on how a constructivist orientation to grounded theory methodology informed research decisions, shaped the development of the emergent grounded theory, and prompted a way of thinking about data collection and analysis. Implications for future research directions and policy and practice in the field of special and alternative education are discussed.

  10. Developmental and Evolutionary Lexicon Acquisition in Cognitive Agents/Robots with Grounding Principle: A Short Review.

    Science.gov (United States)

    Rasheed, Nadia; Amin, Shamsudin H M

    2016-01-01

    Grounded language acquisition is an important issue, particularly to facilitate human-robot interactions in an intelligent and effective way. The evolutionary and developmental language acquisition are two innovative and important methodologies for the grounding of language in cognitive agents or robots, the aim of which is to address current limitations in robot design. This paper concentrates on these two main modelling methods with the grounding principle for the acquisition of linguistic ability in cognitive agents or robots. This review not only presents a survey of the methodologies and relevant computational cognitive agents or robotic models, but also highlights the advantages and progress of these approaches for the language grounding issue.

  11. The IASLC Lung Cancer Staging Project: Background Data and Proposals for the Application of TNM Staging Rules to Lung Cancer Presenting as Multiple Nodules with Ground Glass or Lepidic Features or a Pneumonic Type of Involvement in the Forthcoming Eighth Edition of the TNM Classification.

    Science.gov (United States)

    Detterbeck, Frank C; Marom, Edith M; Arenberg, Douglas A; Franklin, Wilbur A; Nicholson, Andrew G; Travis, William D; Girard, Nicolas; Mazzone, Peter J; Donington, Jessica S; Tanoue, Lynn T; Rusch, Valerie W; Asamura, Hisao; Rami-Porta, Ramón

    2016-05-01

    Application of tumor, node, and metastasis (TNM) classification is difficult in patients with lung cancer presenting as multiple ground glass nodules or with diffuse pneumonic-type involvement. Clarification of how to do this is needed for the forthcoming eighth edition of TNM classification. A subcommittee of the International Association for the Study of Lung Cancer Staging and Prognostic Factors Committee conducted a systematic literature review to build an evidence base regarding such tumors. An iterative process that included an extended workgroup was used to develop proposals for TNM classification. Patients with multiple tumors with a prominent ground glass component on imaging or lepidic component on microscopy are being seen with increasing frequency. These tumors are associated with good survival after resection and a decreased propensity for nodal and extrathoracic metastases. Diffuse pneumonic-type involvement in the lung is associated with a worse prognosis, but also with a decreased propensity for nodal and distant metastases. For multifocal ground glass/lepidic tumors, we propose that the T category be determined by the highest T lesion, with either the number of tumors or m in parentheses to denote the multifocal nature, and that a single N and M category be used for all the lesions collectively-for example, T1a(3)N0M0 or T1b(m)N0M0. For diffuse pneumonic-type lung cancer we propose that the T category be designated by size (or T3) if in one lobe, as T4 if involving an ipsilateral different lobe, or as M1a if contralateral and that a single N and M category be used for all pulmonary areas of involvement. Copyright © 2016 International Association for the Study of Lung Cancer. Published by Elsevier Inc. All rights reserved.

  12. Differing antidepressant maintenance methodologies.

    Science.gov (United States)

    Safer, Daniel J

    2017-10-01

    The principle evidence that antidepressant medication (ADM) is an effective maintenance treatment for adults with major depressive disorder (MDD) is from placebo substitution trials. These trials enter responders from ADM efficacy trials into randomized, double-blind placebo-controlled (RDBPC) effectiveness trials to measure the rate of MDD relapse over time. However, other randomized maintenance trial methodologies merit consideration and comparison. A systematic review of ADM randomized maintenance trials included research reports from multiple databases. Relapse rate was the main effectiveness outcome assessed. Five ADM randomized maintenance methodologies for MDD responders are described and compared for outcome. These effectiveness trials include: placebo-substitution, ADM/placebo extension, ADM extension, ADM vs. psychotherapy, and treatment as usual. The placebo-substitution trials for those abruptly switched to placebo resulted in unusually high (46%) rates of relapse over 6-12months, twice the continuing ADM rate. These trials were characterized by selective screening, high attrition, an anxious anticipation of a switch to placebo, and a risk of drug withdrawal symptoms. Selectively screened ADM efficacy responders who entered into 4-12month extension trials experienced relapse rates averaging ~10% with a low attrition rate. Non-industry sponsored randomized trials of adults with multiple prior MDD episodes who were treated with ADM maintenance for 1-2years experienced relapse rates averaging 40%. Placebo substitution trial methodology represents only one approach to assess ADM maintenance. Antidepressant maintenance research for adults with MDD should be evaluated for industry sponsorship, attrition, the impact of the switch to placebo, and major relapse differences in MDD subpopulations. Copyright © 2017. Published by Elsevier Inc.

  13. Failure detection system design methodology. Ph.D. Thesis

    Science.gov (United States)

    Chow, E. Y.

    1980-01-01

    The design of a failure detection and identification system consists of designing a robust residual generation process and a high performance decision making process. The design of these two processes are examined separately. Residual generation is based on analytical redundancy. Redundancy relations that are insensitive to modelling errors and noise effects are important for designing robust residual generation processes. The characterization of the concept of analytical redundancy in terms of a generalized parity space provides a framework in which a systematic approach to the determination of robust redundancy relations are developed. The Bayesian approach is adopted for the design of high performance decision processes. The FDI decision problem is formulated as a Bayes sequential decision problem. Since the optimal decision rule is incomputable, a methodology for designing suboptimal rules is proposed. A numerical algorithm is developed to facilitate the design and performance evaluation of suboptimal rules.

  14. Sample size methodology

    CERN Document Server

    Desu, M M

    2012-01-01

    One of the most important problems in designing an experiment or a survey is sample size determination and this book presents the currently available methodology. It includes both random sampling from standard probability distributions and from finite populations. Also discussed is sample size determination for estimating parameters in a Bayesian setting by considering the posterior distribution of the parameter and specifying the necessary requirements. The determination of the sample size is considered for ranking and selection problems as well as for the design of clinical trials. Appropria

  15. Literacy research methodologies

    CERN Document Server

    Duke, Nell K

    2012-01-01

    The definitive reference on literacy research methods, this book serves as a key resource for researchers and as a text in graduate-level courses. Distinguished scholars clearly describe established and emerging methodologies, discuss the types of questions and claims for which each is best suited, identify standards of quality, and present exemplary studies that illustrate the approaches at their best. The book demonstrates how each mode of inquiry can yield unique insights into literacy learning and teaching and how the methods can work together to move the field forward.   New to This Editi

  16. Internalism as Methodology

    Directory of Open Access Journals (Sweden)

    Terje Lohndal

    2009-10-01

    Full Text Available This paper scrutinizes the recent proposal made by Lassiter (2008 that the dichotomy between Chomskyan internalism and Dummett-type externalism is misguided and should be overcome by an approach that incorporates sociolinguistic concepts such as speakers’ dispositions to defer. We argue that Lassiter’s arguments are flawed and based on a serious misunder-standing of the internalist approach to the study of natural language, failing to appreciate its methodological nature and conclude that Lassiter’s socio-linguistic approach is just another instance of externalist attempts with little hope of scientific achievement.

  17. The New Methodology

    Institute of Scientific and Technical Information of China (English)

    2001-01-01

    In the past few years there's been a rapidly growing interest in“lightweight” methodologies. Alternatively characterized as an antidote to bureaucracy or a license to hack they've stirred up interest all over the software landscape. In this essay I explore the reasons for lightweight methods, focusing not so much on their weight but on their adaptive nature and their people-first orientation . I also give a summary and references to the processes in this school and consider the factors that should influence your choice of whether to go down this newly trodden path.

  18. Rule Versus the Causality Rule in Insurance Law

    DEFF Research Database (Denmark)

    Lando, Henrik

    it subjects the risk averse Buyer of insurance to less variance. This implies that the pro rata rule should apply when there is significant risk for a Buyer of unintentional misrepresentation, and when the incentive to intentionally misrepresent can be curtailed through frequent verification of the Buyer......'s true type. On the other hand, when the risk of unintentional misrepresentation is small, when verification is costly, and when the Buyer is sufficiently risk averse, the Buyer conceivably may be more effectively deterred from intentional misrepresentation under the causality rule. It is argued...

  19. Linking Symbolic Interactionism and Grounded Theory Methods in a Research Design

    Directory of Open Access Journals (Sweden)

    Jennifer Chamberlain-Salaun

    2013-09-01

    Full Text Available This article focuses on Corbin and Strauss’ evolved version of grounded theory. In the third edition of their seminal text, Basics of Qualitative Research: Techniques and Procedures for Developing Grounded Theory, the authors present 16 assumptions that underpin their conception of grounded theory methodology. The assumptions stem from a symbolic interactionism perspective of social life, including the themes of meaning, action and interaction, self and perspectives. As research design incorporates both methodology and methods, the authors aim to expose the linkages between the 16 assumptions and essential grounded theory methods, highlighting the application of the latter in light of the former. Analyzing the links between symbolic interactionism and essential grounded theory methods provides novice researchers and researchers new to grounded theory with a foundation from which to design an evolved grounded theory research study.

  20. A discussion of differences in preparation, performance and postreflections in participant observations within two grounded theory approaches

    DEFF Research Database (Denmark)

    Berthelsen, Connie Bøttcher; Lindhardt Damsgaard, Tove; Frederiksen, Kirsten

    2016-01-01

    researchers to experience activities and interactions directly in situ. However, using participant observations as a data collection method can be done in many ways, depending on the chosen grounded theory methodology, and may produce different results. This discussion shows that how the differences between......This paper presents a discussion of the differences in using participant observation as a data collection method by comparing the classic grounded theory methodology of Barney Glaser with the constructivist grounded theory methodology by Kathy Charmaz. Participant observations allow nursing...... using participant observations in classic and constructivist grounded theory can be considerable and that grounded theory researchers should adhere to the method descriptions of performing participant observations according to the selected grounded theory methodology to enhance the quality of research....

  1. A discussion of differences in preparation, performance and postreflections in participant observations within two grounded theory approaches.

    Science.gov (United States)

    Berthelsen, Connie Bøttcher; Lindhardt, Tove; Frederiksen, Kirsten

    2016-05-10

    This paper presents a discussion of the differences in using participant observation as a data collection method by comparing the classic grounded theory methodology of Barney Glaser with the constructivist grounded theory methodology by Kathy Charmaz. Participant observations allow nursing researchers to experience activities and interactions directly in situ. However, using participant observations as a data collection method can be done in many ways, depending on the chosen grounded theory methodology, and may produce different results. This discussion shows that how the differences between using participant observations in classic and constructivist grounded theory can be considerable and that grounded theory researchers should adhere to the method descriptions of performing participant observations according to the selected grounded theory methodology to enhance the quality of research.

  2. Medicare Program; Inpatient Rehabilitation Facility Prospective Payment System for Federal Fiscal Year 2017. Final rule.

    Science.gov (United States)

    2016-08-05

    This final rule will update the prospective payment rates for inpatient rehabilitation facilities (IRFs) for federal fiscal year (FY) 2017 as required by the statute. As required by section 1886(j)(5) of the Act, this rule includes the classification and weighting factors for the IRF prospective payment system's (IRF PPS's) case-mix groups and a description of the methodologies and data used in computing the prospective payment rates for FY 2017. This final rule also revises and updates quality measures and reporting requirements under the IRF quality reporting program (QRP).

  3. Generating rules with predicates, terms and variables from the pruned neural networks.

    Science.gov (United States)

    Nayak, Richi

    2009-05-01

    Artificial neural networks (ANN) have demonstrated good predictive performance in a wide range of applications. They are, however, not considered sufficient for knowledge representation because of their inability to represent the reasoning process succinctly. This paper proposes a novel methodology Gyan that represents the knowledge of a trained network in the form of restricted first-order predicate rules. The empirical results demonstrate that an equivalent symbolic interpretation in the form of rules with predicates, terms and variables can be derived describing the overall behaviour of the trained ANN with improved comprehensibility while maintaining the accuracy and fidelity of the propositional rules.

  4. Methodology for Evaluating an Adaptation of Evidence-Based Drug Abuse Prevention in Alternative Schools

    Science.gov (United States)

    Hopson, Laura M.; Steiker, Lori K. H.

    2008-01-01

    The purpose of this article is to set forth an innovative methodological protocol for culturally grounding interventions with high-risk youths in alternative schools. This study used mixed methods to evaluate original and adapted versions of a culturally grounded substance abuse prevention program. The qualitative and quantitative methods…

  5. Ares I-X Ground Diagnostic Prototype

    Science.gov (United States)

    Schwabacher, Mark A.; Martin, Rodney Alexander; Waterman, Robert D.; Oostdyk, Rebecca Lynn; Ossenfort, John P.; Matthews, Bryan

    2010-01-01

    The automation of pre-launch diagnostics for launch vehicles offers three potential benefits: improving safety, reducing cost, and reducing launch delays. The Ares I-X Ground Diagnostic Prototype demonstrated anomaly detection, fault detection, fault isolation, and diagnostics for the Ares I-X first-stage Thrust Vector Control and for the associated ground hydraulics while the vehicle was in the Vehicle Assembly Building at Kennedy Space Center (KSC) and while it was on the launch pad. The prototype combines three existing tools. The first tool, TEAMS (Testability Engineering and Maintenance System), is a model-based tool from Qualtech Systems Inc. for fault isolation and diagnostics. The second tool, SHINE (Spacecraft Health Inference Engine), is a rule-based expert system that was developed at the NASA Jet Propulsion Laboratory. We developed SHINE rules for fault detection and mode identification, and used the outputs of SHINE as inputs to TEAMS. The third tool, IMS (Inductive Monitoring System), is an anomaly detection tool that was developed at NASA Ames Research Center. The three tools were integrated and deployed to KSC, where they were interfaced with live data. This paper describes how the prototype performed during the period of time before the launch, including accuracy and computer resource usage. The paper concludes with some of the lessons that we learned from the experience of developing and deploying the prototype.

  6. Diffraction or Reflection? Sketching the Contours of Two Methodologies in Educational Research

    Science.gov (United States)

    Bozalek, Vivienne; Zembylas, Michalinos

    2017-01-01

    Internationally, an interest is emerging in a growing body of work on what has become known as "diffractive methodologies" drawing attention to ontological aspects of research. Diffractive methodologies have largely been developed in response to a dissatisfaction with practices of "reflexivity", which are seen to be grounded in…

  7. Scrum methodology in banking environment

    OpenAIRE

    Strihová, Barbora

    2015-01-01

    Bachelor thesis "Scrum methodology in banking environment" is focused on one of agile methodologies called Scrum and description of the methodology used in banking environment. Its main goal is to introduce the Scrum methodology and outline a real project placed in a bank focused on software development through a case study, address problems of the project, propose solutions of the addressed problems and identify anomalies of Scrum in software development constrained by the banking environmen...

  8. Ground Enterprise Management System Project

    Data.gov (United States)

    National Aeronautics and Space Administration — Emergent Space Technologies Inc. proposes to develop the Ground Enterprise Management System (GEMS) for spacecraft ground systems. GEMS will provide situational...

  9. 76 FR 76815 - Business Opportunity Rule

    Science.gov (United States)

    2011-12-08

    ... ``Amended Franchise Rule'' refers to the amended Franchise Rule published at 72 FR 15444 (Mar. 30, 2007) and... Disclosure Form, available at http://www.ftc.gov/bcp/workshops/bizopps/disclosure-form-report.pdf . ``Original Franchise Rule'' refers to the original Franchise Rule published at 43 FR 59614 (Dec. 21,...

  10. Inferring comprehensible business/ICT alignment rules

    NARCIS (Netherlands)

    Cumps, B.; Martens, D.; De Backer, M.; Haesen, R.; Viaene, S.; Dedene, G.; Baesens, B.; Snoeck, M.

    2009-01-01

    We inferred business rules for business/ICT alignment by applying a novel rule induction algorithm on a data set containing rich alignment information polled from 641 organisations in 7 European countries. The alignment rule set was created using AntMiner+, a rule induction technique with a reputati

  11. A Pieri rule for skew shapes

    CERN Document Server

    Assaf, Sami; Lam, Thomas

    2009-01-01

    The Pieri rule expresses the product of a Schur function and a single row Schur function in terms of Schur functions. We extend the classical Pieri rule by expressing the product of a skew Schur function and a single row Schur function in terms of skew Schur functions. Like the classical rule, our rule involves simple additions of boxes to the original skew shape.

  12. Evaluation of Rule Extraction Algorithms

    Directory of Open Access Journals (Sweden)

    Tiruveedula GopiKrishna

    2014-05-01

    Full Text Available For the data mining domain, the lack of explanation facilities seems to be a serious drawback for techniques based on Artificial Neural Networks, or, for that matter, any technique producing opaque models In particular, the ability to generate even limited explanations is absolutely crucial for user acceptance of such systems. Since the purpose of most data mining systems is to support decision making,the need for explanation facilities in these systems is apparent. The task for the data miner is thus to identify the complex but general relationships that are likely to carry over to production data and the explanation facility makes this easier. Also focused the quality of the extracted rules; i.e. how well the required explanation is performed. In this research some important rule extraction algorithms are discussed and identified the algorithmic complexity; i.e. how efficient the underlying rule extraction algorithm is

  13. Integration Rules for Scattering Equations

    CERN Document Server

    Baadsgaard, Christian; Bourjaily, Jacob L; Damgaard, Poul H

    2015-01-01

    As described by Cachazo, He and Yuan, scattering amplitudes in many quantum field theories can be represented as integrals that are fully localized on solutions to the so-called scattering equations. Because the number of solutions to the scattering equations grows quite rapidly, the contour of integration involves contributions from many isolated components. In this paper, we provide a simple, combinatorial rule that immediately provides the result of integration against the scattering equation constraints for any M\\"obius-invariant integrand involving only simple poles. These rules have a simple diagrammatic interpretation that makes the evaluation of any such integrand immediate. Finally, we explain how these rules are related to the computation of amplitudes in the field theory limit of string theory.

  14. Integration rules for scattering equations

    Science.gov (United States)

    Baadsgaard, Christian; Bjerrum-Bohr, N. E. J.; Bourjaily, Jacob L.; Damgaard, Poul H.

    2015-09-01

    As described by Cachazo, He and Yuan, scattering amplitudes in many quantum field theories can be represented as integrals that are fully localized on solutions to the so-called scattering equations. Because the number of solutions to the scattering equations grows quite rapidly, the contour of integration involves contributions from many isolated components. In this paper, we provide a simple, combinatorial rule that immediately provides the result of integration against the scattering equation constraints fo any Möbius-invariant integrand involving only simple poles. These rules have a simple diagrammatic interpretation that makes the evaluation of any such integrand immediate. Finally, we explain how these rules are related to the computation of amplitudes in the field theory limit of string theory.

  15. Comprehensive Child Welfare Information System. Final rule.

    Science.gov (United States)

    2016-06-01

    This final rule replaces the Statewide and Tribal Automated Child Welfare Information Systems (S/TACWIS) rule with the Comprehensive Child Welfare Information System (CCWIS) rule. The rule also makes conforming amendments in rules in related requirements. This rule will assist title IV-E agencies in developing information management systems that leverage new innovations and technology in order to better serve children and families. More specifically, this final rule supports the use of cost-effective, innovative technologies to automate the collection of high-quality case management data and to promote its analysis, distribution, and use by workers, supervisors, administrators, researchers, and policy makers.

  16. Conformance Testing: Measurement Decision Rules

    Science.gov (United States)

    Mimbs, Scott M.

    2010-01-01

    The goal of a Quality Management System (QMS) as specified in ISO 9001 and AS9100 is to provide assurance to the customer that end products meet specifications. Measuring devices, often called measuring and test equipment (MTE), are used to provide the evidence of product conformity to specified requirements. Unfortunately, processes that employ MTE can become a weak link to the overall QMS if proper attention is not given to the measurement process design, capability, and implementation. Documented "decision rules" establish the requirements to ensure measurement processes provide the measurement data that supports the needs of the QMS. Measurement data are used to make the decisions that impact all areas of technology. Whether measurements support research, design, production, or maintenance, ensuring the data supports the decision is crucial. Measurement data quality can be critical to the resulting consequences of measurement-based decisions. Historically, most industries required simplistic, one-size-fits-all decision rules for measurements. One-size-fits-all rules in some cases are not rigorous enough to provide adequate measurement results, while in other cases are overly conservative and too costly to implement. Ideally, decision rules should be rigorous enough to match the criticality of the parameter being measured, while being flexible enough to be cost effective. The goal of a decision rule is to ensure that measurement processes provide data with a sufficient level of quality to support the decisions being made - no more, no less. This paper discusses the basic concepts of providing measurement-based evidence that end products meet specifications. Although relevant to all measurement-based conformance tests, the target audience is the MTE end-user, which is anyone using MTE other than calibration service providers. Topics include measurement fundamentals, the associated decision risks, verifying conformance to specifications, and basic measurement

  17. Multimodal hybrid reasoning methodology for personalized wellbeing services.

    Science.gov (United States)

    Ali, Rahman; Afzal, Muhammad; Hussain, Maqbool; Ali, Maqbool; Siddiqi, Muhammad Hameed; Lee, Sungyoung; Ho Kang, Byeong

    2016-02-01

    A wellness system provides wellbeing recommendations to support experts in promoting a healthier lifestyle and inducing individuals to adopt healthy habits. Adopting physical activity effectively promotes a healthier lifestyle. A physical activity recommendation system assists users to adopt daily routines to form a best practice of life by involving themselves in healthy physical activities. Traditional physical activity recommendation systems focus on general recommendations applicable to a community of users rather than specific individuals. These recommendations are general in nature and are fit for the community at a certain level, but they are not relevant to every individual based on specific requirements and personal interests. To cover this aspect, we propose a multimodal hybrid reasoning methodology (HRM) that generates personalized physical activity recommendations according to the user׳s specific needs and personal interests. The methodology integrates the rule-based reasoning (RBR), case-based reasoning (CBR), and preference-based reasoning (PBR) approaches in a linear combination that enables personalization of recommendations. RBR uses explicit knowledge rules from physical activity guidelines, CBR uses implicit knowledge from experts׳ past experiences, and PBR uses users׳ personal interests and preferences. To validate the methodology, a weight management scenario is considered and experimented with. The RBR part of the methodology generates goal, weight status, and plan recommendations, the CBR part suggests the top three relevant physical activities for executing the recommended plan, and the PBR part filters out irrelevant recommendations from the suggested ones using the user׳s personal preferences and interests. To evaluate the methodology, a baseline-RBR system is developed, which is improved first using ranged rules and ultimately using a hybrid-CBR. A comparison of the results of these systems shows that hybrid-CBR outperforms the

  18. Bilinearity, rules, and prefrontal cortex

    Directory of Open Access Journals (Sweden)

    Peter Dayan

    2007-11-01

    Full Text Available Humans can be instructed verbally to perform computationally complex cognitive tasks; their performance then improves relatively slowly over the course of practice. Many skills underlie these abilities; in this paper, we focus on the particular question of a uniform architecture for the instantiation of habitual performance and the storage, recall, and execution of simple rules. Our account builds on models of gated working memory, and involves a bilinear architecture for representing conditional input-output maps and for matching rules to the state of the input and working memory. We demonstrate the performance of our model on two paradigmatic tasks used to investigate prefrontal and basal ganglia function.

  19. Incorporation of ICT in teaching methodologies of teaching specialization cecar to

    Directory of Open Access Journals (Sweden)

    Asdrúbal Antonio Atencia Andrade

    2013-12-01

    Full Text Available This project was to approach the incorporation of ICT in the teaching methodologies of Specialization in Teaching program of the University Corporation Caribbean city of Sincelejo, considering that the Information Technology and Communication. Objective: To characterize from a focus on basic ICT skills of teachers methodologies Specialization Program in Teaching. Methodology: The study was a historical hermeneutic and grounded theory was used in interventional symbolic Sampieri, Fernandez & Collado.

  20. Real Rules for Virtual Space

    Institute of Scientific and Technical Information of China (English)

    2011-01-01

    The international community struggles to find common ground on cyber security More than 700 representatives from 60 countries London Conference on Cyberspace, hosted by the British Foreign and Common wealth Office.

  1. Do we need methodological theory to do qualitative research?

    Science.gov (United States)

    Avis, Mark

    2003-09-01

    Positivism is frequently used to stand for the epistemological assumption that empirical science based on principles of verificationism, objectivity, and reproducibility is the foundation of all genuine knowledge. Qualitative researchers sometimes feel obliged to provide methodological alternatives to positivism that recognize their different ethical, ontological, and epistemological commitments and have provided three theories: phenomenology, grounded theory, and ethnography. The author argues that positivism was a doomed attempt to define empirical foundations for knowledge through a rigorous separation of theory and evidence; offers a pragmatic, coherent view of knowledge; and suggests that rigorous, rational empirical investigation does not need methodological theory. Therefore, qualitative methodological theory is unnecessary and counterproductive because it hinders critical reflection on the relation between methodological theory and empirical evidence.

  2. Reinventing Grounded Theory: Some Questions about Theory, Ground and Discovery

    Science.gov (United States)

    Thomas, Gary; James, David

    2006-01-01

    Grounded theory's popularity persists after three decades of broad-ranging critique. In this article three problematic notions are discussed--"theory," "ground" and "discovery"--which linger in the continuing use and development of grounded theory procedures. It is argued that far from providing the epistemic security promised by grounded theory,…

  3. Rules Extraction with an Immune Algorithm

    Directory of Open Access Journals (Sweden)

    Deqin Yan

    2007-12-01

    Full Text Available In this paper, a method of extracting rules with immune algorithms from information systems is proposed. Designing an immune algorithm is based on a sharing mechanism to extract rules. The principle of sharing and competing resources in the sharing mechanism is consistent with the relationship of sharing and rivalry among rules. In order to extract rules efficiently, a new concept of flexible confidence and rule measurement is introduced. Experiments demonstrate that the proposed method is effective.

  4. Ground water in Oklahoma

    Science.gov (United States)

    Leonard, A.R.

    1960-01-01

    One of the first requisites for the intelligent planning of utilization and control of water and for the administration of laws relating to its use is data on the quantity, quality, and mode of occurrence of the available supplies. The collection, evaluation and interpretation, and publication of such data are among the primary functions of the U.S. Geological Survey. Since 1895 the Congress has made appropriations to the Survey for investigation of the water resources of the Nation. In 1929 the Congress adopted the policy of dollar-for-dollar cooperation with the States and local governmental agencies in water-resources investigations of the U.S. Geological Survey. In 1937 a program of ground-water investigations was started in cooperation with the Oklahoma Geological Survey, and in 1949 this program was expanded to include cooperation with the Oklahoma Planning and Resources Board. In 1957 the State Legislature created the Oklahoma Water Resources Board as the principal State water agency and it became the principal local cooperator. The Ground Water Branch of the U.S. Geological Survey collects, analyzes, and evaluates basic information on ground-water resources and prepares interpretive reports based on those data. Cooperative ground-water work was first concentrated in the Panhandle counties. During World War II most work was related to problems of water supply for defense requirements. Since 1945 detailed investigations of ground-water availability have been made in 11 areas, chiefly in the western and central parts of the State. In addition, water levels in more than 300 wells are measured periodically, principally in the western half of the State. In Oklahoma current studies are directed toward determining the source, occurrence, and availability of ground water and toward estimating the quantity of water and rate of replenishment to specific areas and water-bearing formations. Ground water plays an important role in the economy of the State. It is

  5. Effective methodology to make DFM guide line

    Science.gov (United States)

    Choi, Jaeyoung; Shim, Yeonah; Yun, Kyunghee; Choi, Kwangseon; Han, Jaewon

    2009-10-01

    Design For Manufacturing (DFM) has become an important focusing part in the semiconductor industry as the feature size on the chip goes down below the 0.13um technology. Lots of DFM related ideas have been come up, tried, and adopted for wider process window and higher device performance. As the minimum features are getting shrunk, the design rules also become more complicated, but still not good enough to describe the certain pattern that imposes narrow process window or even failure of device. Thus, these process hot spot patterns become to identify, correct, or remove at the design step. One of the efforts is to support a DFM guide line to the designer or add to conventional DRC rules. However it is very difficult to make DFM guideline because we detect the hot spot pattern and confirm if these patterns is real hot spot or not. In this study, we developed effective methodology how to make DFM guide line. Firstly we use the s oftware, called nanoscope to detect hot spots on post OPC layouts and then make this detected hot spot patter n to test patterns that it can check electrical performance and then we compared with electrical performance a ccording to split condition. It is confirmed this method is very effective to make DFM guide line below the 0. 13um technology.

  6. A Grounded Theory Study of Supervision of Preservice Consultation Training

    Science.gov (United States)

    Newman, Daniel S.

    2012-01-01

    The purpose of this study was to explore a university-based supervision process for consultants-in-training (CITs) engaged in a preservice level consultation course with applied practicum experience. The study was approached from a constructivist worldview using a grounded theory methodology. Data consisted of supervision session transcripts,…

  7. University Students' Experiences of Nonmarital Breakups: A Grounded Theory

    Science.gov (United States)

    Hebert, Sarah; Popadiuk, Natalee

    2008-01-01

    Prior nonmarital breakup research has been focused on negative outcomes, rarely examining the personal growth aspects of this experience. In this study, we used a qualitative grounded theory methodology to explore the changes that university students reported experiencing as a result of a heterosexual nonmarital breakup and how those changes…

  8. Recharge estimation for transient ground water modeling.

    Science.gov (United States)

    Jyrkama, Mikko I; Sykes, Jon F; Normani, Stefano D

    2002-01-01

    Reliable ground water models require both an accurate physical representation of the system and appropriate boundary conditions. While physical attributes are generally considered static, boundary conditions, such as ground water recharge rates, can be highly variable in both space and time. A practical methodology incorporating the hydrologic model HELP3 in conjunction with a geographic information system was developed to generate a physically based and highly detailed recharge boundary condition for ground water modeling. The approach uses daily precipitation and temperature records in addition to land use/land cover and soils data. The importance of the method in transient ground water modeling is demonstrated by applying it to a MODFLOW modeling study in New Jersey. In addition to improved model calibration, the results from the study clearly indicate the importance of using a physically based and highly detailed recharge boundary condition in ground water quality modeling, where the detailed knowledge of the evolution of the ground water flowpaths is imperative. The simulated water table is within 0.5 m of the observed values using the method, while the water levels can differ by as much as 2 m using uniform recharge conditions. The results also show that the combination of temperature and precipitation plays an important role in the amount and timing of recharge in cooler climates. A sensitivity analysis further reveals that increasing the leaf area index, the evaporative zone depth, or the curve number in the model will result in decreased recharge rates over time, with the curve number having the greatest impact.

  9. Fisheye Photogrammetry: Tests and Methodologies for the Survey of Narrow Spaces

    Science.gov (United States)

    Perfetti, L.; Polari, C.; Fassi, F.

    2017-02-01

    The research illustrated in this article aimed at identifying a good standard methodology to survey very narrow spaces during 3D investigation of Cultural Heritage. It is an important topic in today's era of BIM modelling applied to Cultural Heritage. Spaces like staircases, corridors and passages are very common in the architectural or archaeological fields, and obtaining a 3D-oriented survey of those areas can be a very complex task when completeness of the model and high precision are requested. Photogrammetry appears to be the most promising solution in terms of versatility and manoeuvrability also considering the quality of the required data. Fisheye lenses were studied and tested in depth because of their significant advantage in the field of view if compared with rectilinear lenses. This advantage alone can be crucial to reduce the total amount of photos and, as a consequence, to obtain manageable data, to simplify the survey phase and to significantly reduce the elaboration time. In order to overcome the main issue that arise when using fisheye lenses, which is the lack of rules that can be employed to design the survey, a general mathematical formulation to precisely estimate the GSD (Ground Sampling Distance) for every optical projection is presented here. A complete survey of a real complex case study was performed in order to test and stress the proposed methodology, and to handle a fisheye-based survey from beginning to end: the photogrammetric survey of the Minguzzi Staircase. It is a complex service spiral-staircase located in the Duomo di Milano with a total height of 25 meters and characterized by a narrow walkable space about 70 centimetres wide.

  10. UNCOMMON SENSORY METHODOLOGIES

    Directory of Open Access Journals (Sweden)

    Vladimír Vietoris

    2015-02-01

    Full Text Available Sensory science is the young but the rapidly developing field of the food industry. Actually, the great emphasis is given to the production of rapid techniques of data collection, the difference between consumers and trained panel is obscured and the role of sensory methodologists is to prepare the ways for evaluation, by which a lay panel (consumers can achieve identical results as a trained panel. Currently, there are several conventional methods of sensory evaluation of food (ISO standards, but more sensory laboratories are developing methodologies that are not strict enough in the selection of evaluators, their mechanism is easily understandable and the results are easily interpretable. This paper deals with mapping of marginal methods used in sensory evaluation of food (new types of profiles, CATA, TDS, napping.

  11. Albert Einstein's Methodology

    CERN Document Server

    Weinstein, Galina

    2012-01-01

    This paper discusses Einstein's methodology. 1. Einstein characterized his work as a theory of principle and reasoned that beyond kinematics, the 1905 heuristic relativity principle could offer new connections between non-kinematical concepts. 2. Einstein's creativity and inventiveness and process of thinking; invention or discovery. 3. Einstein considered his best friend Michele Besso as a sounding board and his class-mate from the Polytechnic Marcel Grossman, as his active partner. Yet, Einstein wrote to Arnold Sommerfeld that Grossman will never claim to be considered a co-discoverer of the Einstein-Grossmann theory. He only helped in guiding Einstein through the mathematical literature, but contributed nothing of substance to the results of the theory. Hence, Einstein neither considered Besso or Grossmann as co-discoverers of the relativity theory which he invented.

  12. Situating methodology within qualitative research.

    Science.gov (United States)

    Kramer-Kile, Marnie L

    2012-01-01

    Qualitative nurse researchers are required to make deliberate and sometimes complex methodological decisions about their work. Methodology in qualitative research is a comprehensive approach in which theory (ideas) and method (doing) are brought into close alignment. It can be difficult, at times, to understand the concept of methodology. The purpose of this research column is to: (1) define qualitative methodology; (2) illuminate the relationship between epistemology, ontology and methodology; (3) explicate the connection between theory and method in qualitative research design; and 4) highlight relevant examples of methodological decisions made within cardiovascular nursing research. Although there is no "one set way" to do qualitative research, all qualitative researchers should account for the choices they make throughout the research process and articulate their methodological decision-making along the way.

  13. Detection of ground ice using ground penetrating radar method

    Institute of Scientific and Technical Information of China (English)

    Gennady M. Stoyanovich; Viktor V. Pupatenko; Yury A. Sukhobok

    2015-01-01

    The paper presents the results of a ground penetrating radar (GPR) application for the detection of ground ice. We com-bined a reflection traveltime curves analysis with a frequency spectrogram analysis. We found special anomalies at specific traces in the traveltime curves and ground boundaries analysis, and obtained a ground model for subsurface structure which allows the ground ice layer to be identified and delineated.

  14. Collison and Grounding

    DEFF Research Database (Denmark)

    Wang, G.; Ji, C.; Kuhala, P.;

    2006-01-01

    COMMITTEE MANDATE Concern for structural arrangements on ships and floating structures with regard to their integrity and adequacy in the events of collision and grounding, with the view towards risk assessment and management. Consideration shall be given to the frequency of occurrence, the proba......COMMITTEE MANDATE Concern for structural arrangements on ships and floating structures with regard to their integrity and adequacy in the events of collision and grounding, with the view towards risk assessment and management. Consideration shall be given to the frequency of occurrence...

  15. Power Gating Based Ground Bounce Noise Reduction

    Directory of Open Access Journals (Sweden)

    M. Uma Maheswari

    2014-08-01

    Full Text Available As low power circuits are most popular the decrease in supply voltage leads to increase in leakage power with respect to the technology scaling. So for removing this kind of leakages and to provide a better power efficiency many power gating techniques are used. But the leakage due to ground connection to the active part of the circuit is very high rather than all other leakages. As it is mainly due to the back EMF of the ground connection it was called it as ground bounce noise. To reduce this noise different methodologies are designed. In this paper the design of such an efficient technique related to ground bounce noise reduction using power gating circuits and comparing the results using DSCH and Microwind low power tools. In this paper the analysis of adders such as full adders using different types of power gated circuits using low power VLSI design techniques and to present the comparison results between different power gating methods.

  16. Dynasting Theory: Lessons in learning grounded theory

    Directory of Open Access Journals (Sweden)

    Johnben Teik-Cheok Loy, MBA, MTS, Ph.D.

    2011-06-01

    Full Text Available This article captures the key learning lessons gleaned from the author’s experience learning and developing a grounded theory for his doctoral dissertation using the classic methodology as conceived by Barney Glaser. The theory was developed through data gathered on founders and successors of Malaysian Chinese family-own businesses. The main concern for Malaysian Chinese family businesses emerged as dynasting . the building, maintaining, and growing the power and resources of the business within the family lineage. The core category emerged as dynasting across cultures, where founders and successors struggle to transition from traditional Chinese to hybrid cultural and modernized forms of family business from one generation to the next. The key learning lessons were categorized under five headings: (a sorting through different versions of grounded theory, (b educating and managing research stakeholders, (c embracing experiential learning, (d discovering the core category: grounded intuition, and (e recognizing limitations and possibilities.Keywords: grounded theory, learning, dynasting, family business, Chinese

  17. All Things Out of Rule

    Science.gov (United States)

    Gregory, Nuala

    2015-01-01

    This article brings together and compares my own artistic practice of drawing/painting and the eighteenth-century novel "Tristram Shandy." In both cases, there is a free play of lines, textual or graphic, which sets "all things out of rule". A whole typology of lines is woven throughout Sterne's text and reappears,…

  18. Introduction to QCD Sum Rules

    Science.gov (United States)

    Dominguez, C. A.

    2013-08-01

    A general, and very basic introduction to QCD sum rules is presented, with emphasis on recent issues to be described at length in other papers in this issue. Collectively, these papers constitute the proceedings of the International Workshop on Determination of the Fundamental Parameters of QCD, Singapore, March 2013.

  19. New Economy - New Policy Rules?

    NARCIS (Netherlands)

    Bullard, J.; Schaling, E.

    2000-01-01

    The U.S. economy appears to have experienced a pronounced shift toward higher productivity over the last five years or so. We wish to understand the implications of such shifts for the structure of optimal monetary policy rules in simple dynamic economies. Accordingly, we begin with a standard

  20. New Economy - New Policy Rules?

    NARCIS (Netherlands)

    Bullard, J.; Schaling, E.

    2000-01-01

    The U.S. economy appears to have experienced a pronounced shift toward higher productivity over the last five years or so. We wish to understand the implications of such shifts for the structure of optimal monetary policy rules in simple dynamic economies. Accordingly, we begin with a standard econo

  1. A Methodology for Implementing Clinical Algorithms Using Expert-System and Database Tools

    OpenAIRE

    Rucker, Donald W.; Shortliffe, Edward H.

    1989-01-01

    The HyperLipid Advisory System is a combination of an expert system and a database that uses an augmented transition network methodology for implementing clinical algorithms. These algorithms exist as tables from which the separate expert-system rule base sequentially extracts the steps in the algorithm. The rule base assumes that the algorithm has a binary branching structure and models episodes of clinical care, but otherwise makes no assumption regarding the specific clinical domain. Hyper...

  2. Coding Issues in Grounded Theory

    Science.gov (United States)

    Moghaddam, Alireza

    2006-01-01

    This paper discusses grounded theory as one of the qualitative research designs. It describes how grounded theory generates from data. Three phases of grounded theory--open coding, axial coding, and selective coding--are discussed, along with some of the issues which are the source of debate among grounded theorists, especially between its…

  3. Coding Issues in Grounded Theory

    Science.gov (United States)

    Moghaddam, Alireza

    2006-01-01

    This paper discusses grounded theory as one of the qualitative research designs. It describes how grounded theory generates from data. Three phases of grounded theory--open coding, axial coding, and selective coding--are discussed, along with some of the issues which are the source of debate among grounded theorists, especially between its…

  4. A Generalized Carpenter's Rule Theorem for Self-Touching Linkages

    CERN Document Server

    Abbott, Timothy G; Gassend, Blaise

    2009-01-01

    The Carpenter's Rule Theorem states that any chain linkage in the plane can be folded continuously between any two configurations while preserving the bar lengths and without the bars crossing. However, this theorem applies only to strictly simple configurations, where bars intersect only at their common endpoints. We generalize the theorem to self-touching configurations, where bars can touch but not properly cross. At the heart of our proof is a new definition of self-touching configurations of planar linkages, based on an annotated configuration space and limits of nontouching configurations. We show that this definition is equivalent to the previously proposed definition of self-touching configurations, which is based on a combinatorial description of overlapping features. Using our new definition, we prove the generalized Carpenter's Rule Theorem using a topological argument. We believe that our topological methodology provides a powerful tool for manipulating many kinds of self-touching objects, such as...

  5. Rough set and rule-based multicriteria decision aiding

    Directory of Open Access Journals (Sweden)

    Roman Slowinski

    2012-08-01

    Full Text Available The aim of multicriteria decision aiding is to give the decision maker a recommendation concerning a set of objects evaluated from multiple points of view called criteria. Since a rational decision maker acts with respect to his/her value system, in order to recommend the most-preferred decision, one must identify decision maker's preferences. In this paper, we focus on preference discovery from data concerning some past decisions of the decision maker. We consider the preference model in the form of a set of "if..., then..." decision rules discovered from the data by inductive learning. To structure the data prior to induction of rules, we use the Dominance-based Rough Set Approach (DRSA. DRSA is a methodology for reasoning about data, which handles ordinal evaluations of objects on considered criteria and monotonic relationships between these evaluations and the decision. We review applications of DRSA to a large variety of multicriteria decision problems.

  6. Controlling False Positives in Association Rule Mining

    CERN Document Server

    Liu, Guimei; Wong, Limsoon

    2011-01-01

    Association rule mining is an important problem in the data mining area. It enumerates and tests a large number of rules on a dataset and outputs rules that satisfy user-specified constraints. Due to the large number of rules being tested, rules that do not represent real systematic effect in the data can satisfy the given constraints purely by random chance. Hence association rule mining often suffers from a high risk of false positive errors. There is a lack of comprehensive study on controlling false positives in association rule mining. In this paper, we adopt three multiple testing correction approaches---the direct adjustment approach, the permutation-based approach and the holdout approach---to control false positives in association rule mining, and conduct extensive experiments to study their performance. Our results show that (1) Numerous spurious rules are generated if no correction is made. (2) The three approaches can control false positives effectively. Among the three approaches, the permutation...

  7. Mechanics of Ship Grounding

    DEFF Research Database (Denmark)

    Pedersen, Preben Terndrup

    1996-01-01

    In these notes first a simplified mathematical model is presented for analysis of ship hull loading due to grounding on relatively hard and plane sand, clay or rock sea bottoms. In a second section a more rational calculation model is described for the sea bed soil reaction forces on the sea bottom...

  8. Grounding Anger Management

    Directory of Open Access Journals (Sweden)

    Odis E. Simmons, PhD

    2017-06-01

    Full Text Available One of the things that drew me to grounded theory from the beginning was Glaser and Strauss’ assertion in The Discovery of Grounded Theory that it was useful as a “theoretical foothold” for practical applications (p. 268. From this, when I was a Ph.D student studying under Glaser and Strauss in the early 1970s, I devised a GT based approach to action I later came to call “grounded action.” In this short paper I’ll present a very brief sketch of an anger management program I developed in 1992, using grounded action. I began my research by attending a two-day anger management training workshop designed for training professionals in the most commonly used anger management model. Like other intervention programs I had seen, this model took a psychologizing and pathologizing approach to the issue. Following this, I sat through the full course of an anger management program that used this model, observing the reactions of the participants and the approach of the facilitator. Following each session I conducted open-ended interviews with most of the participants, either individually or in groups of two or three. I had also done previous research in counseling and social work contexts that turned out to be very relevant to an anger management program design.

  9. Grounding in Instant Messaging

    Science.gov (United States)

    Fox Tree, Jean E.; Mayer, Sarah A.; Betts, Teresa E.

    2011-01-01

    In two experiments, we investigated predictions of the "collaborative theory of language use" (Clark, 1996) as applied to instant messaging (IM). This theory describes how the presence and absence of different grounding constraints causes people to interact differently across different communicative media (Clark & Brennan, 1991). In Study 1, we…

  10. Informed Grounded Theory

    Science.gov (United States)

    Thornberg, Robert

    2012-01-01

    There is a widespread idea that in grounded theory (GT) research, the researcher has to delay the literature review until the end of the analysis to avoid contamination--a dictum that might turn educational researchers away from GT. Nevertheless, in this article the author (a) problematizes the dictum of delaying a literature review in classic…

  11. TARDEC Ground Vehicle Robotics

    Science.gov (United States)

    2013-05-30

    UNCLASSIFIED UNCLASSIFIED 10 Optionally Manned Vehicles OMV can be driven by a soldier; OMV can drive a soldier; OMV can be remotely operated; OMV can be...all missions for OMV (i.e. shared driving) (i.e. remotely operated) 2 m od al iti es Mission Payloads UNCLASSIFIED UNCLASSIFIED 11 Ground

  12. SUM-RULES FOR MAGNETIC DICHROISM IN RARE-EARTH 4F-PHOTOEMISSION

    NARCIS (Netherlands)

    THOLE, BT; VANDERLAAN, G

    1993-01-01

    We present new sum rules for magnetic dichroism in spin polarized photoemission from partly filled shells which give the expectation values of the orbital and spin magnetic moments and their correlations in the ground state. We apply this to the 4f photoemission of rare earths, where the

  13. On what grounds?

    DEFF Research Database (Denmark)

    Markussen, Thomas; Krogh, Peter Gall; Bang, Anne Louise

    2015-01-01

    Research through design is a murky field and there is an increasing interest in understanding its varied practices and methodology. In the research literature that is initially reviewed in this paper two positions are located as the most dominant representing opposite opinions concerning the natu...

  14. Exploration of SWRL Rule Bases through Visualization, Paraphrasing, and Categorization of Rules

    Science.gov (United States)

    Hassanpour, Saeed; O'Connor, Martin J.; Das, Amar K.

    Rule bases are increasingly being used as repositories of knowledge content on the Semantic Web. As the size and complexity of these rule bases increases, developers and end users need methods of rule abstraction to facilitate rule management. In this paper, we describe a rule abstraction method for Semantic Web Rule Language (SWRL) rules that is based on lexical analysis and a set of heuristics. Our method results in a tree data structure that we exploit in creating techniques to visualize, paraphrase, and categorize SWRL rules. We evaluate our approach by applying it to several biomedical ontologies that contain SWRL rules, and show how the results reveal rule patterns within the rule base. We have implemented our method as a plug-in tool for Protégé-OWL, the most widely used ontology modeling software for the Semantic Web. Our tool can allow users to rapidly explore content and patterns in SWRL rule bases, enabling their acquisition and management.

  15. The diagnostic rules of peripheral lung cancer preliminary study based on data mining technique

    Institute of Scientific and Technical Information of China (English)

    Yongqian Qiang; Youmin Guo; Xue Li; Qiuping Wang; Hao Chen; Duwu Cui

    2007-01-01

    Objective: To discuss the clinical and imaging diagnostic rules of peripheral lung cancer by data mining technique, and to explore new ideas in the diagnosis of peripheral lung cancer, and to obtain early-stage technology and knowledge support of computer-aided detecting (CAD). Methods: 58 cases of peripheral lung cancer confirmed by clinical pathology were collected. The data were imported into the database after the standardization of the clinical and CT findings attributes were identified. The data was studied comparatively based on Association Rules (AR) of the knowledge discovery process and the Rough Set (RS) reduction algorithm and Genetic Algorithm(GA) of the generic data analysis tool (ROSETTA), respectively. Results: The genetic classification algorithm of ROSETTA generates 5 000 or so diagnosis rules. The RS reduction algorithm of Johnson's Algorithm generates 51 diagnosis rules and the AR algorithm generates 123 diagnosis rules. Three data mining methods basically consider gender, age,cough, location, lobulation sign, shape, ground-glass density attributes as the main basis for the diagnosis of peripheral lung cancer. Conclusion: These diagnosis rules for peripheral lung cancer with three data mining technology is same as clinical diagnostic rules, and these rules also can be used to build the knowledge base of expert system. This study demonstrated the potential values of data mining technology in clinical imaging diagnosis and differential diagnosis.

  16. Revising the "Rule of Three" for inferring seizure freedom.

    Science.gov (United States)

    Westover, M Brandon; Cormier, Justine; Bianchi, Matt T; Shafi, Mouhsin; Kilbride, Ronan; Cole, Andrew J; Cash, Sydney S

    2012-02-01

    How long after starting a new medication must a patient go without seizures before they can be regarded as seizure-free? A recent International League Against Epilepsy (ILAE) task force proposed using a "Rule of Three" as an operational definition of seizure freedom, according to which a patient should be considered seizure-free following an intervention after a period without seizures has elapsed equal to three times the longest preintervention interseizure interval over the previous year. This rule was motivated in large part by statistical considerations advanced in a classic 1983 paper by Hanley and Lippman-Hand. However, strict adherence to the statistical logic of this rule generally requires waiting much longer than recommended by the ILAE task force. Therefore, we set out to determine whether an alternative approach to the Rule of Three might be possible, and under what conditions the rule may be expected to hold or would need to be extended. Probabilistic modeling and application of Bayes' rule. We find that an alternative approach to the problem of inferring seizure freedom supports using the Rule of Three in the way proposed by the ILAE in many cases, particularly in evaluating responses to a first trial of antiseizure medication, and to favorably-selected epilepsy surgical candidates. In cases where the a priori odds of success are less favorable, our analysis requires longer seizure-free observation periods before declaring seizure freedom, up to six times the average preintervention interseizure interval. The key to our approach is to take into account not only the time elapsed without seizures but also empirical data regarding the a priori probability of achieving seizure freedom conferred by a particular intervention. In many cases it may be reasonable to consider a patient seizure-free after they have gone without seizures for a period equal to three times the preintervention interseizure interval, as proposed on pragmatic grounds in a recent ILAE

  17. Revising the Rule Of Three For Inferring Seizure Freedom

    Science.gov (United States)

    Westover, M. Brandon; Cormier, Justine; Bianchi, Matt T.; Shafi, Mouhsin; Kilbride, Ronan; Cole, Andrew J.; Cash, Sydney S.

    2011-01-01

    grounds in a recent ILAE position paper, though in other commonly encountered cases a waiting time up to six times this interval is required. In this work we have provided a coherent theoretical basis for modified criterion for seizure freedom, which we call the “Rule of Three-To-Six”. PMID:22191711

  18. 76 FR 60572 - Self-Regulatory Organizations; The Options Clearing Corporation; Order Approving Proposed Rule...

    Science.gov (United States)

    2011-09-29

    ... member to which it has the largest exposure in extreme but plausible market conditions.'' The publication... proposed for the previous proposed rule change; discuss the adaptation of the methodology underlying the... 2008, which were two months of extreme volatility in the U.S. securities markets, the revised...

  19. Assessing Financial Education Methods: Principles vs. Rules-of-Thumb Approaches

    Science.gov (United States)

    Skimmyhorn, William L.; Davies, Evan R.; Mun, David; Mitchell, Brian

    2016-01-01

    Despite thousands of programs and tremendous public and private interest in improving financial decision-making, little is known about how best to teach financial education. Using an experimental approach, the authors estimated the effects of two different education methodologies (principles-based and rules-of-thumb) on the knowledge,…

  20. Only half right: species with female-biased sexual size dimorphism consistently break Rensch's rule.

    Directory of Open Access Journals (Sweden)

    Thomas J Webb

    Full Text Available BACKGROUND: Most animal species display Sexual Size Dimorphism (SSD: males and females consistently attain different sizes, most frequently with females being larger than males. However the selective mechanisms driving patterns of SSD remain controversial. 'Rensch's rule' proposes a general scaling phenomenon for all taxa, whereby SSD increases with average body size when males are larger than females, and decreases with body size when females are larger than males. Rensch's rule appears to be general in the former case, but there is little evidence for the rule when females are larger then males. METHODOLOGY/PRINCIPAL FINDINGS: Using comprehensive data for 1291 species of birds across 30 families, we find strong support for Rensch's rule in families where males are typically larger than females, but no overall support for the rule in families with female-biased SSD. Reviewing previous studies of a broad range of taxa (arthropods, reptiles, fish and birds showing predominantly female-biased SSD, we conclude that Rensch's conjecture is the exception rather than the rule in such species. CONCLUSIONS/SIGNIFICANCE: The absence of consistent scaling of SSD in taxa with female-biased SSD, the most prevalent direction of dimorphism, calls into question previous general evolutionary explanations for Rensch's rule. We propose that, unlike several other ecological scaling relationships, Rensch's rule does not exist as an independent scaling phenomenon.

  1. Gaussian quadrature for splines via homotopy continuation: Rules for C2 cubic splines

    KAUST Repository

    Barton, Michael

    2015-10-24

    We introduce a new concept for generating optimal quadrature rules for splines. To generate an optimal quadrature rule in a given (target) spline space, we build an associated source space with known optimal quadrature and transfer the rule from the source space to the target one, while preserving the number of quadrature points and therefore optimality. The quadrature nodes and weights are, considered as a higher-dimensional point, a zero of a particular system of polynomial equations. As the space is continuously deformed by changing the source knot vector, the quadrature rule gets updated using polynomial homotopy continuation. For example, starting with C1C1 cubic splines with uniform knot sequences, we demonstrate the methodology by deriving the optimal rules for uniform C2C2 cubic spline spaces where the rule was only conjectured to date. We validate our algorithm by showing that the resulting quadrature rule is independent of the path chosen between the target and the source knot vectors as well as the source rule chosen.

  2. Engineering radioecology: Methodological considerations

    Energy Technology Data Exchange (ETDEWEB)

    Nechaev, A.F.; Projaev, V.V. [St. Petersburg State Inst. of Technology (Russian Federation); Sobolev, I.A.; Dmitriev, S.A. [United Ecologo-Technological and Research Center on Radioactive Waste Management and Environmental Remediation, Moscow (Russian Federation)

    1995-12-31

    The term ``radioecology`` has been widely recognized in scientific and technical societies. At the same time, this scientific school (radioecology) does not have a precise/generally acknowledged structure, unified methodical basis, fixed subjects of investigation, etc. In other words, radioecology is a vast, important but rather amorphous conglomerate of various ideas, amalgamated mostly by their involvement in biospheric effects of ionizing radiation and some conceptual stereotypes. This paradox was acceptable up to a certain time. However, with the termination of the Cold War and because of remarkable political changes in the world, it has become possible to convert the problem of environmental restoration from the scientific sphere in particularly practical terms. Already the first steps clearly showed an imperfection of existing technologies, managerial and regulatory schemes; lack of qualified specialists, relevant methods and techniques; uncertainties in methodology of decision-making, etc. Thus, building up (or maybe, structuring) of special scientific and technological basis, which the authors call ``engineering radioecology``, seems to be an important task. In this paper they endeavored to substantiate the last thesis and to suggest some preliminary ideas concerning the subject matter of engineering radioecology.

  3. Cancer cytogenetics: methodology revisited.

    Science.gov (United States)

    Wan, Thomas S K

    2014-11-01

    The Philadelphia chromosome was the first genetic abnormality discovered in cancer (in 1960), and it was found to be consistently associated with CML. The description of the Philadelphia chromosome ushered in a new era in the field of cancer cytogenetics. Accumulating genetic data have been shown to be intimately associated with the diagnosis and prognosis of neoplasms; thus, karyotyping is now considered a mandatory investigation for all newly diagnosed leukemias. The development of FISH in the 1980s overcame many of the drawbacks of assessing the genetic alterations in cancer cells by karyotyping. Karyotyping of cancer cells remains the gold standard since it provides a global analysis of the abnormalities in the entire genome of a single cell. However, subsequent methodological advances in molecular cytogenetics based on the principle of FISH that were initiated in the early 1990s have greatly enhanced the efficiency and accuracy of karyotype analysis by marrying conventional cytogenetics with molecular technologies. In this review, the development, current utilization, and technical pitfalls of both the conventional and molecular cytogenetics approaches used for cancer diagnosis over the past five decades will be discussed.

  4. Scientific methodology applied.

    Science.gov (United States)

    Lussier, A

    1975-04-01

    The subject of this symposium is naproxen, a new drug that resulted from an investigation to find a superior anti-inflammatory agent. It was synthesized by Harrison et al. in 1970 at the Syntex Institute of Organic Chemistry and Biological Sciences. How can we chart the evolution of this or any other drug? Three steps are necessary: first, chemical studies (synthesis, analysis); second, animal pharmacology; third, human pharmacology. The last step can additionally be divided into four phases: metabolism and toxicology of the drug in normal volunteers; dose titration and initial clinical trials with sick subjects (pharmacometry); confirmatory clinical trials when the drug is accepted on the market and revaluation (familiarization trials). To discover the truth about naproxen, we must all participate actively with a critical mind, following the principles of scientific methodology. We shall find that the papers to be presented today all deal with the third step in the evaluation process--clinical pharmacology. It is quite evident that the final and most decisive test must be aimed at the most valuable target: the human being. The end product of this day's work for each of us should be the formation of an opinion based on solid scientific proofs. And let us hope that we will all enjoy fulfilling the symposium in its entire etymological meaning this evening. In vino veritas.

  5. Glycaemic index methodology.

    Science.gov (United States)

    Brouns, F; Bjorck, I; Frayn, K N; Gibbs, A L; Lang, V; Slama, G; Wolever, T M S

    2005-06-01

    The glycaemic index (GI) concept was originally introduced to classify different sources of carbohydrate (CHO)-rich foods, usually having an energy content of >80 % from CHO, to their effect on post-meal glycaemia. It was assumed to apply to foods that primarily deliver available CHO, causing hyperglycaemia. Low-GI foods were classified as being digested and absorbed slowly and high-GI foods as being rapidly digested and absorbed, resulting in different glycaemic responses. Low-GI foods were found to induce benefits on certain risk factors for CVD and diabetes. Accordingly it has been proposed that GI classification of foods and drinks could be useful to help consumers make 'healthy food choices' within specific food groups. Classification of foods according to their impact on blood glucose responses requires a standardised way of measuring such responses. The present review discusses the most relevant methodological considerations and highlights specific recommendations regarding number of subjects, sex, subject status, inclusion and exclusion criteria, pre-test conditions, CHO test dose, blood sampling procedures, sampling times, test randomisation and calculation of glycaemic response area under the curve. All together, these technical recommendations will help to implement or reinforce measurement of GI in laboratories and help to ensure quality of results. Since there is current international interest in alternative ways of expressing glycaemic responses to foods, some of these methods are discussed.

  6. Ground Control for Emplacement Drifts for SR

    Energy Technology Data Exchange (ETDEWEB)

    Y. Sun

    2000-04-07

    This analysis demonstrates that a satisfactory ground control system can be designed for the Yucca Mountain site, and provides the technical basis for the design of ground support systems to be used in repository emplacement and non-emplacement drifts. The repository ground support design was based on analytical methods using acquired computer codes, and focused on the final support systems. A literature review of case histories, including the lessons learned from the design and construction of the ESF, the studies on the seismic damages of underground openings, and the use of rock mass classification systems in the ground support design, was conducted (Sections 6.3.4 and 6.4). This review provided some basis for determining the inputs and methodologies used in this analysis. Stability of the supported and unsupported emplacement and non-emplacement drifts was evaluated in this analysis. The excavation effects (i.e., state of the stress change due to excavation), thermal effects (i.e., due to heat output from waste packages), and seismic effects (i.e., from potential earthquake events) were evaluated, and stress controlled modes of failure were examined for two in situ stress conditions (k_0=0.3 and 1.0) using rock properties representing rock mass categories of 1 and 5. Variation of rock mass units such as the non-lithophysal (Tptpmn) and lithophysal (Tptpll) was considered in the analysis. The focus was on the non-lithophysal unit because this unit appears to be relatively weaker and has much smaller joint spacing. Therefore, the drift stability and ground support needs were considered to be controlled by the design for this rock unit. The ground support systems for both emplacement and non-emplacement drifts were incorporated into the models to assess their performance under in situ, thermal, and seismic loading conditions. Both continuum and discontinuum modeling approaches were employed in the analyses of the rock mass behavior and in the evaluation of the

  7. Employees' and Managers' Accounts of Interactive Workplace Learning: A Grounded Theory of "Complex Integrative Learning"

    Science.gov (United States)

    Armson, Genevieve; Whiteley, Alma

    2010-01-01

    Purpose: The purpose of this paper is to investigate employees' and managers' accounts of interactive learning and what might encourage or inhibit emergent learning. Design/methodology/approach: The approach taken was a constructivist/social constructivist ontology, interpretive epistemology and qualitative methodology, using grounded theory…

  8. Qualitative Methodology in Unfamiliar Cultures

    DEFF Research Database (Denmark)

    Svensson, Christian Franklin

    2014-01-01

    on a qualitative methodology, conscious reflection on research design and objectivity is important when doing fieldwork. This case study discusses such reflections. Emphasis throughout is given to applied qualitative methodology and its contributions to the social sciences, in particular having to do...... with relational, emotional, and ethical issues associated with interviewing and personal observation. Although the empirical setting of this case is Southeast Asia, the various discussions and interrelatedness of methodology, theory, and empirical reflections will prove applicable to field studies throughout...

  9. Infrasonic induced ground motions

    Science.gov (United States)

    Lin, Ting-Li

    On January 28, 2004, the CERI seismic network recorded seismic signals generated by an unknown source. Our conclusion is that the acoustic waves were initiated by an explosive source near the ground surface. The meteorological temperature and effective sound speed profiles suggested existence of an efficient near-surface waveguide that allowed the acoustic disturbance to propagate to large distances. An explosion occurring in an area of forest and farms would have limited the number of eyewitnesses. Resolution of the source might be possible by experiment or by detailed analysis of the ground motion data. A seismo-acoustic array was built to investigate thunder-induced ground motions. Two thunder events with similar N-wave waveforms but different horizontal slownesses are chosen to evaluate the credibility of using thunder as a seismic source. These impulsive acoustic waves excited P and S reverberations in the near surface that depend on both the incident wave horizontal slowness and the velocity structure in the upper 30 meters. Nineteen thunder events were chosen to further investigate the seismo-acoustic coupling. The consistent incident slowness differences between acoustic pressure and ground motions suggest that ground reverberations were first initiated somewhat away from the array. Acoustic and seismic signals were used to generate the time-domain transfer function through the deconvolution technique. Possible non-linear interaction for acoustic propagation into the soil at the surface was observed. The reverse radial initial motions suggest a low Poisson's ratio for the near-surface layer. The acoustic-to-seismic transfer functions show a consistent reverberation series of the Rayleigh wave type, which has a systematic dispersion relation to incident slownesses inferred from the seismic ground velocity. Air-coupled Rayleigh wave dispersion was used to quantitatively constrain the near-surface site structure with constraints afforded by near-surface body

  10. Transmission pricing: paradigms and methodologies

    Energy Technology Data Exchange (ETDEWEB)

    Shirmohammadi, Dariush [Pacific Gas and Electric Co., San Francisco, CA (United States); Vieira Filho, Xisto; Gorenstin, Boris [Centro de Pesquisas de Energia Eletrica (CEPEL), Rio de Janeiro, RJ (Brazil); Pereira, Mario V.P. [Power System Research, Rio de Janeiro, RJ (Brazil)

    1994-12-31

    In this paper we describe the principles of several paradigms and methodologies for pricing transmission services. The paper outlines some of the main characteristics of these paradigms and methodologies such as where they may be used for best results. Due to their popularity, power flow based MW-mile and short run marginal cost pricing methodologies will be covered in some detail. We conclude the paper with examples of the application of these two pricing methodologies for pricing transmission services in Brazil. (author) 25 refs., 2 tabs.

  11. Climate-friendly Default Rules

    DEFF Research Database (Denmark)

    Sunstein, Cass R.; Reisch, Lucia A.

    . The underlying reasons include the power of suggestion; inertia and procrastination; and loss aversion. If well-chosen, climate-friendly defaults are likely to have large effects in reducing the economic and environmental harms associated with various products and activities. In deciding whether to establish...... between climate-friendly products or services and alternatives that are potentially damaging to the climate but less expensive? The answer may well depend on the default rule. Indeed, climate-friendly default rules may well be a more effective tool for altering outcomes than large economic incentives...... climate-friendly defaults, choice architects (subject to legal constraints) should consider both consumer welfare and a wide range of other costs and benefits. Sometimes that assessment will argue strongly in favor of climate-friendly defaults, particularly when both economic and environmental...

  12. METHODOLOGY OF PROFESSIONAL PEDAGOGICAL EDUCATION: THEORY AND PRACTICE (theoretical and methodological foundations of vocational teacher education

    Directory of Open Access Journals (Sweden)

    Evgeny M. Dorozhkin

    2014-01-01

    Full Text Available The study is aimed at investigating a justification of the new approach to the problem of vocational education development through the prism of interdependence research methodology and practice. This conceptual setup allows determining the main directions for teacher training modernization of vocational schools. The authors note that the current socio-economic situation in our country has actualized the problem of personnel training. Politicians, economists and scientists’ speeches are all about the shortage of skilled personnel. They see the main reason of this catastrophic situation in the present system of primary and secondary vocational education. At least they concern over the current practice of pedagogical personnel training of vocational education who are to restore the system of vocational education. Our country, Russia has a great positive experience in solving this problem. Scientific-methodological centre for vocational teacher education is the Russian State Vocational Pedagogical University under the scientific direction of Academician of the Russian Academy of Education, G. M. Romantsev. The reflection of scientifictheoretical bases of this education led the authors to the analysis and designing (formation of existent and new professional and pedagogical methodology. Methods. The fundamental position of A. M. Novikov on the generality of the research (scientific and practical activity methodology has become the theoretical platform of the present study. Conceptual field, conceptual statements and professional model are presented as the whole system (or integrating factor. The theoretical framework has determined the logic of the study and its results. Scientific and educational methodology differentiation in terms of the subject of cognitive activity has allowed identifying the main scientific and practical disciplines of vocational teacher education. The creative concept as the subject ground is instrumental analysis of

  13. Assimilating to Hierarchical Culture: A Grounded Theory Study on Communication among Clinical Nurses.

    Science.gov (United States)

    Kim, MinYoung; Oh, Seieun

    2016-01-01

    The purpose of this study was to generate a substantive model that accounts for the explanatory social processes of communication in which nurses were engaged in clinical settings in Korea. Grounded theory methodology was used in this study. A total of 15 clinical nurses participated in the in-depth interviews. "Assimilating to the hierarchical culture" emerged as the basic social process of communication in which the participants engaged in their work environments. To adapt to the cultures of their assigned wards, the nurses learned to be silent and engaged in their assimilation into the established hierarchy. The process of assimilation consisted of three phases based on the major goals that nurses worked to achieve: getting to know about unspoken rules, persevering within the culture, and acting as senior nurse. Seven strategies and actions utilized to achieve the major tasks emerged as subcategories, including receiving strong disapproval, learning by observing, going silent, finding out what is acceptable, minimizing distress, taking advantages as senior nurse, and taking responsibilities as senior nurse. The findings identified how the pattern of communication in nursing organizations affected the way in which nurses were assimilated into organizational culture, from individual nurses' perspectives. In order to improve the rigid working atmosphere and culture in nursing organizations and increase members' satisfaction with work and quality of life, managers and staff nurses need training that focuses on effective communication and encouraging peer opinion-sharing within horizontal relationships. Moreover, organization-level support should be provided to create an environment that encourages free expression.

  14. Methodology for Estimating Ingestion Dose for Emergency Response at SRS

    CERN Document Server

    Simpkins, A A

    2002-01-01

    At the Savannah River Site (SRS), emergency response models estimate dose for inhalation and ground shine pathways. A methodology has been developed to incorporate ingestion doses into the emergency response models. The methodology follows a two-phase approach. The first phase estimates site-specific derived response levels (DRLs) which can be compared with predicted ground-level concentrations to determine if intervention is needed to protect the public. This phase uses accepted methods with little deviation from recommended guidance. The second phase uses site-specific data to estimate a 'best estimate' dose to offsite individuals from ingestion of foodstuffs. While this method deviates from recommended guidance, it is technically defensibly and more realistic. As guidance is updated, these methods also will need to be updated.

  15. Kaupapa Maori Methodology: Trusting the Methodology through Thick and Thin

    Science.gov (United States)

    Hiha, Anne Aroha

    2016-01-01

    Kaupapa Maori is thoroughly theorised in academia in Aotearoa and those wishing to use it as their research methodology can find support through the writing of a number of Maori academics. What is not so well articulated, is the experiential voice of those who have used Kaupapa Maori as research methodology. My identity as a Maori woman…

  16. Decision Rules for Enhanced Breakout.

    Science.gov (United States)

    1987-03-20

    347AD-A18 2753 DECISION RULES FOR ENHANCED GRERICOUT(U) MODERN TECNOLOGIES CORP DAYTON 0ON T MI MCCANN 20 MAR 87 MTC-TR-8883-02 BRMC-85-564-i F33615... information to develop a priori estimates of the cost to break-out specific spare parts. In addition, some recommendations were to be developed...implementing directives. Applicable AF and AFLC Regulations and Pamphlets were also reviewed to obtain information on costs and their estimated

  17. On the Fermi Golden Rule

    DEFF Research Database (Denmark)

    Jensen, Arne; Nenciu, Gheorghe

    2008-01-01

    We review and further develop the framework in [9] of the stationary theory of resonances, arising by perturbation of either threshold, or embedded in the continuum, eigenvalues. While in [9] only non/degenerate eigenvalues were considered, here we add some results for the degenerate case. [9] A........ Jensen and G. Nenciu, The Fermi Golden Rule and its form at thresholds in odd dimensions. Comm. Math. Phys 261 (2006), 693-727...

  18. Horizons Revealed: From Methodology to Method

    Directory of Open Access Journals (Sweden)

    Turner de Sales

    2003-03-01

    Full Text Available In this article, the author reports on a method crafted to interrogate the data of a Gadamerian hermeneutic phenomenological study that explored hope seen through the eyes of a small number of Australian youth. She advocates for transparency throughout data analysis, by commencing with an explication of Gadamerian hermeneutic phenomenology, followed by a description of the manner by which the data were interrogated. It is a basic premise of this work that all too often authors have adopted thematic analysis uncritically, and have used this method of analysis without considering its fit to the philosophical or methodological orientation of the study, and this practice has remained, by and large, unchallenged. While not advocating against thematic analysis per se, the author disputes that this analytical method is appropriate for studies that are grounded by the philosophical underpinnings of Gadamerian hermeneutic phenomenology, and therefore offers a unique method of data analysis.

  19. Interpretive research methodology: broadening the dialogue.

    Science.gov (United States)

    Lowenberg, J S

    1993-12-01

    This article expands the dialogue on interpretive research methodology, locating this set of approaches within a broad historical and interdisciplinary context. Several of the most commonly held misconceptions in nursing, particularly those related to the meanings and derivations ascribed to "grounded theory," "symbolic interactionism," and "ethnography," are examined. The interpretive research approaches not only have gained broader acceptance across disciplines, but also have shifted in more radical and often less structured directions during the past decade. Several pivotal areas of these ongoing shifts are analyzed for their relevance to nursing research: the influence of critical and feminist theory and postmodernism, the ambiguity inherent in both every-day life and the research enterprise, the importance of locating the researcher, power and status inequities, the problematic aspects of language, meaning, and representation, and the emphasis on reflexivity and context as constitutive of meaning.

  20. A History of the Double-Bond Rule

    Science.gov (United States)

    Hoogenboom, Bernard E.

    1998-05-01

    The tautomeric polar systems recognized by Laar in 1886 contain an active atom that appeared to migrate from its original position. The tautomeric systems are of a general structural form and can be represented as X=Y-Z-A. Later workers recognized the same bond weakening effect in a variety of organic structures in which atom A is halogen, hydrogen, carbon, or nitrogen. Hermann Staudinger recognized the weakness of that bond, an allyl bond, in hydrocarbons and exploited the behavior for the preparation of isoprene from terpene hydrocarbons. In 1922 he formulated a generality, a rule, regarding the allyl bond reactivity He noted that natural rubber also decomposed to form isoprene and therefore concluded that natural rubber is an unsaturated hydrocarbon, that isoprene units in natural rubber represent weakly held allyl substituents, and that natural rubber is a macromolecular combination of isoprene units. From his different experience as an industrial chemist, Otto Schmidt recognized the same bond weakening effect in hydrocarbons and in 1932 postulated the "Double-Bond Rule," stating that the presence of a double bond in a hydrocarbon has an alternating strengthening and weakening effect on single bonds throughout the molecule, diminishing with distance from the double bond. Schmidt not only understood the practical benefit of this rule, but he also offered an explanation for the Rule on theoretical grounds. Novel in its time, his theoretical explanation did not find popular acceptance, despite his considerable efforts to promote it in the literature. His concept of the Rule was supplanted by the new theory of resonance devised by Pauling and Wheland and by the implied notion of the stabilization of products by delocalization effects.

  1. Microbiological Methodology in Astrobiology

    Science.gov (United States)

    Abyzov, S. S.; Gerasimenko, L. M.; Hoover, R. B.; Mitskevich, I. N.; Mulyukin, A. L.; Poglazova, M. N.; Rozanov, A. Y.

    2005-01-01

    Searching for life in astromaterials to be delivered from the future missions to extraterrestrial bodies is undoubtedly related to studies of the properties and signatures of living microbial cells and microfossils on Earth. As model terrestrial analogs of Martian polar subsurface layers are often regarded the Antarctic glacier and Earth permafrost habitats where alive microbial cells preserved viability for millennia years due to entering the anabiotic state. For the future findings of viable microorganisms in samples from extraterrestrial objects, it is important to use a combined methodology that includes classical microbiological methods, plating onto nutrient media, direct epifluorescence and electron microscopy examinations, detection of the elemental composition of cells, radiolabeling techniques, PCR and FISH methods. Of great importance is to ensure authenticity of microorganisms (if any in studied samples) and to standardize the protocols used to minimize a risk of external contamination. Although the convincing evidence of extraterrestrial microbial life will may come from the discovery of living cells in astromaterials, biomorphs and microfossils must also be regarded as a target in search of life evidence bearing in mind a scenario that alive microorganisms had not be preserved and underwent mineralization. Under the laboratory conditions, processes that accompanied fossilization of cyanobacteria were reconstructed, and artificially produced cyanobacterial stromatolites resembles by their morphological properties those found in natural Earth habitats. Regarding the vital importance of distinguishing between biogenic and abiogenic signatures and between living and fossil microorganisms in analyzed samples, it is worthwhile to use some previously developed approaches based on electron microscopy examinations and analysis of elemental composition of biomorphs in situ and comparison with the analogous data obtained for laboratory microbial cultures and

  2. Reactions to Reading 'Remaining Consistent with Method? An Analysis of Grounded Theory Research in Accounting': A Comment on Gurd

    OpenAIRE

    2008-01-01

    Purpose: This paper is a comment on Gurd's paper published in QRAM 5(2) on the use of grounded theory in interpretive accounting research. Methodology: Like Gurd, we conducted a bibliographic study on prior pieces of research claiming the use of grounded theory. Findings: We found a large diversity of ways of doing grounded theory. There are as many ways as articles. Consistent with the spirit of grounded theory, the field suggested the research questions, methods and verifiability criteria. ...

  3. THE EFFECTS OF PRELIMINARY RULINGS

    Directory of Open Access Journals (Sweden)

    Iuliana-Mădălina LARION

    2015-07-01

    Full Text Available The study analyses the effects of the preliminary rulings rendered by the Court of Justice for the judicial body that made the reference and for other bodies dealing with similar cases, for the member states, for the European Union’ s institutions and for EU legal order. Starting from the binding effect of the preliminary judgment for national judicial bodies, which requires them to follow the ruling or make a new reference, to the lack of precedent doctrine in EU law, continuing with the possibility to indirectly verify the compatibility of national law of the member states with EU law and ending with the administrative or legislative measures that can or must be taken by the member states, the study intends to highlight the limits, nuances and consequences of the binding effect. It mentions the contribution of the national courts and of the Court of Justice of the European Union to the development of EU law, such as clarifying autonomous notions and it emphasizes the preliminary procedure's attributes of being a form of judicial protection of individual rights, as well as a means to review the legality of acts of EU institutions. The paper is meant to be a useful instrument for practitioners. Therefor, it also deals with the possibility and limits of asking new questions, in order to obtain reconsideration or a refinement of the legal issue and with the problem of judicial control over the interpretation and application of the preliminary ruling by the lower court.

  4. Decentralized Ground Staff Scheduling

    DEFF Research Database (Denmark)

    Sørensen, M. D.; Clausen, Jens

    2002-01-01

    Typically, ground staff scheduling is centrally planned for each terminal in an airport. The advantage of this is that the staff is efficiently utilized, but a disadvantage is that staff spends considerable time walking between stands. In this paper a decentralized approach for ground staff...... scheduling is investigated. The airport terminal is divided into zones, where each zone consists of a set of stands geographically next to each other. Staff is assigned to work in only one zone and the staff scheduling is planned decentralized for each zone. The advantage of this approach is that the staff...... work in a smaller area of the terminal and thus spends less time walking between stands. When planning decentralized the allocation of stands to flights influences the staff scheduling since the workload in a zone depends on which flights are allocated to stands in the zone. Hence solving the problem...

  5. Ibis ground calibration

    Energy Technology Data Exchange (ETDEWEB)

    Bird, A.J.; Barlow, E.J.; Tikkanen, T. [Southampton Univ., School of Physics and Astronomy (United Kingdom); Bazzano, A.; Del Santo, M.; Ubertini, P. [Istituto di Astrofisica Spaziale e Fisica Cosmica - IASF/CNR, Roma (Italy); Blondel, C.; Laurent, P.; Lebrun, F. [CEA Saclay - Sap, 91 - Gif sur Yvette (France); Di Cocco, G.; Malaguti, E. [Istituto di Astrofisica Spaziale e Fisica-Bologna - IASF/CNR (Italy); Gabriele, M.; La Rosa, G.; Segreto, A. [Istituto di Astrofisica Spaziale e Fisica- IASF/CNR, Palermo (Italy); Quadrini, E. [Istituto di Astrofisica Spaziale e Fisica-Cosmica, EASF/CNR, Milano (Italy); Volkmer, R. [Institut fur Astronomie und Astrophysik, Tubingen (Germany)

    2003-11-01

    We present an overview of results obtained from IBIS ground calibrations. The spectral and spatial characteristics of the detector planes and surrounding passive materials have been determined through a series of calibration campaigns. Measurements of pixel gain, energy resolution, detection uniformity, efficiency and imaging capability are presented. The key results obtained from the ground calibration have been: - optimization of the instrument tunable parameters, - determination of energy linearity for all detection modes, - determination of energy resolution as a function of energy through the range 20 keV - 3 MeV, - demonstration of imaging capability in each mode, - measurement of intrinsic detector non-uniformity and understanding of the effects of passive materials surrounding the detector plane, and - discovery (and closure) of various leakage paths through the passive shielding system.

  6. Autonomous Rule Creation for Intrusion Detection

    Energy Technology Data Exchange (ETDEWEB)

    Todd Vollmer; Jim Alves-Foss; Milos Manic

    2011-04-01

    Many computational intelligence techniques for anomaly based network intrusion detection can be found in literature. Translating a newly discovered intrusion recognition criteria into a distributable rule can be a human intensive effort. This paper explores a multi-modal genetic algorithm solution for autonomous rule creation. This algorithm focuses on the process of creating rules once an intrusion has been identified, rather than the evolution of rules to provide a solution for intrusion detection. The algorithm was demonstrated on anomalous ICMP network packets (input) and Snort rules (output of the algorithm). Output rules were sorted according to a fitness value and any duplicates were removed. The experimental results on ten test cases demonstrated a 100 percent rule alert rate. Out of 33,804 test packets 3 produced false positives. Each test case produced a minimum of three rule variations that could be used as candidates for a production system.

  7. Autonomous Rule Creation for Intrusion Detection

    Energy Technology Data Exchange (ETDEWEB)

    Todd Vollmer; Jim Alves-Foss; Milos Manic

    2011-04-01

    Many computational intelligence techniques for anomaly based network intrusion detection can be found in literature. Translating a newly discovered intrusion recognition criteria into a distributable rule can be a human intensive effort. This paper explores a multi-modal genetic algorithm solution for autonomous rule creation. This algorithm focuses on the process of creating rules once an intrusion has been identified, rather than the evolution of rules to provide a solution for intrusion detection. The algorithm was demonstrated on anomalous ICMP network packets (input) and Snort rules (output of the algorithm). Output rules were sorted according to a fitness value and any duplicates were removed. The experimental results on ten test cases demonstrated a 100 percent rule alert rate. Out of 33,804 test packets 3 produced false positives. Each test case produced a minimum of three rule variations that could be used as candidates for a production system.

  8. Abegg, Lewis, Langmuir, and the Octet Rule.

    Science.gov (United States)

    Jensen, William B.

    1984-01-01

    Discusses major events leading to the development of the octet rule. Three conclusions based on the work of Mendeleev, Abegg, Thompson, Kossel, Lewis, and Langmuir are considered as is the debate over the rule's validity. (JN)

  9. Statistical inference of static analysis rules

    Science.gov (United States)

    Engler, Dawson Richards (Inventor)

    2009-01-01

    Various apparatus and methods are disclosed for identifying errors in program code. Respective numbers of observances of at least one correctness rule by different code instances that relate to the at least one correctness rule are counted in the program code. Each code instance has an associated counted number of observances of the correctness rule by the code instance. Also counted are respective numbers of violations of the correctness rule by different code instances that relate to the correctness rule. Each code instance has an associated counted number of violations of the correctness rule by the code instance. A respective likelihood of the validity is determined for each code instance as a function of the counted number of observances and counted number of violations. The likelihood of validity indicates a relative likelihood that a related code instance is required to observe the correctness rule. The violations may be output in order of the likelihood of validity of a violated correctness rule.

  10. Online Rule Generation Software Process Model

    National Research Council Canada - National Science Library

    Sudeep Marwaha; Alka Aroa; Satma M C; Rajni Jain; R C Goyal

    2013-01-01

    .... The software process model for rule generation using decision tree classifier refers to the various steps required to be executed for the development of a web based software model for decision rule generation...

  11. Mechanics of Ship Grounding

    DEFF Research Database (Denmark)

    Pedersen, Preben Terndrup

    1996-01-01

    In these notes first a simplified mathematical model is presented for analysis of ship hull loading due to grounding on relatively hard and plane sand, clay or rock sea bottoms. In a second section a more rational calculation model is described for the sea bed soil reaction forces on the sea bottom....... Finally, overall hull failure is considered first applying a quasistatic analysis model and thereafter a full dynamic model....

  12. Book Review: Essentials of Accessible Grounded Theory (Stern & Porr, 2011

    Directory of Open Access Journals (Sweden)

    Odis E. Simmons, Ph.D.

    2011-12-01

    Full Text Available Although Porr is a relative newcomer to grounded theory, Stern has been at it for many years (she received her PhD under Glaser and Strauss in 1977. She has been instrumental in introducing many students to grounded theory, particularly in the nursing field, as well as making notable contributions to grounded theory literature. As Stern’s (1994 observations and insights suggested, constructivist versions of grounded theory emerged and spread in part because grounded theory was often being taught by teachers who themselves had a superficial, distorted understanding of the methodology, because they had learned it “minus mentor.” Given her observations, insights, and writings, when I began reading Essentials, my expectations were high. But, after reading it, I concluded that, in some important ways, it falls short. Given Stern’s considerable experience and previous contributions to grounded theory, it is ironic that Essentials contains more confusing and subtly inaccurate content than a book written for neophyte grounded theorists should. Although I think it is a noble effort with useful information, it contains material that is at variance with classic grounded theory, yet this isn’t made clear to the reader. Because Stern and Porr failed to make a clear distinction between classic and other forms of grounded theory, many readers, particularly neophytes, will of course expect that what they present in this book accurately represents essential canons of all types of grounded theory, including classic. Readers will carry the understandings and misunderstandings gained from the book into their research and discussions with other neophytes and individuals who express interest in grounded theory.

  13. Methodology of risk assessment of loss of water resources due to climate changes

    Science.gov (United States)

    Israfilov, Yusif; Israfilov, Rauf; Guliyev, Hatam; Afandiyev, Galib

    2016-04-01

    For sustainable development and management of rational use of water resources of Azerbaijan Republic it is actual to forecast their changes taking into account different scenarios of climate changes and assessment of possible risks of loss of sections of water resources. The major part of the Azerbaijani territory is located in the arid climate and the vast majority of water is used in the national economic production. An optimal use of conditional groundwater and surface water is of great strategic importance for economy of the country in terms of lack of common water resources. Low annual rate of sediments, high evaporation and complex natural and hydrogeological conditions prevent sustainable formation of conditioned resources of ground and surface water. In addition, reserves of fresh water resources are not equally distributed throughout the Azerbaijani territory. The lack of the common water balance creates tension in the rational use of fresh water resources in various sectors of the national economy, especially in agriculture, and as a result, in food security of the republic. However, the fresh water resources of the republic have direct proportional dependence on climatic factors. 75-85% of the resources of ground stratum-pore water of piedmont plains and fracture-vein water of mountain regions are formed by the infiltration of rainfall and condensate water. Changes of climate parameters involve changes in the hydrological cycle of the hydrosphere and as a rule, are reflected on their resources. Forecasting changes of water resources of the hydrosphere with different scenarios of climate change in regional mathematical models allowed estimating the extent of their relationship and improving the quality of decisions. At the same time, it is extremely necessary to obtain additional data for risk assessment and management to reduce water resources for a detailed analysis, forecasting the quantitative and qualitative parameters of resources, and also for

  14. 75 FR 82148 - Nutrition Labeling of Single-Ingredient Products and Ground or Chopped Meat and Poultry Products

    Science.gov (United States)

    2010-12-29

    ... Ground or Chopped Meat and Poultry Products; Final Rule #0;#0;Federal Register / Vol. 75 , No. 249... Labeling of Single-Ingredient Products and Ground or Chopped Meat and Poultry Products AGENCY: Food Safety... (FSIS) is amending the Federal meat and poultry products inspection regulations to require nutrition...

  15. Outdoor ground impedance models.

    Science.gov (United States)

    Attenborough, Keith; Bashir, Imran; Taherzadeh, Shahram

    2011-05-01

    Many models for the acoustical properties of rigid-porous media require knowledge of parameter values that are not available for outdoor ground surfaces. The relationship used between tortuosity and porosity for stacked spheres results in five characteristic impedance models that require not more than two adjustable parameters. These models and hard-backed-layer versions are considered further through numerical fitting of 42 short range level difference spectra measured over various ground surfaces. For all but eight sites, slit-pore, phenomenological and variable porosity models yield lower fitting errors than those given by the widely used one-parameter semi-empirical model. Data for 12 of 26 grassland sites and for three beech wood sites are fitted better by hard-backed-layer models. Parameter values obtained by fitting slit-pore and phenomenological models to data for relatively low flow resistivity grounds, such as forest floors, porous asphalt, and gravel, are consistent with values that have been obtained non-acoustically. Three impedance models yield reasonable fits to a narrow band excess attenuation spectrum measured at short range over railway ballast but, if extended reaction is taken into account, the hard-backed-layer version of the slit-pore model gives the most reasonable parameter values.

  16. Optimization of CHR propagation rules: extended report

    OpenAIRE

    Van Weert, Peter

    2008-01-01

    Constraint Handling Rules (CHR) is an elegant, high-level programming language based on multi-headed, forward chaining rules. To ensure CHR propagation rules are applied at most once with the same combination of constraints, CHR implementations maintain a so-called propagation history. The performance impact of this history can be significant. We introduce several optimizations that, for the majority of CHR rules, eliminate this overhead. We formally prove their correctness, and evaluate thei...

  17. Refinements of some new efficient quadrature rules

    Science.gov (United States)

    Qayyum, A.; Shoaib, M.; Faye, I.; Kashif, A. R.

    2016-11-01

    In the field of Engineering and Applied Mathematical Sciences, minimizing approximation error is very important task and therefore quadrature rules are investigated regularly. In this paper, using some standard results of theoretical inequalities, e.g. Ostrowski type inequality, some new efficient quadrature rules are introduced for n-times differentiable mappings. These quadrature rules are expected to give better results comparing to the conventional quadrature rules.

  18. Religionsfrihed i Kina & The Rule of Law

    DEFF Research Database (Denmark)

    Christoffersen, Lisbet

    2011-01-01

    Artiklen redegør for aktuel kinesisk religionsret, sammenstiller den med internationale religionsretlige grundbegreber og anvender dette empiriske materiale til en reflektion over begreberne Rule of Law vs Rule by Law......Artiklen redegør for aktuel kinesisk religionsret, sammenstiller den med internationale religionsretlige grundbegreber og anvender dette empiriske materiale til en reflektion over begreberne Rule of Law vs Rule by Law...

  19. Engineering uses of physics-based ground motion simulations

    Science.gov (United States)

    Baker, Jack W.; Luco, Nicolas; Abrahamson, Norman A.; Graves, Robert W.; Maechling, Phillip J.; Olsen, Kim B.

    2014-01-01

    This paper summarizes validation methodologies focused on enabling ground motion simulations to be used with confidence in engineering applications such as seismic hazard analysis and dynmaic analysis of structural and geotechnical systems. Numberical simullation of ground motion from large erthquakes, utilizing physics-based models of earthquake rupture and wave propagation, is an area of active research in the earth science community. Refinement and validatoin of these models require collaboration between earthquake scientists and engineering users, and testing/rating methodolgies for simulated ground motions to be used with confidence in engineering applications. This paper provides an introduction to this field and an overview of current research activities being coordinated by the Souther California Earthquake Center (SCEC). These activities are related both to advancing the science and computational infrastructure needed to produce ground motion simulations, as well as to engineering validation procedures. Current research areas and anticipated future achievements are also discussed.

  20. Taxation without representation: the illegal IRS rule to expand tax credits under the PPACA.

    Science.gov (United States)

    Adler, Jonathan H; Cannon, Michael F

    2013-01-01

    The Patient Protection and Affordable Care Act (PPACA) provides tax credits and subsidies for the purchase of qualifying health insurance plans on state-run insurance exchanges. Contrary to expectations, many states are refusing or otherwise failing to create such exchanges. An Internal Revenue Service (IRS) rule purports to extend these tax credits and subsidies to the purchase of health insurance in federal exchanges created in states without exchanges of their own. This rule lacks statutory authority. The text, structure, and history of the Act show that tax credits and subsidies are not available in federally run exchanges. The IRS rule is contrary to congressional intent and cannot be justified on other legal grounds. Because tax credit eligibility can trigger penalties on employers and individuals, affected parties are likely to have standing to challenge the IRS rule in court.

  1. Idioms-based Business Rule Extraction

    NARCIS (Netherlands)

    Smit, R

    2011-01-01

    This thesis studies the extraction of embedded business rules, using the idioms of the used framework to identify them. Embedded business rules exist as source code in the software system and knowledge about them may get lost. Extraction of those business rules could make them accessible and managea

  2. Comparison of Heuristics for Inhibitory Rule Optimization

    KAUST Repository

    Alsolami, Fawaz

    2014-09-13

    Knowledge representation and extraction are very important tasks in data mining. In this work, we proposed a variety of rule-based greedy algorithms that able to obtain knowledge contained in a given dataset as a series of inhibitory rules containing an expression “attribute ≠ value” on the right-hand side. The main goal of this paper is to determine based on rule characteristics, rule length and coverage, whether the proposed rule heuristics are statistically significantly different or not; if so, we aim to identify the best performing rule heuristics for minimization of rule length and maximization of rule coverage. Friedman test with Nemenyi post-hoc are used to compare the greedy algorithms statistically against each other for length and coverage. The experiments are carried out on real datasets from UCI Machine Learning Repository. For leading heuristics, the constructed rules are compared with optimal ones obtained based on dynamic programming approach. The results seem to be promising for the best heuristics: the average relative difference between length (coverage) of constructed and optimal rules is at most 2.27% (7%, respectively). Furthermore, the quality of classifiers based on sets of inhibitory rules constructed by the considered heuristics are compared against each other, and the results show that the three best heuristics from the point of view classification accuracy coincides with the three well-performed heuristics from the point of view of rule length minimization.

  3. 16 CFR 410.1 - The Rule.

    Science.gov (United States)

    2010-01-01

    ... 16 Commercial Practices 1 2010-01-01 2010-01-01 false The Rule. 410.1 Section 410.1 Commercial Practices FEDERAL TRADE COMMISSION TRADE REGULATION RULES DECEPTIVE ADVERTISING AS TO SIZES OF VIEWABLE PICTURES SHOWN BY TELEVISION RECEIVING SETS § 410.1 The Rule. In connection with the sale of...

  4. 78 FR 54566 - Energy Labeling Rule

    Science.gov (United States)

    2013-09-05

    ... From the Federal Register Online via the Government Publishing Office FEDERAL TRADE COMMISSION 16 CFR Part 305 RIN 3084-AB03 Energy Labeling Rule AGENCY: Federal Trade Commission. ACTION: Final rule; correction. SUMMARY: The Federal Trade Commission published a final rule on July 23, 2013 revising its...

  5. 7 CFR 29.2622 - Rule 6.

    Science.gov (United States)

    2010-01-01

    ... 7 Agriculture 2 2010-01-01 2010-01-01 false Rule 6. 29.2622 Section 29.2622 Agriculture Regulations of the Department of Agriculture AGRICULTURAL MARKETING SERVICE (Standards, Inspections, Marketing... INSPECTION Standards Rules § 29.2622 Rule 6. A lot of tobacco on the marginal line between two colors shall...

  6. 7 CFR 29.1112 - Rule 6.

    Science.gov (United States)

    2010-01-01

    ... 7 Agriculture 2 2010-01-01 2010-01-01 false Rule 6. 29.1112 Section 29.1112 Agriculture Regulations of the Department of Agriculture AGRICULTURAL MARKETING SERVICE (Standards, Inspections, Marketing... INSPECTION Standards Rules § 29.1112 Rule 6. A lot of tobacco on the marginal line between two colors shall...

  7. 7 CFR 29.3607 - Rule 6.

    Science.gov (United States)

    2010-01-01

    ... 7 Agriculture 2 2010-01-01 2010-01-01 false Rule 6. 29.3607 Section 29.3607 Agriculture Regulations of the Department of Agriculture AGRICULTURAL MARKETING SERVICE (Standards, Inspections, Marketing... INSPECTION Standards Rules § 29.3607 Rule 6. A lot of tobacco on the marginal line between two colors shall...

  8. 7 CFR 29.3108 - Rule 5.

    Science.gov (United States)

    2010-01-01

    ... 7 Agriculture 2 2010-01-01 2010-01-01 false Rule 5. 29.3108 Section 29.3108 Agriculture Regulations of the Department of Agriculture AGRICULTURAL MARKETING SERVICE (Standards, Inspections, Marketing... INSPECTION Standards Rules § 29.3108 Rule 5. A lot of tobacco on the marginal line between two colors shall...

  9. 7 CFR 29.2397 - Rule 6.

    Science.gov (United States)

    2010-01-01

    ... 7 Agriculture 2 2010-01-01 2010-01-01 false Rule 6. 29.2397 Section 29.2397 Agriculture Regulations of the Department of Agriculture AGRICULTURAL MARKETING SERVICE (Standards, Inspections, Marketing... INSPECTION Standards Rules § 29.2397 Rule 6. A lot of tobacco on the marginal line between two colors shall...

  10. Error Analysis of Quadrature Rules. Classroom Notes

    Science.gov (United States)

    Glaister, P.

    2004-01-01

    Approaches to the determination of the error in numerical quadrature rules are discussed and compared. This article considers the problem of the determination of errors in numerical quadrature rules, taking Simpson's rule as the principal example. It suggests an approach based on truncation error analysis of numerical schemes for differential…

  11. Binary translation using peephole translation rules

    Science.gov (United States)

    Bansal, Sorav; Aiken, Alex

    2010-05-04

    An efficient binary translator uses peephole translation rules to directly translate executable code from one instruction set to another. In a preferred embodiment, the translation rules are generated using superoptimization techniques that enable the translator to automatically learn translation rules for translating code from the source to target instruction set architecture.

  12. 31 CFR 103.85 - Issuing rulings.

    Science.gov (United States)

    2010-07-01

    ... 31 Money and Finance: Treasury 1 2010-07-01 2010-07-01 false Issuing rulings. 103.85 Section 103.85 Money and Finance: Treasury Regulations Relating to Money and Finance FINANCIAL RECORDKEEPING AND REPORTING OF CURRENCY AND FOREIGN TRANSACTIONS Administrative Rulings § 103.85 Issuing rulings. The...

  13. 14 CFR 437.39 - Flight rules.

    Science.gov (United States)

    2010-01-01

    ... 14 Aeronautics and Space 4 2010-01-01 2010-01-01 false Flight rules. 437.39 Section 437.39 Aeronautics and Space COMMERCIAL SPACE TRANSPORTATION, FEDERAL AVIATION ADMINISTRATION, DEPARTMENT OF... Documentation § 437.39 Flight rules. An applicant must provide flight rules as required by § 437.71....

  14. Product and Quotient Rules from Logarithmic Differentiation

    Science.gov (United States)

    Chen, Zhibo

    2012-01-01

    A new application of logarithmic differentiation is presented, which provides an alternative elegant proof of two basic rules of differentiation: the product rule and the quotient rule. The proof can intrigue students, help promote their critical thinking and rigorous reasoning and deepen their understanding of previously encountered concepts. The…

  15. 7 CFR 29.3618 - Rule 17.

    Science.gov (United States)

    2010-01-01

    ... 7 Agriculture 2 2010-01-01 2010-01-01 false Rule 17. 29.3618 Section 29.3618 Agriculture Regulations of the Department of Agriculture AGRICULTURAL MARKETING SERVICE (Standards, Inspections, Marketing... INSPECTION Standards Rules § 29.3618 Rule 17. Any lot of tobacco which is not green but contains over...

  16. The Orbital Angular Momentum Sum Rule

    Science.gov (United States)

    Aslan, Fatma; Burkardt, Matthias

    2015-10-01

    As an alternative to the Ji sum rule for the quark angular momentum, a sum rule for the quark orbital angular momentum, based on a twist-3 generalized parton distribution, has been suggested. We study the validity of this sum rule in the context of scalar Yukawa interactions as well as in QED for an electron.

  17. A dynamic analysis of moving average rules

    NARCIS (Netherlands)

    Chiarella, C.; He, X.Z.; Hommes, C.H.

    2006-01-01

    The use of various moving average (MA) rules remains popular with financial market practitioners. These rules have recently become the focus of a number empirical studies, but there have been very few studies of financial market models where some agents employ technical trading rules of the type

  18. Will Rule based BPM obliterate Process Models?

    NARCIS (Netherlands)

    Joosten, S.; Joosten, H.J.M.

    2007-01-01

    Business rules can be used directly for controlling business processes, without reference to a business process model. In this paper we propose to use business rules to specify both business processes and the software that supports them. Business rules expressed in smart mathematical notations bring

  19. NCAA Rule 48: Origins and Reactions.

    Science.gov (United States)

    Wieder, Alan

    1986-01-01

    National Collegiate Athletic Association Rule 48 sets academic standards for high school which incoming freshmen must have met in order to receive a grant-in-aid and play intercollegiate athletics. The author discusses why tougher standards are needed, how Rule 48 operates, what problems are, and why there is opposition to the rule. (MT)

  20. Rule-based Modelling and Tunable Resolution

    Directory of Open Access Journals (Sweden)

    Russ Harmer

    2009-11-01

    Full Text Available We investigate the use of an extension of rule-based modelling for cellular signalling to create a structured space of model variants. This enables the incremental development of rule sets that start from simple mechanisms and which, by a gradual increase in agent and rule resolution, evolve into more detailed descriptions.

  1. Rule-based Modelling and Tunable Resolution

    CERN Document Server

    Harmer, Russ

    2009-01-01

    We investigate the use of an extension of rule-based modelling for cellular signalling to create a structured space of model variants. This enables the incremental development of rule sets that start from simple mechanisms and which, by a gradual increase in agent and rule resolution, evolve into more detailed descriptions.

  2. Formal Semantics of Dynamic Rules in ORM

    NARCIS (Netherlands)

    Balsters, Herman; Halpin, Terry; Meersman, R; Tari, Z; Herrero, P

    2008-01-01

    This paper provides formal semantics for an extension of the Object-Role Modeling approach that supports declaration of dynamic rules. Dynamic rules differ from static rules by pertaining to properties of state transitions, rather than to the states themselves. In this paper we restrict application

  3. A dynamic analysis of moving average rules

    NARCIS (Netherlands)

    C. Chiarella; X.Z. He; C.H. Hommes

    2006-01-01

    The use of various moving average (MA) rules remains popular with financial market practitioners. These rules have recently become the focus of a number empirical studies, but there have been very few studies of financial market models where some agents employ technical trading rules of the type use

  4. Workshops as a Research Methodology

    Science.gov (United States)

    Ørngreen, Rikke; Levinsen, Karin

    2017-01-01

    This paper contributes to knowledge on workshops as a research methodology, and specifically on how such workshops pertain to e-learning. A literature review illustrated that workshops are discussed according to three different perspectives: workshops as a means, workshops as practice, and workshops as a research methodology. Focusing primarily on…

  5. A methodology for social experimentation

    DEFF Research Database (Denmark)

    Ravn, Ib

    A methodology is outlined whereby one may improve the performance of a social system to the satisfaction of its stakeholders, that is, facilitate desirable social and organizational transformations......A methodology is outlined whereby one may improve the performance of a social system to the satisfaction of its stakeholders, that is, facilitate desirable social and organizational transformations...

  6. Methodology of Law and Economics

    NARCIS (Netherlands)

    A.M. Pacces (Alessio Maria); L.T. Visscher (Louis)

    2011-01-01

    textabstractIntroduction A chapter on the methodology of law and economics, i.e. the economic analysis of law, concerns the methodology of economics. The above quote (Becker 1976, 5) shows that economics should not be defined by its subject, but by its method (also Veljanovski 2007, 19). This method

  7. Methodological Pluralism and Narrative Inquiry

    Science.gov (United States)

    Michie, Michael

    2013-01-01

    This paper considers how the integral theory model of Nancy Davis and Laurie Callihan might be enacted using a different qualitative methodology, in this case the narrative methodology. The focus of narrative research is shown to be on "what meaning is being made" rather than "what is happening here" (quadrant 2 rather than…

  8. Choosing a Methodology: Philosophical Underpinning

    Science.gov (United States)

    Jackson, Elizabeth

    2013-01-01

    As a university lecturer, I find that a frequent question raised by Masters students concerns the methodology chosen for research and the rationale required in dissertations. This paper unpicks some of the philosophical coherence that can inform choices to be made regarding methodology and a well-thought out rationale that can add to the rigour of…

  9. Building ASIPS the Mescal methodology

    CERN Document Server

    Gries, Matthias

    2006-01-01

    A number of system designers use ASIP's rather than ASIC's to implement their system solutions. This book gives a comprehensive methodology for the design of these application-specific instruction processors (ASIPs). It includes demonstrations of applications of the methodologies using the Tipi research framework.

  10. Optimal quadrature rules for odd-degree spline spaces and their application to tensor-product-based isogeometric analysis

    KAUST Repository

    Barton, Michael

    2016-03-14

    We introduce optimal quadrature rules for spline spaces that are frequently used in Galerkin discretizations to build mass and stiffness matrices. Using the homotopy continuation concept (Bartoň and Calo, 2016) that transforms optimal quadrature rules from source spaces to target spaces, we derive optimal rules for splines defined on finite domains. Starting with the classical Gaussian quadrature for polynomials, which is an optimal rule for a discontinuous odd-degree space, we derive rules for target spaces of higher continuity. We further show how the homotopy methodology handles cases where the source and target rules require different numbers of optimal quadrature points. We demonstrate it by deriving optimal rules for various odd-degree spline spaces, particularly with non-uniform knot sequences and non-uniform multiplicities. We also discuss convergence of our rules to their asymptotic counterparts, that is, the analogues of the midpoint rule of Hughes et al. (2010), that are exact and optimal for infinite domains. For spaces of low continuities, we numerically show that the derived rules quickly converge to their asymptotic counterparts as the weights and nodes of a few boundary elements differ from the asymptotic values.

  11. A Rule-Based System Implementing a Method for Translating FOL Formulas into NL Sentences

    Science.gov (United States)

    Mpagouli, Aikaterini; Hatzilygeroudis, Ioannis

    In this paper, we mainly present the implementation of a system that translates first order logic (FOL) formulas into natural language (NL) sentences. The motivation comes from an intelligent tutoring system teaching logic as a knowledge representation language, where it is used as a means for feedback to the students-users. FOL to NL conversion is achieved by using a rule-based approach, where we exploit the pattern matching capabilities of rules. So, the system consists of rule-based modules corresponding to the phases of our translation methodology. Facts are used in a lexicon providing lexical and grammatical information that helps in producing the NL sentences. The whole system is implemented in Jess, a java-implemented rule-based programming tool. Experimental results confirm the success of our choices.

  12. An Estimation of Distribution Algorithm with Intelligent Local Search for Rule-based Nurse Rostering

    CERN Document Server

    Uwe, Aickelin; Jingpeng, Li

    2007-01-01

    This paper proposes a new memetic evolutionary algorithm to achieve explicit learning in rule-based nurse rostering, which involves applying a set of heuristic rules for each nurse's assignment. The main framework of the algorithm is an estimation of distribution algorithm, in which an ant-miner methodology improves the individual solutions produced in each generation. Unlike our previous work (where learning is implicit), the learning in the memetic estimation of distribution algorithm is explicit, i.e. we are able to identify building blocks directly. The overall approach learns by building a probabilistic model, i.e. an estimation of the probability distribution of individual nurse-rule pairs that are used to construct schedules. The local search processor (i.e. the ant-miner) reinforces nurse-rule pairs that receive higher rewards. A challenging real world nurse rostering problem is used as the test problem. Computational results show that the proposed approach outperforms most existing approaches. It is ...

  13. Evolving Rule-Based Systems in two Medical Domains using Genetic Programming

    DEFF Research Database (Denmark)

    Tsakonas, A.; Dounias, G.; Jantzen, Jan

    2004-01-01

    We demonstrate, compare and discuss the application of two genetic programming methodologies for the construction of rule-based systems in two medical domains: the diagnosis of Aphasia's subtypes and the classification of Pap-Smear Test examinations. The first approach consists of a scheme...... that combines genetic programming and heuristic hierarchical crisp rule-base construction. The second model is composed by a grammar driven genetic programming system for the generation of fuzzy rule-based systems. Results are also compared for their efficiency, accuracy and comprehensibility, to those...... of a standard entropy based machine learning approach and to those of a standard genetic programming symbolic expression approach. In the diagnosis of subtypes of Aphasia, two models for crisp rule-bases are presented. The first one discriminates between four major types and the second attempts...

  14. Probability matching involves rule-generating ability: a neuropsychological mechanism dealing with probabilities.

    Science.gov (United States)

    Unturbe, Jesús; Corominas, Josep

    2007-09-01

    Probability matching is a nonoptimal strategy consisting of selecting each alternative in proportion to its reinforcement contingency. However, matching is related to hypothesis testing in an incidental, marginal, and methodologically disperse manner. Although some authors take it for granted, the relationship has not been demonstrated. Fifty-eight healthy participants performed a modified, bias-free probabilistic two-choice task, the Simple Prediction Task (SPT). Self-reported spurious rules were recorded and then graded by two independent judges. Participants who produced the most complex rules selected the probability matching strategy and were therefore less successful than those who did not produce rules. The close relationship between probability matching and rule generating makes SPT a complementary instrument for studying decision making, which might throw some light on the debate about irrationality. The importance of the reaction times, both before and after responding, is also discussed.

  15. Control/structure interaction design methodology

    Science.gov (United States)

    Briggs, Hugh C.; Layman, William E.

    1989-01-01

    The Control Structure Interaction Program is a technology development program for spacecraft that exhibit interactions between the control system and structural dynamics. The program objectives include development and verification of new design concepts (such as active structure) and new tools (such as a combined structure and control optimization algorithm) and their verification in ground and possibly flight test. The new CSI design methodology is centered around interdisciplinary engineers using new tools that closely integrate structures and controls. Verification is an important CSI theme and analysts will be closely integrated to the CSI Test Bed laboratory. Components, concepts, tools and algorithms will be developed and tested in the lab and in future Shuttle-based flight experiments. The design methodology is summarized in block diagrams depicting the evolution of a spacecraft design and descriptions of analytical capabilities used in the process. The multiyear JPL CSI implementation plan is described along with the essentials of several new tools. A distributed network of computation servers and workstations was designed that will provide a state-of-the-art development base for the CSI technologies.

  16. Ground test for vibration control demonstrator

    Science.gov (United States)

    Meyer, C.; Prodigue, J.; Broux, G.; Cantinaud, O.; Poussot-Vassal, C.

    2016-09-01

    In the objective of maximizing comfort in Falcon jets, Dassault Aviation is developing an innovative vibration control technology. Vibrations of the structure are measured at several locations and sent to a dedicated high performance vibration control computer. Control laws are implemented in this computer to analyse the vibrations in real time, and then elaborate orders sent to the existing control surfaces to counteract vibrations. After detailing the technology principles, this paper focuses on the vibration control ground demonstration that was performed by Dassault Aviation in May 2015 on Falcon 7X business jet. The goal of this test was to attenuate vibrations resulting from fixed forced excitation delivered by shakers. The ground test demonstrated the capability to implement an efficient closed-loop vibration control with a significant vibration level reduction and validated the vibration control law design methodology. This successful ground test was a prerequisite before the flight test demonstration that is now being prepared. This study has been partly supported by the JTI CleanSky SFWA-ITD.

  17. CRUDE OIL PRICE FORECASTING WITH TEI@I METHODOLOGY

    Institute of Scientific and Technical Information of China (English)

    WANG Shouyang; YU Lean; K.K.LAI

    2005-01-01

    The difficulty in crude oil price forecasting,due to inherent complexity,has attracted much attention of academic researchers and business practitioners.Various methods have been tried to solve the problem of forecasting crude oil prices.However,all of the existing models of prediction can not meet practical needs.Very recently,Wang and Yu proposed a new methodology for handling complex systems-TEI@I methodology by means of a systematic integration of text mining,econometrics and intelligent techniques.Within the framework of TEI@I methodology,econometrical models are used to model the linear components of crude oil price time series (i.e.,main trends) while nonlinear components of crude oil price time series (i.e.,error terms) are modelled by using artificial neural network (ANN) models.In addition,the impact of irregular and infrequent future events on crude oil price is explored using web-based text mining (WTM) and rule-based expert systems (RES) techniques.Thus,a fully novel nonlinear integrated forecasting approach with error correction and judgmental adjustment is formulated to improve prediction performance within the framework of the TEI@I methodology.The proposed methodology and the novel forecasting approach are illustrated via an example.

  18. Scalable rule-based modelling of allosteric proteins and biochemical networks.

    Directory of Open Access Journals (Sweden)

    Julien F Ollivier

    Full Text Available Much of the complexity of biochemical networks comes from the information-processing abilities of allosteric proteins, be they receptors, ion-channels, signalling molecules or transcription factors. An allosteric protein can be uniquely regulated by each combination of input molecules that it binds. This "regulatory complexity" causes a combinatorial increase in the number of parameters required to fit experimental data as the number of protein interactions increases. It therefore challenges the creation, updating, and re-use of biochemical models. Here, we propose a rule-based modelling framework that exploits the intrinsic modularity of protein structure to address regulatory complexity. Rather than treating proteins as "black boxes", we model their hierarchical structure and, as conformational changes, internal dynamics. By modelling the regulation of allosteric proteins through these conformational changes, we often decrease the number of parameters required to fit data, and so reduce over-fitting and improve the predictive power of a model. Our method is thermodynamically grounded, imposes detailed balance, and also includes molecular cross-talk and the background activity of enzymes. We use our Allosteric Network Compiler to examine how allostery can facilitate macromolecular assembly and how competitive ligands can change the observed cooperativity of an allosteric protein. We also develop a parsimonious model of G protein-coupled receptors that explains functional selectivity and can predict the rank order of potency of agonists acting through a receptor. Our methodology should provide a basis for scalable, modular and executable modelling of biochemical networks in systems and synthetic biology.

  19. Cutkosky Rules from Outer Space

    CERN Document Server

    Kreimer, Dirk

    2016-01-01

    We overview recent results on the mathematical foundations of Cutkosky rules. We emphasize that the two operations of shrinking an internal edge or putting internal lines on the mass-shell are natural operation on the cubical chain complex studied in the context of geometric group theory. This together with Cutkosky's theorem regarded as a theorem which informs us about variations connected to the monodromy of Feynman amplitudes allows for a systematic approach to normal and anomalous thresholds, dispersion relations and the optical theorem. In this report we follow [1] closely.

  20. Membership Rules - LHCRRB Scrutiny Group

    CERN Document Server

    2017-01-01

    The LHC Resources Scrutiny Group was created in 2001 to review and scrutinize the M&O cost estimates of the LHC Collaborations. The Scrutiny Group first met on 23 August 2001 and reported to the RRBs at its 13th Plenary meeting, in October 2001 (RRB-D-2001-8). The Scrutiny Group operates according to the procedures set out in Annex 12 of the MoUs for the M&O of the LHC experiments. This document lists the Rules of Procedure that apply to the M&O Scrutiny Group

  1. Managing Knowledge as Business Rules

    OpenAIRE

    Anca Ioana ANDREESCU; Mircea, Marinela

    2009-01-01

    In today’s business environment, it is a certainty that will manage to survive especially those organizations which are striving to adapt quickly and with low costs to the new demands of market competition. Knowledge represented by internal business rules of an organization can help crystallize their orientation in order to ensure a competitive advantage in the market. In this context and in a relatively short time, a new trend in software development has arisen, ex-tending current methods ...

  2. The Rule of Metaphor commented.

    OpenAIRE

    2015-01-01

    This paper presents the exposure provided by Marie-France Begué to SIPLET (Permanent Interdisciplinary Seminar Literature, Aesthetics and Theology) around The Rule of Methaphor of Paul Ricoeur. In it, after a general introduction, are addressed in detail four of the studies in the book: the first, “Between Rhetoric and Poetics: Aristotle,”; the sixth, “The work of the likeness,”; the seventh, “Metaphor and reference”; and the eighth,” Metaphor and philosophical discourse”. The main objective ...

  3. Marketing the Rule of Law

    Institute of Scientific and Technical Information of China (English)

    2007-01-01

    Looking back at the last three decades of reform and opening up in China, many components of this process can now be examined in an attempt to ascertain what part they have played in the country’s political,social and economic growth.Rule of law and a harmonious society are some of the issues at stake.Professor Jiang Ping,former President of China University of Political Science and Law,discusses these and other areas of reform in an interview with China Economic Weekly.

  4. Common Ground and Delegation

    DEFF Research Database (Denmark)

    Dobrajska, Magdalena; Foss, Nicolai Juul; Lyngsie, Jacob

    Much recent research suggests that firms need to increase their level of delegation to better cope with, for example, the challenges introduced by dynamic rapid environments and the need to engage more with external knowledge sources. However, there is less insight into the organizational...... preconditions of increasing delegation. We argue that key HR practices?namely, hiring, training and job-rotation?are associated with delegation of decision-making authority. These practices assist in the creation of shared knowledge conditions between managers and employees. In turn, such a ?common ground...

  5. Ground penetrating radar

    CERN Document Server

    Daniels, David J

    2004-01-01

    Ground-penetrating radar has come to public attention in recent criminal investigations, but has actually been a developing and maturing remote sensing field for some time. In the light of recent expansion of the technique to a wide range of applications, the need for an up-to-date reference has become pressing. This fully revised and expanded edition of the best-selling Surface-Penetrating Radar (IEE, 1996) presents, for the non-specialist user or engineer, all the key elements of this technique, which span several disciplines including electromagnetics, geophysics and signal processing. The

  6. Singlet Ground State Magnetism:

    DEFF Research Database (Denmark)

    Loidl, A.; Knorr, K.; Kjems, Jørgen;

    1979-01-01

    The magneticGamma 1 –Gamma 4 exciton of the singlet ground state system TbP has been studied by inelastic neutron scattering above the antiferromagnetic ordering temperature. Considerable dispersion and a pronounced splitting was found in the [100] and [110] directions. Both the band width...... and the splitting increased rapidly as the transition temperature was approached in accordance with the predictions of the RPA-theory. The dispersion is analysed in terms of a phenomenological model using interactions up to the fourth nearest neighbour....

  7. The LOFT Ground Segment

    DEFF Research Database (Denmark)

    Bozzo, E.; Antonelli, A.; Argan, A.;

    2014-01-01

    targets per orbit (~90 minutes), providing roughly ~80 GB of proprietary data per day (the proprietary period will be 12 months). The WFM continuously monitors about 1/3 of the sky at a time and provides data for about ~100 sources a day, resulting in a total of ~20 GB of additional telemetry. The LOFT...... we summarize the planned organization of the LOFT ground segment (GS), as established in the mission Yellow Book 1 . We describe the expected GS contributions from ESA and the LOFT consortium. A review is provided of the planned LOFT data products and the details of the data flow, archiving...

  8. A Simpler Understanding of Classic GT: How it is a fundamentally different methodology

    Directory of Open Access Journals (Sweden)

    Ólavur Christiansen

    2007-06-01

    Full Text Available The author reduces the research rationale of classic grounded theory (GT methodology and the consequential classic GT research procedures and stages down to their essential elements. This reduction makes it possible to compare classic GT to other research methodologies in a manner that is simpler and yet concise. This methodological analysis and synthesis has been conducted while applying and after having applied the classic GT methodology in practice in a major project. The fundamental differences between classic GT versus other adaptations of GT, as well as other qualitative-inductive research approaches, are mainly explained by the very different approaches in solving the problem of many equally justifiable interpretations of the same data, and by the consequential differences in research procedures, and how they are applied. Comprehension of methodological differences in details will always be relevant. However, an uncomplicated and still concise explanation of the differences between these methodologies is necessary. “Grounded theory” (GT is used as a common label in the literature for very different research approaches. This simpler approach of comparing the methodologies will be helpful for researchers, who might want to consider several options when deciding which research methodology to use, and who need quickly to understand some of the most essential methodological elements.

  9. How Politics Shapes the Growth of Rules

    DEFF Research Database (Denmark)

    Jakobsen, Mads Leth Felsager; Mortensen, Peter Bjerre

    2015-01-01

    This article examines the impact of politics on governmental rule production. Traditionally, explanations of rule dynamics have focused on nonpolitical factors such as the self-evolvement of rules, environmental factors, and decision maker attributes. This article develops a set of hypotheses about...... when, why, and how political factors shape changes in the stock of rules. Furthermore, we test these hypotheses on a unique, new data set based on all Danish primary legislation and administrative rules from 1989 to 2011 categorized into 20 different policy domains. The analysis shows...... that the traditional Weberian “rules breed rules” explanations must be supplemented with political explanations that take party ideology and changes in the political agenda into account. Moreover, the effect of political factors is indistinguishable across changes in primary laws and changes in administrative rules...

  10. Scoping studies: advancing the methodology

    Directory of Open Access Journals (Sweden)

    O'Brien Kelly K

    2010-09-01

    Full Text Available Abstract Background Scoping studies are an increasingly popular approach to reviewing health research evidence. In 2005, Arksey and O'Malley published the first methodological framework for conducting scoping studies. While this framework provides an excellent foundation for scoping study methodology, further clarifying and enhancing this framework will help support the consistency with which authors undertake and report scoping studies and may encourage researchers and clinicians to engage in this process. Discussion We build upon our experiences conducting three scoping studies using the Arksey and O'Malley methodology to propose recommendations that clarify and enhance each stage of the framework. Recommendations include: clarifying and linking the purpose and research question (stage one; balancing feasibility with breadth and comprehensiveness of the scoping process (stage two; using an iterative team approach to selecting studies (stage three and extracting data (stage four; incorporating a numerical summary and qualitative thematic analysis, reporting results, and considering the implications of study findings to policy, practice, or research (stage five; and incorporating consultation with stakeholders as a required knowledge translation component of scoping study methodology (stage six. Lastly, we propose additional considerations for scoping study methodology in order to support the advancement, application and relevance of scoping studies in health research. Summary Specific recommendations to clarify and enhance this methodology are outlined for each stage of the Arksey and O'Malley framework. Continued debate and development about scoping study methodology will help to maximize the usefulness and rigor of scoping study findings within healthcare research and practice.

  11. Stat-LRC: statistical rules check for variational lithography

    Science.gov (United States)

    Sreedhar, Aswin; Kundu, Sandip

    2010-03-01

    As interconnect densities increase with each technology generation, the lithographic processes required to print all features with acceptable irregularities have become more complex. Restricted design rules (RDR) and modelbased Design for Manufacturability (DFM) guidelines have been added to the existing Design Rule Check (DRC) software to prevent unprintable patterns to be drawn on the mask by predicting their imprint on the wafer. It is evident from analyses of predicted patterns that edge placement errors have a continuous distribution, hence a pass/fail cut-off is somewhat arbitrary. In this paper, we describe a methodology to perform Statistical Lithography Rules Check (Stat-LRC) involving design yield based on interconnect linewidth distribution for variation in lithographic input error sources. In this scheme, a list of error locations indicating polygons that have yield below a user specified threshold are listed. The overall design yield is recovered by trading-off slightly poorer EPE distributions for lines with short runs with excellent ones. The simulation/analysis environment is fully automated and yield recovery improvement has been demonstrated.

  12. Genetic Programming for the Generation of Crisp and Fuzzy Rule Bases in Classification and Diagnosis of Medical Data

    DEFF Research Database (Denmark)

    Dounias, George; Tsakonas, Athanasios; Jantzen, Jan;

    2002-01-01

    This paper demonstrates two methodologies for the construction of rule-based systems in medical decision making. The first approach consists of a method combining genetic programming and heuristic hierarchical rule-base construction. The second model is composed by a strongly-typed genetic progra...... systems. Comparisons on the system's comprehensibility and the transparency are included. These comparisons include for the Aphasia domain, previous work consisted of two neural network models....

  13. Discovering Non-Redundant Association Rules using MinMax Approximation Rules

    OpenAIRE

    R. Vijaya Prakash; Dr. A. Govardhan3; Prof. SSVN. Sarma

    2012-01-01

    Frequent pattern mining is an important area of data mining used to generate the Association Rules. The extracted Frequent Patterns quality is a big concern, as it generates huge sets of rules and many of them are redundant. Mining Non-Redundant Frequent patterns is a big concern in the area of Association rule mining. In this paper we proposed a method to eliminate the redundant Frequent patterns using MinMax rule approach, to generate the quality Association Rules.

  14. Research rules for library ethnography

    OpenAIRE

    2011-01-01

    Purpose – The aim of this paper is to introduce the second part of the theme issue on “user research and technology” and to discuss testing online digital library resources using methods from ethnography and cultural anthropology. Design/methodology/approach – This editorial reviews the literature and research design methods. Findings – Library and information science as a field is changing and the requirements for top quality research are growing more stringent. This is typical of the ...

  15. Methodological practicalities in analytical generalization

    DEFF Research Database (Denmark)

    Halkier, Bente

    2011-01-01

    In this article, I argue that the existing literature on qualitative methodologies tend to discuss analytical generalization at a relatively abstract and general theoretical level. It is, however, not particularly straightforward to “translate” such abstract epistemological principles into more...... operative methodological strategies for producing analytical generalizations in research practices. Thus, the aim of the article is to contribute to the discussions among qualitatively working researchers about generalizing by way of exemplifying some of the methodological practicalities in analytical...... and processes in producing the three different ways of generalizing: ideal typologizing, category zooming, and positioning....

  16. Prioritization methodology for chemical replacement

    Science.gov (United States)

    Goldberg, Ben; Cruit, Wendy; Schutzenhofer, Scott

    1995-01-01

    This methodology serves to define a system for effective prioritization of efforts required to develop replacement technologies mandated by imposed and forecast legislation. The methodology used is a semi quantitative approach derived from quality function deployment techniques (QFD Matrix). QFD is a conceptual map that provides a method of transforming customer wants and needs into quantitative engineering terms. This methodology aims to weight the full environmental, cost, safety, reliability, and programmatic implications of replacement technology development to allow appropriate identification of viable candidates and programmatic alternatives.

  17. Threshold Differences on Figure and Ground: Gelb and Granit (1923)

    Science.gov (United States)

    Kinateder, Max

    2017-01-01

    In 1923, Gelb and Granit, using a method of adjustment for a small red light, reported a lower threshold for the target when presented on a ground region than on an adjacent figural region. More recent work in perceptual organization has found precisely the opposite—a processing advantage seems to go to items presented on the figure, not the ground. Although Gelb and Granit continue to be cited for their finding, it has not previously been available as an English translation. Understanding their methodology and results is important for integrating early Gestalt theory with more recent investigations. PMID:28286640

  18. The LOFT Ground Segment

    CERN Document Server

    Bozzo, E; Argan, A; Barret, D; Binko, P; Brandt, S; Cavazzuti, E; Courvoisier, T; Herder, J W den; Feroci, M; Ferrigno, C; Giommi, P; Götz, D; Guy, L; Hernanz, M; Zand, J J M in't; Klochkov, D; Kuulkers, E; Motch, C; Lumb, D; Papitto, A; Pittori, C; Rohlfs, R; Santangelo, A; Schmid, C; Schwope, A D; Smith, P J; Webb, N A; Wilms, J; Zane, S

    2014-01-01

    LOFT, the Large Observatory For X-ray Timing, was one of the ESA M3 mission candidates that completed their assessment phase at the end of 2013. LOFT is equipped with two instruments, the Large Area Detector (LAD) and the Wide Field Monitor (WFM). The LAD performs pointed observations of several targets per orbit (~90 minutes), providing roughly ~80 GB of proprietary data per day (the proprietary period will be 12 months). The WFM continuously monitors about 1/3 of the sky at a time and provides data for about ~100 sources a day, resulting in a total of ~20 GB of additional telemetry. The LOFT Burst alert System additionally identifies on-board bright impulsive events (e.g., Gamma-ray Bursts, GRBs) and broadcasts the corresponding position and trigger time to the ground using a dedicated system of ~15 VHF receivers. All WFM data are planned to be made public immediately. In this contribution we summarize the planned organization of the LOFT ground segment (GS), as established in the mission Yellow Book 1 . We...

  19. Software Security Rules: SDLC Perspective

    Directory of Open Access Journals (Sweden)

    S. K. Pandey

    2009-10-01

    Full Text Available Software has become an integral part of everyday life. Everyday, millions of people perform transaction through internet, ATM, mobile phone, they send email & e-greetings, and use word processing and spreadsheet for various purpose. People use software bearing in mind that it is reliable and can be trust upon and the operation they perform is secured. Now, if these software have exploitable security hole then how can they be safe for use. Security brings value to software in terms of people’s trust. The value provided by secure software is of vital importance because many critical functions are entirely dependent on the software. That is why security is a serious topic which should be given proper attention during the entire SDLC, ‘right from the beginning’. For the proper implementation of security in the software, twenty one security rules are proposed in this paper along with validation results. It is found that by applying these rules as per given implementation mechanism, most of the vulnerabilities are eliminated in the software and a more secure software can be built.

  20. International rules on maritime delimitation

    Directory of Open Access Journals (Sweden)

    Tubić Bojan

    2013-01-01

    Full Text Available This paper deals with international rules which are applicable in the cases of maritime delimitation. The importance of sea and its resources incited the states to regulate their boundaries in international agreements. In the situations when it is impossible to reach an agreement on delimitation it is necessary to resolve the dispute. Among other issues, there can be differences regarding the delimitation of territorial sea, continental shelf and exclusive economic zone. The UN Convention on the Law of the Sea, which comprehensively regulates this field contains also certain rules on maritime delimitation. Besides diplomatic means, states concerned can try to resolve the dispute by using the arbitration or judicial means of dispute settlement. The jurisdiction for the dispute settlement have the International Court of Justice and the Tribunal for the Law of the Sea. They usually apply the principle of equidistance but it can be complemented or replaced with additional criteria if there are some special circumstances related to the concrete case. The basic aim in the process of decision making is the achievement of equitable and acceptable solution.

  1. Biclustering Learning of Trading Rules.

    Science.gov (United States)

    Huang, Qinghua; Wang, Ting; Tao, Dacheng; Li, Xuelong

    2015-10-01

    Technical analysis with numerous indicators and patterns has been regarded as important evidence for making trading decisions in financial markets. However, it is extremely difficult for investors to find useful trading rules based on numerous technical indicators. This paper innovatively proposes the use of biclustering mining to discover effective technical trading patterns that contain a combination of indicators from historical financial data series. This is the first attempt to use biclustering algorithm on trading data. The mined patterns are regarded as trading rules and can be classified as three trading actions (i.e., the buy, the sell, and no-action signals) with respect to the maximum support. A modified K nearest neighborhood ( K -NN) method is applied to classification of trading days in the testing period. The proposed method [called biclustering algorithm and the K nearest neighbor (BIC- K -NN)] was implemented on four historical datasets and the average performance was compared with the conventional buy-and-hold strategy and three previously reported intelligent trading systems. Experimental results demonstrate that the proposed trading system outperforms its counterparts and will be useful for investment in various financial markets.

  2. Methodological Advances in Political Gaming: The One-Person Computer Interactive, Quasi-Rigid Rule Game.

    Science.gov (United States)

    Shubik, Martin

    The main problem in computer gaming research is the initial decision of choosing the type of gaming method to be used. Free-form games lead to exciting open-ended confrontations that generate much information. However, they do not easily lend themselves to analysis because they generate far too much information and their results are seldom…

  3. Medicare Program; Inpatient Rehabilitation Facility Prospective Payment System for Federal Fiscal Year 2018. Final rule.

    Science.gov (United States)

    2017-08-03

    This final rule updates the prospective payment rates for inpatient rehabilitation facilities (IRFs) for federal fiscal year (FY) 2018 as required by the statute. As required by section 1886(j)(5) of the Social Security Act (the Act), this rule includes the classification and weighting factors for the IRF prospective payment system's (IRF PPS) case-mix groups and a description of the methodologies and data used in computing the prospective payment rates for FY 2018. This final rule also revises the International Classification of Diseases, 10th Revision, Clinical Modification (ICD-10-CM) diagnosis codes that are used to determine presumptive compliance under the "60 percent rule," removes the 25 percent payment penalty for inpatient rehabilitation facility patient assessment instrument (IRF-PAI) late transmissions, removes the voluntary swallowing status item (Item 27) from the IRF-PAI, summarizes comments regarding the criteria used to classify facilities for payment under the IRF PPS, provides for a subregulatory process for certain annual updates to the presumptive methodology diagnosis code lists, adopts the use of height/weight items on the IRF-PAI to determine patient body mass index (BMI) greater than 50 for cases of single-joint replacement under the presumptive methodology, and revises and updates measures and reporting requirements under the IRF quality reporting program (QRP).

  4. The information content of rules and rule sets and its application

    Institute of Scientific and Technical Information of China (English)

    HU Dan; LI HongXing; YU XianChuan

    2008-01-01

    The information content of rules is categorized into inner mutual information content and outer impartation information content. Actually, the conventional objective interestingness measures based on information theory are all inner mutual informarion, which represent the confidence of rules and the mutual information between the antecedent and consequent. Moreover, almost all of these measures lose sight of the outer impartation information, which is conveyed to the user and help the user to make decisions. We put forward the viewpoint that the outer impartation information content of rules and rule sets can be represented by the relations from input universe to output universe. By binary relations, the interaction of rules in a rule set can be easily represented by operators: union and intersection. Based on the entropy of relations, the outer impartation information content of rules and rule sets are well measured. Then, the conditional information content of rules and rule sets, the independence of rules and rule sets and the inconsistent knowledge of rule sets are defined and measured. The properties of these new measures are discussed and some interesting results are proven, such as the information content of a rule set may be bigger than the sum of the information content of rules in the rule set, and the conditional information content of rules may be negative. At last, the applications of these new measures are discussed. The new method for the appraisement of .rule mining algorithm, and two rule pruning algorithms, λ-choice and RPCIC, are put forward. These new methods and algorithms havepredominance in satisfying the need of more efficient decision information.

  5. Qualitative methodology in developmental psychology

    DEFF Research Database (Denmark)

    Demuth, Carolin; Mey, Günter

    2015-01-01

    Qualitative methodology presently is gaining increasing recognition in developmental psychology. Although the founders of developmental psychology to a large extent already used qualitative procedures, the field was long dominated by a (post) positivistic quantitative paradigm. The increasing...

  6. Nanotoxicology materials, methodologies, and assessments

    CERN Document Server

    Durán, Nelson; Alves, Oswaldo L; Zucolotto, Valtencir

    2014-01-01

    This book begins with a detailed introduction to engineered nanostructures, followed by a section on methodologies used in research on cytotoxicity and genotoxicity, and concluding with evidence for the cyto- and genotoxicity of specific nanoparticles.

  7. No crisis but methodological separatism

    DEFF Research Database (Denmark)

    Erola, Jani; Reimer, David; Räsänen, Pekka;

    2015-01-01

    This article compares methodological trends in nationally and internationally oriented sociology using data from the articles of three Nordic sociological journals: one international (Acta Sociologica), one Finnish (Sosiologia), and one Danish (Dansk Sociologi). The data consists of 943 articles ...

  8. Methodological Reflections: Inter- ethnic Research

    DEFF Research Database (Denmark)

    Singla, Rashmi

    2010-01-01

    This article reflects on the methodological and epistemological aspects of the ethical issues involved in encounters between researcher and research participants with ethnic minority background in contexts with diversity. Specific challenges involved in longitudinal research (10 - 15 years) are a...

  9. Some notes on taxonomic methodology

    NARCIS (Netherlands)

    Hammen, van der L.

    1986-01-01

    The present paper constitutes an introduction to taxonomic methodology. After an analysis of taxonomic practice, and a brief survey of kinds of attributes, the paper deals with observation, description, comparison, arrangement and classification, hypothesis construction, deduction, model, experiment

  10. Methodology and Foreground of Metallomics

    Institute of Scientific and Technical Information of China (English)

    He Bin; Jiang Guibin

    2005-01-01

    Metallomics is proposed as a new omics to follow genomics, proteomics and metabolomics. This paper gives an overview of the development of metallomics based on the introduction of the concept of metallomics and its methodology.

  11. Clustering Association Rules with Fuzzy Concepts

    Science.gov (United States)

    Steinbrecher, Matthias; Kruse, Rudolf

    Association rules constitute a widely accepted technique to identify frequent patterns inside huge volumes of data. Practitioners prefer the straightforward interpretability of rules, however, depending on the nature of the underlying data the number of induced rules can be intractable large. Even reasonably sized result sets may contain a large amount of rules that are uninteresting to the user because they are too general, are already known or do not match other user-related intuitive criteria. We allow the user to model his conception of interestingness by means of linguistic expressions on rule evaluation measures and compound propositions of higher order (i.e., temporal changes of rule properties). Multiple such linguistic concepts can be considered a set of fuzzy patterns (Fuzzy Sets and Systems 28(3):313-331, 1988) and allow for the partition of the initial rule set into fuzzy fragments that contain rules of similar membership to a user’s concept (Höppner et al., Fuzzy Clustering, Wiley, Chichester, 1999; Computational Statistics and Data Analysis 51(1):192-214, 2006; Advances in Fuzzy Clustering and Its Applications, chap. 1, pp. 3-30, Wiley, New York, 2007). With appropriate visualization methods that extent previous rule set visualizations (Foundations of Fuzzy Logic and Soft Computing, Lecture Notes in Computer Science, vol. 4529, pp. 295-303, Springer, Berlin, 2007) we allow the user to instantly assess the matching of his concepts against the rule set.

  12. Rule-Based Network Service Provisioning

    Directory of Open Access Journals (Sweden)

    Rudy Deca

    2012-10-01

    Full Text Available Due to the unprecedented development of networks, manual network service provisioning is becoming increasingly risky, error-prone, expensive, and time-consuming. To solve this problem,rule-based methods can provide adequate leverage for automating various network management tasks. This paper presents a rule-based solution for automated network service provisioning. The proposed approach captures configuration data interdependencies using high-level, service-specific, user-configurable rules. We focus on the service validation task, which is illustrated by means of a case study.Based on numerical results, we analyse the influence of the network-level complexity factors and rule descriptive features on the rule efficiency. This analysis shows the operators how to increase rule efficiency while keeping the rules simple and the rule set compact. We present a technique that allows operators to increase the error coverage, and we show that high error coverage scales well when the complexity of networks and services increases.We reassess the correlation function between specific rule efficiency and rule complexity metrics found in previous work, and show that this correlation function holds for various sizes, types, and complexities of networks and services.

  13. Qualitative methodology in developmental psychology

    DEFF Research Database (Denmark)

    Demuth, Carolin; Mey, Günter

    2015-01-01

    Qualitative methodology presently is gaining increasing recognition in developmental psychology. Although the founders of developmental psychology to a large extent already used qualitative procedures, the field was long dominated by a (post) positivistic quantitative paradigm. The increasing...... recognition of the sociocultural embeddedness of human development, and of the importance to study individuals’ subjective experience, however, calls for adequate methodological procedures that allow for the study of processes of transformation across the life span. The wide range of established procedures...

  14. MOTOR VEHICLE SAFETY RESEARCH METHODOLOGY

    Directory of Open Access Journals (Sweden)

    A. Stepanov

    2015-07-01

    Full Text Available The issues of vehicle safety are considered. The methodology of approach to analyzing and solving the problem of safety management of vehicles and overall traffic is offered. The distinctive features of organization and management of vehicle safety are shown. There has been drawn a conclusion that the methodological approach to solving traffic safety problems is reduced to selection and classification of safety needs.

  15. Agile Methodology - Past and Future

    Science.gov (United States)

    2011-05-01

    Agile Methodology – P t d F t ”as an u ure Warren W. Tignor SAIC Report Documentation Page Form ApprovedOMB No. 0704-0188 Public reporting...AND SUBTITLE Agile Methodology - Past and Future 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER 6. AUTHOR(S) 5d. PROJECT NUMBER...Takeuchi & Nonaka HBR 1986, p139 RUGBY Waterfall Red vs Agile Black Team- . - Manifesto 2001 SCRUM GRAPHIC* * Adapted from Schwaber (2007) Agile

  16. Investment Strategies Optimization based on a SAX-GA Methodology

    CERN Document Server

    Canelas, António M L; Horta, Nuno C G

    2013-01-01

    This book presents a new computational finance approach combining a Symbolic Aggregate approXimation (SAX) technique with an optimization kernel based on genetic algorithms (GA). While the SAX representation is used to describe the financial time series, the evolutionary optimization kernel is used in order to identify the most relevant patterns and generate investment rules. The proposed approach considers several different chromosomes structures in order to achieve better results on the trading platform The methodology presented in this book has great potential on investment markets.

  17. Ares I-X Ground Diagnostic Prototype

    Science.gov (United States)

    Schwabacher, Mark; Martin, Rodney; Waterman, Robert; Oostdyk, Rebecca; Ossenfort, John; Matthews, Bryan

    2010-01-01

    Automating prelaunch diagnostics for launch vehicles offers three potential benefits. First, it potentially improves safety by detecting faults that might otherwise have been missed so that they can be corrected before launch. Second, it potentially reduces launch delays by more quickly diagnosing the cause of anomalies that occur during prelaunch processing. Reducing launch delays will be critical to the success of NASA's planned future missions that require in-orbit rendezvous. Third, it potentially reduces costs by reducing both launch delays and the number of people needed to monitor the prelaunch process. NASA is currently developing the Ares I launch vehicle to bring the Orion capsule and its crew of four astronauts to low-earth orbit on their way to the moon. Ares I-X will be the first unmanned test flight of Ares I. It is scheduled to launch on October 27, 2009. The Ares I-X Ground Diagnostic Prototype is a prototype ground diagnostic system that will provide anomaly detection, fault detection, fault isolation, and diagnostics for the Ares I-X first-stage thrust vector control (TVC) and for the associated ground hydraulics while it is in the Vehicle Assembly Building (VAB) at John F. Kennedy Space Center (KSC) and on the launch pad. It will serve as a prototype for a future operational ground diagnostic system for Ares I. The prototype combines three existing diagnostic tools. The first tool, TEAMS (Testability Engineering and Maintenance System), is a model-based tool that is commercially produced by Qualtech Systems, Inc. It uses a qualitative model of failure propagation to perform fault isolation and diagnostics. We adapted an existing TEAMS model of the TVC to use for diagnostics and developed a TEAMS model of the ground hydraulics. The second tool, Spacecraft Health Inference Engine (SHINE), is a rule-based expert system developed at the NASA Jet Propulsion Laboratory. We developed SHINE rules for fault detection and mode identification. The prototype

  18. Writing biomedical manuscripts part I: fundamentals and general rules.

    Science.gov (United States)

    Ohwovoriole, A E

    2011-01-01

    It is a professional obligation for health researchers to investigate and communicate their findings to the medical community. The writing of a publishable scientific manuscript can be a daunting task for the beginner and to even some established researchers. Many manuscripts fail to get off the ground and/or are rejected. The writing task can be made easier and the quality improved by using and following simple rules and leads that apply to general scientific writing .The manuscript should follow a standard structure:(e.g. (Abstract) plus Introduction, Methods, Results, and Discussion/Conclusion, the IMRAD model. The authors must also follow well established fundamentals of good communication in science and be systematic in approach. The manuscript must move from what is currently known to what was unknown that was investigated using a hypothesis, research question or problem statement. Each section has its own style of structure and language of presentation. The beginning of writing a good manuscript is to do a good study design and to pay attention to details at every stage. Many manuscripts are rejected because of errors that can be avoided if the authors follow simple guidelines and rules. One good way to avoid potential disappointment in manuscript writing is to follow the established general rules along with those of the journal in which the paper is to be published. An important injunction is to make the writing precise, clear, parsimonious, and comprehensible to the intended audience. The purpose of this article is to arm and encourage potential biomedical authors with tools and rules that will enable them to write contemporary manuscripts, which can stand the rigorous peer review process. The expectations of standard journals, and common pitfalls the major elements of a manuscript are covered.

  19. Exact duality and Bjorken sum rule in heavy quark models à la Bakamjian-Thomas

    CERN Document Server

    Le Yaouanc, A; Pène, O; Raynal, J C

    1996-01-01

    The heavy mass limit of quark models based on the Bakamjian-Thomas cons\\-truction reveals remarkable features. In addition to previously demonstrated properties of covariance and Isgur-Wise scaling, exact duality, leading to the Bjorken-Isgur-Wise sum rule, is proven, for the first time to our knowledge in relativistic quark models. Inelastic as well as elastic contributions to the sum rule are then discussed in terms of ground state averages of a few number of operators corresponding to the nonrelativistic dipole operator and various relativistic corrections.

  20. Designing as middle ground

    DEFF Research Database (Denmark)

    Nickelsen, Niels Christian Mossfeldt; Binder, Thomas

    2010-01-01

    The theoretical background in this chapter is science and technology studies and actor network theory, enabling investigation of heterogeneity, agency and perfor-mative effects through ‘symmetric’ analysis. The concept of design is defined as being imaginative and mindful to a number of actors...... in a network of humans and non-humans, highlighting that design objects and the designer as an authority are constructed throughout this endeavour. The illustrative case example is drawn from product development in a rubber valve factory in Jutland in Denmark. The key contribution to a general core of design...... research is an articulation of design activity taking place as a middle ground and as an intermixture between a ‘scientific’ regime of knowledge transfer and a capital ‘D’ ‘Designerly’ regime of authoring....