WorldWideScience

Sample records for method required significant

  1. Determining the significance of associations between two series of discrete events : bootstrap methods /

    Energy Technology Data Exchange (ETDEWEB)

    Niehof, Jonathan T.; Morley, Steven K.

    2012-01-01

    We review and develop techniques to determine associations between series of discrete events. The bootstrap, a nonparametric statistical method, allows the determination of the significance of associations with minimal assumptions about the underlying processes. We find the key requirement for this method: one of the series must be widely spaced in time to guarantee the theoretical applicability of the bootstrap. If this condition is met, the calculated significance passes a reasonableness test. We conclude with some potential future extensions and caveats on the applicability of these methods. The techniques presented have been implemented in a Python-based software toolkit.

  2. An Engineering Method of Civil Jet Requirements Validation Based on Requirements Project Principle

    Science.gov (United States)

    Wang, Yue; Gao, Dan; Mao, Xuming

    2018-03-01

    A method of requirements validation is developed and defined to meet the needs of civil jet requirements validation in product development. Based on requirements project principle, this method will not affect the conventional design elements, and can effectively connect the requirements with design. It realizes the modern civil jet development concept, which is “requirement is the origin, design is the basis”. So far, the method has been successfully applied in civil jet aircraft development in China. Taking takeoff field length as an example, the validation process and the validation method of the requirements are detailed introduced in the study, with the hope of providing the experiences to other civil jet product design.

  3. Metric-based method of software requirements correctness improvement

    Directory of Open Access Journals (Sweden)

    Yaremchuk Svitlana

    2017-01-01

    Full Text Available The work highlights the most important principles of software reliability management (SRM. The SRM concept construes a basis for developing a method of requirements correctness improvement. The method assumes that complicated requirements contain more actual and potential design faults/defects. The method applies a newer metric to evaluate the requirements complexity and double sorting technique evaluating the priority and complexity of a particular requirement. The method enables to improve requirements correctness due to identification of a higher number of defects with restricted resources. Practical application of the proposed method in the course of demands review assured a sensible technical and economic effect.

  4. Relating Business Goals to Architecturally Significant Requirements for Software Systems

    Science.gov (United States)

    2010-05-01

    must respond within five seconds” [ EPF 2010]. A major source of architecturally significant requirements is the set of business goals that led to the...Projects for Competitive Advantage, Center for Business Practices, 1999. [ EPF 2010] Eclipse Process Framework Project. Concept: Architecturally

  5. Comparison of different methods to extract the required coefficient of friction for level walking.

    Science.gov (United States)

    Chang, Wen-Ruey; Chang, Chien-Chi; Matz, Simon

    2012-01-01

    The required coefficient of friction (RCOF) is an important predictor for slip incidents. Despite the wide use of the RCOF there is no standardised method for identifying the RCOF from ground reaction forces. This article presents a comparison of the outcomes from seven different methods, derived from those reported in the literature, for identifying the RCOF from the same data. While commonly used methods are based on a normal force threshold, percentage of stance phase or time from heel contact, a newly introduced hybrid method is based on a combination of normal force, time and direction of increase in coefficient of friction. Although no major differences were found with these methods in more than half the strikes, significant differences were found in a significant portion of strikes. Potential problems with some of these methods were identified and discussed and they appear to be overcome by the hybrid method. No standard method exists for determining the required coefficient of friction (RCOF), an important predictor for slipping. In this study, RCOF values from a single data set, using various methods from the literature, differed considerably for a significant portion of strikes. A hybrid method may yield improved results.

  6. The Sources and Methods of Engineering Design Requirement

    DEFF Research Database (Denmark)

    Li, Xuemeng; Zhang, Zhinan; Ahmed-Kristensen, Saeema

    2014-01-01

    to be defined in a new context. This paper focuses on understanding the design requirement sources at the requirement elicitation phase. It aims at proposing an improved design requirement source classification considering emerging markets and presenting current methods for eliciting requirement for each source...

  7. A Survey of Various Object Oriented Requirement Engineering Methods

    OpenAIRE

    Anandi Mahajan; Dr. Anurag Dixit

    2013-01-01

    In current years many industries have been moving to the use of object-oriented methods for the development of large scale information systems The requirement of Object Oriented approach in the development of software systems is increasing day by day. This paper is basically a survey paper on various Object-oriented requirement engineering methods. This paper contains a summary of the available Object-oriented requirement engineering methods with their relative advantages and disadvantages...

  8. Basic requirements to the methods of personnel monitoring

    International Nuclear Information System (INIS)

    Keirim-Markus, I.B.

    1981-01-01

    Requirements to methods of personnel monitoring (PMM) depending on irradiation conditions are given. The irradiation conditions determine subjected to monitoring types of irradiation, measurement ranges, periodicity of monitoring, operativeness of obtaining results and required accuracy. The PMM based on the photographic effect of ionizing radiation is the main method of the mass monitoring [ru

  9. A fast method for the unit scheduling problem with significant renewable power generation

    International Nuclear Information System (INIS)

    Osório, G.J.; Lujano-Rojas, J.M.; Matias, J.C.O.; Catalão, J.P.S.

    2015-01-01

    Highlights: • A model to the scheduling of power systems with significant renewable power generation is provided. • A new methodology that takes information from the analysis of each scenario separately is proposed. • Based on a probabilistic analysis, unit scheduling and corresponding economic dispatch are estimated. • A comparison with others methodologies is in favour of the proposed approach. - Abstract: Optimal operation of power systems with high integration of renewable power sources has become difficult as a consequence of the random nature of some sources like wind energy and photovoltaic energy. Nowadays, this problem is solved using Monte Carlo Simulation (MCS) approach, which allows considering important statistical characteristics of wind and solar power production such as the correlation between consecutive observations, the diurnal profile of the forecasted power production, and the forecasting error. However, MCS method requires the analysis of a representative amount of trials, which is an intensive calculation task that increases considerably with the number of scenarios considered. In this paper, a model to the scheduling of power systems with significant renewable power generation based on scenario generation/reduction method, which establishes a proportional relationship between the number of scenarios and the computational time required to analyse them, is proposed. The methodology takes information from the analysis of each scenario separately to determine the probabilistic behaviour of each generator at each hour in the scheduling problem. Then, considering a determined significance level, the units to be committed are selected and the load dispatch is determined. The proposed technique was illustrated through a case study and the comparison with stochastic programming approach was carried out, concluding that the proposed methodology can provide an acceptable solution in a reduced computational time

  10. A Method for Software Requirement Volatility Analysis Using QFD

    Directory of Open Access Journals (Sweden)

    Yunarso Anang

    2016-10-01

    Full Text Available Changes of software requirements are inevitable during the development life cycle. Rather than avoiding the circumstance, it is easier to just accept it and find a way to anticipate those changes. This paper proposes a method to analyze the volatility of requirement by using the Quality Function Deployment (QFD method and the introduced degree of volatility. Customer requirements are deployed to software functions and subsequently to architectural design elements. And then, after determining the potential for changes of the design elements, the degree of volatility of the software requirements is calculated. In this paper the method is described using a flow diagram and illustrated using a simple example, and is evaluated using a case study.

  11. How to Compare the Security Quality Requirements Engineering (SQUARE) Method with Other Methods

    National Research Council Canada - National Science Library

    Mead, Nancy R

    2007-01-01

    The Security Quality Requirements Engineering (SQUARE) method, developed at the Carnegie Mellon Software Engineering Institute, provides a systematic way to identify security requirements in a software development project...

  12. Method for developing cost estimates for generic regulatory requirements

    International Nuclear Information System (INIS)

    1985-01-01

    The NRC has established a practice of performing regulatory analyses, reflecting costs as well as benefits, of proposed new or revised generic requirements. A method had been developed to assist the NRC in preparing the types of cost estimates required for this purpose and for assigning priorities in the resolution of generic safety issues. The cost of a generic requirement is defined as the net present value of total lifetime cost incurred by the public, industry, and government in implementing the requirement for all affected plants. The method described here is for commercial light-water-reactor power plants. Estimating the cost for a generic requirement involves several steps: (1) identifying the activities that must be carried out to fully implement the requirement, (2) defining the work packages associated with the major activities, (3) identifying the individual elements of cost for each work package, (4) estimating the magnitude of each cost element, (5) aggregating individual plant costs over the plant lifetime, and (6) aggregating all plant costs and generic costs to produce a total, national, present value of lifetime cost for the requirement. The method developed addresses all six steps. In this paper, we discuss on the first three

  13. Classical Methods and Calculation Algorithms for Determining Lime Requirements

    Directory of Open Access Journals (Sweden)

    André Guarçoni

    Full Text Available ABSTRACT The methods developed for determination of lime requirements (LR are based on widely accepted principles. However, the formulas used for calculation have evolved little over recent decades, and in some cases there are indications of their inadequacy. The aim of this study was to compare the lime requirements calculated by three classic formulas and three algorithms, defining those most appropriate for supplying Ca and Mg to coffee plants and the smaller possibility of causing overliming. The database used contained 600 soil samples, which were collected in coffee plantings. The LR was estimated by the methods of base saturation, neutralization of Al3+, and elevation of Ca2+ and Mg2+ contents (two formulas and by the three calculation algorithms. Averages of the lime requirements were compared, determining the frequency distribution of the 600 lime requirements (LR estimated through each calculation method. In soils with low cation exchange capacity at pH 7, the base saturation method may fail to adequately supply the plants with Ca and Mg in many situations, while the method of Al3+ neutralization and elevation of Ca2+ and Mg2+ contents can result in the calculation of application rates that will increase the pH above the suitable range. Among the methods studied for calculating lime requirements, the algorithm that predicts reaching a defined base saturation, with adequate Ca and Mg supply and the maximum application rate limited to the H+Al value, proved to be the most efficient calculation method, and it can be recommended for use in numerous crops conditions.

  14. Estimation methods of eco-environmental water requirements: Case study

    Institute of Scientific and Technical Information of China (English)

    YANG Zhifeng; CUI Baoshan; LIU Jingling

    2005-01-01

    Supplying water to the ecological environment with certain quantity and quality is significant for the protection of diversity and the realization of sustainable development. The conception and connotation of eco-environmental water requirements, including the definition of the conception, the composition and characteristics of eco-environmental water requirements, are evaluated in this paper. The classification and estimation methods of eco-environmental water requirements are then proposed. On the basis of the study on the Huang-Huai-Hai Area, the present water use, the minimum and suitable water requirement are estimated and the corresponding water shortage is also calculated. According to the interrelated programs, the eco-environmental water requirements in the coming years (2010, 2030, 2050) are estimated. The result indicates that the minimum and suitable eco-environmental water requirements fluctuate with the differences of function setting and the referential standard of water resources, and so as the water shortage. Moreover, the study indicates that the minimum eco-environmental water requirement of the study area ranges from 2.84×1010m3 to 1.02×1011m3, the suitable water requirement ranges from 6.45×1010m3 to 1.78×1011m3, the water shortage ranges from 9.1×109m3 to 2.16×1010m3 under the minimum water requirement, and it is from 3.07×1010m3 to 7.53×1010m3 under the suitable water requirement. According to the different values of the water shortage, the water priority can be allocated. The ranges of the eco-environmental water requirements in the three coming years (2010, 2030, 2050) are 4.49×1010m3-1.73×1011m3, 5.99×10m3?2.09×1011m3, and 7.44×1010m3-2.52×1011m3, respectively.

  15. Handbook of methods for risk-based analysis of technical specification requirements

    International Nuclear Information System (INIS)

    Samanta, P.K.; Vesely, W.E.

    1994-01-01

    Technical Specifications (TS) requirements for nuclear power plants define the Limiting Conditions for Operation (LCOs) and Surveillance Requirements (SRs) to assure safety during operation. In general, these requirements were based on deterministic analysis and engineering judgments. Experiences with plant operation indicate that some elements of the requirements are unnecessarily restrictive, while others may not be conducive to safety. Improvements in these requirements are facilitated by the availability of plant specific Probabilistic Safety Assessments (PSAs). The use of risk and reliability-based methods to improve TS requirements has gained wide interest because these methods can: Quantitatively evaluate the risk and justify changes based on objective risk arguments; Provide a defensible basis for these requirements for regulatory applications. The US NRC Office of Research is sponsoring research to develop systematic risk-based methods to improve various aspects of TS requirements. The handbook of methods, which is being prepared, summarizes such risk-based methods. The scope of the handbook includes reliability and risk-based methods for evaluating allowed outage times (AOTs), action statements requiring shutdown where shutdown risk may be substantial, surveillance test intervals (STIs), defenses against common-cause failures, managing plant configurations, and scheduling maintenances. For each topic, the handbook summarizes methods of analysis and data needs, outlines the insights to be gained, lists additional references, and presents examples of evaluations

  16. Handbook of methods for risk-based analysis of Technical Specification requirements

    International Nuclear Information System (INIS)

    Samanta, P.K.; Vesely, W.E.

    1993-01-01

    Technical Specifications (TS) requirements for nuclear power plants define the Limiting Conditions for Operation (LCOs) and Surveillance Requirements (SRs) to assure safety during operation. In general, these requirements were based on deterministic analysis and engineering judgments. Experiences with plant operation indicate that some elements of the requirements are unnecessarily restrictive, while others may not be conducive to safety. Improvements in these requirements are facilitated by the availability of plant specific Probabilistic Safety Assessments (PSAs). The use of risk and reliability-based methods to improve TS requirements has gained wide interest because these methods can: quantitatively evaluate the risk impact and justify changes based on objective risk arguments. Provide a defensible basis for these requirements for regulatory applications. The United States Nuclear Regulatory Commission (USNRC) Office of Research is sponsoring research to develop systematic risk-based methods to improve various aspects of TS requirements. The handbook of methods, which is being prepared, summarizes such risk-based methods. The scope of the handbook includes reliability and risk-based methods for evaluating allowed outage times (AOTs), action statements requiring shutdown where shutdown risk may be substantial, surveillance test intervals (STIs), defenses against common-cause failures, managing plant configurations, and scheduling maintenances. For each topic, the handbook summarizes methods of analysis and data needs, outlines the insights to be gained, lists additional references, and presents examples of evaluations

  17. The Videographic Requirements Gathering Method for Adolescent-Focused Interaction Design

    Directory of Open Access Journals (Sweden)

    Tamara Peyton

    2014-08-01

    Full Text Available We present a novel method for conducting requirements gathering with adolescent populations. Called videographic requirements gathering, this technique makes use of mobile phone data capture and participant creation of media images. The videographic requirements gathering method can help researchers and designers gain intimate insight into adolescent lives while simultaneously reducing power imbalances. We provide rationale for this approach, pragmatics of using the method, and advice on overcoming common challenges facing researchers and designers relying on this technique.

  18. Significance and challenges of stereoselectivity assessing methods in drug metabolism

    Directory of Open Access Journals (Sweden)

    Zhuowei Shen

    2016-02-01

    Full Text Available Stereoselectivity in drug metabolism can not only influence the pharmacological activities, tolerability, safety, and bioavailability of drugs directly, but also cause different kinds of drug–drug interactions. Thus, assessing stereoselectivity in drug metabolism is of great significance for pharmaceutical research and development (R&D and rational use in clinic. Although there are various methods available for assessing stereoselectivity in drug metabolism, many of them have shortcomings. The indirect method of chromatographic methods can only be applicable to specific samples with functional groups to be derivatized or form complex with a chiral selector, while the direct method achieved by chiral stationary phases (CSPs is expensive. As a detector of chromatographic methods, mass spectrometry (MS is highly sensitive and specific, whereas the matrix interference is still a challenge to overcome. In addition, the use of nuclear magnetic resonance (NMR and immunoassay in chiral analysis are worth noting. This review presents several typical examples of drug stereoselective metabolism and provides a literature-based evaluation on current chiral analytical techniques to show the significance and challenges of stereoselectivity assessing methods in drug metabolism.

  19. The Box-and-Dot Method: A Simple Strategy for Counting Significant Figures

    Science.gov (United States)

    Stephenson, W. Kirk

    2009-01-01

    A visual method for counting significant digits is presented. This easy-to-learn (and easy-to-teach) method, designated the box-and-dot method, uses the device of "boxing" significant figures based on two simple rules, then counting the number of digits in the boxes. (Contains 4 notes.)

  20. Dosimetry using radiosensitive gels in radiotherapy: significance and methods

    International Nuclear Information System (INIS)

    Gibon, D.; Bourel, P.; Castelain, B.; Marchandise, X.; Rousseau, J.

    2001-01-01

    The goal of conformal radiotherapy is to concentrate the dose in a well-defined volume by avoiding the neighbouring healthy structures. This technique requires powerful treatment planning software and a rigorous control of estimated dosimetry. The usual dosimetric tools are not adapted to visualize and validate complex 3D treatment. Dosimetry by radiosensitive gel permits visualization and measurement of the three-dimensional dose distribution. The objective of this work is to report on current work in this field and, based on our results and our experience, to draw prospects for an optimal use of this technique. Further developments will relate to the realization of new radiosensitive gels satisfying, as well as possible, cost requirements, easy realization and use, magnetic resonance imagery (MRI) sensitivity, tissue equivalence, and stability. Other developments focus on scanning methods, especially in MRI to measure T1 and T2. (author)

  1. New concepts, requirements and methods concerning the periodic inspection of the CANDU fuel channels

    International Nuclear Information System (INIS)

    Denis, J.R.

    1995-01-01

    Periodic inspection of fuel channels is essential for a proper assessment of the structural integrity of these vital components of the reactor. The development of wet channel technologies for non-destructive examination (NDE) of pressure tubes and the high technical performance and reliability of the CIGAR equipment have led, in less than 1 0 years, to the accumulation of a very significant volume of data concerning the flaw mechanisms and structural behaviour of the CANDU fuel channels. On this basis, a new form of the CAN/CSA-N285.4 Standard for Periodic Inspection of CANDU Nuclear Power Plant components was elaborated, introducing new concepts and requirements, in accord with the powerful NDE methods now available. This paper presents these concepts and requirements, and discusses the NDE methods, presently used or under development, to satisfy these requirements. Specific features regarding the fuel channel inspections of Cernavoda NGS Unit 1 are also discussed. (author)

  2. Quantitative methods for developing C2 system requirement

    Energy Technology Data Exchange (ETDEWEB)

    Tyler, K.K.

    1992-06-01

    The US Army established the Army Tactical Command and Control System (ATCCS) Experimentation Site (AES) to provide a place where material and combat developers could experiment with command and control systems. The AES conducts fundamental and applied research involving command and control issues using a number of research methods, ranging from large force-level experiments, to controlled laboratory experiments, to studies and analyses. The work summarized in this paper was done by Pacific Northwest Laboratory under task order from the Army Tactical Command and Control System Experimentation Site. The purpose of the task was to develop the functional requirements for army engineer automation and support software, including MCS-ENG. A client, such as an army engineer, has certain needs and requirements of his or her software; these needs must be presented in ways that are readily understandable to the software developer. A requirements analysis then, such as the one described in this paper, is simply the means of communication between those who would use a piece of software and those who would develop it. The analysis from which this paper was derived attempted to bridge the ``communications gap`` between army combat engineers and software engineers. It sought to derive and state the software needs of army engineers in ways that are meaningful to software engineers. In doing this, it followed a natural sequence of investigation: (1) what does an army engineer do, (2) with which tasks can software help, (3) how much will it cost, and (4) where is the highest payoff? This paper demonstrates how each of these questions was addressed during an analysis of the functional requirements of engineer support software. Systems engineering methods are used in a task analysis and a quantitative scoring method was developed to score responses regarding the feasibility of task automation. The paper discusses the methods used to perform utility and cost-benefits estimates.

  3. Quality functions for requirements engineering in system development methods.

    Science.gov (United States)

    Johansson, M; Timpka, T

    1996-01-01

    Based on a grounded theory framework, this paper analyses the quality characteristics for methods to be used for requirements engineering in the development of medical decision support systems (MDSS). The results from a Quality Function Deployment (QFD) used to rank functions connected to user value and a focus group study were presented to a validation focus group. The focus group studies take advantage of a group process to collect data for further analyses. The results describe factors considered by the participants as important in the development of methods for requirements engineering in health care. Based on the findings, the content which, according to the user a MDSS method should support is established.

  4. Quantitative methods for developing C2 system requirement

    Energy Technology Data Exchange (ETDEWEB)

    Tyler, K.K.

    1992-06-01

    The US Army established the Army Tactical Command and Control System (ATCCS) Experimentation Site (AES) to provide a place where material and combat developers could experiment with command and control systems. The AES conducts fundamental and applied research involving command and control issues using a number of research methods, ranging from large force-level experiments, to controlled laboratory experiments, to studies and analyses. The work summarized in this paper was done by Pacific Northwest Laboratory under task order from the Army Tactical Command and Control System Experimentation Site. The purpose of the task was to develop the functional requirements for army engineer automation and support software, including MCS-ENG. A client, such as an army engineer, has certain needs and requirements of his or her software; these needs must be presented in ways that are readily understandable to the software developer. A requirements analysis then, such as the one described in this paper, is simply the means of communication between those who would use a piece of software and those who would develop it. The analysis from which this paper was derived attempted to bridge the communications gap'' between army combat engineers and software engineers. It sought to derive and state the software needs of army engineers in ways that are meaningful to software engineers. In doing this, it followed a natural sequence of investigation: (1) what does an army engineer do, (2) with which tasks can software help, (3) how much will it cost, and (4) where is the highest payoff This paper demonstrates how each of these questions was addressed during an analysis of the functional requirements of engineer support software. Systems engineering methods are used in a task analysis and a quantitative scoring method was developed to score responses regarding the feasibility of task automation. The paper discusses the methods used to perform utility and cost-benefits estimates.

  5. Proposal for Requirement Validation Criteria and Method Based on Actor Interaction

    Science.gov (United States)

    Hattori, Noboru; Yamamoto, Shuichiro; Ajisaka, Tsuneo; Kitani, Tsuyoshi

    We propose requirement validation criteria and a method based on the interaction between actors in an information system. We focus on the cyclical transitions of one actor's situation against another and clarify observable stimuli and responses based on these transitions. Both actors' situations can be listed in a state transition table, which describes the observable stimuli or responses they send or receive. Examination of the interaction between both actors in the state transition tables enables us to detect missing or defective observable stimuli or responses. Typically, this method can be applied to the examination of the interaction between a resource managed by the information system and its user. As a case study, we analyzed 332 requirement defect reports of an actual system development project in Japan. We found that there were a certain amount of defects regarding missing or defective stimuli and responses, which can be detected using our proposed method if this method is used in the requirement definition phase. This means that we can reach a more complete requirement definition with our proposed method.

  6. Review of data requirements for groundwater flow and solute transport modelling and the ability of site investigation methods to meet these requirements

    International Nuclear Information System (INIS)

    McEwen, T.J.; Chapman, N.A.; Robinson, P.C.

    1990-08-01

    This report describes the data requirements for the codes that may be used in the modelling of groundwater flow and radionuclide transport during the assessment of a Nirex site for the deep disposal of low and intermediate level radioactive waste and also the site investigation methods that exist to supply the data for these codes. The data requirements for eight codes are reviewed, with most emphasis on three of the more significant codes, VANDAL, NAMMU and CHEMTARD. The largest part of the report describes and discusses the site investigation techniques and each technique is considered in terms of its ability to provide the data necessary to characterise the geological and hydrogeological environment around a potential repository. (author)

  7. Determination of fuel irradiation parameters. Required accuracies and available methods

    International Nuclear Information System (INIS)

    Mas, P.

    1977-01-01

    This paper reports on the present point of some main methods to determine the nuclear parameters of fuel irradiation in testing reactors (nuclear power, burn up, ...) The different methods (theoretical or experimental) are reviewed: neutron measurements and calculations, gamma scanning, heat balance, ... . The required accuracies are reviewed: they are of 3-5 % on flux, fluences, nuclear power, burn-up, conversion factor. These required accuracies are compared with the real accuracies available which are the present time of order of 5-20 % on these parameters

  8. Determination of material irradiation parameters. Required accuracies and available methods

    International Nuclear Information System (INIS)

    Cerles, J.M.; Mas, P.

    1978-01-01

    In this paper, the author reports some main methods to determine the nuclear parameters of material irradiation in testing reactor (nuclear power, burn-up, fluxes, fluences, ...). The different methods (theoretical or experimental) are reviewed: neutronics measurements and calculations, gamma scanning, thermal balance, ... The required accuracies are reviewed: they are of 3-5% on flux, fluences, nuclear power, burn-up, conversion factor, ... These required accuracies are compared with the real accuracies available which are at the present time of order of 5-20% on these parameters

  9. Functional Mobility Testing: A Novel Method to Create Suit Design Requirements

    Science.gov (United States)

    England, Scott A.; Benson, Elizabeth A.; Rajulu, Sudhakar L.

    2008-01-01

    This study was performed to aide in the creation of design requirements for the next generation of space suits that more accurately describe the level of mobility necessary for a suited crewmember through the use of an innovative methodology utilizing functional mobility. A novel method was utilized involving the collection of kinematic data while 20 subjects (10 male, 10 female) performed pertinent functional tasks that will be required of a suited crewmember during various phases of a lunar mission. These tasks were selected based on relevance and criticality from a larger list of tasks that may be carried out by the crew. Kinematic data was processed through Vicon BodyBuilder software to calculate joint angles for the ankle, knee, hip, torso, shoulder, elbow, and wrist. Maximum functional mobility was consistently lower than maximum isolated mobility. This study suggests that conventional methods for establishing design requirements for human-systems interfaces based on maximal isolated joint capabilities may overestimate the required mobility. Additionally, this method provides a valuable means of evaluating systems created from these requirements by comparing the mobility available in a new spacesuit, or the mobility required to use a new piece of hardware, to this newly established database of functional mobility.

  10. Software Safety Analysis of Digital Protection System Requirements Using a Qualitative Formal Method

    International Nuclear Information System (INIS)

    Lee, Jang-Soo; Kwon, Kee-Choon; Cha, Sung-Deok

    2004-01-01

    The safety analysis of requirements is a key problem area in the development of software for the digital protection systems of a nuclear power plant. When specifying requirements for software of the digital protection systems and conducting safety analysis, engineers find that requirements are often known only in qualitative terms and that existing fault-tree analysis techniques provide little guidance on formulating and evaluating potential failure modes. A framework for the requirements engineering process is proposed that consists of a qualitative method for requirements specification, called the qualitative formal method (QFM), and a safety analysis method for the requirements based on causality information, called the causal requirements safety analysis (CRSA). CRSA is a technique that qualitatively evaluates causal relationships between software faults and physical hazards. This technique, extending the qualitative formal method process and utilizing information captured in the state trajectory, provides specific guidelines on how to identify failure modes and the relationship among them. The QFM and CRSA processes are described using shutdown system 2 of the Wolsong nuclear power plants as the digital protection system example

  11. Significance tests in mutagen screening: another method considering historical control frequencies

    International Nuclear Information System (INIS)

    Traut, H.

    1983-01-01

    Recently a method has been devised for testing the significance of the difference between a mutation frequency observed after chemical treatment or iradiation and the historical ('stable') control frequency. Another test is proposed serving the same purpose. Both methods are applied to several examples (experimental frequency versus historical control frequency). The results (P values) obtained agree well. (author)

  12. 5 CFR 610.404 - Requirement for time-accounting method.

    Science.gov (United States)

    2010-01-01

    ... REGULATIONS HOURS OF DUTY Flexible and Compressed Work Schedules § 610.404 Requirement for time-accounting method. An agency that authorizes a flexible work schedule or a compressed work schedule under this...

  13. Effective teaching methods in higher education: requirements and barriers

    Directory of Open Access Journals (Sweden)

    NAHID SHIRANI BIDABADI

    2016-10-01

    Full Text Available Introduction: Teaching is one of the main components in educational planning which is a key factor in conducting educational plans. Despite the importance of good teaching, the outcomes are far from ideal. The present qualitative study aimed to investigate effective teaching in higher education in Iran based on the experiences of best professors in the country and the best local professors of Isfahan University of Technology. Methods: This qualitative content analysis study was conducted through purposeful sampling. Semi-structured interviews were conducted with ten faculty members (3 of them from the best professors in the country and 7 from the best local professors. Content analysis was performed by MAXQDA software. The codes, categories and themes were explored through an inductive process that began from semantic units or direct quotations to general themes. Results: According to the results of this study, the best teaching approach is the mixed method (student-centered together with teacher-centered plus educational planning and previous readiness. But whenever the teachers can teach using this method confront with some barriers and requirements; some of these requirements are prerequisite in professors’ behavior and some of these are prerequisite in professors’ outlook. Also, there are some major barriers, some of which are associated with the professors’ operation and others are related to laws and regulations. Implications of these findings for teachers’ preparation in education are discussed. Conclusion: In the present study, it was illustrated that a good teaching method helps the students to question their preconceptions, and motivates them to learn, by putting them in a situation in which they come to see themselves as the authors of answers, as the agents of responsibility for change. But training through this method has some barriers and requirements. To have an effective teaching; the faculty members of the universities

  14. Rapid methods for jugular bleeding of dogs requiring one technician.

    Science.gov (United States)

    Frisk, C S; Richardson, M R

    1979-06-01

    Two methods were used to collect blood from the jugular vein of dogs. In both techniques, only one technician was required. A rope with a slip knot was placed around the base of the neck to assist in restraint and act as a tourniquet for the vein. The technician used one hand to restrain the dog by the muzzle and position the head. The other hand was used for collecting the sample. One of the methods could be accomplished with the dog in its cage. The bleeding techniques were rapid, requiring approximately 1 minute per dog.

  15. Assessing Clinical Significance: Does it Matter which Method we Use?

    Science.gov (United States)

    Atkins, David C.; Bedics, Jamie D.; Mcglinchey, Joseph B.; Beauchaine, Theodore P.

    2005-01-01

    Measures of clinical significance are frequently used to evaluate client change during therapy. Several alternatives to the original method devised by N. S. Jacobson, W. C. Follette, & D. Revenstorf (1984) have been proposed, each purporting to increase accuracy. However, researchers have had little systematic guidance in choosing among…

  16. 40 CFR 63.344 - Performance test requirements and test methods.

    Science.gov (United States)

    2010-07-01

    ... electroplating tanks or chromium anodizing tanks. The sampling time and sample volume for each run of Methods 306... Chromium Anodizing Tanks § 63.344 Performance test requirements and test methods. (a) Performance test... Emissions From Decorative and Hard Chromium Electroplating and Anodizing Operations,” appendix A of this...

  17. The Ability of Different Imputation Methods to Preserve the Significant Genes and Pathways in Cancer

    Directory of Open Access Journals (Sweden)

    Rosa Aghdam

    2017-12-01

    Full Text Available Deciphering important genes and pathways from incomplete gene expression data could facilitate a better understanding of cancer. Different imputation methods can be applied to estimate the missing values. In our study, we evaluated various imputation methods for their performance in preserving significant genes and pathways. In the first step, 5% genes are considered in random for two types of ignorable and non-ignorable missingness mechanisms with various missing rates. Next, 10 well-known imputation methods were applied to the complete datasets. The significance analysis of microarrays (SAM method was applied to detect the significant genes in rectal and lung cancers to showcase the utility of imputation approaches in preserving significant genes. To determine the impact of different imputation methods on the identification of important genes, the chi-squared test was used to compare the proportions of overlaps between significant genes detected from original data and those detected from the imputed datasets. Additionally, the significant genes are tested for their enrichment in important pathways, using the ConsensusPathDB. Our results showed that almost all the significant genes and pathways of the original dataset can be detected in all imputed datasets, indicating that there is no significant difference in the performance of various imputation methods tested. The source code and selected datasets are available on http://profiles.bs.ipm.ir/softwares/imputation_methods/.

  18. The Ability of Different Imputation Methods to Preserve the Significant Genes and Pathways in Cancer.

    Science.gov (United States)

    Aghdam, Rosa; Baghfalaki, Taban; Khosravi, Pegah; Saberi Ansari, Elnaz

    2017-12-01

    Deciphering important genes and pathways from incomplete gene expression data could facilitate a better understanding of cancer. Different imputation methods can be applied to estimate the missing values. In our study, we evaluated various imputation methods for their performance in preserving significant genes and pathways. In the first step, 5% genes are considered in random for two types of ignorable and non-ignorable missingness mechanisms with various missing rates. Next, 10 well-known imputation methods were applied to the complete datasets. The significance analysis of microarrays (SAM) method was applied to detect the significant genes in rectal and lung cancers to showcase the utility of imputation approaches in preserving significant genes. To determine the impact of different imputation methods on the identification of important genes, the chi-squared test was used to compare the proportions of overlaps between significant genes detected from original data and those detected from the imputed datasets. Additionally, the significant genes are tested for their enrichment in important pathways, using the ConsensusPathDB. Our results showed that almost all the significant genes and pathways of the original dataset can be detected in all imputed datasets, indicating that there is no significant difference in the performance of various imputation methods tested. The source code and selected datasets are available on http://profiles.bs.ipm.ir/softwares/imputation_methods/. Copyright © 2017. Production and hosting by Elsevier B.V.

  19. V-amylose structural characteristics, methods of preparation, significance, and potential applications

    CSIR Research Space (South Africa)

    Obiro, WC

    2012-02-01

    Full Text Available , and postprandial hyperglycaemia in diabetics. Various aspects of V-amylose structure, methods of preparation, factors that affect its formation, and the significance and potential applications of the V-amylose complexes are reviewed....

  20. Effective Teaching Methods in Higher Education: Requirements and Barriers.

    Science.gov (United States)

    Shirani Bidabadi, Nahid; Nasr Isfahani, Ahmmadreza; Rouhollahi, Amir; Khalili, Roya

    2016-10-01

    Teaching is one of the main components in educational planning which is a key factor in conducting educational plans. Despite the importance of good teaching, the outcomes are far from ideal. The present qualitative study aimed to investigate effective teaching in higher education in Iran based on the experiences of best professors in the country and the best local professors of Isfahan University of Technology. This qualitative content analysis study was conducted through purposeful sampling. Semi-structured interviews were conducted with ten faculty members (3 of them from the best professors in the country and 7 from the best local professors). Content analysis was performed by MAXQDA software. The codes, categories and themes were explored through an inductive process that began from semantic units or direct quotations to general themes. According to the results of this study, the best teaching approach is the mixed method (student-centered together with teacher-centered) plus educational planning and previous readiness. But whenever the teachers can teach using this method confront with some barriers and requirements; some of these requirements are prerequisite in professors' behavior and some of these are prerequisite in professors' outlook. Also, there are some major barriers, some of which are associated with the professors' operation and others are related to laws and regulations. Implications of these findings for teachers' preparation in education are discussed. In the present study, it was illustrated that a good teaching method helps the students to question their preconceptions, and motivates them to learn, by putting them in a situation in which they come to see themselves as the authors of answers, as the agents of responsibility for change. But training through this method has some barriers and requirements. To have an effective teaching; the faculty members of the universities should be awarded of these barriers and requirements as a way to

  1. New significance test methods for Fourier analysis of geophysical time series

    Directory of Open Access Journals (Sweden)

    Z. Zhang

    2011-09-01

    Full Text Available When one applies the discrete Fourier transform to analyze finite-length time series, discontinuities at the data boundaries will distort its Fourier power spectrum. In this paper, based on a rigid statistics framework, we present a new significance test method which can extract the intrinsic feature of a geophysical time series very well. We show the difference in significance level compared with traditional Fourier tests by analyzing the Arctic Oscillation (AO and the Nino3.4 time series. In the AO, we find significant peaks at about 2.8, 4.3, and 5.7 yr periods and in Nino3.4 at about 12 yr period in tests against red noise. These peaks are not significant in traditional tests.

  2. Extrapeural locating method: significance in CT-guided transthoracic pulmonary biopsy

    International Nuclear Information System (INIS)

    Tang Guangjian; Wang Rengui; Liu Jianxin; Sun Jingtao

    2008-01-01

    Objective: To evaluate the usefulness of extrapleural locating method in CT-guided transthoracic pulmonary biopsy to prevent or reduce the size of peumothorax. Methods: One hundred and fifteen cases of CT-guided transthoracic pulmonary biopsy with the pulmonary lesions not in direct contact with the pleura were selected. Of 115 cases, 46 were performed with extrapleural locating method (EPL) and 69 cases with lesion edge locating method (LEL). Taking the maximum distance between the partial and visceral pleura (MPVD) measured on the CT image after the procedure as the index of the volume of pneumothorax. The incidence and volume of pneumothorax of both groups were compared and statistically analysed with R x C Chi-Square test. The retention time of the biopsy needle in the lung parenchyma of the two group was documented and the average time was calculated in each group. Results: The incidence of pneumothorax was 45.7% (21/46), median 0.4 cm with EPL group, and 66.7% (46/69) and median 0.3cm with LEL group. When the distance between the lesion and pleura was equal or smaller than 2 cm (≤2cm), the incidence of pneumothorax was 39.4% (13/33) with EPL group and 73.2% (30/41) with LEL group, and the difference of incidence and volume of the pneumothorax between two groups was statistically significant(χ 2 =9.981, P=0.019). When the distance was larger than 2 cm (>2 cm), the incidence and volume of pneumothorax between two groups were not significant statistically. The average retention time of the biopsy needle in the lung parenchyma was (7.2 ± 1.8) s with EPL group and (58.3 ± 11.6) s with LEL group. Conclusion: The extrapleural locating method can reduce effectively the retention time of the biopsy needle in the lung parenchyma and the incidence and volume of pneumothorax in CT-guided transthoracic pulmonary biopsy. (authors)

  3. Identifying and prioritizing customer requirements from tractor production by QFD method

    Directory of Open Access Journals (Sweden)

    H Taghizadeh

    2017-05-01

    Full Text Available Introduction Discovering and understanding customer needs and expectations are considered as important factors on customer satisfaction and play vital role to maintain the current activity among its competitors, proceeding and obtaining customer satisfaction which are critical factors to design a successful production; thus the successful organizations must meet their needs containing the quality of the products or services to customers. Quality Function Deployment (QFD is a technique for studying demands and needs of customers which is going to give more emphasis to the customer's interests in this way. The QFD method in general implemented various tools and methods for reaching qualitative goals; but the most important and the main tool of this method is the house of quality diagrams. The Analytic Hierarchy Process (AHP is a famous and common MADM method based on pair wise comparisons used for determining the priority of understudied factors in various studies until now. With considering effectiveness of QFD method to explicating customer's demands and obtaining customer satisfaction, generally, the researchers followed this question's suite and scientific answer: how can QFD explicate real demands and requirements of customers from tractor final production and what is the prioritization of these demands and requirements in view of customers. Accordingly, the aim of this study was to identify and prioritize the customer requirements of Massey Ferguson (MF 285 tractor production in Iran tractor manufacturing company with t- student statistical test, AHP and QFD methods. Materials and Methods Research method was descriptive and statistical population included all of the tractor customers of Tractor Manufacturing Company in Iran from March 2011 to March 2015. The statistical sample size was 171 which are determined with Cochran index. Moreover, 20 experts' opinion has been considered for determining product's technical requirements. Literature

  4. A comparison of published methods of calculation of defect significance

    International Nuclear Information System (INIS)

    Ingham, T.; Harrison, R.P.

    1982-01-01

    This paper presents some of the results obtained in a round-robin calculational exercise organised by the OECD Committee on the Safety of Nuclear Installations (CSNI). The exercise was initiated to examine practical aspects of using documented elastic-plastic fracture mechanics methods to calculate defect significance. The extent to which the objectives of the exercise were met is illustrated using solutions to 'standard' problems produced by UKAEA and CEGB using the methods given in ASME XI, Appendix A, BSI PD6493, and the CEGB R/H/R6 Document. Differences in critical or tolerable defect size defined using these procedures are examined in terms of their different treatments and reasons for discrepancies are discussed. (author)

  5. Quality assurance requirements and methods for high level waste package acceptability

    International Nuclear Information System (INIS)

    1992-12-01

    This document should serve as guidance for assigning the necessary items to control the conditioning process in such a way that waste packages are produced in compliance with the waste acceptance requirements. It is also provided to promote the exchange of information on quality assurance requirements and on the application of quality assurance methods associated with the production of high level waste packages, to ensure that these waste packages comply with the requirements for transportation, interim storage and waste disposal in deep geological formations. The document is intended to assist both the operators of conditioning facilities and repositories as well as national authorities and regulatory bodies, involved in the licensing of the conditioning of high level radioactive wastes or in the development of deep underground disposal systems. The document recommends the quality assurance requirements and methods which are necessary to generate data for these parameters identified in IAEA-TECDOC-560 on qualitative acceptance criteria, and indicates where and when the control methods can be applied, e.g. in the operation or commissioning of a process or in the development of a waste package design. Emphasis is on the control of the process and little reliance is placed on non-destructive or destructive testing. Qualitative criteria, relevant to disposal of high level waste, are repository dependent and are not addressed here. 37 refs, 3 figs, 2 tabs

  6. Impact significance determination-Pushing the boundaries

    International Nuclear Information System (INIS)

    Lawrence, David P.

    2007-01-01

    Impact significance determination practice tends to be highly variable. Too often insufficient consideration is given to good practice insights. Also, impact significance determinations are frequently narrowly defined addressing, for example, only individual, negative impacts, focusing on bio-physical impacts, and not seeking to integrate either the Precautionary Principle or sustainability. This article seeks to extend the boundaries of impact significance determination practice by providing an overview of good general impact significance practices, together with stakeholder roles and potential methods for addressing significance determination challenges. Relevant thresholds, criteria, contextual considerations and support methods are also highlighted. The analysis is then extended to address how impact significance determination practices change for positive as compared with negative impacts, for cumulative as compared with individual impacts, for socio-economic as compared with bio-physical impacts, when the Precautionary Principle is integrated into the process, and when sustainability contributions drive the EIA process and related impact significance determinations. These refinements can assist EIA practitioners in ensuring that the scope and nature of impact significance determinations reflect the broadened scope of emerging EIA requirements and practices. Suggestions are included for further refining and testing of the proposed changes to impact significance determination practice

  7. A Fluorine-18 Radiolabeling Method Enabled by Rhenium(I) Complexation Circumvents the Requirement of Anhydrous Conditions.

    Science.gov (United States)

    Klenner, Mitchell A; Pascali, Giancarlo; Zhang, Bo; Sia, Tiffany R; Spare, Lawson K; Krause-Heuer, Anwen M; Aldrich-Wright, Janice R; Greguric, Ivan; Guastella, Adam J; Massi, Massimiliano; Fraser, Benjamin H

    2017-05-11

    Azeotropic distillation is typically required to achieve fluorine-18 radiolabeling during the production of positron emission tomography (PET) imaging agents. However, this time-consuming process also limits fluorine-18 incorporation, due to radioactive decay of the isotope and its adsorption to the drying vessel. In addressing these limitations, the fluorine-18 radiolabeling of one model rhenium(I) complex is reported here, which is significantly improved under conditions that do not require azeotropic drying. This work could open a route towards the investigation of a simplified metal-mediated late-stage radiofluorination method, which would expand upon the accessibility of new PET and PET-optical probes. © 2017 Wiley-VCH Verlag GmbH & Co. KGaA, Weinheim.

  8. Age most significant predictor of requiring enteral feeding in head-and-neck cancer patients

    International Nuclear Information System (INIS)

    Sachdev, Sean; Refaat, Tamer; Bacchus, Ian D; Sathiaseelan, Vythialinga; Mittal, Bharat B

    2015-01-01

    A significant number of patients treated for head and neck squamous cell cancer (HNSCC) undergo enteral tube feeding. Data suggest that avoiding enteral feeding can prevent long-term tube dependence and disuse of the swallowing mechanism which has been linked to complications such as prolonged dysphagia and esophageal constriction. We examined detailed dosimetric and clinical parameters to better identify those at risk of requiring enteral feeding. One hundred patients with advanced stage HNSCC were retrospectively analyzed after intensity-modulated radiation therapy (IMRT) to a median dose of 70 Gy (range: 60-75 Gy) with concurrent chemotherapy in nearly all cases (97%). Patients with significant weight loss (>10%) in the setting of severely reduced oral intake were referred for placement of a percutaneous endoscopic gastrostomy (PEG) tube. Detailed DVH parameters were collected for several structures. Univariate and multivariate analyses using logistic regression were used to determine clinical and dosimetric factors associated with needing enteral feeding. Dichotomous outcomes were tested using Fisher’s exact test and continuous variables between groups using the Wilcoxon rank-sum test. Thirty-three percent of patients required placement of an enteral feeding tube. The median time to tube placement was 25 days from start of treatment, after a median dose of 38 Gy. On univariate analysis, age (p = 0.0008), the DFH (Docetaxel/5-FU/Hydroxyurea) chemotherapy regimen (p = .042) and b.i.d treatment (P = 0.040) (used in limited cases on protocol) predicted need for enteral feeding. On multivariate analysis, age remained the single statistically significant factor (p = 0.003) regardless of other clinical features (e.g. BMI) and all radiation planning parameters. For patients 60 or older compared to younger adults, the odds ratio for needing enteral feeding was 4.188 (p = 0.0019). Older age was found to be the most significant risk factor for needing enteral feeding in

  9. A Method for and Issues Associated with the Determination of Space Suit Joint Requirements

    Science.gov (United States)

    Matty, Jennifer E.; Aitchison, Lindsay

    2009-01-01

    In the design of a new space suit it is necessary to have requirements that define what mobility space suit joints should be capable of achieving in both a system and at the component level. NASA elected to divide mobility into its constituent parts-range of motion (ROM) and torque- in an effort to develop clean design requirements that limit subject performance bias and are easily verified. Unfortunately, the measurement of mobility can be difficult to obtain. Current technologies, such as the Vicon motion capture system, allow for the relatively easy benchmarking of range of motion (ROM) for a wide array of space suit systems. The ROM evaluations require subjects in the suit to accurately evaluate the ranges humans can achieve in the suit. However, when it comes to torque, there are significant challenges for both benchmarking current performance and writing requirements for future suits. This is reflected in the fact that torque definitions have been applied to very few types of space suits and with limited success in defining all the joints accurately. This paper discussed the advantages and disadvantages to historical joint torque evaluation methods, describes more recent efforts directed at benchmarking joint torques of prototype space suits, and provides an outline for how NASA intends to address joint torque in design requirements for the Constellation Space Suit System (CSSS).

  10. Interpreting Statistical Significance Test Results: A Proposed New "What If" Method.

    Science.gov (United States)

    Kieffer, Kevin M.; Thompson, Bruce

    As the 1994 publication manual of the American Psychological Association emphasized, "p" values are affected by sample size. As a result, it can be helpful to interpret the results of statistical significant tests in a sample size context by conducting so-called "what if" analyses. However, these methods can be inaccurate…

  11. Identification of Water Quality Significant Parameter with Two Transformation/Standardization Methods on Principal Component Analysis and Scilab Software

    Directory of Open Access Journals (Sweden)

    Jovan Putranda

    2016-09-01

    Full Text Available Water quality monitoring is prone to encounter error on its recording or measuring process. The monitoring on river water quality not only aims to recognize the water quality dynamic, but also to evaluate the data to create river management policy and water pollution in order to maintain the continuity of human health or sanitation requirement, and biodiversity preservation. Evaluation on water quality monitoring needs to be started by identifying the important water quality parameter. This research objected to identify the significant parameters by using two transformation or standardization methods on water quality data, which are the river Water Quality Index, WQI (Indeks Kualitas Air, Sungai, IKAs transformation or standardization method and transformation or standardization method with mean 0 and variance 1; so that the variability of water quality parameters could be aggregated with one another. Both of the methods were applied on the water quality monitoring data which its validity and reliability have been tested. The PCA, Principal Component Analysis (Analisa Komponen Utama, AKU, with the help of Scilab software, has been used to process the secondary data on water quality parameters of Gadjah Wong river in 2004-2013, with its validity and reliability has been tested. The Scilab result was cross examined with the result from the Excel-based Biplot Add In software. The research result showed that only 18 from total 35 water quality parameters that have passable data quality. The two transformation or standardization data methods gave different significant parameter type and amount result. On the transformation or standardization mean 0 variances 1, there were water quality significant parameter dynamic to mean concentration of each water quality parameters, which are TDS, SO4, EC, TSS, NO3N, COD, BOD5, Grease Oil and NH3N. On the river WQI transformation or standardization, the water quality significant parameter showed the level of

  12. Neural Interfaces for Intracortical Recording: Requirements, Fabrication Methods, and Characteristics.

    Science.gov (United States)

    Szostak, Katarzyna M; Grand, Laszlo; Constandinou, Timothy G

    2017-01-01

    Implantable neural interfaces for central nervous system research have been designed with wire, polymer, or micromachining technologies over the past 70 years. Research on biocompatible materials, ideal probe shapes, and insertion methods has resulted in building more and more capable neural interfaces. Although the trend is promising, the long-term reliability of such devices has not yet met the required criteria for chronic human application. The performance of neural interfaces in chronic settings often degrades due to foreign body response to the implant that is initiated by the surgical procedure, and related to the probe structure, and material properties used in fabricating the neural interface. In this review, we identify the key requirements for neural interfaces for intracortical recording, describe the three different types of probes-microwire, micromachined, and polymer-based probes; their materials, fabrication methods, and discuss their characteristics and related challenges.

  13. Prognostic significance of blood-brain barrier disruption in patients with severe nonpenetrating traumatic brain injury requiring decompressive craniectomy.

    Science.gov (United States)

    Ho, Kwok M; Honeybul, Stephen; Yip, Cheng B; Silbert, Benjamin I

    2014-09-01

    The authors assessed the risk factors and outcomes associated with blood-brain barrier (BBB) disruption in patients with severe, nonpenetrating, traumatic brain injury (TBI) requiring decompressive craniectomy. At 2 major neurotrauma centers in Western Australia, a retrospective cohort study was conducted among 97 adult neurotrauma patients who required an external ventricular drain (EVD) and decompressive craniectomy during 2004-2012. Glasgow Outcome Scale scores were used to assess neurological outcomes. Logistic regression was used to identify factors associated with BBB disruption, defined by a ratio of total CSF protein concentrations to total plasma protein concentration > 0.007 in the earliest CSF specimen collected after TBI. Of the 252 patients who required decompressive craniectomy, 97 (39%) required an EVD to control intracranial pressure, and biochemical evidence of BBB disruption was observed in 43 (44%). Presence of disruption was associated with more severe TBI (median predicted risk for unfavorable outcome 75% vs 63%, respectively; p = 0.001) and with worse outcomes at 6, 12, and 18 months than was absence of BBB disruption (72% vs 37% unfavorable outcomes, respectively; p = 0.015). The only risk factor significantly associated with increased risk for BBB disruption was presence of nonevacuated intracerebral hematoma (> 1 cm diameter) (OR 3.03, 95% CI 1.23-7.50; p = 0.016). Although BBB disruption was associated with more severe TBI and worse long-term outcomes, when combined with the prognostic information contained in the Corticosteroid Randomization after Significant Head Injury (CRASH) prognostic model, it did not seem to add significant prognostic value (area under the receiver operating characteristic curve 0.855 vs 0.864, respectively; p = 0.453). Biochemical evidence of BBB disruption after severe nonpenetrating TBI was common, especially among patients with large intracerebral hematomas. Disruption of the BBB was associated with more severe

  14. Neural Interfaces for Intracortical Recording: Requirements, Fabrication Methods, and Characteristics

    Directory of Open Access Journals (Sweden)

    Katarzyna M. Szostak

    2017-12-01

    Full Text Available Implantable neural interfaces for central nervous system research have been designed with wire, polymer, or micromachining technologies over the past 70 years. Research on biocompatible materials, ideal probe shapes, and insertion methods has resulted in building more and more capable neural interfaces. Although the trend is promising, the long-term reliability of such devices has not yet met the required criteria for chronic human application. The performance of neural interfaces in chronic settings often degrades due to foreign body response to the implant that is initiated by the surgical procedure, and related to the probe structure, and material properties used in fabricating the neural interface. In this review, we identify the key requirements for neural interfaces for intracortical recording, describe the three different types of probes—microwire, micromachined, and polymer-based probes; their materials, fabrication methods, and discuss their characteristics and related challenges.

  15. Developing the RIAM method (rapid impact assessment matrix) in the context of impact significance assessment

    International Nuclear Information System (INIS)

    Ijaes, Asko; Kuitunen, Markku T.; Jalava, Kimmo

    2010-01-01

    In this paper the applicability of the RIAM method (rapid impact assessment matrix) is evaluated in the context of impact significance assessment. The methodological issues considered in the study are: 1) to test the possibilities of enlarging the scoring system used in the method, and 2) to compare the significance classifications of RIAM and unaided decision-making to estimate the consistency between these methods. The data used consisted of projects for which funding had been applied for via the European Union's Regional Development Trust in the area of Central Finland. Cases were evaluated with respect to their environmental, social and economic impacts using an assessment panel. The results showed the scoring framework used in RIAM could be modified according to the problem situation at hand, which enhances its application potential. However the changes made in criteria B did not significantly affect the final ratings of the method, which indicates the high importance of criteria A1 (importance) and A2 (magnitude) to the overall results. The significance classes obtained by the two methods diverged notably. In general the ratings given by RIAM tended to be smaller compared to intuitive judgement implying that the RIAM method may be somewhat conservative in character.

  16. Improving allowed outage time and surveillance test interval requirements: a study of their interactions using probabilistic methods

    International Nuclear Information System (INIS)

    Martorell, S.A.; Serradell, V.G.; Samanta, P.K.

    1995-01-01

    Technical Specifications (TS) define the limits and conditions for operating nuclear plants safely. We selected the Limiting Conditions for Operations (LCO) and Surveillance Requirements (SR), both within TS, as the main items to be evaluated using probabilistic methods. In particular, we focused on the Allowed Outage Time (AOT) and Surveillance Test Interval (STI) requirements in LCO and SR, respectively. Already, significant operating and design experience has accumulated revealing several problems which require modifications in some TS rules. Developments in Probabilistic Safety Assessment (PSA) allow the evaluation of effects due to such modifications in AOT and STI from a risk point of view. Thus, some changes have already been adopted in some plants. However, the combined effect of several changes in AOT and STI, i.e. through their interactions, is not addressed. This paper presents a methodology which encompasses, along with the definition of AOT and STI interactions, the quantification of interactions in terms of risk using PSA methods, an approach for evaluating simultaneous AOT and STI modifications, and an assessment of strategies for giving flexibility to plant operation through simultaneous changes on AOT and STI using trade-off-based risk criteria

  17. Sensitivity Analysis of Hydraulic Methods Regarding Hydromorphologic Data Derivation Methods to Determine Environmental Water Requirements

    Directory of Open Access Journals (Sweden)

    Alireza Shokoohi

    2015-07-01

    Full Text Available This paper studies the accuracy of hydraulic methods in determining environmental flow requirements. Despite the vital importance of deriving river cross sectional data for hydraulic methods, few studies have focused on the criteria for deriving this data. The present study shows that the depth of cross section has a meaningful effect on the results obtained from hydraulic methods and that, considering fish as the index species for river habitat analysis, an optimum depth of 1 m should be assumed for deriving information from cross sections. The second important parameter required for extracting the geometric and hydraulic properties of rivers is the selection of an appropriate depth increment; ∆y. In the present research, this parameter was found to be equal to 1 cm. The uncertainty of the environmental discharge evaluation, when allocating water in areas with water scarcity, should be kept as low as possible. The Manning friction coefficient (n is an important factor in river discharge calculation. Using a range of "n" equal to 3 times the standard deviation for the study area, it is shown that the influence of friction coefficient on the estimation of environmental flow is much less than that on the calculation of river discharge.

  18. Evaluating statistical and clinical significance of intervention effects in single-case experimental designs: an SPSS method to analyze univariate data.

    Science.gov (United States)

    Maric, Marija; de Haan, Else; Hogendoorn, Sanne M; Wolters, Lidewij H; Huizenga, Hilde M

    2015-03-01

    Single-case experimental designs are useful methods in clinical research practice to investigate individual client progress. Their proliferation might have been hampered by methodological challenges such as the difficulty applying existing statistical procedures. In this article, we describe a data-analytic method to analyze univariate (i.e., one symptom) single-case data using the common package SPSS. This method can help the clinical researcher to investigate whether an intervention works as compared with a baseline period or another intervention type, and to determine whether symptom improvement is clinically significant. First, we describe the statistical method in a conceptual way and show how it can be implemented in SPSS. Simulation studies were performed to determine the number of observation points required per intervention phase. Second, to illustrate this method and its implications, we present a case study of an adolescent with anxiety disorders treated with cognitive-behavioral therapy techniques in an outpatient psychotherapy clinic, whose symptoms were regularly assessed before each session. We provide a description of the data analyses and results of this case study. Finally, we discuss the advantages and shortcomings of the proposed method. Copyright © 2014. Published by Elsevier Ltd.

  19. Proposed New Method of Interpretation of Infrared Ship Signature Requirements

    NARCIS (Netherlands)

    Neele, F.P.; Wilson, M.T.; Youern, K.

    2005-01-01

    new method of deriving and defining requirements for the infrared signature of new ships is presented. The current approach is to specify the maximum allowed temperature or radiance contrast of the ship with respect to its background. At present, in most NATO countries, it is the contractor’s

  20. [Intra-Articular Application of Tranexamic Acid Significantly Reduces Blood Loss and Transfusion Requirement in Primary Total Knee Arthroplasty].

    Science.gov (United States)

    Lošťák, J; Gallo, J; Špička, J; Langová, K

    2016-01-01

    PURPOSE OF THE STUDY The aim of this prospective study was to investigate the effect of topical application of tranexamic acid (TXA, Exacyl) on the amount of post-operative blood loss, and blood transfusion requirement in patients undergoing primary total knee arthroplasty (TKA). Attention was paid to early complications potentially associated with TXA administration, such as haematoma, wound exudate, or knee swelling. In addition, the economic benefit of TXA treatment was also taken into account. MATERIAL AND METHODS The study included 238 patients (85 men and 153 women) who underwent primary total knee arthroplasty (TKA) at our department between January 2013 and November 2015. A group of 119 patients (41 men and 78 women) received intraarticular TXA injections according to the treatment protocol (TXA group). A control group matched in basic characteristics to the TXA group also consisted of 119 patients. The average age in the TXA group was 69.8 years, and the most frequent indication for TKA surgery was primary knee osteoarthritis (81.5%). In each patient, post-operative volume of blood lost from drains and total blood loss including hidden blood loss were recorded, as well as post-operative haemoglobin and haematocrit levels. On discharge of each patient from hospital, the size and site of a haematoma; wound exudate, if present after post-operative day 4; joint swelling; range of motion and early revision surgery, if performed, were evaluated. Requirements of analgesic drugs after surgery were also recorded. RESULTS In the TXA group, blood losses from drains were significantly lower than in the control group (456.7 ± 270.8 vs 640.5 ±448.2; p = 0.004). The median value for blood losses from drains was lower by 22% and the average value for total blood loss, including hidden losses, was also lower than in the control group (762.4 ± 345.2 ml vs 995.5 ± 457.3 ml). The difference in the total amount of blood loss between the two groups was significant (p = 0

  1. Testing the Difference of Correlated Agreement Coefficients for Statistical Significance

    Science.gov (United States)

    Gwet, Kilem L.

    2016-01-01

    This article addresses the problem of testing the difference between two correlated agreement coefficients for statistical significance. A number of authors have proposed methods for testing the difference between two correlated kappa coefficients, which require either the use of resampling methods or the use of advanced statistical modeling…

  2. Method for calculating required shielding in medical x-ray rooms

    International Nuclear Information System (INIS)

    Karppinen, J.

    1997-10-01

    The new annual radiation dose limits - 20 mSv (previously 50 mSv) for radiation workers and 1 mSv (previously 5 mSv) for other persons - implies that the adequacy of existing radiation shielding must be re-evaluated. In principle, one could assume that the thicknesses of old radiation shields should be increased by about one or two half-value layers in order to comply with the new dose limits. However, the assumptions made in the earlier shielding calculations are highly conservative; the required shielding was often determined by applying the maximum high-voltage of the x-ray tube for the whole workload. A more realistic calculation shows that increased shielding is typically not necessary if more practical x-ray tube voltages are used in the evaluation. We have developed a PC-based calculation method for calculating the x-ray shielding which is more realistic than the highly conservative method formerly used. The method may be used to evaluate an existing shield for compliance with new regulations. As examples of these calculations, typical x-ray rooms are considered. The lead and concrete thickness requirements as a function of x-ray tube voltage and workload are also given in tables. (author)

  3. Latency in Visionic Systems: Test Methods and Requirements

    Science.gov (United States)

    Bailey, Randall E.; Arthur, J. J., III; Williams, Steven P.; Kramer, Lynda J.

    2005-01-01

    A visionics device creates a pictorial representation of the external scene for the pilot. The ultimate objective of these systems may be to electronically generate a form of Visual Meteorological Conditions (VMC) to eliminate weather or time-of-day as an operational constraint and provide enhancement over actual visual conditions where eye-limiting resolution may be a limiting factor. Empirical evidence has shown that the total system delays or latencies including the imaging sensors and display systems, can critically degrade their utility, usability, and acceptability. Definitions and measurement techniques are offered herein as common test and evaluation methods for latency testing in visionics device applications. Based upon available data, very different latency requirements are indicated based upon the piloting task, the role in which the visionics device is used in this task, and the characteristics of the visionics cockpit display device including its resolution, field-of-regard, and field-of-view. The least stringent latency requirements will involve Head-Up Display (HUD) applications, where the visionics imagery provides situational information as a supplement to symbology guidance and command information. Conversely, the visionics system latency requirement for a large field-of-view Head-Worn Display application, providing a Virtual-VMC capability from which the pilot will derive visual guidance, will be the most stringent, having a value as low as 20 msec.

  4. Evaluation of methods to estimate the essential amino acids requirements of fish from the muscle amino acid profile

    Directory of Open Access Journals (Sweden)

    Álvaro José de Almeida Bicudo

    2014-03-01

    Full Text Available Many methods to estimate amino acid requirement based on amino acid profile of fish have been proposed. This study evaluates the methodology proposed by Meyer & Fracalossi (2005 and by Tacon (1989 to estimate amino acids requirement of fish, which do exempt knowledge on previous nutritional requirement of reference amino acid. Data on amino acid requirement of pacu, Piaractus mesopotamicus, were used to validate de accuracy of those methods. Meyer & Fracalossi's and Tacon's methodology estimated the lysine requirement of pacu, respectively, at 13 and 23% above requirement determined using dose-response method. The values estimated by both methods lie within the range of requirements determined for other omnivorous fish species, the Meyer & Fracalossi (2005 method showing better accuracy.

  5. KidReporter : a method for engaging children in making a newspaper to gather user requirements

    NARCIS (Netherlands)

    Bekker, M.M.; Beusmans, J.; Keyson, D.V.; Lloyd, P.A.; Bekker, M.M.; Markopoulos, P.; Tsikalkina, M.

    2002-01-01

    We describe a design method, called the KidReporter method, for gathering user requirements from children. Two school classes participated in making a newspaper about a zoo, to gather requirements for the design process of an interactive educational game. The educational game was developed to

  6. Fault tree construction of hybrid system requirements using qualitative formal method

    International Nuclear Information System (INIS)

    Lee, Jang-Soo; Cha, Sung-Deok

    2005-01-01

    When specifying requirements for software controlling hybrid systems and conducting safety analysis, engineers experience that requirements are often known only in qualitative terms and that existing fault tree analysis techniques provide little guidance on formulating and evaluating potential failure modes. In this paper, we propose Causal Requirements Safety Analysis (CRSA) as a technique to qualitatively evaluate causal relationship between software faults and physical hazards. This technique, extending qualitative formal method process and utilizing information captured in the state trajectory, provides specific guidelines on how to identify failure modes and relationship among them. Using a simplified electrical power system as an example, we describe step-by-step procedures of conducting CRSA. Our experience of applying CRSA to perform fault tree analysis on requirements for the Wolsong nuclear power plant shutdown system indicates that CRSA is an effective technique in assisting safety engineers

  7. Methods for ensuring compliance with regulatory requirements: regulators and operators

    International Nuclear Information System (INIS)

    Fleischmann, A.W.

    1989-01-01

    Some of the methods of ensuring compliance with regulatory requirements contained in various radiation protection documents such as Regulations, ICRP Recommendations etc. are considered. These include radiation safety officers and radiation safety committees, personnel monitoring services, dissemination of information, inspection services and legislative power of enforcement. Difficulties in ensuring compliance include outmoded legislation, financial and personnel constraints

  8. 42 CFR 84.146 - Method of measuring the power and torque required to operate blowers.

    Science.gov (United States)

    2010-10-01

    ... 42 Public Health 1 2010-10-01 2010-10-01 false Method of measuring the power and torque required... RESPIRATORY PROTECTIVE DEVICES Supplied-Air Respirators § 84.146 Method of measuring the power and torque.... These are used to facilitate timing. To determine the torque or horsepower required to operate the...

  9. Reduction in requirements for allogeneic blood products: nonpharmacologic methods.

    Science.gov (United States)

    Hardy, J F; Bélisle, S; Janvier, G; Samama, M

    1996-12-01

    Various strategies have been proposed to decrease bleeding and allogeneic transfusion requirements during and after cardiac operations. This article attempts to document the usefulness, or lack thereof, of the nonpharmacologic methods available in clinical practice. Blood conservation methods were reviewed in chronologic order, as they become available to patients during the perisurgical period. The literature in support of or against each strategy was reexamined critically. Avoidance of preoperative anemia and adherence to published guidelines for the practice of transfusion are of paramount importance. Intraoperatively, tolerance of low hemoglobin concentrations and use of autologous blood (predonated or harvested before bypass) will reduce allogeneic transfusions. The usefulness of plateletpheresis and retransfusion of shed mediastinal fluid remains controversial. Intraoperatively and postoperatively, maintenance of normothermia contributes to improved hemostasis. Several approaches have been shown to be effective. An efficient combination of methods can reduce, and sometimes abolish, the need for allogeneic blood products after cardiac operations, inasmuch as all those involved in the care of cardiac surgical patients adhere thoughtfully to existing transfusion guidelines.

  10. Adjust the method of the FMEA to the requirements of the aviation industry

    Directory of Open Access Journals (Sweden)

    Andrzej FELLNER

    2015-12-01

    Full Text Available The article presents a summary of current methods used in aviation and rail transport. It also contains a proposal to adjust the method of the FMEA to the latest requirements of the airline industry. The authors suggested tables of indicators Zn, Pr and Dt necessary to implement FMEA method of risk analysis taking into account current achievements aerospace and rail safety. Also proposed acceptable limits of the RPN number which allows you to classify threats.

  11. Significance of roentgenologic and nuclear medicine methods in diagnosis and operative indications of coronary artery disease

    Energy Technology Data Exchange (ETDEWEB)

    Felix, R [Bonn Univ. (F.R. Germany). Radiologische Klinik; Winkler, C [Bonn Univ. (F.R. Germany). Inst. fuer Klinische und Experimentelle Nuklearmedizin; Schaede, A [Bonn Univ. (F.R. Germany). Medizinische Klinik

    1976-03-01

    Significance and technique of roentgenologic and nuclear medicine methods for evaluation of coronary artery disease and myocardial perfusion are presented. Some routinely used methods in nuclear medicine are briefly discussed concerning the evaluation of left ventricular function.

  12. Design requirements, criteria and methods for seismic qualification of CANDU power plants

    International Nuclear Information System (INIS)

    Singh, N.; Duff, C.G.

    1979-10-01

    This report describes the requirements and criteria for the seismic design and qualification of systems and equipment in CANDU nuclear power plants. Acceptable methods and techniques for seismic qualification of CANDU nuclear power plants to mitigate the effects or the consequences of earthquakes are also described. (auth)

  13. Method of extracting significant trouble information of nuclear power plants using probabilistic analysis technique

    International Nuclear Information System (INIS)

    Shimada, Yoshio; Miyazaki, Takamasa

    2005-01-01

    In order to analyze and evaluate large amounts of trouble information of overseas nuclear power plants, it is necessary to select information that is significant in terms of both safety and reliability. In this research, a method of efficiently and simply classifying degrees of importance of components in terms of safety and reliability while paying attention to root-cause components appearing in the information was developed. Regarding safety, the reactor core damage frequency (CDF), which is used in the probabilistic analysis of a reactor, was used. Regarding reliability, the automatic plant trip probability (APTP), which is used in the probabilistic analysis of automatic reactor trips, was used. These two aspects were reflected in the development of criteria for classifying degrees of importance of components. By applying these criteria, a simple method of extracting significant trouble information of overseas nuclear power plants was developed. (author)

  14. A method for risk-informed safety significance categorization using the analytic hierarchy process and bayesian belief networks

    International Nuclear Information System (INIS)

    Ha, Jun Su; Seong, Poong Hyun

    2004-01-01

    A risk-informed safety significance categorization (RISSC) is to categorize structures, systems, or components (SSCs) of a nuclear power plant (NPP) into two or more groups, according to their safety significance using both probabilistic and deterministic insights. In the conventional methods for the RISSC, the SSCs are quantitatively categorized according to their importance measures for the initial categorization. The final decisions (categorizations) of SSCs, however, are qualitatively made by an expert panel through discussions and adjustments of opinions by using the probabilistic insights compiled in the initial categorization process and combining the probabilistic insights with the deterministic insights. Therefore, owing to the qualitative and linear decision-making process, the conventional methods have the demerits as follows: (1) they are very costly in terms of time and labor, (2) it is not easy to reach the final decision, when the opinions of the experts are in conflict and (3) they have an overlapping process due to the linear paradigm (the categorization is performed twice - first, by the engineers who propose the method, and second, by the expert panel). In this work, a method for RISSC using the analytic hierarchy process (AHP) and bayesian belief networks (BBN) is proposed to overcome the demerits of the conventional methods and to effectively arrive at a final decision (or categorization). By using the AHP and BBN, the expert panel takes part in the early stage of the categorization (that is, the quantification process) and the safety significance based on both probabilistic and deterministic insights is quantified. According to that safety significance, SSCs are quantitatively categorized into three categories such as high safety significant category (Hi), potentially safety significant category (Po), or low safety significant category (Lo). The proposed method was applied to the components such as CC-V073, CV-V530, and SI-V644 in Ulchin Unit

  15. Evaluating significance in linear mixed-effects models in R.

    Science.gov (United States)

    Luke, Steven G

    2017-08-01

    Mixed-effects models are being used ever more frequently in the analysis of experimental data. However, in the lme4 package in R the standards for evaluating significance of fixed effects in these models (i.e., obtaining p-values) are somewhat vague. There are good reasons for this, but as researchers who are using these models are required in many cases to report p-values, some method for evaluating the significance of the model output is needed. This paper reports the results of simulations showing that the two most common methods for evaluating significance, using likelihood ratio tests and applying the z distribution to the Wald t values from the model output (t-as-z), are somewhat anti-conservative, especially for smaller sample sizes. Other methods for evaluating significance, including parametric bootstrapping and the Kenward-Roger and Satterthwaite approximations for degrees of freedom, were also evaluated. The results of these simulations suggest that Type 1 error rates are closest to .05 when models are fitted using REML and p-values are derived using the Kenward-Roger or Satterthwaite approximations, as these approximations both produced acceptable Type 1 error rates even for smaller samples.

  16. Thresholds for statistical and clinical significance in systematic reviews with meta-analytic methods

    DEFF Research Database (Denmark)

    Jakobsen, Janus Christian; Wetterslev, Jorn; Winkel, Per

    2014-01-01

    BACKGROUND: Thresholds for statistical significance when assessing meta-analysis results are being insufficiently demonstrated by traditional 95% confidence intervals and P-values. Assessment of intervention effects in systematic reviews with meta-analysis deserves greater rigour. METHODS......: Methodologies for assessing statistical and clinical significance of intervention effects in systematic reviews were considered. Balancing simplicity and comprehensiveness, an operational procedure was developed, based mainly on The Cochrane Collaboration methodology and the Grading of Recommendations...... Assessment, Development, and Evaluation (GRADE) guidelines. RESULTS: We propose an eight-step procedure for better validation of meta-analytic results in systematic reviews (1) Obtain the 95% confidence intervals and the P-values from both fixed-effect and random-effects meta-analyses and report the most...

  17. Postoperative Stiffness Requiring Manipulation Under Anesthesia Is Significantly Reduced After Simultaneous Versus Staged Bilateral Total Knee Arthroplasty.

    Science.gov (United States)

    Meehan, John P; Monazzam, Shafagh; Miles, Troy; Danielsen, Beate; White, Richard H

    2017-12-20

    adjust for relevant risk factors, the 90-day odds ratio (OR) of undergoing manipulation after simultaneous bilateral TKA was significantly lower than that for unilateral TKA (OR = 0.70; 95% confidence interval [CI], 0.57 to 0.86) and staged bilateral TKA (OR = 0.71; 95% CI, 0.57 to 0.90). Similarly, at 180 days, the odds of undergoing manipulation were significantly lower after simultaneous bilateral TKA than after both unilateral TKA (OR = 0.71; 95% CI, 0.59 to 0.84) and staged bilateral TKA (OR = 0.76; 95% CI, 0.63 to 0.93). The frequency of manipulation was significantly associated with younger age, fewer comorbidities, black race, and the absence of obesity. Although the ORs were small (close to 1), simultaneous bilateral TKA had a significantly decreased rate of stiffness requiring manipulation under anesthesia at 90 days and 180 days after knee replacement compared with that after staged bilateral TKA and unilateral TKA. Therapeutic Level III. See Instructions for Authors for a complete description of levels of evidence.

  18. Method of administration of PROMIS scales did not significantly impact score level, reliability, or validity

    DEFF Research Database (Denmark)

    Bjorner, Jakob B; Rose, Matthias; Gandek, Barbara

    2014-01-01

    OBJECTIVES: To test the impact of the method of administration (MOA) on score level, reliability, and validity of scales developed in the Patient Reported Outcomes Measurement Information System (PROMIS). STUDY DESIGN AND SETTING: Two nonoverlapping parallel forms each containing eight items from......, no significant mode differences were found and all confidence intervals were within the prespecified minimal important difference of 0.2 standard deviation. Parallel-forms reliabilities were very high (ICC = 0.85-0.93). Only one across-mode ICC was significantly lower than the same-mode ICC. Tests of validity...... questionnaire (PQ), personal digital assistant (PDA), or personal computer (PC) and a second form by PC, in the same administration. Method equivalence was evaluated through analyses of difference scores, intraclass correlations (ICCs), and convergent/discriminant validity. RESULTS: In difference score analyses...

  19. Identification of significant features by the Global Mean Rank test.

    Science.gov (United States)

    Klammer, Martin; Dybowski, J Nikolaj; Hoffmann, Daniel; Schaab, Christoph

    2014-01-01

    With the introduction of omics-technologies such as transcriptomics and proteomics, numerous methods for the reliable identification of significantly regulated features (genes, proteins, etc.) have been developed. Experimental practice requires these tests to successfully deal with conditions such as small numbers of replicates, missing values, non-normally distributed expression levels, and non-identical distributions of features. With the MeanRank test we aimed at developing a test that performs robustly under these conditions, while favorably scaling with the number of replicates. The test proposed here is a global one-sample location test, which is based on the mean ranks across replicates, and internally estimates and controls the false discovery rate. Furthermore, missing data is accounted for without the need of imputation. In extensive simulations comparing MeanRank to other frequently used methods, we found that it performs well with small and large numbers of replicates, feature dependent variance between replicates, and variable regulation across features on simulation data and a recent two-color microarray spike-in dataset. The tests were then used to identify significant changes in the phosphoproteomes of cancer cells induced by the kinase inhibitors erlotinib and 3-MB-PP1 in two independently published mass spectrometry-based studies. MeanRank outperformed the other global rank-based methods applied in this study. Compared to the popular Significance Analysis of Microarrays and Linear Models for Microarray methods, MeanRank performed similar or better. Furthermore, MeanRank exhibits more consistent behavior regarding the degree of regulation and is robust against the choice of preprocessing methods. MeanRank does not require any imputation of missing values, is easy to understand, and yields results that are easy to interpret. The software implementing the algorithm is freely available for academic and commercial use.

  20. Quantitative Nuclear Medicine Imaging: Concepts, Requirements and Methods

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2014-01-15

    The absolute quantification of radionuclide distribution has been a goal since the early days of nuclear medicine. Nevertheless, the apparent complexity and sometimes limited accuracy of these methods have prevented them from being widely used in important applications such as targeted radionuclide therapy or kinetic analysis. The intricacy of the effects degrading nuclear medicine images and the lack of availability of adequate methods to compensate for these effects have frequently been seen as insurmountable obstacles in the use of quantitative nuclear medicine in clinical institutions. In the last few decades, several research groups have consistently devoted their efforts to the filling of these gaps. As a result, many efficient methods are now available that make quantification a clinical reality, provided appropriate compensation tools are used. Despite these efforts, many clinical institutions still lack the knowledge and tools to adequately measure and estimate the accumulated activities in the human body, thereby using potentially outdated protocols and procedures. The purpose of the present publication is to review the current state of the art of image quantification and to provide medical physicists and other related professionals facing quantification tasks with a solid background of tools and methods. It describes and analyses the physical effects that degrade image quality and affect the accuracy of quantification, and describes methods to compensate for them in planar, single photon emission computed tomography (SPECT) and positron emission tomography (PET) images. The fast paced development of the computational infrastructure, both hardware and software, has made drastic changes in the ways image quantification is now performed. The measuring equipment has evolved from the simple blind probes to planar and three dimensional imaging, supported by SPECT, PET and hybrid equipment. Methods of iterative reconstruction have been developed to allow for

  1. Defining and determining the significance of impacts: concepts and methods

    International Nuclear Information System (INIS)

    Christensen, S.W.; Van Winkle, W.; Mattice, J.S.

    1975-01-01

    The term impact is conceptually and mathematically defined to be the difference in the state or value of an ecosystem with versus without the source of impact. Some resulting problems associated with the measurement of impacts based on comparisons of baseline and operational data are discussed briefly. The concept of a significant adverse impact on a biological system is operationally defined in terms of an adverse impact which, according to a proposed decision-tree, justifies rejection of a project or a change in its site, design, or mode of operation. A gradient of increasing difficulty in the prediction of impacts exists as the scope of the assessment is expanded to consider long-term, far-field impacts with respect to higher levels of biological organization (e.g., communities or ecosystems). The analytical methods available for predicting short-term, near-field impacts are discussed. Finally, the role of simulation modeling as an aid to professional judgment in predicting the long-term, far-field consequences of impacts is considered, and illustrated with an example. (U.S.)

  2. A method for determining customer requirement weights based on TFMF and TLR

    Science.gov (United States)

    Ai, Qingsong; Shu, Ting; Liu, Quan; Zhou, Zude; Xiao, Zheng

    2013-11-01

    'Customer requirements' (CRs) management plays an important role in enterprise systems (ESs) by processing customer-focused information. Quality function deployment (QFD) is one of the main CRs analysis methods. Because CR weights are crucial for the input of QFD, we developed a method for determining CR weights based on trapezoidal fuzzy membership function (TFMF) and 2-tuple linguistic representation (TLR). To improve the accuracy of CR weights, we propose to apply TFMF to describe CR weights so that they can be appropriately represented. Because the fuzzy logic is not capable of aggregating information without loss, TLR model is adopted as well. We first describe the basic concepts of TFMF and TLR and then introduce an approach to compute CR weights. Finally, an example is provided to explain and verify the proposed method.

  3. Critical requirements of the SSTR method

    International Nuclear Information System (INIS)

    Gold, R.

    1975-08-01

    Discrepancies have been reported in absolute fission rate measurements observed with Solid State Tract Recorders (SSTR) and fission chambers which lie well outside experimental error. As a result of these comparisons, the reliability of the SSTR method has been seriously questioned, and the fission chamber method has been advanced for sole use in absolute fission rate determinations. In view of the absolute accuracy already reported and well documented for the SSTR method, this conclusion is both surprising and unfortunate. Two independent methods are highly desirable. Moreover, these two methods more than compliment one another, since certain in-core experiments may be amenable to either but not both techniques. Consequently, one cannot abandon the SSTR method without sacrificing crucial advantages. A critical reappraisal of certain aspects of the SSTR method is offered in the hope that the source of the current controversy can be uncovered and a long term beneficial agreement between these two methods can therefore be established. (WHK)

  4. Using the Bootstrap Method for a Statistical Significance Test of Differences between Summary Histograms

    Science.gov (United States)

    Xu, Kuan-Man

    2006-01-01

    A new method is proposed to compare statistical differences between summary histograms, which are the histograms summed over a large ensemble of individual histograms. It consists of choosing a distance statistic for measuring the difference between summary histograms and using a bootstrap procedure to calculate the statistical significance level. Bootstrapping is an approach to statistical inference that makes few assumptions about the underlying probability distribution that describes the data. Three distance statistics are compared in this study. They are the Euclidean distance, the Jeffries-Matusita distance and the Kuiper distance. The data used in testing the bootstrap method are satellite measurements of cloud systems called cloud objects. Each cloud object is defined as a contiguous region/patch composed of individual footprints or fields of view. A histogram of measured values over footprints is generated for each parameter of each cloud object and then summary histograms are accumulated over all individual histograms in a given cloud-object size category. The results of statistical hypothesis tests using all three distances as test statistics are generally similar, indicating the validity of the proposed method. The Euclidean distance is determined to be most suitable after comparing the statistical tests of several parameters with distinct probability distributions among three cloud-object size categories. Impacts on the statistical significance levels resulting from differences in the total lengths of satellite footprint data between two size categories are also discussed.

  5. A method for the estimation of the significance of cross-correlations in unevenly sampled red-noise time series

    Science.gov (United States)

    Max-Moerbeck, W.; Richards, J. L.; Hovatta, T.; Pavlidou, V.; Pearson, T. J.; Readhead, A. C. S.

    2014-11-01

    We present a practical implementation of a Monte Carlo method to estimate the significance of cross-correlations in unevenly sampled time series of data, whose statistical properties are modelled with a simple power-law power spectral density. This implementation builds on published methods; we introduce a number of improvements in the normalization of the cross-correlation function estimate and a bootstrap method for estimating the significance of the cross-correlations. A closely related matter is the estimation of a model for the light curves, which is critical for the significance estimates. We present a graphical and quantitative demonstration that uses simulations to show how common it is to get high cross-correlations for unrelated light curves with steep power spectral densities. This demonstration highlights the dangers of interpreting them as signs of a physical connection. We show that by using interpolation and the Hanning sampling window function we are able to reduce the effects of red-noise leakage and to recover steep simple power-law power spectral densities. We also introduce the use of a Neyman construction for the estimation of the errors in the power-law index of the power spectral density. This method provides a consistent way to estimate the significance of cross-correlations in unevenly sampled time series of data.

  6. Complying with US and European complaint handling requirements.

    Science.gov (United States)

    Donawa, M E

    1997-09-01

    The importance of customer complaints for providing valuable information on the use of medical devices is clearly reflected in United States (US) and European quality system requirements for handling complaints. However, there are significant differences in US and European complaint handling requirements. This article will discuss those differences and methods for ensuring compliance.

  7. Mission from Mars - a method for exploring user requirements for children in a narrative space

    DEFF Research Database (Denmark)

    Dindler, Christian; Ludvigsen, Martin; Lykke-Olesen, Andreas

    2005-01-01

    In this paper a particular design method is propagated as a supplement to existing descriptive approaches to current practice studies especially suitable for gathering requirements for the design of children's technology. The Mission from Mars method was applied during the design of an electronic...... school bag (eBag). The three-hour collaborative session provides a first-hand insight into children's practice in a fun and intriguing way. The method is proposed as a supplement to existing descriptive design methods for interaction design and children....

  8. Organisational reviews - requirements, methods and experience. Progress report 2006

    Energy Technology Data Exchange (ETDEWEB)

    Reiman, T.; Oedewald, P.; Wahlstroem, B. [VTT, Technical Research Centre of Finland (Finland); Rollenhagen, C.; Kahlbom, U. [Maelardalen University (FI)

    2007-04-15

    Organisational reviews are important instruments in the continuous quest for improved performance. In the nuclear field there has been an increasing regulatory interest in organisational performance, because incidents and accidents often point to organisational deficiencies as one of the major precursors. Many methods for organisational reviews have been proposed, but they are mostly based on ad hoc approaches to specific problems. The absence of well-established techniques for organisational reviews has already shown to cause discussions and controversies on different levels. The aim of the OrRe project is to collect the experiences from organisational reviews carried out so far and to reflect them in a theoretical model of organisational performance. Furthermore, the project aims to reflect on the criteria for the definition of the scope and content of organisational reviews. Finally, recommendations will be made for guidance for people participating in organisational reviews. This progress report describes regulatory practices in Finland and Sweden together with some case examples of organizational reviews and assessment in both countries. Some issues of concern are raised and an outline for the next year's work is proposed. Issues of concern include the sufficient depth of the assessment, the required competence in assessments, data and criteria problems, definition of the boundaries of the system to be assessed, and the necessary internal support and organisational maturity required for successful assessments. Finally, plans for next year's work are outlined. (au)

  9. Organisational reviews - requirements, methods and experience. Progress report 2006

    International Nuclear Information System (INIS)

    Reiman, T.; Oedewald, P.; Wahlstroem, B.; Rollenhagen, C.; Kahlbom, U.

    2007-04-01

    Organisational reviews are important instruments in the continuous quest for improved performance. In the nuclear field there has been an increasing regulatory interest in organisational performance, because incidents and accidents often point to organisational deficiencies as one of the major precursors. Many methods for organisational reviews have been proposed, but they are mostly based on ad hoc approaches to specific problems. The absence of well-established techniques for organisational reviews has already shown to cause discussions and controversies on different levels. The aim of the OrRe project is to collect the experiences from organisational reviews carried out so far and to reflect them in a theoretical model of organisational performance. Furthermore, the project aims to reflect on the criteria for the definition of the scope and content of organisational reviews. Finally, recommendations will be made for guidance for people participating in organisational reviews. This progress report describes regulatory practices in Finland and Sweden together with some case examples of organizational reviews and assessment in both countries. Some issues of concern are raised and an outline for the next year's work is proposed. Issues of concern include the sufficient depth of the assessment, the required competence in assessments, data and criteria problems, definition of the boundaries of the system to be assessed, and the necessary internal support and organisational maturity required for successful assessments. Finally, plans for next year's work are outlined. (au)

  10. Testing statistical significance scores of sequence comparison methods with structure similarity

    Directory of Open Access Journals (Sweden)

    Leunissen Jack AM

    2006-10-01

    Full Text Available Abstract Background In the past years the Smith-Waterman sequence comparison algorithm has gained popularity due to improved implementations and rapidly increasing computing power. However, the quality and sensitivity of a database search is not only determined by the algorithm but also by the statistical significance testing for an alignment. The e-value is the most commonly used statistical validation method for sequence database searching. The CluSTr database and the Protein World database have been created using an alternative statistical significance test: a Z-score based on Monte-Carlo statistics. Several papers have described the superiority of the Z-score as compared to the e-value, using simulated data. We were interested if this could be validated when applied to existing, evolutionary related protein sequences. Results All experiments are performed on the ASTRAL SCOP database. The Smith-Waterman sequence comparison algorithm with both e-value and Z-score statistics is evaluated, using ROC, CVE and AP measures. The BLAST and FASTA algorithms are used as reference. We find that two out of three Smith-Waterman implementations with e-value are better at predicting structural similarities between proteins than the Smith-Waterman implementation with Z-score. SSEARCH especially has very high scores. Conclusion The compute intensive Z-score does not have a clear advantage over the e-value. The Smith-Waterman implementations give generally better results than their heuristic counterparts. We recommend using the SSEARCH algorithm combined with e-values for pairwise sequence comparisons.

  11. Performance of methods for estimation of table beet water requirement in Alagoas

    Directory of Open Access Journals (Sweden)

    Daniella P. dos Santos

    Full Text Available ABSTRACT Optimization of water use in agriculture is fundamental, particularly in regions where water scarcity is intense, requiring the adoption of technologies that promote increased irrigation efficiency. The objective of this study was to evaluate evapotranspiration models and to estimate the crop coefficients of beet grown in a drainage lysimeter in the Agreste region of Alagoas. The experiment was conducted at the Campus of the Federal University of Alagoas - UFAL, in the municipality of Arapiraca, AL, between March and April 2014. Crop evapotranspiration (ETc was estimated in drainage lysimeters and reference evapotranspiration (ETo by Penman-Monteith-FAO 56 and Hargreaves-Samani methods. The Hargreaves-Samani method presented a good performance index for ETo estimation compared with the Penman-Monteith-FAO method, indicating that it is adequate for the study area. Beet ETc showed a cumulative demand of 202.11 mm for a cumulative reference evapotranspiration of 152.00 mm. Kc values determined using the Penman-Monteith-FAO 56 and Hargreaves-Samani methods were overestimated, in comparison to the Kc values of the FAO-56 standard method. With the obtained results, it is possible to correct the equations of the methods for the region, allowing for adequate irrigation management.

  12. An adaptive-binning method for generating constant-uncertainty/constant-significance light curves with Fermi-LAT data

    International Nuclear Information System (INIS)

    Lott, B.; Escande, L.; Larsson, S.; Ballet, J.

    2012-01-01

    Here, we present a method enabling the creation of constant-uncertainty/constant-significance light curves with the data of the Fermi-Large Area Telescope (LAT). The adaptive-binning method enables more information to be encapsulated within the light curve than with the fixed-binning method. Although primarily developed for blazar studies, it can be applied to any sources. Furthermore, this method allows the starting and ending times of each interval to be calculated in a simple and quick way during a first step. The reported mean flux and spectral index (assuming the spectrum is a power-law distribution) in the interval are calculated via the standard LAT analysis during a second step. In the absence of major caveats associated with this method Monte-Carlo simulations have been established. We present the performance of this method in determining duty cycles as well as power-density spectra relative to the traditional fixed-binning method.

  13. Microalbuminuria: It's Significance, risk factors and methods of ...

    African Journals Online (AJOL)

    Alasia Datonye

    Male gender Hypertension. High salt (and protein ) ... gender and high salt intake are also to be associated with a .... method has advantages and disadvantages, and the choice depends .... single voided urine samples to estimate quantitative proteinuria. .... in an Urban and Periurban School, Port Harcourt , Rivers. State.

  14. Delay generation methods with reduced memory requirements

    DEFF Research Database (Denmark)

    Tomov, Borislav Gueorguiev; Jensen, Jørgen Arendt

    2003-01-01

    Modern diagnostic ultrasound beamformers require delay information for each sample along the image lines. In order to avoid storing large amounts of focusing data, delay generation techniques have to be used. In connection with developing a compact beamformer architecture, recursive algorithms were......) For the best parametric approach, the gate count was 2095, the maximum operation speed was 131.9 MHz, the power consumption at 40 MHz was 10.6 mW, and it requires 4 12-bit words for each image line and channel. 2) For the piecewise-linear approximation, the corresponding numbers are 1125 gates, 184.9 MHz, 7...

  15. 21 CFR 111.320 - What requirements apply to laboratory methods for testing and examination?

    Science.gov (United States)

    2010-04-01

    ... 21 Food and Drugs 2 2010-04-01 2010-04-01 false What requirements apply to laboratory methods for testing and examination? 111.320 Section 111.320 Food and Drugs FOOD AND DRUG ADMINISTRATION, DEPARTMENT OF HEALTH AND HUMAN SERVICES (CONTINUED) FOOD FOR HUMAN CONSUMPTION CURRENT GOOD MANUFACTURING...

  16. A method of formal requirements analysis for NPP I and C systems based on object-oriented visual modeling with SCR

    International Nuclear Information System (INIS)

    Koo, S. R.; Seong, P. H.

    1999-01-01

    In this work, a formal requirements analysis method for Nuclear Power Plant (NPP) I and C systems is suggested. This method uses Unified Modeling Language (UML) for modeling systems visually and Software Cost Reduction (SCR) formalism for checking the system models. Since object-oriented method can analyze a document by the objects in a real system, UML models that use object-oriented method are useful for understanding problems and communicating with everyone involved in the project. In order to analyze the requirement more formally, SCR tabular notations is converted from UML models. To help flow-through from UML models to SCR specifications, additional syntactic extensions for UML notation and a converting procedure are defined. The combined method has been applied to Dynamic Safety System (DSS). From this application, three kinds of errors were detected in the existing DSS requirements

  17. A Study on a Control Method with a Ventilation Requirement of a VAV System in Multi-Zone

    Directory of Open Access Journals (Sweden)

    Hyo-Jun Kim

    2017-11-01

    Full Text Available The objective of this study was to propose a control method with a ventilation requirement of variable air volume (VAV system in multi-zone. In order to control the VAV system inmulti-zone, it is essential to control the terminal unit installed in each zone. A VAV terminal unit with conventional control method using a fixed minimum air flow can cause indoor air quality (IAQ issues depending on the variation in the number of occupants. This research proposes a control method with a ventilation requirement of the VAV terminal unit and AHU inmulti-zone. The integrated control method with an air flow increase model in the VAV terminal unit, AHU, and outdoor air intake rate increase model in the AHU was based on the indoor CO2 concentration. The conventional and proposed control algorithms were compared through a TRNSYS simulation program. The proposed VAV terminal unit control method satisfies all the conditions of indoor temperature, IAQ, and stratification. An energy comparison with the conventional control method showed that the method satisfies not only the indoor thermal comfort, IAQ, and stratification issue, but also reduces the energy consumption.

  18. Comparison of Rice Direct Seeding Methods (Mechanical and Manual with Transplanting Method

    Directory of Open Access Journals (Sweden)

    A Eyvani

    2014-04-01

    Full Text Available The main method of rice planting in Iran is transplanting. Due to poor mechanization of rice production, this method is laborious and costly. The other method is direct seeding in wet lands which is performed in the one third of rice cultivation area of the world. The most important problem in this method is high labor requirement of weed control. In order to compare the different rice planting methods (direct drilling, transplanting, and seed broadcasting a manually operated rice direct seeder (drum seeder was designed and fabricated. The research was conducted using a randomized complete block design with three treatments and three replications. Required draft force, field efficiency, effective field capacity, yield, and yield components were measured and the treatments were compared economically. Results showed that there were significant differences among the treatments from the view point of rice yield at the confidence level of 95% i.e. the transplanting method had the maximum yield. A higher rice yield was obtained from the direct seeder compared to the manual broadcasting method but, the difference between these two methods for crop yield was not significant even at the confidence level of the 95%. The coefficient of variation of seed distribution with direct seeding was more than 20%. The labor and time requirements per hectare reduced to 7 and 20 times, respectively when comparing the newly designed direct seeder with the transplanting method. The direct seeding method had the highest benefit to cost ratio in spite of its lower yield. Therefore, this method could be recommended in the rice growing regions.

  19. On detection and assessment of statistical significance of Genomic Islands

    Directory of Open Access Journals (Sweden)

    Chaudhuri Probal

    2008-04-01

    Full Text Available Abstract Background Many of the available methods for detecting Genomic Islands (GIs in prokaryotic genomes use markers such as transposons, proximal tRNAs, flanking repeats etc., or they use other supervised techniques requiring training datasets. Most of these methods are primarily based on the biases in GC content or codon and amino acid usage of the islands. However, these methods either do not use any formal statistical test of significance or use statistical tests for which the critical values and the P-values are not adequately justified. We propose a method, which is unsupervised in nature and uses Monte-Carlo statistical tests based on randomly selected segments of a chromosome. Such tests are supported by precise statistical distribution theory, and consequently, the resulting P-values are quite reliable for making the decision. Results Our algorithm (named Design-Island, an acronym for Detection of Statistically Significant Genomic Island runs in two phases. Some 'putative GIs' are identified in the first phase, and those are refined into smaller segments containing horizontally acquired genes in the refinement phase. This method is applied to Salmonella typhi CT18 genome leading to the discovery of several new pathogenicity, antibiotic resistance and metabolic islands that were missed by earlier methods. Many of these islands contain mobile genetic elements like phage-mediated genes, transposons, integrase and IS elements confirming their horizontal acquirement. Conclusion The proposed method is based on statistical tests supported by precise distribution theory and reliable P-values along with a technique for visualizing statistically significant islands. The performance of our method is better than many other well known methods in terms of their sensitivity and accuracy, and in terms of specificity, it is comparable to other methods.

  20. Assessment of tritium breeding requirements for fusion power reactors

    International Nuclear Information System (INIS)

    Jung, J.

    1983-12-01

    This report presents an assessment of tritium-breeding requirements for fusion power reactors. The analysis is based on an evaluation of time-dependent tritium inventories in the reactor system. The method presented can be applied to any fusion systems in operation on a steady-state mode as well as on a pulsed mode. As an example, the UWMAK-I design was analyzed and it has been found that the startup inventory requirement calculated by the present method significantly differs from those previously calculated. The effect of reactor-parameter changes on the required tritium breeding ratio is also analyzed for a variety of reactor operation scenarios

  1. Active Radiation Detectors for Use in Space Beyond Low Earth Orbit: Spatial and Energy Resolution Requirements and Methods for Heavy Ion Charge Classification

    Science.gov (United States)

    McBeth, Rafe A.

    Space radiation exposure to astronauts will need to be carefully monitored on future missions beyond low earth orbit. NASA has proposed an updated radiation risk framework that takes into account a significant amount of radiobiological and heavy ion track structure information. These models require active radiation detection systems to measure the energy and ion charge Z. However, current radiation detection systems cannot meet these demands. The aim of this study was to investigate several topics that will help next generation detection systems meet the NASA objectives. Specifically, this work investigates the required spatial resolution to avoid coincident events in a detector, the effects of energy straggling and conversion of dose from silicon to water, and methods for ion identification (Z) using machine learning. The main results of this dissertation are as follows: 1. Spatial resolution on the order of 0.1 cm is required for active space radiation detectors to have high confidence in identifying individual particles, i.e., to eliminate coincident events. 2. Energy resolution of a detector system will be limited by energy straggling effects and the conversion of dose in silicon to dose in biological tissue (water). 3. Machine learning methods show strong promise for identification of ion charge (Z) with simple detector designs.

  2. Application of the AHP method to analyze the significance of the factors affecting road traffic safety

    Directory of Open Access Journals (Sweden)

    Justyna SORDYL

    2015-06-01

    Full Text Available Over the past twenty years, the number of vehicles registered in Poland has grown rapidly. At the same time, a relatively small increase in the length of the road network has been observed. As a result of the limited capacity of available infrastructure, it leads to significant congestion and to increase of the probability of road accidents. The overall level of road safety depends on many factors - the behavior of road users, infrastructure solutions and the development of automotive technology. Thus the detailed assessment of the importance of individual elements determining road safety is difficult. The starting point is to organize the factors by grouping them into categories which are components of the DVE system (driver - vehicle - environment. In this work, to analyze the importance of individual factors affecting road safety, the use of analytic hierarchy process method (AHP was proposed. It is one of the multi-criteria methods which allows us to perform hierarchical analysis of the decision process, by means of experts’ opinions. Usage of AHP method enabled us to evaluate and rank the factors affecting road safety. This work attempts to link the statistical data and surveys in significance analysis of the elements determining road safety.

  3. COMPARISON OF SPATIAL INTERPOLATION METHODS FOR WHEAT WATER REQUIREMENT AND ITS TEMPORAL DISTRIBUTION IN HAMEDAN PROVINCE (IRAN

    Directory of Open Access Journals (Sweden)

    M. H. Nazarifar

    2014-01-01

    Full Text Available Water is the main constraint for production of agricultural crops. The temporal and spatial variations in water requirement for agriculture products are limiting factors in the study of optimum use of water resources in regional planning and management. However, due to unfavorable distribution and density of meteorological stations, it is not possible to monitor the regional variations precisely. Therefore, there is a need to estimate the evapotranspiration of crops at places where meteorological data are not available and then extend the findings from points of measurements to regional scale. Geostatistical methods are among those methods that can be used for estimation of evapotranspiration at regional scale. The present study attempts to investigate different geostatistical methods for temporal and spatial estimation of water requirements for wheat crop in different periods. The study employs the data provided by 16 synoptic and climatology meteorological stations in Hamadan province in Iran. Evapotranspiration for each month and for the growth period were determined using Penman-Mantis and Torrent-White methods for different water periods based on Standardized Precipitation Index (SPI. Among the available geostatistical methods, three methods: Kriging Method, Cokriging Method, and inverse weighted distance were selected, and analyzed, using GS+ software. Analysis and selection of the suitable geostatistical method were performed based on two measures, namely Mean Absolute Error (MAE and Mean Bias Error (MBE. The findings suggest that, in general, during the drought period, Kriging method is the proper one for estimating water requirements for the six months: January, February, April, May, August, and December. However, weighted moving average is a better estimation method for the months March, June, September, and October. In addition, Kriging is the best method for July. In normal conditions, Kriging is suitable for April, August, December

  4. A new spinning reserve requirement forecast method for deregulated electricity markets

    International Nuclear Information System (INIS)

    Amjady, Nima; Keynia, Farshid

    2010-01-01

    Ancillary services are necessary for maintaining the security and reliability of power systems and constitute an important part of trade in competitive electricity markets. Spinning Reserve (SR) is one of the most important ancillary services for saving power system stability and integrity in response to contingencies and disturbances that continuously occur in the power systems. Hence, an accurate day-ahead forecast of SR requirement helps the Independent System Operator (ISO) to conduct a reliable and economic operation of the power system. However, SR signal has complex, non-stationary and volatile behavior along the time domain and depends greatly on system load. In this paper, a new hybrid forecast engine is proposed for SR requirement prediction. The proposed forecast engine has an iterative training mechanism composed of Levenberg-Marquadt (LM) learning algorithm and Real Coded Genetic Algorithm (RCGA), implemented on the Multi-Layer Perceptron (MLP) neural network. The proposed forecast methodology is examined by means of real data of Pennsylvania-New Jersey-Maryland (PJM) electricity market and the California ISO (CAISO) controlled grid. The obtained forecast results are presented and compared with those of the other SR forecast methods. (author)

  5. A new spinning reserve requirement forecast method for deregulated electricity markets

    Energy Technology Data Exchange (ETDEWEB)

    Amjady, Nima; Keynia, Farshid [Department of Electrical Engineering, Semnan University, Semnan (Iran)

    2010-06-15

    Ancillary services are necessary for maintaining the security and reliability of power systems and constitute an important part of trade in competitive electricity markets. Spinning Reserve (SR) is one of the most important ancillary services for saving power system stability and integrity in response to contingencies and disturbances that continuously occur in the power systems. Hence, an accurate day-ahead forecast of SR requirement helps the Independent System Operator (ISO) to conduct a reliable and economic operation of the power system. However, SR signal has complex, non-stationary and volatile behavior along the time domain and depends greatly on system load. In this paper, a new hybrid forecast engine is proposed for SR requirement prediction. The proposed forecast engine has an iterative training mechanism composed of Levenberg-Marquadt (LM) learning algorithm and Real Coded Genetic Algorithm (RCGA), implemented on the Multi-Layer Perceptron (MLP) neural network. The proposed forecast methodology is examined by means of real data of Pennsylvania-New Jersey-Maryland (PJM) electricity market and the California ISO (CAISO) controlled grid. The obtained forecast results are presented and compared with those of the other SR forecast methods. (author)

  6. Using cognitive modeling for requirements engineering in anesthesiology

    NARCIS (Netherlands)

    Pott, C; le Feber, J

    2005-01-01

    Cognitive modeling is a complexity reducing method to describe significant cognitive processes under a specified research focus. Here, a cognitive process model for decision making in anesthesiology is presented and applied in requirements engineering. Three decision making situations of

  7. Required doses for projection methods in X-ray diagnosis

    International Nuclear Information System (INIS)

    Hagemann, G.

    1992-01-01

    The ideal dose requirement has been stated by Cohen et al. (1981) by a formula basing on parallel beam, maximum quantum yield and Bucky grid effect depending on the signal to noise ratio and object contrast. This was checked by means of contrast detail diagrams measured at the hole phantom, and was additionally compared with measurement results obtained with acrylic glass phantoms. The optimal dose requirement is obtained by the maximum technically possible approach to the ideal requirement level. Examples are given, besides for x-ray equipment with Gd 2 O 2 S screen film systems for grid screen mammography, and new thoracic examination systems for mass screenings. Finally, a few values concerning the dose requirement or the analogous time required for fluorscent screening in angiography and interventional radiology, are stated, as well as for dentistry and paediatric x-ray diagnostics. (orig./HP) [de

  8. An Efficient Method for Image and Audio Steganography using Least Significant Bit (LSB) Substitution

    Science.gov (United States)

    Chadha, Ankit; Satam, Neha; Sood, Rakshak; Bade, Dattatray

    2013-09-01

    In order to improve the data hiding in all types of multimedia data formats such as image and audio and to make hidden message imperceptible, a novel method for steganography is introduced in this paper. It is based on Least Significant Bit (LSB) manipulation and inclusion of redundant noise as secret key in the message. This method is applied to data hiding in images. For data hiding in audio, Discrete Cosine Transform (DCT) and Discrete Wavelet Transform (DWT) both are used. All the results displayed prove to be time-efficient and effective. Also the algorithm is tested for various numbers of bits. For those values of bits, Mean Square Error (MSE) and Peak-Signal-to-Noise-Ratio (PSNR) are calculated and plotted. Experimental results show that the stego-image is visually indistinguishable from the original cover-image when nsteganography process does not reveal presence of any hidden message, thus qualifying the criteria of imperceptible message.

  9. Significance evaluation in factor graphs

    DEFF Research Database (Denmark)

    Madsen, Tobias; Hobolth, Asger; Jensen, Jens Ledet

    2017-01-01

    in genomics and the multiple-testing issues accompanying them, accurate significance evaluation is of great importance. We here address the problem of evaluating statistical significance of observations from factor graph models. Results Two novel numerical approximations for evaluation of statistical...... significance are presented. First a method using importance sampling. Second a saddlepoint approximation based method. We develop algorithms to efficiently compute the approximations and compare them to naive sampling and the normal approximation. The individual merits of the methods are analysed both from....... Conclusions The applicability of saddlepoint approximation and importance sampling is demonstrated on known models in the factor graph framework. Using the two methods we can substantially improve computational cost without compromising accuracy. This contribution allows analyses of large datasets...

  10. Evaluation of Irrigation Methods for Highbush Blueberry. I. Growth and Water Requirements of Young Plants

    Science.gov (United States)

    A study was conducted in a new field of northern highbush blueberry (Vaccinium corymbosum L. 'Elliott') to determine the effects of different irrigation methods on growth and water requirements of uncropped plants during the first 2 years after planting. The plants were grown on mulched, raised beds...

  11. Conducting organizational safety reviews - requirements, methods and experience

    International Nuclear Information System (INIS)

    Reiman, T.; Oedewald, P.; Wahlstroem, B.; Rollenhagen, C.; Kahlbom, U.

    2008-03-01

    Organizational safety reviews are part of the safety management process of power plants. They are typically performed after major reorganizations, significant incidents or according to specified review programs. Organizational reviews can also be a part of a benchmarking between organizations that aims to improve work practices. Thus, they are important instruments in proactive safety management and safety culture. Most methods that have been used for organizational reviews are based more on practical considerations than a sound scientific theory of how various organizational or technical issues influence safety. Review practices and methods also vary considerably. The objective of this research is to promote understanding on approaches used in organizational safety reviews as well as to initiate discussion on criteria and methods of organizational assessment. The research identified a set of issues that need to be taken into account when planning and conducting organizational safety reviews. Examples of the issues are definition of appropriate criteria for evaluation, the expertise needed in the assessment and the organizational motivation for conducting the assessment. The study indicates that organizational safety assessments involve plenty of issues and situations where choices have to be made regarding what is considered valid information and a balance has to be struck between focus on various organizational phenomena. It is very important that these choices are based on a sound theoretical framework and that these choices can later be evaluated together with the assessment findings. The research concludes that at its best, the organizational safety reviews can be utilised as a source of information concerning the changing vulnerabilities and the actual safety performance of the organization. In order to do this, certain basic organizational phenomena and assessment issues have to be acknowledged and considered. The research concludes with recommendations on

  12. Conducting organizational safety reviews - requirements, methods and experience

    Energy Technology Data Exchange (ETDEWEB)

    Reiman, T.; Oedewald, P.; Wahlstroem, B. [Technical Research Centre of Finland, VTT (Finland); Rollenhagen, C. [Royal Institute of Technology, KTH, (Sweden); Kahlbom, U. [RiskPilot (Sweden)

    2008-03-15

    Organizational safety reviews are part of the safety management process of power plants. They are typically performed after major reorganizations, significant incidents or according to specified review programs. Organizational reviews can also be a part of a benchmarking between organizations that aims to improve work practices. Thus, they are important instruments in proactive safety management and safety culture. Most methods that have been used for organizational reviews are based more on practical considerations than a sound scientific theory of how various organizational or technical issues influence safety. Review practices and methods also vary considerably. The objective of this research is to promote understanding on approaches used in organizational safety reviews as well as to initiate discussion on criteria and methods of organizational assessment. The research identified a set of issues that need to be taken into account when planning and conducting organizational safety reviews. Examples of the issues are definition of appropriate criteria for evaluation, the expertise needed in the assessment and the organizational motivation for conducting the assessment. The study indicates that organizational safety assessments involve plenty of issues and situations where choices have to be made regarding what is considered valid information and a balance has to be struck between focus on various organizational phenomena. It is very important that these choices are based on a sound theoretical framework and that these choices can later be evaluated together with the assessment findings. The research concludes that at its best, the organizational safety reviews can be utilised as a source of information concerning the changing vulnerabilities and the actual safety performance of the organization. In order to do this, certain basic organizational phenomena and assessment issues have to be acknowledged and considered. The research concludes with recommendations on

  13. Assessing thermochromatography as a separation method for nuclear forensics. Current capability vis-a-vis forensic requirements

    International Nuclear Information System (INIS)

    Hanson, D.E.; Garrison, J.R.; Hall, H.L.

    2011-01-01

    Nuclear forensic science has become increasingly important for global nuclear security. However, many current laboratory analysis techniques are based on methods developed without the imperative for timely analysis that underlies the post-detonation forensics mission requirements. Current analysis of actinides, fission products, and fuel-specific materials requires time-consuming chemical separation coupled with nuclear counting or mass spectrometry. High-temperature gas-phase separations have been used in the past for the rapid separation of newly created elements/isotopes and as a basis for chemical classification of that element. We are assessing the utility of this method for rapid separation in the gas-phase to accelerate the separations of radioisotopes germane to post-detonation nuclear forensic investigations. The existing state of the art for thermo chromatographic separations, and its applicability to nuclear forensics, will be reviewed. (author)

  14. Puberty menorrhagia Requiring Inpatient Admission

    Directory of Open Access Journals (Sweden)

    AH Khosla

    2010-06-01

    Full Text Available INTRODUCTION: Puberty menorrhagia is a significant health problem in adolescent age group and severe cases may require admission and blood transfusion. Aim of this study was to evaluate the causes, associated complications and management of puberty menorrhagia. METHODS: Hospital records of all patients of puberty menorrhagia requiring admission were analyzed for etiology, duration since menarche, duration of bleeding, investigation profile and management. RESULTS: There were 18 patients of puberty menorrhagia requiring hospital admission. Etiology was anovulatory bleeding in 11 patients, bleeding disorders in five which included idiopathic thrombocytopenia purpura in three and one each with Von-Willebrand disease and leukemia. Two patients had hypothyroidism as the cause. Fourteen patients presented with severe anaemia and required blood transfusion. All except one responded to oral hormonal therapy. CONCLUSIONS: Puberty menorrhagia can be associated with severe complications and requiring blood transfusion. Although most common cause is anovulation but bleeding disorder, other medical condition and other organic causes must be ruled out in any patient of Puberty menorrhagia. KEYWORDS: anovulation, bleeding disorder, puberty, menorrhagia, anaemia.

  15. SB certification handout material requirements, test methods, responsibilities, and minimum classification levels for mixture-based specification for flexible base.

    Science.gov (United States)

    2012-10-01

    A handout with tables representing the material requirements, test methods, responsibilities, and minimum classification levels mixture-based specification for flexible base and details on aggregate and test methods employed, along with agency and co...

  16. [Precautions of physical performance requirements and test methods during product standard drafting process of medical devices].

    Science.gov (United States)

    Song, Jin-Zi; Wan, Min; Xu, Hui; Yao, Xiu-Jun; Zhang, Bo; Wang, Jin-Hong

    2009-09-01

    The major idea of this article is to discuss standardization and normalization for the product standard of medical devices. Analyze the problem related to the physical performance requirements and test methods during product standard drafting process and make corresponding suggestions.

  17. Defining Requirements and Related Methods for Designing Sensorized Garments

    Directory of Open Access Journals (Sweden)

    Giuseppe Andreoni

    2016-05-01

    Full Text Available Designing smart garments has strong interdisciplinary implications, specifically related to user and technical requirements, but also because of the very different applications they have: medicine, sport and fitness, lifestyle monitoring, workplace and job conditions analysis, etc. This paper aims to discuss some user, textile, and technical issues to be faced in sensorized clothes development. In relation to the user, the main requirements are anthropometric, gender-related, and aesthetical. In terms of these requirements, the user’s age, the target application, and fashion trends cannot be ignored, because they determine the compliance with the wearable system. Regarding textile requirements, functional factors—also influencing user comfort—are elasticity and washability, while more technical properties are the stability of the chemical agents’ effects for preserving the sensors’ efficacy and reliability, and assuring the proper duration of the product for the complete life cycle. From the technical side, the physiological issues are the most important: skin conductance, tolerance, irritation, and the effect of sweat and perspiration are key factors for reliable sensing. Other technical features such as battery size and duration, and the form factor of the sensor collector, should be considered, as they affect aesthetical requirements, which have proven to be crucial, as well as comfort and wearability.

  18. Effects of climate change on water requirements and phenological period of major crops in Heihe River basin, China - Based on the accumulated temperature threshold method

    Science.gov (United States)

    Han, Dongmei; Xu, Xinyi; Yan, Denghua

    2016-04-01

    In recent years, global climate change has significantly caused a serious crisis of water resources throughout the world. However, mainly through variations in temperature, climate change will affect water requirements of crop. It is obvious that the rise of temperature affects growing period and phenological period of crop directly, then changes the water demand quota of crop. Methods including accumulated temperature threshold and climatic tendency rate were adopted, which made up for the weakness of phenological observations, to reveal the response of crop phenological change during the growing period. Then using Penman-Menteith model and crop coefficients from the United Nations Food& Agriculture Organization (FAO), the paper firstly explored crop water requirements in different growth periods, and further forecasted quantitatively crop water requirements in Heihe River Basin, China under different climate change scenarios. Results indicate that: (i) The results of crop phenological change established in the method of accumulated temperature threshold were in agreement with measured results, and (ii) there were many differences in impacts of climate warming on water requirement of different crops. The growth periods of wheat and corn had tendency of shortening as well as the length of growth periods. (ii)Results of crop water requirements under different climate change scenarios showed: when temperature increased by 1°C, the start time of wheat growth period changed, 2 days earlier than before, and the length of total growth period shortened 2 days. Wheat water requirements increased by 1.4mm. However, corn water requirements decreased by almost 0.9mm due to the increasing temperature of 1°C. And the start time of corn growth period become 3 days ahead, and the length of total growth period shortened 4 days. Therefore, the contradiction between water supply and water demands are more obvious under the future climate warming in Heihe River Basin, China.

  19. Process qualification and control in electron beams--requirements, methods, new concepts and challenges

    International Nuclear Information System (INIS)

    Mittendorfer, J.; Gratzl, F.; Hanis, D.

    2004-01-01

    In this paper the status of process qualification and control in electron beam irradiation is analyzed in terms of requirements, concepts, methods and challenges for a state-of-the-art process control concept for medical device sterilization. Aspects from process qualification to routine process control are described together with the associated process variables. As a case study the 10 MeV beams at Mediscan GmbH are considered. Process control concepts like statistical process control (SPC) and a new concept to determine process capability is briefly discussed

  20. Flood control design requirements and flood evaluation methods of inland nuclear power plant

    International Nuclear Information System (INIS)

    Zhang Ailing; Wang Ping; Zhu Jingxing

    2011-01-01

    Effect of flooding is one of the key safety factors and environmental factors in inland nuclear power plant sitting. Up to now, the rule of law and standard systems are established for the selection of nuclear power plant location and flood control requirements in China. In this paper flood control standards of China and other countries are introduced. Several inland nuclear power plants are taken as examples to thoroughly discuss the related flood evaluation methods. The suggestions are also put forward in the paper. (authors)

  1. Application of spatial methods to identify areas with lime requirement in eastern Croatia

    Science.gov (United States)

    Bogunović, Igor; Kisic, Ivica; Mesic, Milan; Zgorelec, Zeljka; Percin, Aleksandra; Pereira, Paulo

    2016-04-01

    With more than 50% of acid soils in all agricultural land in Croatia, soil acidity is recognized as a big problem. Low soil pH leads to a series of negative phenomena in plant production and therefore as a compulsory measure for reclamation of acid soils is liming, recommended on the base of soil analysis. The need for liming is often erroneously determined only on the basis of the soil pH, because the determination of cation exchange capacity, the hydrolytic acidity and base saturation is a major cost to producers. Therefore, in Croatia, as well as some other countries, the amount of liming material needed to ameliorate acid soils is calculated by considering their hydrolytic acidity. For this research, several interpolation methods were tested to identify the best spatial predictor of hidrolitic acidity. The purpose of this study was to: test several interpolation methods to identify the best spatial predictor of hidrolitic acidity; and to determine the possibility of using multivariate geostatistics in order to reduce the number of needed samples for determination the hydrolytic acidity, all with an aim that the accuracy of the spatial distribution of liming requirement is not significantly reduced. Soil pH (in KCl) and hydrolytic acidity (Y1) is determined in the 1004 samples (from 0-30 cm) randomized collected in agricultural fields near Orahovica in eastern Croatia. This study tested 14 univariate interpolation models (part of ArcGIS software package) in order to provide most accurate spatial map of hydrolytic acidity on a base of: all samples (Y1 100%), and the datasets with 15% (Y1 85%), 30% (Y1 70%) and 50% fewer samples (Y1 50%). Parallel to univariate interpolation methods, the precision of the spatial distribution of the Y1 was tested by the co-kriging method with exchangeable acidity (pH in KCl) as a covariate. The soils at studied area had an average pH (KCl) 4,81, while the average Y1 10,52 cmol+ kg-1. These data suggest that liming is necessary

  2. Guidance and methods for satisfying low specific activity material and surface contaminated object regulatory requirements

    International Nuclear Information System (INIS)

    Pope, R.B.; Shappert, L.B.; Michelhaugh, R.D.; Boyle, R.W.; Easton, E.P.; Coodk, J.R.

    1998-01-01

    The U.S. Department of Transportation (DOT) and the U.S. Nuclear Regulatory Commission (NRC) have prepared a comprehensive set of draft guidance for shippers and inspectors to use when applying the newly imposed regulatory requirements for low specific activity (LSA) material and surface contaminated objects (SCOs). These requirements represent significant departures in some areas from the manner in which these materials and objects were regulated by the earlier versions of the regulations. The proper interpretation and application of the regulatory criteria can require a fairly complex set of decisions be made. To assist those trying these regulatory requirements, a detailed set of logic-flow diagrams representing decisions related to multiple factors were prepared and included in the draft report for comment on Categorizing and Transporting Low Specific Activity Materials and Surface Contaminated Objects, (DOT/NRC, 1997). These logic-flow diagrams, as developed, are specific to the U.S. regulations, but were readily adaptable to the IAEA regulations. The diagrams have been modified accordingly and tied directly to specific paragraphs in IAEA Safety Series No. 6. This paper provides the logic-flow diagrams adapted in the IAEA regulations, and demonstrated how these diagrams can be used to assist consignors and inspectors in assessing compliance of shipments with the LSA material and SCO regulatory requirements. (authors)

  3. Statistical Significance for Hierarchical Clustering

    Science.gov (United States)

    Kimes, Patrick K.; Liu, Yufeng; Hayes, D. Neil; Marron, J. S.

    2017-01-01

    Summary Cluster analysis has proved to be an invaluable tool for the exploratory and unsupervised analysis of high dimensional datasets. Among methods for clustering, hierarchical approaches have enjoyed substantial popularity in genomics and other fields for their ability to simultaneously uncover multiple layers of clustering structure. A critical and challenging question in cluster analysis is whether the identified clusters represent important underlying structure or are artifacts of natural sampling variation. Few approaches have been proposed for addressing this problem in the context of hierarchical clustering, for which the problem is further complicated by the natural tree structure of the partition, and the multiplicity of tests required to parse the layers of nested clusters. In this paper, we propose a Monte Carlo based approach for testing statistical significance in hierarchical clustering which addresses these issues. The approach is implemented as a sequential testing procedure guaranteeing control of the family-wise error rate. Theoretical justification is provided for our approach, and its power to detect true clustering structure is illustrated through several simulation studies and applications to two cancer gene expression datasets. PMID:28099990

  4. Architecturally Significant Requirements Identification, Classification and Change Management for Multi-tenant Cloud-Based Systems

    DEFF Research Database (Denmark)

    Chauhan, Muhammad Aufeef; Probst, Christian W.

    2017-01-01

    presented a framework for requirements classification and change management focusing on distributed Platform as a Service (PaaS) and Software as a Service (SaaS) systems as well as complex software ecosystems that are built using PaaS and SaaS, such as Tools as a Service (TaaS). We have demonstrated...

  5. Minerals Intake Distributions in a Large Sample of Iranian at-Risk Population Using the National Cancer Institute Method: Do They Meet Their Requirements?

    Science.gov (United States)

    Heidari, Zahra; Feizi, Awat; Azadbakht, Leila; Sarrafzadegan, Nizal

    2015-01-01

    Minerals are required for the body's normal function. The current study assessed the intake distribution of minerals and estimated the prevalence of inadequacy and excess among a representative sample of healthy middle aged and elderly Iranian people. In this cross-sectional study, the second follow up to the Isfahan Cohort Study (ICS), 1922 generally healthy people aged 40 and older were investigated. Dietary intakes were collected using 24 hour recalls and two or more consecutive food records. Distribution of minerals intake was estimated using traditional (averaging dietary intake days) and National Cancer Institute (NCI) methods, and the results obtained from the two methods, were compared. The prevalence of minerals intake inadequacy or excess was estimated using the estimated average requirement (EAR) cut-point method, the probability approach and the tolerable upper intake levels (UL). There were remarkable differences between values obtained using traditional and NCI methods, particularly in the lower and upper percentiles of the estimated intake distributions. A high prevalence of inadequacy of magnesium (50 - 100 %), calcium (21 - 93 %) and zinc (30 - 55 % for males > 50 years) was observed. Significant gender differences were found regarding inadequate intakes of calcium (21 - 76 % for males vs. 45 - 93 % for females), magnesium (92 % vs. 100 %), iron (0 vs. 15 % for age group 40 - 50 years) and zinc (29 - 55 % vs. 0 %) (all; p < 0.05). Severely imbalanced intakes of magnesium, calcium and zinc were observed among the middle-aged and elderly Iranian population. Nutritional interventions and population-based education to improve healthy diets among the studied population at risk are needed.

  6. Current lipid extraction methods are significantly enhanced adding a water treatment step in Chlorella protothecoides.

    Science.gov (United States)

    Ren, Xiaojie; Zhao, Xinhe; Turcotte, François; Deschênes, Jean-Sébastien; Tremblay, Réjean; Jolicoeur, Mario

    2017-02-11

    helps the subsequent release of intracellular lipids in the second extraction step, thus improving the global lipids extraction yield. In addition, the water treatment positively modifies the intracellular lipid class ratios of the final extract, in which TAG ratio is significantly increased without changes in the fatty acids composition. The novel method thus provides an efficient way to improve lipid extraction yield of existing methods, as well as selectively favoring TAG, a lipid of the upmost interest for biodiesel production.

  7. Rubidium-strontium method

    International Nuclear Information System (INIS)

    Dubansky, A.

    1980-01-01

    The rubidium-strontium geological dating method is based on the determination of the Rb and Sr isotope ratio in rocks, mainly using mass spectrometry. The method is only practical for silicate minerals and rocks, potassium feldspars and slates. Also described is the rubidium-strontium isochrone method. This, however, requires a significant amount of experimental data and an analysis of large quantities of samples, often of the order of tons. The results are tabulated of rubidium-strontium dating of geological formations in the Czech Socialist Republic. (M.S.)

  8. Methods for significance testing of categorical covariates in logistic regression models after multiple imputation: power and applicability analysis

    NARCIS (Netherlands)

    Eekhout, I.; Wiel, M.A. van de; Heymans, M.W.

    2017-01-01

    Background. Multiple imputation is a recommended method to handle missing data. For significance testing after multiple imputation, Rubin’s Rules (RR) are easily applied to pool parameter estimates. In a logistic regression model, to consider whether a categorical covariate with more than two levels

  9. Significance of blood examination in radiation workers

    International Nuclear Information System (INIS)

    Mori, Hirofumi; Nakamura, Shinobu; Ando, Atsushi; Kojima, Kazuhiko; Kikuta, Yoko.

    1978-01-01

    Blood examination made for the past 3 years revealed that the influences of chronic exposure of extremely small amounts of radiation (an average of 5 mrem/day) on peripheral blood are not detected. However, the blood examination, which is prescribed at least twice a year by law, is for determining whether or not the results of blood examination are within a normal range. Therefore, even though influences of a large amount of radiation are detected, it has little significance as a monitoring of chronic exposure of extremely small amounts of radiation. If the blood examination is used as a monitoring method for detecting exposure, it is important to compare the results with the previous ones in the same individuals. It is also necessary to increase the number of examinations and to study the blood more in detail. However, before that, the standard of the evaluation should be more defined because of its obscurity. The present blood examination is useful in managing the health generally. However, it is not good as a monitoring for chronic exposure of the extremely small amounts of radiation. Therefore, a routine biological method to be able to monitor radiation more precisely, is required. (Namekawa, K.)

  10. Dietary energy requirements of young adult men, determined by using the doubly labeled water method

    International Nuclear Information System (INIS)

    Roberts, S.B.; Heyman, M.B.; Evans, W.J.; Fuss, P.; Tsay, R.; Young, V.R.

    1991-01-01

    The autors examined the hypothesis that current recommendations on dietary energy requirements may underestimate the total energy needs of young adult men, by measuring total energy expenditure (TEE) and resting energy expenditure (REE) in 14 weight-maintaining healthy subjects leading unrestricted lives. TEE and body composition were measured by using 2H(2)18O, and REE was measured by using indirect calorimetry. All subjects had sedentary full-time occupations and participated in strenuous leisure activities for 34 ± 6 (SE) min/d. TEE and REE were 14.61 ± 0.76 and 7.39 ± 0.26 MJ/d, respectively, and 202 ± 2 and 122 ± 2 kJ.kg-1.d-1. There were significant relationships between TEE and both body fat-free mass (r = 0.732, P less than 0.005) and measured REE (r = 0.568, P less than 0.05). Measured TEE:REE values were significantly higher than the recommended energy requirement (1.98 ± 0.09, compared with 1.55 or 1.67, P less than 0.005). These results are consistent with the suggestion that the current recommended energy intake for young adult men may underestimate total energy needs

  11. A new method to detect significant basal body temperature changes during a woman's menstrual cycle.

    Science.gov (United States)

    Freundl, Günter; Frank-Herrmann, Petra; Brown, Simon; Blackwell, Leonard

    2014-10-01

    To compare the results of a computer programme based on the Trigg's tracking system (TTS) identification of the basal body temperature (BBT) shift day from daily records of BBT values (TTS transition day), with the BBT shift day identified from the same records using the Sensiplan(®) symptothermal method of natural family planning. A computer programme was written to display the daily BBT readings for 364 menstrual cycles from 51 women aged 24 to 35 years, obtained from the German Natural Family Planning (NFP) database. The TTS transition day so identified from each record was then compared with the BBT shift day estimated from the same record by the Sensiplan(®) method. Total agreement between the methods was obtained for 81% (294/364) of the cycles and 18% (67) cycles differed by ± 1 day. For the 364 pairs of values distributed among 51 women the medians of the differences between the TTS transition day and Sensiplan(®) initial day of the BBT rise (shift day) were not significantly different (χ(2) = 65.28, df = 50, p = 0.07205). The advantages of the tracking signal algorithm are that in many cases it was possible to identify the BBT shift day on that very day - rather than only some days later - and to estimate the probability that a transition had occurred from the different values of the tracking signal.

  12. METHODS FOR DETERMINING AGITATOR MIXING REQUIREMENTS FOR A MIXING & SAMPLING FACILITY TO FEED WTP (WASTE TREATMENT PLANT)

    Energy Technology Data Exchange (ETDEWEB)

    GRIFFIN PW

    2009-08-27

    The following report is a summary of work conducted to evaluate the ability of existing correlative techniques and alternative methods to accurately estimate impeller speed and power requirements for mechanical mixers proposed for use in a mixing and sampling facility (MSF). The proposed facility would accept high level waste sludges from Hanford double-shell tanks and feed uniformly mixed high level waste to the Waste Treatment Plant. Numerous methods are evaluated and discussed, and resulting recommendations provided.

  13. Difference in method of administration did not significantly impact item response

    DEFF Research Database (Denmark)

    Bjorner, Jakob B; Rose, Matthias; Gandek, Barbara

    2014-01-01

    assistant (PDA), or personal computer (PC) on the Internet, and a second form by PC, in the same administration. Structural invariance, equivalence of item responses, and measurement precision were evaluated using confirmatory factor analysis and item response theory methods. RESULTS: Multigroup...... levels in IVR, PQ, or PDA administration as compared to PC. Availability of large item response theory-calibrated PROMIS item banks allowed for innovations in study design and analysis.......PURPOSE: To test the impact of method of administration (MOA) on the measurement characteristics of items developed in the Patient-Reported Outcomes Measurement Information System (PROMIS). METHODS: Two non-overlapping parallel 8-item forms from each of three PROMIS domains (physical function...

  14. Identifying significant temporal variation in time course microarray data without replicates

    Directory of Open Access Journals (Sweden)

    Porter Weston

    2009-03-01

    Full Text Available Abstract Background An important component of time course microarray studies is the identification of genes that demonstrate significant time-dependent variation in their expression levels. Until recently, available methods for performing such significance tests required replicates of individual time points. This paper describes a replicate-free method that was developed as part of a study of the estrous cycle in the rat mammary gland in which no replicate data was collected. Results A temporal test statistic is proposed that is based on the degree to which data are smoothed when fit by a spline function. An algorithm is presented that uses this test statistic together with a false discovery rate method to identify genes whose expression profiles exhibit significant temporal variation. The algorithm is tested on simulated data, and is compared with another recently published replicate-free method. The simulated data consists both of genes with known temporal dependencies, and genes from a null distribution. The proposed algorithm identifies a larger percentage of the time-dependent genes for a given false discovery rate. Use of the algorithm in a study of the estrous cycle in the rat mammary gland resulted in the identification of genes exhibiting distinct circadian variation. These results were confirmed in follow-up laboratory experiments. Conclusion The proposed algorithm provides a new approach for identifying expression profiles with significant temporal variation without relying on replicates. When compared with a recently published algorithm on simulated data, the proposed algorithm appears to identify a larger percentage of time-dependent genes for a given false discovery rate. The development of the algorithm was instrumental in revealing the presence of circadian variation in the virgin rat mammary gland during the estrous cycle.

  15. Anticipating requirements changes-using futurology in requirements elicitation

    OpenAIRE

    Pimentel, João Henrique; Santos, Emanuel; Castro, Jaelson; Franch Gutiérrez, Javier

    2012-01-01

    It is well known that requirements changes in a later phase of software developments is a major source of software defects and costs. Thus, the need of techniques to control or reduce the amount of changes during software development projects. The authors advocate the use of foresight methods as a valuable input to requirements elicitation, with the potential to decrease the number of changes that would be required after deployment, by anticipating them. In this paper, the authors define a pr...

  16. The Paradox of "Structured" Methods for Software Requirements Management: A Case Study of an e-Government Development Project

    Science.gov (United States)

    Conboy, Kieran; Lang, Michael

    This chapter outlines the alternative perspectives of "rationalism" and "improvisation" within information systems development and describes the major shortcomings of each. It then discusses how these shortcomings manifested themselves within an e-government case study where a "structured" requirements management method was employed. Although this method was very prescriptive and firmly rooted in the "rational" paradigm, it was observed that users often resorted to improvised behaviour, such as privately making decisions on how certain aspects of the method should or should not be implemented.

  17. Website-based PNG image steganography using the modified Vigenere Cipher, least significant bit, and dictionary based compression methods

    Science.gov (United States)

    Rojali, Salman, Afan Galih; George

    2017-08-01

    Along with the development of information technology in meeting the needs, various adverse actions and difficult to avoid are emerging. One of such action is data theft. Therefore, this study will discuss about cryptography and steganography that aims to overcome these problems. This study will use the Modification Vigenere Cipher, Least Significant Bit and Dictionary Based Compression methods. To determine the performance of study, Peak Signal to Noise Ratio (PSNR) method is used to measure objectively and Mean Opinion Score (MOS) method is used to measure subjectively, also, the performance of this study will be compared to other method such as Spread Spectrum and Pixel Value differencing. After comparing, it can be concluded that this study can provide better performance when compared to other methods (Spread Spectrum and Pixel Value Differencing) and has a range of MSE values (0.0191622-0.05275) and PSNR (60.909 to 65.306) with a hidden file size of 18 kb and has a MOS value range (4.214 to 4.722) or image quality that is approaching very good.

  18. Robust design requirements specification: a quantitative method for requirements development using quality loss functions

    DEFF Research Database (Denmark)

    Pedersen, Søren Nygaard; Christensen, Martin Ebro; Howard, Thomas J.

    2016-01-01

    Product requirements serve many purposes in the product development process. Most importantly, they are meant to capture and facilitate product goals and acceptance criteria, as defined by stakeholders. Accurately communicating stakeholder goals and acceptance criteria can be challenging and more...

  19. Aeronautical Industry Requirements for Titanium Alloys

    Science.gov (United States)

    Bran, D. T.; Elefterie, C. F.; Ghiban, B.

    2017-06-01

    The project presents the requirements imposed for aviation components made from Titanium based alloys. A significant portion of the aircraft pylons are manufactured from Titanium alloys. Strength, weight, and reliability are the primary factors to consider in aircraft structures. These factors determine the requirements to be met by any material used to construct or repair the aircraft. Many forces and structural stresses act on an aircraft when it is flying and when it is static and this thesis describes environmental factors, conditions of external aggression, mechanical characteristics and loadings that must be satisfied simultaneously by a Ti-based alloy, compared to other classes of aviation alloys (as egg. Inconel super alloys, Aluminum alloys).For this alloy class, the requirements are regarding strength to weight ratio, reliability, corrosion resistance, thermal expansion and so on. These characteristics additionally continue to provide new opportunities for advanced manufacturing methods.

  20. Replacing reserve requirements

    OpenAIRE

    Edward J. Stevens

    1993-01-01

    An examination of the fading significance of the Federal Reserve System's reserve requirements and the recent flowering of required clearing balances, a rapidly growing feature of Reserve Bank operations.

  1. Asymptomatic bacteriuria. Clinical significance and management.

    Science.gov (United States)

    Raz, Raul

    2003-10-01

    The clinical significance and management of asymptomatic bacteriuria (ASB) differs according to different groups of patients. ASB requires antibiotic treatment in pregnant women, children aged 5-6 years and prior to invasive genitourinary procedures. However, there is a consensus that ASB in the elderly, healthy school girls and young women, diabetic women and patients with indwelling catheters or intermittent catheterization has no clinical significance and antibiotic prescription is not indicated.

  2. METHOD FOR SECURITY SPECIFICATION SOFTWARE REQUIREMENTS AS A MEANS FOR IMPLEMENTING A SOFTWARE DEVELOPMENT PROCESS SECURE - MERSEC

    Directory of Open Access Journals (Sweden)

    Castro Mecías, L.T.

    2015-06-01

    Full Text Available Often security incidents that have the object or use the software as a means of causing serious damage and legal, economic consequences, etc. Results of a survey by Kaspersky Lab reflectvulnerabilities in software are the main cause of security incidents in enterprises, the report shows that 85% of them have reported security incidents and vulnerabilities in software are the main reason is further estimated that incidents can cause significant losses estimated from 50,000 to $ 649.000. (1 In this regard academic and industry research focuses on proposals based on reducing vulnerabilities and failures of technology, with a positive influence on how the software is developed. A development process for improved safety practices and should include activities from the initial phases of the software; so that security needs are identified, manage risk and appropriate measures are implemented. This article discusses a method of analysis, acquisition and requirements specification of the software safety analysis on the basis of various proposals and deficiencies identified from participant observation in software development teams. Experiments performed using the proposed yields positive results regarding the reduction of security vulnerabilities and compliance with the safety objectives of the software.

  3. 30 CFR 48.3 - Training plans; time of submission; where filed; information required; time for approval; method...

    Science.gov (United States)

    2010-07-01

    ... 30 Mineral Resources 1 2010-07-01 2010-07-01 false Training plans; time of submission; where filed....3 Training plans; time of submission; where filed; information required; time for approval; method... training plan shall be filed with the District Manager for the area in which the mine is located. (c) Each...

  4. Significance of steel electrical resistance method in the evaluation of reinforcement corrosion in cementitious systems

    Directory of Open Access Journals (Sweden)

    Krajci, L.

    2004-06-01

    Full Text Available The suitable detection system of steel reinforcement corrosion in concrete structures contributes to the reduction of their maintenance costs. Method of steel electrical resistance represents non-destructive monitoring of steel in cementitious systems. Specially prepared and arranged test specimen of steel as a corrosion sensor is embedded in mortar specimen. Verification tests of this method based on chloride corrosion of steel in mortars as well as its visual inspection are introduced. Significance of steel electrical resistance method lies in the expression of steel corrosion by these quantitative parameters: reduction of cross-section of steel, thickness of corroded layer and loss of weight of steel material. This method is an integral method that allows the indirect determination of mentioned corrosion characteristics. The comparison of verified method with gravimetric evaluation of steel corrosion gives a good correspondence. Test results on mortars with calcium chloride dosages between 0.5% and 4.0% by weight of cement prove high sensitiveness and reliability of steel electrical resistance method.

    La utilización de un sistema de detección de la corrosión de las armaduras en estructuras de hormigón puede contribuir a la reducción de sus costes de mantenimiento. El método de la resistencia eléctrica del acero consiste en la monitorización no-destructiva realizada sobre el acero en sistemas cementantes. Dentro de la muestra de mortero se coloca el sistema de detección, especialmente preparado y fijado, actuando como un sensor de la corrosión. En este trabajo se presentan ensayos de verificación de este método, junto con inspecciones visuales, en morteros sometidos a corrosión de armaduras por efecto de los cloruros. La efectividad de este método de la resistencia eléctrica del acero se expresa, en la corrosión de armaduras, de acuerdo a los siguientes parámetros cuantitativos: reducción de la sección transversal del

  5. METHODS FOR DETERMINING AGITATOR MIXING REQUIREMENTS FOR A MIXING and SAMPLING FACILITY TO FEED WTP (WASTE TREATMENT PLANT)

    International Nuclear Information System (INIS)

    Griffin, P.W.

    2009-01-01

    The following report is a summary of work conducted to evaluate the ability of existing correlative techniques and alternative methods to accurately estimate impeller speed and power requirements for mechanical mixers proposed for use in a mixing and sampling facility (MSF). The proposed facility would accept high level waste sludges from Hanford double-shell tanks and feed uniformly mixed high level waste to the Waste Treatment Plant. Numerous methods are evaluated and discussed, and resulting recommendations provided.

  6. Blast casting requires fresh assessment of methods

    Energy Technology Data Exchange (ETDEWEB)

    Pilshaw, S.R.

    1987-08-01

    The article discusses the reasons why conventional blasting operations, mainly that of explosive products, drilling and initiation methods are inefficient, and suggests new methods and materials to overcome the problems of the conventional operations. The author suggests that the use of bulk ANFO for casting, instead of high energy and density explosives with high velocity detonation is more effective in producing heave action results. Similarly the drilling of smaller blast holes than is conventional allows better loading distribution of explosives in the rock mass. The author also suggests that casting would be more efficient if the shot rows were loaded differently to produce a variable burden blasting pattern.

  7. So much to do, so little time. To accomplish the mandatory initiatives of ARRA, healthcare organizations will require significant and thoughtful planning, prioritization and execution.

    Science.gov (United States)

    Klein, Kimberly

    2010-01-01

    The American Recovery and Reinvestment Act of 2009 (ARRA) has set forth legislation for the healthcare community to achieve adoption of electronic health records (EHR), as well as form data standards, health information exchanges (HIE) and compliance with more stringent security and privacy controls under the HITECH Act. While the Office of the National Coordinator for Health Information Technology (ONCHIT) works on the definition of both "meaningful use" and "certification" of information technology systems, providers in particular must move forward with their IT initiatives to achieve the basic requirements for Medicare and Medicaid incentives starting in 2011, and avoid penalties that will reduce reimbursement beginning in 2015. In addition, providers, payors, government and non-government stakeholders will all have to balance the implementation of EHRs, working with HIEs, at the same time that they must upgrade their systems to be in compliance with ICD-10 and HIPAA 5010 code sets. Compliance deadlines for EHRs and HIEs begin in 2011, while ICD-10 diagnosis and procedure code sets compliance is required by October 2013 and HIPAA 5010 transaction sets, with one exception, is required by January 1, 2012. In order to accomplish these strategic and mandatory initiatives successfully and simultaneously, healthcare organizations will require significant and thoughtful planning, prioritization and execution.

  8. Dual purpose or not? The significant factors

    International Nuclear Information System (INIS)

    Bak, W.; Roland, V.

    1999-01-01

    The development of spent fuel storage systems requires consideration of many factors in making design decisions. A significant issue affecting the design is the need to incorporate transportability of the canister or cask system design, which results in major changes to the storage system design. This paper presents a review of the significant factors affecting storage system design to incorporate transportation requirements and looks at the trends in both the United States and Europe where Transnucleaire and its US affiliated companies Transnuclear Inc., Transnuclear West and PacTec are active. A discussion is also presented relative to the pros and cons of whether the spent fuel storage system vendor should anticipate these transportation needs in the design of their systems. (author)

  9. Confidence intervals for effect sizes: compliance and clinical significance in the Journal of Consulting and clinical Psychology.

    Science.gov (United States)

    Odgaard, Eric C; Fowler, Robert L

    2010-06-01

    In 2005, the Journal of Consulting and Clinical Psychology (JCCP) became the first American Psychological Association (APA) journal to require statistical measures of clinical significance, plus effect sizes (ESs) and associated confidence intervals (CIs), for primary outcomes (La Greca, 2005). As this represents the single largest editorial effort to improve statistical reporting practices in any APA journal in at least a decade, in this article we investigate the efficacy of that change. All intervention studies published in JCCP in 2003, 2004, 2007, and 2008 were reviewed. Each article was coded for method of clinical significance, type of ES, and type of associated CI, broken down by statistical test (F, t, chi-square, r/R(2), and multivariate modeling). By 2008, clinical significance compliance was 75% (up from 31%), with 94% of studies reporting some measure of ES (reporting improved for individual statistical tests ranging from eta(2) = .05 to .17, with reasonable CIs). Reporting of CIs for ESs also improved, although only to 40%. Also, the vast majority of reported CIs used approximations, which become progressively less accurate for smaller sample sizes and larger ESs (cf. Algina & Kessleman, 2003). Changes are near asymptote for ESs and clinical significance, but CIs lag behind. As CIs for ESs are required for primary outcomes, we show how to compute CIs for the vast majority of ESs reported in JCCP, with an example of how to use CIs for ESs as a method to assess clinical significance.

  10. A novel asynchronous access method with binary interfaces

    Directory of Open Access Journals (Sweden)

    Torres-Solis Jorge

    2008-10-01

    Full Text Available Abstract Background Traditionally synchronous access strategies require users to comply with one or more time constraints in order to communicate intent with a binary human-machine interface (e.g., mechanical, gestural or neural switches. Asynchronous access methods are preferable, but have not been used with binary interfaces in the control of devices that require more than two commands to be successfully operated. Methods We present the mathematical development and evaluation of a novel asynchronous access method that may be used to translate sporadic activations of binary interfaces into distinct outcomes for the control of devices requiring an arbitrary number of commands to be controlled. With this method, users are required to activate their interfaces only when the device under control behaves erroneously. Then, a recursive algorithm, incorporating contextual assumptions relevant to all possible outcomes, is used to obtain an informed estimate of user intention. We evaluate this method by simulating a control task requiring a series of target commands to be tracked by a model user. Results When compared to a random selection, the proposed asynchronous access method offers a significant reduction in the number of interface activations required from the user. Conclusion This novel access method offers a variety of advantages over traditionally synchronous access strategies and may be adapted to a wide variety of contexts, with primary relevance to applications involving direct object manipulation.

  11. Simulation of temporal and spatial distribution of required irrigation water by crop models and the pan evaporation coefficient method

    Science.gov (United States)

    Yang, Yan-min; Yang, Yonghui; Han, Shu-min; Hu, Yu-kun

    2009-07-01

    Hebei Plain is the most important agricultural belt in North China. Intensive irrigation, low and uneven precipitation have led to severe water shortage on the plain. This study is an attempt to resolve this crucial issue of water shortage for sustainable agricultural production and water resources management. The paper models distributed regional irrigation requirement for a range of cultivated crops on the plain. Classic crop models like DSSAT- wheat/maize and COTTON2K are used in combination with pan-evaporation coefficient method to estimate water requirements for wheat, corn, cotton, fruit-trees and vegetables. The approach is more accurate than the static approach adopted in previous studies. This is because the combination use of crop models and pan-evaporation coefficient method dynamically accounts for irrigation requirement at different growth stages of crops, agronomic practices, and field and climatic conditions. The simulation results show increasing Required Irrigation Amount (RIA) with time. RIA ranges from 5.08×109 m3 to 14.42×109 m3 for the period 1986~2006, with an annual average of 10.6×109 m3. Percent average water use by wheat, fruit trees, vegetable, corn and cotton is 41%, 12%, 12%, 11%, 7% and 17% respectively. RIA for April and May (the period with the highest irrigation water use) is 1.78×109 m3 and 2.41×109 m3 respectively. The counties in the piedmont regions of Mount Taihang have high RIA while the central and eastern regions/counties have low irrigation requirement.

  12. The preparation of reports of a significant event at a uranium processing or uranium handling facility

    International Nuclear Information System (INIS)

    1988-08-01

    Licenses to operate uranium processing or uranium handling facilities require that certain events be reported to the Atomic Energy Control Board (AECB) and to other regulatory authorities. Reports of a significant event describe unusual events which had or could have had a significant impact on the safety of facility operations, the worker, the public or on the environment. The purpose of this guide is to suggest an acceptable method of reporting a significant event to the AECB and to describe the information that should be included. The reports of a significant event are made available to the public in accordance with the provisions of the Access to Information Act and the AECB's policy on public access to licensing information

  13. RELAP-7 Software Verification and Validation Plan: Requirements Traceability Matrix (RTM) Part 1 – Physics and numerical methods

    Energy Technology Data Exchange (ETDEWEB)

    Choi, Yong Joon [Idaho National Lab. (INL), Idaho Falls, ID (United States); Yoo, Jun Soo [Idaho National Lab. (INL), Idaho Falls, ID (United States); Smith, Curtis Lee [Idaho National Lab. (INL), Idaho Falls, ID (United States)

    2015-09-01

    This INL plan comprehensively describes the Requirements Traceability Matrix (RTM) on main physics and numerical method of the RELAP-7. The plan also describes the testing-based software verification and validation (SV&V) process—a set of specially designed software models used to test RELAP-7.

  14. The significance of reporting to the thousandths place: Figuring out the laboratory limitations

    Directory of Open Access Journals (Sweden)

    Joely A. Straseski

    2017-04-01

    Full Text Available Objectives: A request to report laboratory values to a specific number of decimal places represents a delicate balance between clinical interpretation of a true analytical change versus laboratory understanding of analytical imprecision and significant figures. Prostate specific antigen (PSA was used as an example to determine if an immunoassay routinely reported to the hundredths decimal place based on significant figure assessment in our laboratory was capable of providing analytically meaningful results when reported to the thousandths places when requested by clinicians. Design and methods: Results of imprecision studies of a representative PSA assay (Roche MODULAR E170 employing two methods of statistical analysis are reported. Sample pools were generated with target values of 0.01 and 0.20 μg/L PSA as determined by the E170. Intra-assay imprecision studies were conducted and the resultant data were analyzed using two independent statistical methods to evaluate reporting limits. Results: These statistical methods indicated reporting results to the thousandths place at the two assessed concentrations was an appropriate reflection of the measurement imprecision for the representative assay. This approach used two independent statistical tests to determine the ability of an analytical system to support a desired reporting level. Importantly, data were generated during a routine intra-assay imprecision study, thus this approach does not require extra data collection by the laboratory. Conclusions: Independent statistical analysis must be used to determine appropriate significant figure limitations for clinically relevant analytes. Establishing these limits is the responsibility of the laboratory and should be determined prior to providing clinical results. Keywords: Significant figures, Imprecision, Prostate cancer, Prostate specific antigen, PSA

  15. Computation of spatial significance of mountain objects extracted from multiscale digital elevation models

    International Nuclear Information System (INIS)

    Sathyamoorthy, Dinesh

    2014-01-01

    The derivation of spatial significance is an important aspect of geospatial analysis and hence, various methods have been proposed to compute the spatial significance of entities based on spatial distances with other entities within the cluster. This paper is aimed at studying the spatial significance of mountain objects extracted from multiscale digital elevation models (DEMs). At each scale, the value of spatial significance index SSI of a mountain object is the minimum number of morphological dilation iterations required to occupy all the other mountain objects in the terrain. The mountain object with the lowest value of SSI is the spatially most significant mountain object, indicating that it has the shortest distance to the other mountain objects. It is observed that as the area of the mountain objects reduce with increasing scale, the distances between the mountain objects increase, resulting in increasing values of SSI. The results obtained indicate that the strategic location of a mountain object at the centre of the terrain is more important than its size in determining its reach to other mountain objects and thus, its spatial significance

  16. Statistical significance of cis-regulatory modules

    Directory of Open Access Journals (Sweden)

    Smith Andrew D

    2007-01-01

    Full Text Available Abstract Background It is becoming increasingly important for researchers to be able to scan through large genomic regions for transcription factor binding sites or clusters of binding sites forming cis-regulatory modules. Correspondingly, there has been a push to develop algorithms for the rapid detection and assessment of cis-regulatory modules. While various algorithms for this purpose have been introduced, most are not well suited for rapid, genome scale scanning. Results We introduce methods designed for the detection and statistical evaluation of cis-regulatory modules, modeled as either clusters of individual binding sites or as combinations of sites with constrained organization. In order to determine the statistical significance of module sites, we first need a method to determine the statistical significance of single transcription factor binding site matches. We introduce a straightforward method of estimating the statistical significance of single site matches using a database of known promoters to produce data structures that can be used to estimate p-values for binding site matches. We next introduce a technique to calculate the statistical significance of the arrangement of binding sites within a module using a max-gap model. If the module scanned for has defined organizational parameters, the probability of the module is corrected to account for organizational constraints. The statistical significance of single site matches and the architecture of sites within the module can be combined to provide an overall estimation of statistical significance of cis-regulatory module sites. Conclusion The methods introduced in this paper allow for the detection and statistical evaluation of single transcription factor binding sites and cis-regulatory modules. The features described are implemented in the Search Tool for Occurrences of Regulatory Motifs (STORM and MODSTORM software.

  17. Pattern and security requirements engineering-based establishment of security standards

    CERN Document Server

    Beckers, Kristian

    2015-01-01

    Security threats are a significant problem for information technology companies today. This book focuses on how to mitigate these threats by using security standards and provides ways to address associated problems faced by engineers caused by ambiguities in the standards. The security standards are analysed, fundamental concepts of the security standards presented, and the relations to the elementary concepts of security requirements engineering (SRE) methods explored. Using this knowledge, engineers can build customised methods that support the establishment of security standards. Standard

  18. The biological significance of brain barrier mechanisms

    DEFF Research Database (Denmark)

    Saunders, Norman R; Habgood, Mark D; Møllgård, Kjeld

    2016-01-01

    , but more work is required to evaluate the method before it can be tried in patients. Overall, our view is that much more fundamental knowledge of barrier mechanisms and development of new experimental methods will be required before drug targeting to the brain is likely to be a successful endeavor......Barrier mechanisms in the brain are important for its normal functioning and development. Stability of the brain's internal environment, particularly with respect to its ionic composition, is a prerequisite for the fundamental basis of its function, namely transmission of nerve impulses....... In addition, the appropriate and controlled supply of a wide range of nutrients such as glucose, amino acids, monocarboxylates, and vitamins is also essential for normal development and function. These are all cellular functions across the interfaces that separate the brain from the rest of the internal...

  19. Untargeted metabolomic profiling plasma samples of patients with lung cancer for searching significant metabolites by HPLC-MS method

    Science.gov (United States)

    Dementeva, N.; Ivanova, K.; Kokova, D.; Kurzina, I.; Ponomaryova, A.; Kzhyshkowska, J.

    2017-09-01

    Lung cancer is one of the most common types of cancer leading to death. Consequently, the search and the identification of the metabolites associated with the risk of developing cancer are very valuable. For the purpose, untargeted metabolic profiling of the plasma samples collected from the patients with lung cancer (n = 100) and the control group (n = 100) was conducted. After sample preparation, the plasma samples were analyzed using LC-MS method. Biostatistics methods were applied to pre-process the data for elicitation of dominating metabolites which responded to the difference between the case and the control groups. At least seven significant metabolites were evaluated and annotated. The most part of identified metabolites are connected with lipid metabolism and their combination could be useful for follow-up studies of lung cancer pathogenesis.

  20. A simple eigenfunction convergence acceleration method for Monte Carlo

    International Nuclear Information System (INIS)

    Booth, Thomas E.

    2011-01-01

    Monte Carlo transport codes typically use a power iteration method to obtain the fundamental eigenfunction. The standard convergence rate for the power iteration method is the ratio of the first two eigenvalues, that is, k_2/k_1. Modifications to the power method have accelerated the convergence by explicitly calculating the subdominant eigenfunctions as well as the fundamental. Calculating the subdominant eigenfunctions requires using particles of negative and positive weights and appropriately canceling the negative and positive weight particles. Incorporating both negative weights and a ± weight cancellation requires a significant change to current transport codes. This paper presents an alternative convergence acceleration method that does not require modifying the transport codes to deal with the problems associated with tracking and cancelling particles of ± weights. Instead, only positive weights are used in the acceleration method. (author)

  1. A Survey of Requirements Engineering Methods for Pervasive Services

    NARCIS (Netherlands)

    Kolos, L.; van Eck, Pascal; Wieringa, Roelf J.

    Designing and deploying ubiquitous computing systems, such as those delivering large-scale mobile services, still requires large-scale investments in both development effort as well as infrastructure costs. Therefore, in order to develop the right system, the design process merits a thorough

  2. Significant Revisions to OSHA 29 CFR 1910.269.

    Science.gov (United States)

    Neitzel, Dennis K

    2015-06-01

    The updated OSHA 29 CFR 1910.269 requirements are significant for assisting employers in their efforts to protect their employees from electrical hazards. In addition, OSHA based these revisions on the latest consensus standards and improvements in electrical safety technology. Together, the updated regulation creates a unified and up-to-date set of requirements to help employers more effectively establish safe work practices to protect their workers.

  3. Probabilistic Requirements (Partial) Verification Methods Best Practices Improvement. Variables Acceptance Sampling Calculators: Empirical Testing. Volume 2

    Science.gov (United States)

    Johnson, Kenneth L.; White, K. Preston, Jr.

    2012-01-01

    The NASA Engineering and Safety Center was requested to improve on the Best Practices document produced for the NESC assessment, Verification of Probabilistic Requirements for the Constellation Program, by giving a recommended procedure for using acceptance sampling by variables techniques as an alternative to the potentially resource-intensive acceptance sampling by attributes method given in the document. In this paper, the results of empirical tests intended to assess the accuracy of acceptance sampling plan calculators implemented for six variable distributions are presented.

  4. Cytogenetic chromosomal aberration dosimetry method after radiation accidents and prognostic significance of stereotypically appearing chromosomal aberrations after radiation exposure

    International Nuclear Information System (INIS)

    Bloennigen, K.A.

    1973-01-01

    The paper reports on a radiation accident involving an Iridium-192 rod of an activity of 7.8 Ci and a size of 2 x 2 x 2 mm 3 . The radiation source had remained in direct contact with the left hip and elbow of the examined person for a period of 45 minutes. On the points that had been directly exposed, physical values of 5,000 rad and 10,000 rad were measured while the whole-body dose was 100-200 rad and the gonad dose 300-400 rad. These values were confirmed by observations of the clinical course and haematological and andrological examinations. Chromosome analysis of lymphocytes produced values between 100 and 125 and thus a significant agreement with the values determined by physical methods. The findings suggest that the relatively simple and fast method of cytogenetic dosimetry provides a useful complementary method to physical dosimetry. (orig./AK) [de

  5. The expression and clinical significance of HDGF in osteosarcoma

    Directory of Open Access Journals (Sweden)

    Chen Z

    2015-09-01

    Full Text Available Zhiguo Chen,1 Shenghai Qiu,2 Xiaofei Lu31Department of Orthopedics, Linyi People’s Hospital, Linyi City, Shandong Province, People’s Republic of China; 2Department of Orthopedics, People’s Hospital of Taiyuan, Taiyuan City, Shanxi Province, People’s Republic of China; 3Department of General Surgery, Jinan Central Hospital affiliated to Shandong University, Jinan City, Shandong Province, People’s Republic of ChinaAim: To investigate the expression of hepatoma-derived growth factor (HDGF in osteosarcoma (OS and the correlation with clinicopathologic factors, prognosis, and tumor progression.Method: HDGF expression in OS tissues was detected by immunohistochemistry. The correlation between HDGF and clinicopathologic factors was analyzed by chi-square test, and the association between HDGF expression and the overall survival rates was evaluated by univariate analysis using Kaplan–Meier method. HDGF concentration in cell medium or cell lysates was detected by enzyme-linked immunosorbent assay method. The effect of extrinsic and intrinsic HDGF on OS cell proliferation was detected by MTT assay after recombinant HDGF stimulation or HDGF knockdown, respectively.Results: Proportion of HDGF high expression was 18.69% (20/107 in OS. HDGF high expression was significantly associated with larger tumor size (P=0.004. With experiments in vitro, we demonstrated that human recombinant HDGF could activate AKT and MAPK signaling pathway, resulting in OS cell proliferation. By knocking down HDGF expression, we proved that intrinsic HDGF was required in OS proliferation.Conclusion: High HDGF expression was significantly associated with larger OS tumor size and could promote OS cell proliferation, indicating that HDGF could be an effective biomarker and a potential drug target in OS treatment.Keywords: hepatoma-derived growth factor, osteosarcoma, tumor size, proliferation, overall survival rate

  6. 40 CFR 136.6 - Method modifications and analytical requirements.

    Science.gov (United States)

    2010-07-01

    ... modifications and analytical requirements. (a) Definitions of terms used in this section. (1) Analyst means the..., oil and grease, total suspended solids, total phenolics, turbidity, chemical oxygen demand, and.... Except as set forth in paragraph (b)(3) of this section, an analyst may modify an approved test procedure...

  7. Significance of appendiceal thickening in association with typhlitis in pediatric oncology patients

    International Nuclear Information System (INIS)

    McCarville, M.B.; Thompson, J.; Adelman, C.S.; Lee, M.O.; Li, C.; Alsammarae, D.; Rao, B.N.; May, M.V.; Jones, S.C.; Sandlund, J.T.

    2004-01-01

    Background: The management of pediatric oncology patients with imaging evidence of appendiceal thickening is complex because they are generally poor surgical candidates and often have confounding clinical findings. Objective: We sought to determine the significance of appendiceal thickening in pediatric oncology patients who also had typhlitis. Specifically, we evaluated the impact of this finding on the duration of typhlitis, its clinical management, and outcome. Materials and methods: From a previous review of the management of typhlitis in 90 children with cancer at our institution, we identified 4 with imaging evidence of appendiceal thickening. We compared colonic wall measurements, duration of typhlitis symptoms, management, and outcome of patients with appendiceal thickening and typhlitis to patients with typhlitis alone. Results: There was no significant difference in duration of typhlitis symptoms between patients with typhlitis only (15.6 ± 1.2 days) and those with typhlitis and appendiceal thickening (14.5 ± 5.8 days; P = 0.9). Two patients with appendiceal thickening required surgical treatment for ischemic bowel, and two were treated medically. Only one patient in the typhlitis without appendiceal thickening group required surgical intervention. There were no deaths in children with appendiceal thickening; two patients died of complications of typhlitis alone. (orig.)

  8. An outline of the systematic-dialectical method: scientific and political significance

    NARCIS (Netherlands)

    Reuten, G.; Moseley, F.; Smith, T.

    2014-01-01

    The method of systematic-dialectics (SD) is reconstructed with a focus on what institutions and processes are necessary - rather than contingent - for the capitalist system. This allows for the detection of strengths and weaknesses in the actual structure of the system. Weaknesses should be

  9. Some basic requirements for the application of electrokinetic methods for the reconstruction of masonry with rising humidity

    Energy Technology Data Exchange (ETDEWEB)

    Friese, P; Jacobasch, H J; Boerner, M

    1987-12-01

    Based on some theoretical statements concerning the electro-osmosis the most important requirements for the application of electrokinetic methods for drying masonry with rising humidity are described. Samples of brick masonry (brick and mortar) were examined by means of an electrokinetic measuring system (EKM) with different electrolytes (CaSO/sub 4/ and KCl) being used for different concentrations. It was found for all samples, that the zeta potential is provided with a negative sign and that the absolute value of the zeta potential approaches zero with increasing electrolyte concentration. Based on these measurements, an upper limit of the electrolyte concentration of 0.1 Mol/liter is established for the application of electrokinetic methods for drying masonry.

  10. 40 CFR 141.74 - Analytical and monitoring requirements.

    Science.gov (United States)

    2010-07-01

    ... 40 Protection of Environment 22 2010-07-01 2010-07-01 false Analytical and monitoring requirements... Analytical and monitoring requirements. (a) Analytical requirements. Only the analytical method(s) specified... as set forth in the article “National Field Evaluation of a Defined Substrate Method for the...

  11. 40 CFR 52.2451 - Significant deterioration of air quality.

    Science.gov (United States)

    2010-07-01

    ... quality. 52.2451 Section 52.2451 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) AIR... Significant deterioration of air quality. (a) The requirements of sections 160 through 165 of the Clean Air... Quality Deterioration. (b) Regulations for preventing significant deterioration of air quality. The...

  12. 40 CFR 52.2528 - Significant deterioration of air quality.

    Science.gov (United States)

    2010-07-01

    ... quality. 52.2528 Section 52.2528 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) AIR... Significant deterioration of air quality. (a) The requirements of Sections 160 through 165 of the Clean Air... Quality Deterioration. (b) Regulations for Preventing Significant Deterioration of Air Quality, the...

  13. Significant Lactic Acidosis from Albuterol

    Directory of Open Access Journals (Sweden)

    Deborah Diercks

    2018-03-01

    Full Text Available Lactic acidosis is a clinical entity that demands rapid assessment and treatment to prevent significant morbidity and mortality. With increased lactate use across many clinical scenarios, lactate values themselves cannot be interpreted apart from their appropriate clinical picture. The significance of Type B lactic acidosis is likely understated in the emergency department (ED. Given the mortality that sepsis confers, a serum lactate is an important screening study. That said, it is with extreme caution that we should interpret and react to the resultant elevated value. We report a patient with a significant lactic acidosis. Though he had a high lactate value, he did not require aggressive resuscitation. A different classification scheme for lactic acidosis that focuses on the bifurcation of the “dangerous” and “not dangerous” causes of lactic acidosis may be of benefit. In addition, this case is demonstrative of the potential overuse of lactates in the ED.

  14. Advantages and limitations of the SETS method

    International Nuclear Information System (INIS)

    Mahaffy, J.H.

    1983-01-01

    The stability-enchancing two-step (SETS) method has been used successfully in the Transient Reactor Analysis Code (TRAC) for several years. The method consists of a basic semi-implicit step combined with a stabilizer step that, taken together, eliminate the material Courant stability limit associated with standard semi-implicit numerical methods. This approach toward stability requires significantly fewer computational operations than a fully implicit method, but currently maintains the first-order accuracy in space and time of its semi-implicit predecessors

  15. ON THE IMPACT OF FLIGHT SAFETY CERTIFICATION REQUIREMENTS ON THE AERODYNAMIC EFFICIENCY OF COMMERCIAL AIRPLANES

    Directory of Open Access Journals (Sweden)

    Vladimir I. Shevyakov

    2018-01-01

    Full Text Available The article considers the issue of aerodynamics efficiency implementation taking into account certification requirements for flight safety. Aerodynamics efficiency means high aerodynamic performance (depending on the airplane size, aerodynamic performance in cruise flight, high aerodynamic performance at takeoff, as well as lift performance at landing.The author estimated the impact on aerodynamics efficiency of both the requirements for aerodynamics performance and requirements for aircraft systems, noncompliance with which may result in significant change of expected operating conditions. It was shown that the use of supercritical wing profiles may result in flight mode limitations due to failure of the required buffeting capacities. It does not allow engaging all the advantages of aerodynamics layout and requires special design solutions to prevent such cases.There were reviewed certification requirements for flight level pressure altitude accuracy and icing conditions warning sysytem. The research presented the methods of aerodynamic efficiency increase by meeting the requirements for reduced vertical separation minima flights and in icing conditions, including requirements for air data probes. Reduced vertical separation minima flight requirements are met by means of efficient air data probes location. Theoretical methods of flow calculation determine areas on the airplane skin surface where static probes minimize errors depending on angle-of-attack and sideslip. It was shown that if certification requirements are not met and in case of flight out of reduced vertical separation minima area, aerodynamics efficiency is significantly reduced and fuel consumption can be increased by 10% and higher. Suggested approaches implementation allows increasing commercial airplanes competitiveness.

  16. 40 CFR 52.1165 - Significant deterioration of air quality.

    Science.gov (United States)

    2010-07-01

    ... quality. 52.1165 Section 52.1165 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) AIR... Significant deterioration of air quality. (a) The requirements of sections 160 through 165 of the Clean Air... deterioration of air quality. (b) Regulation for preventing significant deterioration of air quality. The...

  17. 40 CFR 52.2729 - Significant deterioration of air quality.

    Science.gov (United States)

    2010-07-01

    ... quality. 52.2729 Section 52.2729 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) AIR... Significant deterioration of air quality. (a) The requirements of sections 160 through 165 of the Clean Air... deterioration of air quality. (b) Regulations for preventing significant deterioration of air quality. The...

  18. 40 CFR 52.1689 - Significant deterioration of air quality.

    Science.gov (United States)

    2010-07-01

    ... quality. 52.1689 Section 52.1689 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) AIR... Significant deterioration of air quality. (a) The requirements of sections 160 through 165 of the Clean Air... deterioration of air quality. (b) Regulations for preventing significant deterioration of air quality. The...

  19. 40 CFR 52.1234 - Significant deterioration of air quality.

    Science.gov (United States)

    2010-07-01

    ... quality. 52.1234 Section 52.1234 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) AIR... Significant deterioration of air quality. (a) The requirements of sections 160 through 165 of the Clean Air... deterioration of air quality. (b) Regulations for preventing significant deterioration of air quality. The...

  20. 40 CFR 52.2827 - Significant deterioration of air quality.

    Science.gov (United States)

    2010-07-01

    ... quality. 52.2827 Section 52.2827 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) AIR... Significant deterioration of air quality. (a) The requirements of sections 160 through 165 of the Clean Air... deterioration of air quality. (b) Regulations for preventing significant deterioration of air quality. The...

  1. 40 CFR 52.1603 - Significant deterioration of air quality.

    Science.gov (United States)

    2010-07-01

    ... quality. 52.1603 Section 52.1603 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) AIR... Significant deterioration of air quality. (a) The requirements of sections 160 through 165 of the Clean Air... deterioration of air quality. (b) Regulations for preventing significant deterioration of air quality. The...

  2. 40 CFR 52.1180 - Significant deterioration of air quality.

    Science.gov (United States)

    2010-07-01

    ... quality. 52.1180 Section 52.1180 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) AIR... Significant deterioration of air quality. (a) The requirements of sections 160 through 165 of the Clean Air... deterioration of air quality. (b) Regulations for preventing significant deterioration of air quality. The...

  3. 40 CFR 52.2779 - Significant deterioration of air quality.

    Science.gov (United States)

    2010-07-01

    ... quality. 52.2779 Section 52.2779 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) AIR... Significant deterioration of air quality. (a) The requirements of sections 160 through 165 of the Clean Air... deterioration of air quality. (b) Regulations for preventing significant deterioration of air quality. The...

  4. 40 CFR 52.2676 - Significant deterioration of air quality.

    Science.gov (United States)

    2010-07-01

    ... quality. 52.2676 Section 52.2676 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) AIR... Significant deterioration of air quality. (a) The requirements of sections 160 through 165 of the Clean Air... deterioration of air quality. (b) Regulations for preventing significant deterioration of air quality. The...

  5. 40 CFR 52.499 - Significant deterioration of air quality.

    Science.gov (United States)

    2010-07-01

    ... quality. 52.499 Section 52.499 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) AIR... Significant deterioration of air quality. (a) The requirements of sections 160 through 165 of the Clean Air... deterioration of air quality. (b) Regulations for preventing significant deterioration of air quality. The...

  6. 40 CFR 52.2497 - Significant deterioration of air quality.

    Science.gov (United States)

    2010-07-01

    ... quality. 52.2497 Section 52.2497 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) AIR... Significant deterioration of air quality. (a) The requirements of sections 160 through 165 of the Clean Air... deterioration of air quality. (b) Regulations for preventing significant deterioration of air quality. The...

  7. 40 CFR 52.1485 - Significant deterioration of air quality.

    Science.gov (United States)

    2010-07-01

    ... quality. 52.1485 Section 52.1485 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) AIR... Significant deterioration of air quality. (a) The requirements of sections 160 through 165 of the Clean Air... include approvable procedures for preventing the significant deterioration of air quality. (b) Regulation...

  8. 40 CFR 52.1884 - Significant deterioration of air quality.

    Science.gov (United States)

    2010-07-01

    ... quality. 52.1884 Section 52.1884 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) AIR... Significant deterioration of air quality. (a) The requirements of sections 160 through 165 of the Clean Air... deterioration of air quality. (b) Regulations for preventing significant deterioration of air quality. The...

  9. How significant is the ‘significant other’? Associations between significant others’ health behaviors and attitudes and young adults’ health outcomes

    Directory of Open Access Journals (Sweden)

    Berge Jerica M

    2012-04-01

    Full Text Available Abstract Background Having a significant other has been shown to be protective against physical and psychological health conditions for adults. Less is known about the period of emerging young adulthood and associations between significant others’ weight and weight-related health behaviors (e.g. healthy dietary intake, the frequency of physical activity, weight status. This study examined the association between significant others’ health attitudes and behaviors regarding eating and physical activity and young adults’ weight status, dietary intake, and physical activity. Methods This study uses data from Project EAT-III, a population-based cohort study with emerging young adults from diverse ethnic and socioeconomic backgrounds (n = 1212. Logistic regression models examining cross-sectional associations, adjusted for sociodemographics and health behaviors five years earlier, were used to estimate predicted probabilities and calculate prevalence differences. Results Young adult women whose significant others had health promoting attitudes/behaviors were significantly less likely to be overweight/obese and were more likely to eat ≥ 5 fruits/vegetables per day and engage in ≥ 3.5 hours/week of physical activity, compared to women whose significant others did not have health promoting behaviors/attitudes. Young adult men whose significant other had health promoting behaviors/attitudes were more likely to engage in ≥ 3.5 hours/week of physical activity compared to men whose significant others did not have health promoting behaviors/attitudes. Conclusions Findings suggest the protective nature of the significant other with regard to weight-related health behaviors of young adults, particularly for young adult women. Obesity prevention efforts should consider the importance of including the significant other in intervention efforts with young adult women and potentially men.

  10. Assessing Requirements Quality through Requirements Coverage

    Science.gov (United States)

    Rajan, Ajitha; Heimdahl, Mats; Woodham, Kurt

    2008-01-01

    In model-based development, the development effort is centered around a formal description of the proposed software system the model. This model is derived from some high-level requirements describing the expected behavior of the software. For validation and verification purposes, this model can then be subjected to various types of analysis, for example, completeness and consistency analysis [6], model checking [3], theorem proving [1], and test-case generation [4, 7]. This development paradigm is making rapid inroads in certain industries, e.g., automotive, avionics, space applications, and medical technology. This shift towards model-based development naturally leads to changes in the verification and validation (V&V) process. The model validation problem determining that the model accurately captures the customer's high-level requirements has received little attention and the sufficiency of the validation activities has been largely determined through ad-hoc methods. Since the model serves as the central artifact, its correctness with respect to the users needs is absolutely crucial. In our investigation, we attempt to answer the following two questions with respect to validation (1) Are the requirements sufficiently defined for the system? and (2) How well does the model implement the behaviors specified by the requirements? The second question can be addressed using formal verification. Nevertheless, the size and complexity of many industrial systems make formal verification infeasible even if we have a formal model and formalized requirements. Thus, presently, there is no objective way of answering these two questions. To this end, we propose an approach based on testing that, when given a set of formal requirements, explores the relationship between requirements-based structural test-adequacy coverage and model-based structural test-adequacy coverage. The proposed technique uses requirements coverage metrics defined in [9] on formal high-level software

  11. The time domain triple probe method

    International Nuclear Information System (INIS)

    Meier, M.A.; Hallock, G.A.; Tsui, H.Y.W.; Bengtson, R.D.

    1994-01-01

    A new Langmuir probe technique based on the triple probe method is being developed to provide simultaneous measurement of plasma temperature, potential, and density with the temporal and spatial resolution required to accurately characterize plasma turbulence. When the conventional triple probe method is used in an inhomogeneous plasma, local differences in the plasma measured at each probe introduce significant error in the estimation of turbulence parameters. The Time Domain Triple Probe method (TDTP) uses high speed switching of Langmuir probe potential, rather than spatially separated probes, to gather the triple probe information thus avoiding these errors. Analysis indicates that plasma response times and recent electronics technology meet the requirements to implement the TDTP method. Data reduction techniques of TDTP data are to include linear and higher order correlation analysis to estimate fluctuation induced particle and thermal transport, as well as energy relationships between temperature, density, and potential fluctuations

  12. Frailty in Chinese Peritoneal Dialysis Patients: Prevalence and Prognostic Significance

    Directory of Open Access Journals (Sweden)

    Jack Kit-Chung Ng

    2016-10-01

    Full Text Available Background/Aims: Previous studies showed that frailty is prevalent in both pre-dialysis and dialysis patients. However, the prevalence and prognostic implication of frailty in Chinese peritoneal dialysis (PD patients remain unknown. Methods: We used a validated questionnaire to determine the Frailty Score of 193 unselected prevalent PD patients. All patients were then followed for 2 years for their need of hospitalization and mortality. Results: Amongst the 193 patients, 134 (69.4% met the criteria of being frail. Frailty Score significantly correlated with Charlson's comorbidity score (r = 0.40, p Conclusions: Frailty is prevalent among Chinese PD patients. Frail PD patients have a high risk of requiring hospitalization and their hospital stay tends to be prolonged. Early identification may allow timely intervention to prevent adverse health outcomes in this group of patients.

  13. Behavioral Change and Building Performance: Strategies for Significant, Persistent, and Measurable Institutional Change

    Energy Technology Data Exchange (ETDEWEB)

    Wolfe, Amy K.; Malone, Elizabeth L.; Heerwagen, Judith H.; Dion, Jerome P.

    2014-04-01

    The people who use Federal buildings — Federal employees, operations and maintenance staff, and the general public — can significantly impact a building’s environmental performance and the consumption of energy, water, and materials. Many factors influence building occupants’ use of resources (use behaviors) including work process requirements, ability to fulfill agency missions, new and possibly unfamiliar high-efficiency/high-performance building technologies; a lack of understanding, education, and training; inaccessible information or ineffective feedback mechanisms; and cultural norms and institutional rules and requirements, among others. While many strategies have been used to introduce new occupant use behaviors that promote sustainability and reduced resource consumption, few have been verified in the scientific literature or have properly documented case study results. This paper documents validated strategies that have been shown to encourage new use behaviors that can result in significant, persistent, and measureable reductions in resource consumption. From the peer-reviewed literature, the paper identifies relevant strategies for Federal facilities and commercial buildings that focus on the individual, groups of individuals (e.g., work groups), and institutions — their policies, requirements, and culture. The paper documents methods with evidence of success in changing use behaviors and enabling occupants to effectively interact with new technologies/designs. It also provides a case study of the strategies used at a Federal facility — Fort Carson, Colorado. The paper documents gaps in the current literature and approaches, and provides topics for future research.

  14. 47 CFR 76.54 - Significantly viewed signals; method to be followed for special showings.

    Science.gov (United States)

    2010-10-01

    ... located, in whole or in part, and on all other system community units, franchisees, and franchise.... 339(d). (j) Notwithstanding the requirements of this section, the signal of a television broadcast...

  15. 40 CFR 52.1436 - Significant deterioration of air quality.

    Science.gov (United States)

    2010-07-01

    ... 40 Protection of Environment 4 2010-07-01 2010-07-01 false Significant deterioration of air quality. 52.1436 Section 52.1436 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) AIR... Significant deterioration of air quality. The requirements of sections 160 through 165 of the Clean Air Act...

  16. 40 CFR 52.2303 - Significant deterioration of air quality.

    Science.gov (United States)

    2010-07-01

    ... quality. 52.2303 Section 52.2303 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) AIR... Significant deterioration of air quality. (a) The plan submitted by Texas is approved as meeting the requirements of part C, Clean Air Act for preventing significant deterioration of air quality. The plan...

  17. 40 CFR 52.1280 - Significant deterioration of air quality.

    Science.gov (United States)

    2010-07-01

    ... 40 Protection of Environment 4 2010-07-01 2010-07-01 false Significant deterioration of air quality. 52.1280 Section 52.1280 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) AIR... Significant deterioration of air quality. (a) All applications and other information required pursuant to § 52...

  18. 40 CFR 52.1529 - Significant deterioration of air quality.

    Science.gov (United States)

    2010-07-01

    ... 40 Protection of Environment 4 2010-07-01 2010-07-01 false Significant deterioration of air quality. 52.1529 Section 52.1529 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) AIR... Significant deterioration of air quality. New Hampshire's Part Env-A 623, “Requirements for Prevention of...

  19. A Novel Polygonal Finite Element Method: Virtual Node Method

    Science.gov (United States)

    Tang, X. H.; Zheng, C.; Zhang, J. H.

    2010-05-01

    Polygonal finite element method (PFEM), which can construct shape functions on polygonal elements, provides greater flexibility in mesh generation. However, the non-polynomial form of traditional PFEM, such as Wachspress method and Mean Value method, leads to inexact numerical integration. Since the integration technique for non-polynomial functions is immature. To overcome this shortcoming, a great number of integration points have to be used to obtain sufficiently exact results, which increases computational cost. In this paper, a novel polygonal finite element method is proposed and called as virtual node method (VNM). The features of present method can be list as: (1) It is a PFEM with polynomial form. Thereby, Hammer integral and Gauss integral can be naturally used to obtain exact numerical integration; (2) Shape functions of VNM satisfy all the requirements of finite element method. To test the performance of VNM, intensive numerical tests are carried out. It found that, in standard patch test, VNM can achieve significantly better results than Wachspress method and Mean Value method. Moreover, it is observed that VNM can achieve better results than triangular 3-node elements in the accuracy test.

  20. [Significance of three-dimensional reconstruction as a method of preoperative planning of laparoscopic radiofrequency ablation].

    Science.gov (United States)

    Zhang, W W; Wang, H G; Shi, X J; Chen, M Y; Lu, S C

    2016-09-01

    To discuss the significance of three-dimensional reconstruction as a method of preoperative planning of laparoscopic radiofrequency ablation(LRFA). Thirty-two cases of LRFA admitted from January 2014 to December 2015 in Department of Hepatobiliary Surgery, Chinese People's Liberation Army General Hospital were analyzed(3D-LRFA group). Three-dimensional(3D) reconstruction were taken as a method of preoperative planning in 3D-LRFA group.Other 64 LRFA cases were paired over the same period without three-dimensional reconstruction before the operation (LRFA group). Hepatobiliary system contrast enhanced CT scan of 3D-RFA patients were taken by multi-slice spiral computed tomography(MSCT), and the DICOM data were processed by IQQA(®)-Liver and IQQA(®)-guide to make 3D reconstruction.Using 3D reconstruction model, diameter and scope of tumor were measured, suitable size (length and radiofrequency length) and number of RFA electrode were chosen, scope and effect of radiofrequency were simulated, reasonable needle track(s) was planed, position and angle of laparoscopic ultrasound (LUS) probe was designed and LUS image was simulated.Data of operation and recovery were collected and analyzed. Data between two sets of measurement data were compared with t test or rank sum test, and count data with χ(2) test or Fisher exact probability test.Tumor recurrence rate was analyzed with the Kaplan-Meier survival curve and Log-rank (Mantel-Cox) test. Compared with LRFA group ((216.8±66.2) minutes, (389.1±183.4) s), 3D-LRFA group ((173.3±59.4) minutes, (242.2±90.8) s) has shorter operation time(t=-3.138, P=0.002) and shorter mean puncture time(t=-2.340, P=0.021). There was no significant difference of blood loss(P=0.170), ablation rate (P=0.871) and incidence of complications(P=1.000). Compared with LRFA group ((6.3±3.9)days, (330±102)U/L, (167±64)ng/L), 3D-LRFA group ((4.3±3.1) days, (285±102) U/L, (139±43) ng/L) had shorter post-operative stay(t=-2.527, P=0.016), less

  1. What if there were no significance tests?

    CERN Document Server

    Harlow, Lisa L; Steiger, James H

    2013-01-01

    This book is the result of a spirited debate stimulated by a recent meeting of the Society of Multivariate Experimental Psychology. Although the viewpoints span a range of perspectives, the overriding theme that emerges states that significance testing may still be useful if supplemented with some or all of the following -- Bayesian logic, caution, confidence intervals, effect sizes and power, other goodness of approximation measures, replication and meta-analysis, sound reasoning, and theory appraisal and corroboration. The book is organized into five general areas. The first presents an overview of significance testing issues that sythesizes the highlights of the remainder of the book. The next discusses the debate in which significance testing should be rejected or retained. The third outlines various methods that may supplement current significance testing procedures. The fourth discusses Bayesian approaches and methods and the use of confidence intervals versus significance tests. The last presents the p...

  2. 40 CFR 52.2581 - Significant deterioration of air quality.

    Science.gov (United States)

    2010-07-01

    ... quality. 52.2581 Section 52.2581 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) AIR... Significant deterioration of air quality. (a)-(c) [Reserved] (d) The requirements of sections 160 through 165... provisions for prevention of significant deterioration of air quality at 40 CFR 52.21 are applicable to the...

  3. A laser sheet self-calibration method for scanning PIV

    Science.gov (United States)

    Knutsen, Anna N.; Lawson, John M.; Dawson, James R.; Worth, Nicholas A.

    2017-10-01

    Knowledge of laser sheet position, orientation, and thickness is a fundamental requirement of scanning PIV and other laser-scanning methods. This paper describes the development and evaluation of a new laser sheet self-calibration method for stereoscopic scanning PIV, which allows the measurement of these properties from particle images themselves. The approach is to fit a laser sheet model by treating particles as randomly distributed probes of the laser sheet profile, whose position is obtained via a triangulation procedure enhanced by matching particle images according to their variation in brightness over a scan. Numerical simulations and tests with experimental data were used to quantify the sensitivity of the method to typical experimental error sources and validate its performance in practice. The numerical simulations demonstrate the accurate recovery of the laser sheet parameters over range of different seeding densities and sheet thicknesses. Furthermore, they show that the method is robust to significant image noise and camera misalignment. Tests with experimental data confirm that the laser sheet model can be accurately reconstructed with no impairment to PIV measurement accuracy. The new method is more efficient and robust in comparison with the standard (self-) calibration approach, which requires an involved, separate calibration step that is sensitive to experimental misalignments. The method significantly improves the practicality of making accurate scanning PIV measurements and broadens its potential applicability to scanning systems with significant vibrations.

  4. Aeronautical requirements for Inconel 718 alloy

    Science.gov (United States)

    Elefterie, C. F.; Guragata, C.; Bran, D.; Ghiban, B.

    2017-06-01

    The project goal is to present the requirements imposed by aviation components made from super alloys based on Nickel. A significant portion of fasteners, locking lugs, blade retainers and inserts are manufactured from Alloy 718. The thesis describes environmental factors (corrosion), conditions of external aggression (salt air, intense heat, heavy industrial pollution, high condensation, high pressure), mechanical characteristics (tensile strength, yield strength and fatigue resistance) and loadings (tensions, compression loads) that must be satisfied simultaneously by Ni-based super alloy, compared to other classes of aviation alloys (as egg. Titanium alloys, Aluminum alloys). For this alloy the requirements are strength durability, damage tolerance, fail safety and so on. The corrosion can be an issue, but the fatigue under high-magnitude cyclic tensile loading it’s what limits the lifetime of the airframe. Also, the excellent malleability and weldability characteristics of the 718 system make the material physical properties tolerant of manufacturing processes. These characteristics additionally continue to provide new opportunities for advanced manufacturing methods.

  5. Front Loaded Accurate Requirements Engineering (FLARE): A Requirements Analysis Concept for the 21st Century

    National Research Council Canada - National Science Library

    Leonard, Anthony

    1997-01-01

    This thesis focuses on ways to apply requirements engineering techniques and methods during the development and evolution of DoD software systems in an effort to reduce changes to system requirements...

  6. Identified adjustability dimensions when generating a product specific requirements specification by requirements reuse

    DEFF Research Database (Denmark)

    Hauksdóttir, Dagný; Mortensen, Niels Henrik; Nielsen, Poul Erik

    2014-01-01

    . An extensive state of the art is included to introduce the presented methods related to each adjustability dimensions. The options for implementing each adjustability dimensions in a requirement reuse approach are illustrated along with a discussion regarding the benefits and issues resulting from each option....... This discussion should help practitioners to better understand the possible methods that can be implemented and to design a user friendly and sustainable approach. A case study, describing how the dimensions are incorporated in two requirements reuse approaches, for Danfoss Solar Inverters (SI) and Danfoss...

  7. 40 CFR 141.723 - Requirements to respond to significant deficiencies identified in sanitary surveys performed by EPA.

    Science.gov (United States)

    2010-07-01

    ... deficiencies identified in sanitary surveys performed by EPA. 141.723 Section 141.723 Protection of Environment... performed by EPA, systems must respond in writing to significant deficiencies identified in sanitary survey... will address significant deficiencies noted in the survey. (d) Systems must correct significant...

  8. Significant factors for enabling knowledge sharing between government agencies within South Africa

    Directory of Open Access Journals (Sweden)

    Avain Mannie

    2013-10-01

    Objectives: This study aimed to validate the significant factors that influence the effectiveness of KM between government agencies in South Africa. The commonly identified pillars of KM in the extant literature served as a primary framework in establishing these factors. Method: Data were gathered using an electronic survey made available to different national government agencies within the security cluster. Responses were analysed using structural equation modelling. Main findings: Existing literature highlighted organisational culture, learning organisation, collaboration, subject matter experts and trust as being determinants for knowledge management. The first two were identified as the most significant factors for knowledge sharing to succeed. Conclusion: Whilst there is universal consent as to the strategic importance of KM, actionable implementation of knowledge sharing initiatives appears to be lacking. This study emphasised the fact that leaders must instil a knowledge sharing culture either through employee performance contracts or methods such as the balanced score card. The study also showed that it is imperative for leaders to acknowledge that KM is a multi-faceted discipline that offers strategic advantages. Leaders of developing countries should note that they are on a developmental journey. This requires their organisations to be learning organisations, which necessitates a change in the organisational culture and knowledge interventions through their academies of learning.

  9. The commission errors search and assessment (CESA) method

    Energy Technology Data Exchange (ETDEWEB)

    Reer, B.; Dang, V. N

    2007-05-15

    Errors of Commission (EOCs) refer to the performance of inappropriate actions that aggravate a situation. In Probabilistic Safety Assessment (PSA) terms, they are human failure events that result from the performance of an action. This report presents the Commission Errors Search and Assessment (CESA) method and describes the method in the form of user guidance. The purpose of the method is to identify risk-significant situations with a potential for EOCs in a predictive analysis. The main idea underlying the CESA method is to catalog the key actions that are required in the procedural response to plant events and to identify specific scenarios in which these candidate actions could erroneously appear to be required. The catalog of required actions provides a basis for a systematic search of context-action combinations. To focus the search towards risk-significant scenarios, the actions that are examined in the CESA search are prioritized according to the importance of the systems and functions that are affected by these actions. The existing PSA provides this importance information; the Risk Achievement Worth or Risk Increase Factor values indicate the systems/functions for which an EOC contribution would be more significant. In addition, the contexts, i.e. PSA scenarios, for which the EOC opportunities are reviewed are also prioritized according to their importance (top sequences or cut sets). The search through these context-action combinations results in a set of EOC situations to be examined in detail. CESA has been applied in a plant-specific pilot study, which showed the method to be feasible and effective in identifying plausible EOC opportunities. This experience, as well as the experience with other EOC analyses, showed that the quantification of EOCs remains an issue. The quantification difficulties and the outlook for their resolution conclude the report. (author)

  10. The commission errors search and assessment (CESA) method

    International Nuclear Information System (INIS)

    Reer, B.; Dang, V. N.

    2007-05-01

    Errors of Commission (EOCs) refer to the performance of inappropriate actions that aggravate a situation. In Probabilistic Safety Assessment (PSA) terms, they are human failure events that result from the performance of an action. This report presents the Commission Errors Search and Assessment (CESA) method and describes the method in the form of user guidance. The purpose of the method is to identify risk-significant situations with a potential for EOCs in a predictive analysis. The main idea underlying the CESA method is to catalog the key actions that are required in the procedural response to plant events and to identify specific scenarios in which these candidate actions could erroneously appear to be required. The catalog of required actions provides a basis for a systematic search of context-action combinations. To focus the search towards risk-significant scenarios, the actions that are examined in the CESA search are prioritized according to the importance of the systems and functions that are affected by these actions. The existing PSA provides this importance information; the Risk Achievement Worth or Risk Increase Factor values indicate the systems/functions for which an EOC contribution would be more significant. In addition, the contexts, i.e. PSA scenarios, for which the EOC opportunities are reviewed are also prioritized according to their importance (top sequences or cut sets). The search through these context-action combinations results in a set of EOC situations to be examined in detail. CESA has been applied in a plant-specific pilot study, which showed the method to be feasible and effective in identifying plausible EOC opportunities. This experience, as well as the experience with other EOC analyses, showed that the quantification of EOCs remains an issue. The quantification difficulties and the outlook for their resolution conclude the report. (author)

  11. Hanford Site groundwater monitoring: Setting, sources and methods

    International Nuclear Information System (INIS)

    Hartman, M.J.

    2000-01-01

    Groundwater monitoring is conducted on the Hanford Site to meet the requirements of the Resource Conservation and Recovery Act of 1976 (RCRA); Comprehensive Environmental Response, Compensation, and Liability Act of 1980 (CERCLA); U.S. Department of Energy (DOE) orders; and the Washington Administrative Code. Results of monitoring are published annually (e.g., PNNL-11989). To reduce the redundancy of these annual reports, background information that does not change significantly from year to year has been extracted from the annual report and published in this companion volume. This report includes a description of groundwater monitoring requirements, site hydrogeology, and waste sites that have affected groundwater quality or that require groundwater monitoring. Monitoring networks and methods for sampling, analysis, and interpretation are summarized. Vadose zone monitoring methods and statistical methods also are described. Whenever necessary, updates to information contained in this document will be published in future groundwater annual reports

  12. Hanford Site groundwater monitoring: Setting, sources and methods

    Energy Technology Data Exchange (ETDEWEB)

    M.J. Hartman

    2000-04-11

    Groundwater monitoring is conducted on the Hanford Site to meet the requirements of the Resource Conservation and Recovery Act of 1976 (RCRA); Comprehensive Environmental Response, Compensation, and Liability Act of 1980 (CERCLA); U.S. Department of Energy (DOE) orders; and the Washington Administrative Code. Results of monitoring are published annually (e.g., PNNL-11989). To reduce the redundancy of these annual reports, background information that does not change significantly from year to year has been extracted from the annual report and published in this companion volume. This report includes a description of groundwater monitoring requirements, site hydrogeology, and waste sites that have affected groundwater quality or that require groundwater monitoring. Monitoring networks and methods for sampling, analysis, and interpretation are summarized. Vadose zone monitoring methods and statistical methods also are described. Whenever necessary, updates to information contained in this document will be published in future groundwater annual reports.

  13. CASE METHOD. ACTIVE LEARNING METHODOLOGY TO ACQUIRE SIGNIFICANT IN CHEMISTRY

    Directory of Open Access Journals (Sweden)

    Clotilde Pizarro

    2015-09-01

    Full Text Available In this paper the methodology of cases in first year students of the Engineering Risk Prevention and Environment is applied. For this purpose a real case of contamination occurred at a school in the region of Valparaiso called "La Greda" is presented. If the application starts delivering an extract of the information collected from the media and they made a brief induction on the methodology to be applied. A plenary session, which is debate about possible solutions to the problem and establishing a relationship between the case and drives the chemistry program is then performed. Is concluded that the application of the case method, was a fruitful tool in yields obtained by students, since the percentage of approval was 75%, which is considerably higher than previous years.

  14. Safety significance evaluation system

    International Nuclear Information System (INIS)

    Lew, B.S.; Yee, D.; Brewer, W.K.; Quattro, P.J.; Kirby, K.D.

    1991-01-01

    This paper reports that the Pacific Gas and Electric Company (PG and E), in cooperation with ABZ, Incorporated and Science Applications International Corporation (SAIC), investigated the use of artificial intelligence-based programming techniques to assist utility personnel in regulatory compliance problems. The result of this investigation is that artificial intelligence-based programming techniques can successfully be applied to this problem. To demonstrate this, a general methodology was developed and several prototype systems based on this methodology were developed. The prototypes address U.S. Nuclear Regulatory Commission (NRC) event reportability requirements, technical specification compliance based on plant equipment status, and quality assurance assistance. This collection of prototype modules is named the safety significance evaluation system

  15. Maneuver Planning for Conjunction Risk Mitigation with Ground-track Control Requirements

    Science.gov (United States)

    McKinley, David

    2008-01-01

    The planning of conjunction Risk Mitigation Maneuvers (RMM) in the presence of ground-track control requirements is analyzed. Past RMM planning efforts on the Aqua, Aura, and Terra spacecraft have demonstrated that only small maneuvers are available when ground-track control requirements are maintained. Assuming small maneuvers, analytical expressions for the effect of a given maneuver on conjunction geometry are derived. The analytical expressions are used to generate a large trade space for initial RMM design. This trade space represents a significant improvement in initial maneuver planning over existing methods that employ high fidelity maneuver models and propagation.

  16. IRET: requirements for service platforms

    OpenAIRE

    Baresi, Luciano; Ripa, Gianluca; Pasquale, Liliana

    2013-01-01

    peer-reviewed This paper describes IRENE (Indenica Requirements ElicitatioN mEthod), a methodology to elicit and model the requirements of service platforms, and IRET (IREne Tool), the Eclipse-based modeling framework we developed for IRENE

  17. Improving the requirements process in Axiomatic Design Theory

    DEFF Research Database (Denmark)

    Thompson, Mary Kathryn

    2013-01-01

    This paper introduces a model to integrate the traditional requirements process into Axiomatic Design Theory and proposes a method to structure the requirements process. The method includes a requirements classification system to ensure that all requirements information can be included...... in the Axiomatic Design process, a stakeholder classification system to reduce the chances of excluding one or more key stakeholders, and a table to visualize the mapping between the stakeholders and their requirements....

  18. Evaluation of Information Requirements of Reliability Methods in Engineering Design

    DEFF Research Database (Denmark)

    Marini, Vinicius Kaster; Restrepo-Giraldo, John Dairo; Ahmed-Kristensen, Saeema

    2010-01-01

    This paper aims to characterize the information needed to perform methods for robustness and reliability, and verify their applicability to early design stages. Several methods were evaluated on their support to synthesis in engineering design. Of those methods, FMEA, FTA and HAZOP were selected...

  19. Rapid quality assurance with requirements smells

    OpenAIRE

    Femmer, Henning; Méndez Fernández, Daniel; Wagner, Stefan; Eder, Sebastian

    2016-01-01

    Context: Bad requirements quality can cause expensive consequences during the software development lifecycle, especially if iterations are long and feedback comes late. Objectives: We aim at a light-weight static requirements analysis approach that allows for rapid checks immediately when requirements are written down. Method: We transfer the concept of code smells to Requirements Engineering as Requirements Smells. To evaluate the benefits and limitations, we define Requirements Smells, real...

  20. Medicinal Chemistry Projects Requiring Imaginative Structure-Based Drug Design Methods.

    Science.gov (United States)

    Moitessier, Nicolas; Pottel, Joshua; Therrien, Eric; Englebienne, Pablo; Liu, Zhaomin; Tomberg, Anna; Corbeil, Christopher R

    2016-09-20

    Computational methods for docking small molecules to proteins are prominent in drug discovery. There are hundreds, if not thousands, of documented examples-and several pertinent cases within our research program. Fifteen years ago, our first docking-guided drug design project yielded nanomolar metalloproteinase inhibitors and illustrated the potential of structure-based drug design. Subsequent applications of docking programs to the design of integrin antagonists, BACE-1 inhibitors, and aminoglycosides binding to bacterial RNA demonstrated that available docking programs needed significant improvement. At that time, docking programs primarily considered flexible ligands and rigid proteins. We demonstrated that accounting for protein flexibility, employing displaceable water molecules, and using ligand-based pharmacophores improved the docking accuracy of existing methods-enabling the design of bioactive molecules. The success prompted the development of our own program, Fitted, implementing all of these aspects. The primary motivation has always been to respond to the needs of drug design studies; the majority of the concepts behind the evolution of Fitted are rooted in medicinal chemistry projects and collaborations. Several examples follow: (1) Searching for HDAC inhibitors led us to develop methods considering drug-zinc coordination and its effect on the pKa of surrounding residues. (2) Targeting covalent prolyl oligopeptidase (POP) inhibitors prompted an update to Fitted to identify reactive groups and form bonds with a given residue (e.g., a catalytic residue) when the geometry allows it. Fitted-the first fully automated covalent docking program-was successfully applied to the discovery of four new classes of covalent POP inhibitors. As a result, efficient stereoselective syntheses of a few screening hits were prioritized rather than synthesizing large chemical libraries-yielding nanomolar inhibitors. (3) In order to study the metabolism of POP inhibitors by

  1. Correction of the significance level when attempting multiple transformations of an explanatory variable in generalized linear models

    Science.gov (United States)

    2013-01-01

    Background In statistical modeling, finding the most favorable coding for an exploratory quantitative variable involves many tests. This process involves multiple testing problems and requires the correction of the significance level. Methods For each coding, a test on the nullity of the coefficient associated with the new coded variable is computed. The selected coding corresponds to that associated with the largest statistical test (or equivalently the smallest pvalue). In the context of the Generalized Linear Model, Liquet and Commenges (Stat Probability Lett,71:33–38,2005) proposed an asymptotic correction of the significance level. This procedure, based on the score test, has been developed for dichotomous and Box-Cox transformations. In this paper, we suggest the use of resampling methods to estimate the significance level for categorical transformations with more than two levels and, by definition those that involve more than one parameter in the model. The categorical transformation is a more flexible way to explore the unknown shape of the effect between an explanatory and a dependent variable. Results The simulations we ran in this study showed good performances of the proposed methods. These methods were illustrated using the data from a study of the relationship between cholesterol and dementia. Conclusion The algorithms were implemented using R, and the associated CPMCGLM R package is available on the CRAN. PMID:23758852

  2. SRS Process Facility Significance Fire Frequency

    Energy Technology Data Exchange (ETDEWEB)

    Sarrack, A.G. [Westinghouse Savannah River Company, AIKEN, SC (United States)

    1995-10-01

    This report documents the method and assumptions of a study performed to determine a site generic process facility significant fire initiator frequency and explains the proper way this value should be used.

  3. SRS Process Facility Significance Fire Frequency

    International Nuclear Information System (INIS)

    Sarrack, A.G.

    1995-10-01

    This report documents the method and assumptions of a study performed to determine a site generic process facility significant fire initiator frequency and explains the proper way this value should be used

  4. Grading of quality assurance requirements

    International Nuclear Information System (INIS)

    1991-01-01

    The present Manual provides guidance and illustrative examples for applying a method by which graded quality assurance requirements may be determined and adapted to the items and services of a nuclear power plant in conformance with the requirements of the IAEA Nuclear Safety Standards (NUSS) Code and Safety Guides on quality assurance. The Manual replaces the previous publication IAEA-TECDOC-303 on the same subject. Various methods of grading quality assurance are available in a number of Member States. During the development of the present Manual it was not considered practical to attempt to resolve the differences between those methods and it was preferred to identify and benefit from the good practices available in all the methods. The method presented in this Manual deals with the aspects of management, documentation, control, verification and administration which affect quality. 1 fig., 4 tabs

  5. Trading Robustness Requirements in Mars Entry Trajectory Design

    Science.gov (United States)

    Lafleur, Jarret M.

    2009-01-01

    One of the most important metrics characterizing an atmospheric entry trajectory in preliminary design is the size of its predicted landing ellipse. Often, requirements for this ellipse are set early in design and significantly influence both the expected scientific return from a particular mission and the cost of development. Requirements typically specify a certain probability level (6-level) for the prescribed ellipse, and frequently this latter requirement is taken at 36. However, searches for the justification of 36 as a robustness requirement suggest it is an empirical rule of thumb borrowed from non-aerospace fields. This paper presents an investigation into the sensitivity of trajectory performance to varying robustness (6-level) requirements. The treatment of robustness as a distinct objective is discussed, and an analysis framework is presented involving the manipulation of design variables to effect trades between performance and robustness objectives. The scenario for which this method is illustrated is the ballistic entry of an MSL-class Mars entry vehicle. Here, the design variable is entry flight path angle, and objectives are parachute deploy altitude performance and error ellipse robustness. Resulting plots show the sensitivities between these objectives and trends in the entry flight path angles required to design to these objectives. Relevance to the trajectory designer is discussed, as are potential steps for further development and use of this type of analysis.

  6. A framework for regulatory requirements and industry standards for new nuclear power plants

    International Nuclear Information System (INIS)

    Duran, Felicia A.; Camp, Allen L.; Apostolakis, George E.; Golay, Michael W.

    2000-01-01

    This paper summarizes the development of a framework for risk-based regulation and design for new nuclear power plants. Probabilistic risk assessment methods and a rationalist approach to defense in depth are used to develop a framework that can be applied to identify systematically the regulations and standards required to maintain the desired level of safety and reliability. By implementing such a framework, it is expected that the resulting body of requirements will provide a regulatory environment that will ensure protection of the public, will eliminate the burden of requirements that do not contribute significantly to safety, and thereby will improve the market competitiveness of new plants. (author)

  7. Planning for Site Transition to Long-Term Stewardship: Identification of Requirements and Issues

    International Nuclear Information System (INIS)

    Banaee, J.

    2002-01-01

    A systematic methodology is presented and applied for the identification of requirements and issues pertaining to the planning for, and transition to, long term stewardship (LTS). The method has been applied to three of the twelve identified LTS functions. The results of the application of the methodology to contaminated and uncontaminated federal real property in those three functions are presented. The issues that could be seen as impediments to the implementation of LTS are also identified for the three areas under consideration. The identified requirements are significant and in some cases complex to implement. It is clear that early and careful planning is required in all circumstances

  8. Planning for Site Transition to Long-Term Stewardship: Identification of Requirements and Issues

    Energy Technology Data Exchange (ETDEWEB)

    Banaee, J.

    2002-05-16

    A systematic methodology is presented and applied for the identification of requirements and issues pertaining to the planning for, and transition to, long term stewardship (LTS). The method has been applied to three of the twelve identified LTS functions. The results of the application of the methodology to contaminated and uncontaminated federal real property in those three functions are presented. The issues that could be seen as impediments to the implementation of LTS are also identified for the three areas under consideration. The identified requirements are significant and in some cases complex to implement. It is clear that early and careful planning is required in all circumstances.

  9. Planning for Site Transition to Long-Term Stewardship: Identification of Requirements and Issues

    Energy Technology Data Exchange (ETDEWEB)

    Banaee, Jila

    2002-08-01

    A systematic methodology is presented and applied for the identification of requirements and issues pertaining to the planning for, and transition to, long term stewardship (LTS). The method has been applied to three of the twelve identified LTS functions. The results of the application of the methodology to contaminated and uncontaminated federal real property in those three functions are presented. The issues that could be seen as impediments to the implementation of LTS are also identified for the three areas under consideration. The identified requirements are significant and in some cases complex to implement. It is clear that early and careful planning is required in all circumstances.

  10. Requirements Engineering for Pervasive Services

    NARCIS (Netherlands)

    Kolos, L.; Poulisse, Gert-Jan; van Eck, Pascal; Videira lopes, C.; Schaefer, S.; Clarke, S.; Elrad, T.; Jahnke, J.

    2005-01-01

    Developing pervasive mobile services for a mass market of end customers entails large up-front investments and therefore a good understanding of customer requirements is of paramount importance. This paper presents an approach for developing requirements engineering method that takes distinguishing

  11. 2 CFR 801.437 - What method do I use to communicate to a participant the requirements described in the OMB...

    Science.gov (United States)

    2010-01-01

    ... 2 Grants and Agreements 1 2010-01-01 2010-01-01 false What method do I use to communicate to a participant the requirements described in the OMB guidance at 2 CFR 180.435? 801.437 Section 801.437 Grants... NONPROCUREMENT DEBARMENT AND SUSPENSION Responsibilities of Federal Agency Officials Regarding Transactions § 801...

  12. 40 CFR 52.21 - Prevention of significant deterioration of air quality.

    Science.gov (United States)

    2010-07-01

    ..., and charcoal production plants; (b) Notwithstanding the stationary source size specified in paragraph... particular change begins. (c) It has approximately the same qualitative significance for public health and... be required to meet the data acquisition and availability requirements of this section, to sample...

  13. Understand the Design Requirement in Companies

    DEFF Research Database (Denmark)

    Li, Xuemeng; Ahmed-Kristensen, Saeema

    2015-01-01

    requirements can lead to inappropriate products (Hall, et al., 2002). Understanding the nature of design requirements and the sources, from where they can or should be generated, is critical to before developing methods and processes to support this process. Requirement Engineering research, originated from...

  14. 49 CFR 383.111 - Required knowledge.

    Science.gov (United States)

    2010-10-01

    ... importance of proper visual search, and proper visual search methods. (6) Communication. The principles and procedures for proper communications and the hazards of failure to signal properly. (7) Speed management. The... STANDARDS; REQUIREMENTS AND PENALTIES Required Knowledge and Skills § 383.111 Required knowledge. All...

  15. Prevalence and clinical significance of extravascular incidental findings in patients undergoing CT cervico-cerebral angiography

    International Nuclear Information System (INIS)

    Crockett, Matthew Thomas; Murphy, Blathnaid; Smith, Jennifer; Kavanagh, Eoin Carl

    2015-01-01

    required acute intervention. In addition 19% of patients with highly clinically significant incidental findings did not receive appropriate follow up. Discussion: This study has demonstrated the presence of clinically important incidental findings in a significant proportion of patients undergoing CTCCA with a significant minority of these patients not receiving follow up. A standardised method of reporting incidental findings, such as that used in this paper, would aid radiologists and referring physicians in recording and communicating these findings

  16. Nuclear methods for tribology

    International Nuclear Information System (INIS)

    Racolta, P.M.

    1994-01-01

    The tribological field of activity is mainly concerned with the relative movement of different machine components, friction and wear phenomena and their dependence upon lubrication. Tribological studies on friction and wear processes are important because they lead to significant parameter-improvements of engineering tools and machinery components. A review of fundamental aspects of both friction and wear phenomena is presented. A number of radioindicator-based methods have been known for almost four decades, differing mainly with respect to the mode of introducing the radio-indicators into the machine part to be studied. All these methods briefly presented in this paper are based on the measurement of the activity of wear products and therefore require high activity levels of the part. For this reason, such determinations can be carried out only in special laboratories and under conditions which do not usually agree with the conditions of actual use. What is required is a sensitive, fast method allowing the determination of wear under any operating conditions, without the necessity of stopping and disassembling the machine. The above mentioned requirements are the features that have made the Thin Layer Activation technique (TLA) the most widely used method applied in wear and corrosion studies in the last two decades. The TLA principle, taking in account that wear and corrosion processes are characterised by a loss of material, consists in an ion beam irradiation of a well defined volume of a machine part subjected to wear. The radioactivity level changes can usually be measured by gamma-ray spectroscopy methods. A review of both main TLA fields of application in major laboratories abroad and of those performed at the U-120 cyclotron of I.P.N.E.-Bucharest together with the existing trends to extend other nuclear analytical methods to tribological studies is presented as well. (author). 25 refs., 6 figs., 2 tabs

  17. Development of modelling method selection tool for health services management: from problem structuring methods to modelling and simulation methods.

    Science.gov (United States)

    Jun, Gyuchan T; Morris, Zoe; Eldabi, Tillal; Harper, Paul; Naseer, Aisha; Patel, Brijesh; Clarkson, John P

    2011-05-19

    There is an increasing recognition that modelling and simulation can assist in the process of designing health care policies, strategies and operations. However, the current use is limited and answers to questions such as what methods to use and when remain somewhat underdeveloped. The aim of this study is to provide a mechanism for decision makers in health services planning and management to compare a broad range of modelling and simulation methods so that they can better select and use them or better commission relevant modelling and simulation work. This paper proposes a modelling and simulation method comparison and selection tool developed from a comprehensive literature review, the research team's extensive expertise and inputs from potential users. Twenty-eight different methods were identified, characterised by their relevance to different application areas, project life cycle stages, types of output and levels of insight, and four input resources required (time, money, knowledge and data). The characterisation is presented in matrix forms to allow quick comparison and selection. This paper also highlights significant knowledge gaps in the existing literature when assessing the applicability of particular approaches to health services management, where modelling and simulation skills are scarce let alone money and time. A modelling and simulation method comparison and selection tool is developed to assist with the selection of methods appropriate to supporting specific decision making processes. In particular it addresses the issue of which method is most appropriate to which specific health services management problem, what the user might expect to be obtained from the method, and what is required to use the method. In summary, we believe the tool adds value to the scarce existing literature on methods comparison and selection.

  18. Clinical significance of neonatal menstruation.

    Science.gov (United States)

    Brosens, Ivo; Benagiano, Giuseppe

    2016-01-01

    Past studies have clearly shown the existence of a spectrum of endometrial progesterone responses in neonatal endometrium, varying from proliferation to full decidualization with menstrual-like shedding. The bleedings represent, similar to what occurs in adult menstruation, a progesterone withdrawal bleeding. Today, the bleeding is completely neglected and considered an uneventful episode of no clinical significance. Yet clinical studies have linked the risk of bleeding to a series of events indicating fetal distress. The potential link between the progesterone response and major adolescent disorders requires to be investigated by prospective studies. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.

  19. Adolescent Immunization Coverage and Implementation of New School Requirements in Michigan, 2010

    Science.gov (United States)

    DeVita, Stefanie F.; Vranesich, Patricia A.; Boulton, Matthew L.

    2014-01-01

    Objectives. We examined the effect of Michigan’s new school rules and vaccine coadministration on time to completion of all the school-required vaccine series, the individual adolescent vaccines newly required for sixth grade in 2010, and initiation of the human papillomavirus (HPV) vaccine series, which was recommended but not required for girls. Methods. Data were derived from the Michigan Care Improvement Registry, a statewide Immunization Information System. We assessed the immunization status of Michigan children enrolled in sixth grade in 2009 or 2010. We used univariable and multivariable Cox regression models to identify significant associations between each factor and school completeness. Results. Enrollment in sixth grade in 2010 and coadministration of adolescent vaccines at the first adolescent visit were significantly associated with completion of the vaccines required for Michigan’s sixth graders. Children enrolled in sixth grade in 2010 had higher coverage with the newly required adolescent vaccines by age 13 years than did sixth graders in 2009, but there was little difference in the rate of HPV vaccine initiation among girls. Conclusions. Education and outreach efforts, particularly regarding the importance and benefits of coadministration of all recommended vaccines in adolescents, should be directed toward health care providers, parents, and adolescents. PMID:24922144

  20. Bayesian mixture modeling of significant p values: A meta-analytic method to estimate the degree of contamination from H₀.

    Science.gov (United States)

    Gronau, Quentin Frederik; Duizer, Monique; Bakker, Marjan; Wagenmakers, Eric-Jan

    2017-09-01

    Publication bias and questionable research practices have long been known to corrupt the published record. One method to assess the extent of this corruption is to examine the meta-analytic collection of significant p values, the so-called p -curve (Simonsohn, Nelson, & Simmons, 2014a). Inspired by statistical research on false-discovery rates, we propose a Bayesian mixture model analysis of the p -curve. Our mixture model assumes that significant p values arise either from the null-hypothesis H ₀ (when their distribution is uniform) or from the alternative hypothesis H1 (when their distribution is accounted for by a simple parametric model). The mixture model estimates the proportion of significant results that originate from H ₀, but it also estimates the probability that each specific p value originates from H ₀. We apply our model to 2 examples. The first concerns the set of 587 significant p values for all t tests published in the 2007 volumes of Psychonomic Bulletin & Review and the Journal of Experimental Psychology: Learning, Memory, and Cognition; the mixture model reveals that p values higher than about .005 are more likely to stem from H ₀ than from H ₁. The second example concerns 159 significant p values from studies on social priming and 130 from yoked control studies. The results from the yoked controls confirm the findings from the first example, whereas the results from the social priming studies are difficult to interpret because they are sensitive to the prior specification. To maximize accessibility, we provide a web application that allows researchers to apply the mixture model to any set of significant p values. (PsycINFO Database Record (c) 2017 APA, all rights reserved).

  1. Advanced Testing Method for Ground Thermal Conductivity

    Energy Technology Data Exchange (ETDEWEB)

    Liu, Xiaobing [ORNL; Clemenzi, Rick [Geothermal Design Center Inc.; Liu, Su [University of Tennessee (UT)

    2017-04-01

    A new method is developed that can quickly and more accurately determine the effective ground thermal conductivity (GTC) based on thermal response test (TRT) results. Ground thermal conductivity is an important parameter for sizing ground heat exchangers (GHEXs) used by geothermal heat pump systems. The conventional GTC test method usually requires a TRT for 48 hours with a very stable electric power supply throughout the entire test. In contrast, the new method reduces the required test time by 40%–60% or more, and it can determine GTC even with an unstable or intermittent power supply. Consequently, it can significantly reduce the cost of GTC testing and increase its use, which will enable optimal design of geothermal heat pump systems. Further, this new method provides more information about the thermal properties of the GHEX and the ground than previous techniques. It can verify the installation quality of GHEXs and has the potential, if developed, to characterize the heterogeneous thermal properties of the ground formation surrounding the GHEXs.

  2. The JPL functional requirements tool

    Science.gov (United States)

    Giffin, Geoff; Skinner, Judith; Stoller, Richard

    1987-01-01

    Planetary spacecraft are complex vehicles which are built according to many thousands of requirements. Problems encountered in documenting and maintaining these requirements led to the current attempt to reduce or eliminate these problems by a computer automated data base Functional Requirements Tool. The tool developed at JPL and in use on several JPL Projects is described. The organization and functionality of the Tool, together with an explanation of the data base inputs, their relationships, and use are presented. Methods of interfacing with external documents, representation of tables and figures, and methods of approval and change processing are discussed. The options available for disseminating information from the Tool are identified. The implementation of the Requirements Tool is outlined, and the operation is summarized. The conclusions drawn from this work is that the Requirements Tool represents a useful addition to the System Engineer's Tool kit, it is not currently available elsewhere, and a clear development path exists to expand the capabilities of the Tool to serve larger and more complex projects.

  3. Advanced Extravehicular Activity Pressure Garment Requirements Development

    Science.gov (United States)

    Ross, Amy

    2014-01-01

    The NASA Johnson Space Center advanced pressure garment technology development team is addressing requirements development for exploration missions. Lessons learned from the Z-2 high fidelity prototype development have reiterated that clear low-level requirements and verification methods reduce risk to the government, improve efficiency in pressure garment design efforts, and enable the government to be a smart buyer. The expectation is to provide requirements at the specification level that are validated so that their impact on pressure garment design is understood. Additionally, the team will provide defined verification protocols for the requirements. However, in reviewing exploration space suit high level requirements there are several gaps in the team's ability to define and verify related lower level requirements. This paper addresses the efforts in requirement areas such as mobility/fit/comfort and environmental protection (dust, radiation, plasma, secondary impacts) to determine the by what method the requirements can be defined and use of those methods for verification. Gaps exist at various stages. In some cases component level work is underway, but no system level effort has begun, in other cases no effort has been initiated to close the gap. Status of ongoing efforts and potential approaches to open gaps are discussed.

  4. Spectrum analysis on quality requirements consideration in software design documents.

    Science.gov (United States)

    Kaiya, Haruhiko; Umemura, Masahiro; Ogata, Shinpei; Kaijiri, Kenji

    2013-12-01

    Software quality requirements defined in the requirements analysis stage should be implemented in the final products, such as source codes and system deployment. To guarantee this meta-requirement, quality requirements should be considered in the intermediate stages, such as the design stage or the architectural definition stage. We propose a novel method for checking whether quality requirements are considered in the design stage. In this method, a technique called "spectrum analysis for quality requirements" is applied not only to requirements specifications but also to design documents. The technique enables us to derive the spectrum of a document, and quality requirements considerations in the document are numerically represented in the spectrum. We can thus objectively identify whether the considerations of quality requirements in a requirements document are adapted to its design document. To validate the method, we applied it to commercial software systems with the help of a supporting tool, and we confirmed that the method worked well.

  5. The finite element method its basis and fundamentals

    CERN Document Server

    Zienkiewicz, Olek C; Zhu, JZ

    2013-01-01

    The Finite Element Method: Its Basis and Fundamentals offers a complete introduction to the basis of the finite element method, covering fundamental theory and worked examples in the detail required for readers to apply the knowledge to their own engineering problems and understand more advanced applications. This edition sees a significant rearrangement of the book's content to enable clearer development of the finite element method, with major new chapters and sections added to cover: Weak forms Variational forms Multi-dimensional field prob

  6. A novel String Banana Template Method for Tracks Reconstruction in High Multiplicity Events with significant Multiple Scattering and its Firmware Implementation

    CERN Document Server

    Kulinich, P; Krylov, V

    2004-01-01

    Novel String Banana Template Method (SBTM) for track reconstruction in difficult conditions is proposed and implemented for off-line analysis of relativistic heavy ion collision events. The main idea of the method is in use of features of ensembles of tracks selected by 3-fold coincidence. Two steps model of track is used: the first one - averaged over selected ensemble and the second - per event dependent. It takes into account Multiple Scattering (MS) for this particular track. SBTM relies on use of stored templates generated by precise Monte Carlo simulation, so it's more time efficient for the case of 2D spectrometer. All data required for track reconstruction in such difficult conditions could be prepared in convenient format for fast use. Its template based nature and the fact that the SBTM track model is actually very close to the hits implies that it can be implemented in a firmware processor. In this report a block diagram of firmware based pre-processor for track reconstruction in CMS-like Si tracke...

  7. Toward a risk assessment of the spent fuel and high-level nuclear waste disposal system. Risk assessment requirements, literature review, methods evaluation: an interim report

    Energy Technology Data Exchange (ETDEWEB)

    Hamilton, L.D.; Hill, D.; Rowe, M.D.; Stern, E.

    1986-04-01

    This report provides background information for a risk assessment of the disposal system for spent nuclear fuel and high-level radioactive waste (HLW). It contains a literature review, a survey of the statutory requirements for risk assessment, and a preliminary evaluation of methods. The literature review outlines the state of knowledge of risk assessment and accident consequence analysis in the nuclear fuel cycle and its applicability to spent fuel and HLW disposal. The survey of statutory requirements determines the extent to which risk assessment may be needed in development of the waste-disposal system. The evaluation of methods reviews and evaluates merits and applicabilities of alternative methods for assessing risks and relates them to the problems of spent fuel and HLW disposal. 99 refs.

  8. Toward a risk assessment of the spent fuel and high-level nuclear waste disposal system. Risk assessment requirements, literature review, methods evaluation: an interim report

    International Nuclear Information System (INIS)

    Hamilton, L.D.; Hill, D.; Rowe, M.D.; Stern, E.

    1986-04-01

    This report provides background information for a risk assessment of the disposal system for spent nuclear fuel and high-level radioactive waste (HLW). It contains a literature review, a survey of the statutory requirements for risk assessment, and a preliminary evaluation of methods. The literature review outlines the state of knowledge of risk assessment and accident consequence analysis in the nuclear fuel cycle and its applicability to spent fuel and HLW disposal. The survey of statutory requirements determines the extent to which risk assessment may be needed in development of the waste-disposal system. The evaluation of methods reviews and evaluates merits and applicabilities of alternative methods for assessing risks and relates them to the problems of spent fuel and HLW disposal. 99 refs

  9. Quantifying the Clinical Significance of Cannabis Withdrawal

    Science.gov (United States)

    Allsop, David J.; Copeland, Jan; Norberg, Melissa M.; Fu, Shanlin; Molnar, Anna; Lewis, John; Budney, Alan J.

    2012-01-01

    Background and Aims Questions over the clinical significance of cannabis withdrawal have hindered its inclusion as a discrete cannabis induced psychiatric condition in the Diagnostic and Statistical Manual of Mental Disorders (DSM IV). This study aims to quantify functional impairment to normal daily activities from cannabis withdrawal, and looks at the factors predicting functional impairment. In addition the study tests the influence of functional impairment from cannabis withdrawal on cannabis use during and after an abstinence attempt. Methods and Results A volunteer sample of 49 non-treatment seeking cannabis users who met DSM-IV criteria for dependence provided daily withdrawal-related functional impairment scores during a one-week baseline phase and two weeks of monitored abstinence from cannabis with a one month follow up. Functional impairment from withdrawal symptoms was strongly associated with symptom severity (p = 0.0001). Participants with more severe cannabis dependence before the abstinence attempt reported greater functional impairment from cannabis withdrawal (p = 0.03). Relapse to cannabis use during the abstinence period was associated with greater functional impairment from a subset of withdrawal symptoms in high dependence users. Higher levels of functional impairment during the abstinence attempt predicted higher levels of cannabis use at one month follow up (p = 0.001). Conclusions Cannabis withdrawal is clinically significant because it is associated with functional impairment to normal daily activities, as well as relapse to cannabis use. Sample size in the relapse group was small and the use of a non-treatment seeking population requires findings to be replicated in clinical samples. Tailoring treatments to target withdrawal symptoms contributing to functional impairment during a quit attempt may improve treatment outcomes. PMID:23049760

  10. The clinical significance of computerized axial tomography (CAT) in consideration of conventional diagnostic methods

    International Nuclear Information System (INIS)

    Huenig, R.

    1976-01-01

    Regarding CAT of the intracranial region, the article informs on a) techniques of examination including the production of normal structures, b) the recognizable pathological changes, c) possibilities of enhancement, d) possibilities of course observation, e) limitations of the methods, as well as on f) risk/benefit aspects g) benefit/cost calculations as compared to conventional methods, and on h) the influence of CAT on the frequency of conventional methods of examination. Regarding CAT of the extracranial region, the information available up to the meeting is reported on. (orig./LH) [de

  11. Gas Analysis and Control Methods for Thermal Batteries

    Science.gov (United States)

    2013-09-01

    when using highly efficient microporous thermal insulation packages. An easily implemented method of H2 gas removal from vendor thermal batteries is... microporous thermal insulation packages (1, 4, 5) or reduce volume requirements significantly. More rigorous gas control methods combined with...measured from the DCM pressures and known internal volumes of the 3 GHS that were measured using the ideal gas law with a 10-cc internal volume SS

  12. Evaluation of a method to determine the myocardial uptake from 123I-BMIPP myocardial SPECT and its significance

    International Nuclear Information System (INIS)

    Iwase, Mikio; Toriyama, Takayuki; Itou, Masato; Shimao, Ryuichiro; Ikeda, Koshiro; Suzuki, Takeshi; Nobuta, Takaaki; Iida, Akihiko.

    1996-01-01

    We examined methods of calculating myocardial uptake (TU) of 123 I-BMIPP by SPECT, and compared TU to heart function (ejection fraction (EF), cardiac output (CO), cardiac index (CI)) calculated by left ventriculography. Forty-two patients with acute myocardial infarction were classified into 5 groups; within 1 week (I), from 1 to 2 weeks (II), from 2 weeks to 1.5 months (III), from 1.5 to 3 months (IV) and more than 3 months (V) after percutaneous transluminal coronary angioplasty (PTCA). Chest depth (Tw) was calculated by measuring the thoracic absorption rate of 123 I. In calculating TU, the myocardial count was calculated from short-axis tomograms, and then absorption was corrected using Tw to calculate each value on early-phase image (E) and delay-phase image (D). The influence of lung uptake on myocardial count was only 1.76%. When TU was compared to heart function, there were correlations between group I and group V. Especially in group VD-TU was a significantly correlated with heart function. In heart function CI, but not EF nor CO, was significantly correlated with TU. It was suggested that the correlation between TU and heart function reflected the infarct condition before PTCA in group I, and that the individual difference in recovery of fatty acid metabolism in group V. The significant correlation between D-TU and CI suggests that D-TU reflects heart function and fatty acid metabolism, although TU is influenced by differences in physical status. (author)

  13. Probabilistic Requirements (Partial) Verification Methods Best Practices Improvement. Variables Acceptance Sampling Calculators: Derivations and Verification of Plans. Volume 1

    Science.gov (United States)

    Johnson, Kenneth L.; White, K, Preston, Jr.

    2012-01-01

    The NASA Engineering and Safety Center was requested to improve on the Best Practices document produced for the NESC assessment, Verification of Probabilistic Requirements for the Constellation Program, by giving a recommended procedure for using acceptance sampling by variables techniques. This recommended procedure would be used as an alternative to the potentially resource-intensive acceptance sampling by attributes method given in the document. This document contains the outcome of the assessment.

  14. Theoretical Significance in Q Methodology: A Qualitative Approach to a Mixed Method

    Science.gov (United States)

    Ramlo, Susan

    2015-01-01

    Q methodology (Q) has offered researchers a unique scientific measure of subjectivity since William Stephenson's first article in 1935. Q's focus on subjectivity includes self-referential meaning and interpretation. Q is most often identified with its technique (Q-sort) and its method (factor analysis to group people); yet, it consists of a…

  15. Requirements in engineering projects

    CERN Document Server

    Fernandes, João M

    2016-01-01

    This book focuses on various topics related to engineering and management of requirements, in particular elicitation, negotiation, prioritisation, and documentation (whether with natural languages or with graphical models). The book provides methods and techniques that help to characterise, in a systematic manner, the requirements of the intended engineering system.  It was written with the goal of being adopted as the main text for courses on requirements engineering, or as a strong reference to the topics of requirements in courses with a broader scope. It can also be used in vocational courses, for professionals interested in the software and information systems domain.   Readers who have finished this book will be able to: - establish and plan a requirements engineering process within the development of complex engineering systems; - define and identify the types of relevant requirements in engineering projects; - choose and apply the most appropriate techniques to elicit the requirements of a giv...

  16. [Efficacy of the keyword mnemonic method in adults].

    Science.gov (United States)

    Campos, Alfredo; Pérez-Fabello, María José; Camino, Estefanía

    2010-11-01

    Two experiments were used to assess the efficacy of the keyword mnemonic method in adults. In Experiment 1, immediate and delayed recall (at a one-day interval) were assessed by comparing the results obtained by a group of adults using the keyword mnemonic method in contrast to a group using the repetition method. The mean age of the sample under study was 59.35 years. Subjects were required to learn a list of 16 words translated from Latin into Spanish. Participants who used keyword mnemonics that had been devised by other experimental participants of the same characteristics, obtained significantly higher immediate and delayed recall scores than participants in the repetition method. In Experiment 2, other participants had to learn a list of 24 Latin words translated into Spanish by using the keyword mnemonic method reinforced with pictures. Immediate and delayed recall were significantly greater in the keyword mnemonic method group than in the repetition method group.

  17. System and Method for Multi-Wavelength Optical Signal Detection

    Science.gov (United States)

    McGlone, Thomas D. (Inventor)

    2017-01-01

    The system and method for multi-wavelength optical signal detection enables the detection of optical signal levels significantly below those processed at the discrete circuit level by the use of mixed-signal processing methods implemented with integrated circuit technologies. The present invention is configured to detect and process small signals, which enables the reduction of the optical power required to stimulate detection networks, and lowers the required laser power to make specific measurements. The present invention provides an adaptation of active pixel networks combined with mixed-signal processing methods to provide an integer representation of the received signal as an output. The present invention also provides multi-wavelength laser detection circuits for use in various systems, such as a differential absorption light detection and ranging system.

  18. Detection of significant protein coevolution.

    Science.gov (United States)

    Ochoa, David; Juan, David; Valencia, Alfonso; Pazos, Florencio

    2015-07-01

    The evolution of proteins cannot be fully understood without taking into account the coevolutionary linkages entangling them. From a practical point of view, coevolution between protein families has been used as a way of detecting protein interactions and functional relationships from genomic information. The most common approach to inferring protein coevolution involves the quantification of phylogenetic tree similarity using a family of methodologies termed mirrortree. In spite of their success, a fundamental problem of these approaches is the lack of an adequate statistical framework to assess the significance of a given coevolutionary score (tree similarity). As a consequence, a number of ad hoc filters and arbitrary thresholds are required in an attempt to obtain a final set of confident coevolutionary signals. In this work, we developed a method for associating confidence estimators (P values) to the tree-similarity scores, using a null model specifically designed for the tree comparison problem. We show how this approach largely improves the quality and coverage (number of pairs that can be evaluated) of the detected coevolution in all the stages of the mirrortree workflow, independently of the starting genomic information. This not only leads to a better understanding of protein coevolution and its biological implications, but also to obtain a highly reliable and comprehensive network of predicted interactions, as well as information on the substructure of macromolecular complexes using only genomic information. The software and datasets used in this work are freely available at: http://csbg.cnb.csic.es/pMT/. pazos@cnb.csic.es Supplementary data are available at Bioinformatics online. © The Author 2015. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  19. The historical significance of oak

    Science.gov (United States)

    J. V. Thirgood

    1971-01-01

    A brief history of the importance of oak in Europe, contrasting the methods used in France and Britain to propagate the species and manage the forests for continued productivity. The significance of oak as a strategic resource during the sailing-ship era is stressed, and mention is made of the early development of oak management in North America. The international...

  20. Histological staining methods preparatory to laser capture microdissection significantly affect the integrity of the cellular RNA.

    Science.gov (United States)

    Wang, Hongyang; Owens, James D; Shih, Joanna H; Li, Ming-Chung; Bonner, Robert F; Mushinski, J Frederic

    2006-04-27

    Gene expression profiling by microarray analysis of cells enriched by laser capture microdissection (LCM) faces several technical challenges. Frozen sections yield higher quality RNA than paraffin-imbedded sections, but even with frozen sections, the staining methods used for histological identification of cells of interest could still damage the mRNA in the cells. To study the contribution of staining methods to degradation of results from gene expression profiling of LCM samples, we subjected pellets of the mouse plasma cell tumor cell line TEPC 1165 to direct RNA extraction and to parallel frozen sectioning for LCM and subsequent RNA extraction. We used microarray hybridization analysis to compare gene expression profiles of RNA from cell pellets with gene expression profiles of RNA from frozen sections that had been stained with hematoxylin and eosin (H&E), Nissl Stain (NS), and for immunofluorescence (IF) as well as with the plasma cell-revealing methyl green pyronin (MGP) stain. All RNAs were amplified with two rounds of T7-based in vitro transcription and analyzed by two-color expression analysis on 10-K cDNA microarrays. The MGP-stained samples showed the least introduction of mRNA loss, followed by H&E and immunofluorescence. Nissl staining was significantly more detrimental to gene expression profiles, presumably owing to an aqueous step in which RNA may have been damaged by endogenous or exogenous RNAases. RNA damage can occur during the staining steps preparatory to laser capture microdissection, with the consequence of loss of representation of certain genes in microarray hybridization analysis. Inclusion of RNAase inhibitor in aqueous staining solutions appears to be important in protecting RNA from loss of gene transcripts.

  1. Estimation of POL-iteration methods in fast running DNBR code

    Energy Technology Data Exchange (ETDEWEB)

    Kwon, Hyuk; Kim, S. J.; Seo, K. W.; Hwang, D. H. [KAERI, Daejeon (Korea, Republic of)

    2016-05-15

    In this study, various root finding methods are applied to the POL-iteration module in SCOMS and POLiteration efficiency is compared with reference method. On the base of these results, optimum algorithm of POL iteration is selected. The POL requires the iteration until present local power reach limit power. The process to search the limiting power is equivalent with a root finding of nonlinear equation. POL iteration process involved in online monitoring system used a variant bisection method that is the most robust algorithm to find the root of nonlinear equation. The method including the interval accelerating factor and escaping routine out of ill-posed condition assured the robustness of SCOMS system. POL iteration module in SCOMS shall satisfy the requirement which is a minimum calculation time. For this requirement of calculation time, non-iterative algorithm, few channel model, simple steam table are implemented into SCOMS to improve the calculation time. MDNBR evaluation at a given operating condition requires the DNBR calculation at all axial locations. An increasing of POL-iteration number increased a calculation load of SCOMS significantly. Therefore, calculation efficiency of SCOMS is strongly dependent on the POL iteration number. In case study, the iterations of the methods have a superlinear convergence for finding limiting power but Brent method shows a quardratic convergence speed. These methods are effective and better than the reference bisection algorithm.

  2. Significant Tsunami Events

    Science.gov (United States)

    Dunbar, P. K.; Furtney, M.; McLean, S. J.; Sweeney, A. D.

    2014-12-01

    Tsunamis have inflicted death and destruction on the coastlines of the world throughout history. The occurrence of tsunamis and the resulting effects have been collected and studied as far back as the second millennium B.C. The knowledge gained from cataloging and examining these events has led to significant changes in our understanding of tsunamis, tsunami sources, and methods to mitigate the effects of tsunamis. The most significant, not surprisingly, are often the most devastating, such as the 2011 Tohoku, Japan earthquake and tsunami. The goal of this poster is to give a brief overview of the occurrence of tsunamis and then focus specifically on several significant tsunamis. There are various criteria to determine the most significant tsunamis: the number of deaths, amount of damage, maximum runup height, had a major impact on tsunami science or policy, etc. As a result, descriptions will include some of the most costly (2011 Tohoku, Japan), the most deadly (2004 Sumatra, 1883 Krakatau), and the highest runup ever observed (1958 Lituya Bay, Alaska). The discovery of the Cascadia subduction zone as the source of the 1700 Japanese "Orphan" tsunami and a future tsunami threat to the U.S. northwest coast, contributed to the decision to form the U.S. National Tsunami Hazard Mitigation Program. The great Lisbon earthquake of 1755 marked the beginning of the modern era of seismology. Knowledge gained from the 1964 Alaska earthquake and tsunami helped confirm the theory of plate tectonics. The 1946 Alaska, 1952 Kuril Islands, 1960 Chile, 1964 Alaska, and the 2004 Banda Aceh, tsunamis all resulted in warning centers or systems being established.The data descriptions on this poster were extracted from NOAA's National Geophysical Data Center (NGDC) global historical tsunami database. Additional information about these tsunamis, as well as water level data can be found by accessing the NGDC website www.ngdc.noaa.gov/hazard/

  3. Illumination uniformity requirements for direct drive inertial confinement fusion

    International Nuclear Information System (INIS)

    Rothenberg, J.E.; Eimerl, D.; Key, M.H.; Weber, S.V.

    1995-01-01

    The requirements for laser uniformity are discussed in terms of the ell-mode spectrum. It is shown that the choice of smoothing methods can significantly alter this spectrum and that this choice should be made in the context of the target physics. Although two dimensional smoothing by spectral dispersion yields a high quality near field beam profile, it results in poor smoothing for low spatial frequency. The partially coherent light method (fiber smoothing) leads to superior smoothing at low spatial frequencies, but has very poor near field beam quality. As a result, it may be desirable to use partially coherent light during the driver pulse foot (at low intensity and when minimizing the laser imprint is critical) and smoothing by spectral dispersion during the main pulse

  4. AUTOMATIC SUMMARIZATION OF WEB FORUMS AS SOURCES OF PROFESSIONALLY SIGNIFICANT INFORMATION

    Directory of Open Access Journals (Sweden)

    K. I. Buraya

    2016-07-01

    Full Text Available Subject of Research.The competitive advantage of a modern specialist is the widest possible coverage of informationsources useful from the point of view of obtaining and acquisition of relevant professionally significant information. Among these sources professional web forums occupy a significant place. The paperconsiders the problem of automaticforum text summarization, i.e. identification ofthose fragments that contain professionally relevant information. Method.The research is based on statistical analysis of texts of forums by means of machine learning. Six web forums were selected for research considering aspects of technologies of various subject domains as their subject-matter. The marking of forums was carried out by an expert way. Using various methods of machine learning the models were designed reflecting functional communication between the estimated characteristics of PSI extraction quality and signs of posts. The cumulative NDCG metrics and its dispersion were used for an assessment of quality of models.Main Results. We have shown that an important role in an assessment of PSI extraction efficiency is played by requestcontext. The contexts of requestshave been selected,characteristic of PSI extraction, reflecting various interpretations of information needs of users, designated by terms relevance and informational content. The scales for their estimates have been designed corresponding to worldwide approaches. We have experimentally confirmed that results of the summarization of forums carried out by experts manually significantly depend on requestcontext. We have shown that in the general assessment of PSI extraction efficiency relevance is rather well described by a linear combination of features, and the informational content assessment already requires their nonlinear combination. At the same time at a relevance assessment the leading role is played by the features connected with keywords, and at an informational content

  5. 'Galileo Galilei-GG': design, requirements, error budget and significance of the ground prototype

    International Nuclear Information System (INIS)

    Nobili, A.M.; Bramanti, D.; Comandi, G.L.; Toncelli, R.; Polacco, E.; Chiofalo, M.L.

    2003-01-01

    'Galileo Galilei-GG' is a proposed experiment in low orbit around the Earth aiming to test the equivalence principle to the level of 1 part in 10 17 at room temperature. A unique feature of GG, which is pivotal to achieve high accuracy at room temperature, is fast rotation in supercritical regime around the symmetry axis of the test cylinders, with very weak coupling in the plane perpendicular to it. Another unique feature of GG is the possibility to fly 2 concentric pairs of test cylinders, the outer pair being made of the same material for detection of spurious effects. GG was originally designed for an equatorial orbit. The much lower launching cost for higher inclinations has made it worth redesigning the experiment for a sun-synchronous orbit. We report the main conclusions of this study, which confirms the feasibility of the original goal of the mission also at high inclination, and conclude by stressing the significance of the ground based prototype of the apparatus proposed for space

  6. TRACER - TRACING AND CONTROL OF ENGINEERING REQUIREMENTS

    Science.gov (United States)

    Turner, P. R.

    1994-01-01

    TRACER (Tracing and Control of Engineering Requirements) is a database/word processing system created to document and maintain the order of both requirements and descriptive material associated with an engineering project. A set of hierarchical documents are normally generated for a project whereby the requirements of the higher level documents levy requirements on the same level or lower level documents. Traditionally, the requirements are handled almost entirely by manual paper methods. The problem with a typical paper system, however, is that requirements written and changed continuously in different areas lead to misunderstandings and noncompliance. The purpose of TRACER is to automate the capture, tracing, reviewing, and managing of requirements for an engineering project. The engineering project still requires communications, negotiations, interactions, and iterations among people and organizations, but TRACER promotes succinct and precise identification and treatment of real requirements separate from the descriptive prose in a document. TRACER permits the documentation of an engineering project's requirements and progress in a logical, controllable, traceable manner. TRACER's attributes include the presentation of current requirements and status from any linked computer terminal and the ability to differentiate headers and descriptive material from the requirements. Related requirements can be linked and traced. The program also enables portions of documents to be printed, individual approval and release of requirements, and the tracing of requirements down into the equipment specification. Requirement "links" can be made "pending" and invisible to others until the pending link is made "binding". Individuals affected by linked requirements can be notified of significant changes with acknowledgement of the changes required. An unlimited number of documents can be created for a project and an ASCII import feature permits existing documents to be incorporated

  7. Significant Radionuclides Determination

    Energy Technology Data Exchange (ETDEWEB)

    Jo A. Ziegler

    2001-07-31

    The purpose of this calculation is to identify radionuclides that are significant to offsite doses from potential preclosure events for spent nuclear fuel (SNF) and high-level radioactive waste expected to be received at the potential Monitored Geologic Repository (MGR). In this calculation, high-level radioactive waste is included in references to DOE SNF. A previous document, ''DOE SNF DBE Offsite Dose Calculations'' (CRWMS M&O 1999b), calculated the source terms and offsite doses for Department of Energy (DOE) and Naval SNF for use in design basis event analyses. This calculation reproduces only DOE SNF work (i.e., no naval SNF work is included in this calculation) created in ''DOE SNF DBE Offsite Dose Calculations'' and expands the calculation to include DOE SNF expected to produce a high dose consequence (even though the quantity of the SNF is expected to be small) and SNF owned by commercial nuclear power producers. The calculation does not address any specific off-normal/DBE event scenarios for receiving, handling, or packaging of SNF. The results of this calculation are developed for comparative analysis to establish the important radionuclides and do not represent the final source terms to be used for license application. This calculation will be used as input to preclosure safety analyses and is performed in accordance with procedure AP-3.12Q, ''Calculations'', and is subject to the requirements of DOE/RW-0333P, ''Quality Assurance Requirements and Description'' (DOE 2000) as determined by the activity evaluation contained in ''Technical Work Plan for: Preclosure Safety Analysis, TWP-MGR-SE-000010'' (CRWMS M&O 2000b) in accordance with procedure AP-2.21Q, ''Quality Determinations and Planning for Scientific, Engineering, and Regulatory Compliance Activities''.

  8. 2 CFR 801.332 - What methods must I use to pass requirements down to participants at lower tiers with whom I...

    Science.gov (United States)

    2010-01-01

    ... 2 Grants and Agreements 1 2010-01-01 2010-01-01 false What methods must I use to pass requirements down to participants at lower tiers with whom I intend to do business? 801.332 Section 801.332 Grants... NONPROCUREMENT DEBARMENT AND SUSPENSION Responsibilities of Participants Regarding Transactions § 801.332 What...

  9. Some Findings Concerning Requirements in Agile Methodologies

    Science.gov (United States)

    Rodríguez, Pilar; Yagüe, Agustín; Alarcón, Pedro P.; Garbajosa, Juan

    Agile methods have appeared as an attractive alternative to conventional methodologies. These methods try to reduce the time to market and, indirectly, the cost of the product through flexible development and deep customer involvement. The processes related to requirements have been extensively studied in literature, in most cases in the frame of conventional methods. However, conclusions of conventional methodologies could not be necessarily valid for Agile; in some issues, conventional and Agile processes are radically different. As recent surveys report, inadequate project requirements is one of the most conflictive issues in agile approaches and better understanding about this is needed. This paper describes some findings concerning requirements activities in a project developed under an agile methodology. The project intended to evolve an existing product and, therefore, some background information was available. The major difficulties encountered were related to non-functional needs and management of requirements dependencies.

  10. Quality requirements for veterinary hematology analyzers in small animals-a survey about veterinary experts' requirements and objective evaluation of analyzer performance based on a meta-analysis of method validation studies: bench top hematology analyzer.

    Science.gov (United States)

    Cook, Andrea M; Moritz, Andreas; Freeman, Kathleen P; Bauer, Natali

    2016-09-01

    Scarce information exists about quality requirements and objective evaluation of performance of large veterinary bench top hematology analyzers. The study was aimed at comparing the observed total error (TEobs ) derived from meta-analysis of published method validation data to the total allowable error (TEa ) for veterinary hematology variables in small animals based on experts' opinions. Ideally, TEobs should be hematology analyzers (ADVIA 2120; Sysmex XT2000iV, and CellDyn 3500) was calculated based on method validation studies published between 2005 and 2013 (n = 4). The percent TEobs = 2 * CV (%) + bias (%). The CV was derived from published studies except for the ADVIA 2120 (internal data), and bias was estimated from the regression equation. A total of 41 veterinary experts (19 diplomates, 8 residents, 10 postgraduate students, 4 anonymous specialists) responded. The proposed range of TEa was wide, but generally ≤ 20%. The TEobs was < TEa for all variables and analyzers except for canine and feline HGB (high bias, low CV) and platelet counts (high bias, high CV). Overall, veterinary bench top analyzers fulfilled experts' requirements except for HGB due to method-related bias, and platelet counts due to known preanalytic/analytic issues. © 2016 American Society for Veterinary Clinical Pathology.

  11. Comparative study between EDXRF and ASTM E572 methods using two-way ANOVA

    Science.gov (United States)

    Krummenauer, A.; Veit, H. M.; Zoppas-Ferreira, J.

    2018-03-01

    Comparison with reference method is one of the necessary requirements for the validation of non-standard methods. This comparison was made using the experiment planning technique with two-way ANOVA. In ANOVA, the results obtained using the EDXRF method, to be validated, were compared with the results obtained using the ASTM E572-13 standard test method. Fisher's tests (F-test) were used to comparative study between of the elements: molybdenum, niobium, copper, nickel, manganese, chromium and vanadium. All F-tests of the elements indicate that the null hypothesis (Ho) has not been rejected. As a result, there is no significant difference between the methods compared. Therefore, according to this study, it is concluded that the EDXRF method was approved in this method comparison requirement.

  12. In Their Own Words: The Significance of Participant Perceptions in Assessing Entomology Citizen Science Learning Outcomes Using a Mixed Methods Approach.

    Science.gov (United States)

    Lynch, Louise I; Dauer, Jenny M; Babchuk, Wayne A; Heng-Moss, Tiffany; Golick, Doug

    2018-02-06

    A mixed methods study was used to transcend the traditional pre-, post-test approach of citizen science evaluative research by integrating adults' test scores with their perceptions. We assessed how contributory entomology citizen science affects participants' science self-efficacy, self-efficacy for environmental action, nature relatedness and attitude towards insects. Pre- and post-test score analyses from citizen scientists ( n = 28) and a control group ( n = 72) were coupled with interviews ( n = 11) about science experiences and entomological interactions during participation. Considering quantitative data alone, no statistically significant changes were evident in adults following participation in citizen science when compared to the control group. Citizen scientists' pre-test scores were significantly higher than the control group for self-efficacy for environmental action, nature relatedness and attitude towards insects. Interview data reveal a notable discrepancy between measured and perceived changes. In general, citizen scientists had an existing, long-term affinity for the natural world and perceived increases in their science self-efficacy, self-efficacy for environmental action, nature relatedness and attitude towards insects. Perceived influences may act independently of test scores. Scale instruments may not show impacts with variances in individual's prior knowledge and experiences. The value of mixed methods on citizen science program evaluation is discussed.

  13. In Their Own Words: The Significance of Participant Perceptions in Assessing Entomology Citizen Science Learning Outcomes Using a Mixed Methods Approach

    Science.gov (United States)

    Lynch, Louise I.; Dauer, Jenny M.; Babchuk, Wayne A.; Heng-Moss, Tiffany

    2018-01-01

    A mixed methods study was used to transcend the traditional pre-, post-test approach of citizen science evaluative research by integrating adults’ test scores with their perceptions. We assessed how contributory entomology citizen science affects participants’ science self-efficacy, self-efficacy for environmental action, nature relatedness and attitude towards insects. Pre- and post-test score analyses from citizen scientists (n = 28) and a control group (n = 72) were coupled with interviews (n = 11) about science experiences and entomological interactions during participation. Considering quantitative data alone, no statistically significant changes were evident in adults following participation in citizen science when compared to the control group. Citizen scientists’ pre-test scores were significantly higher than the control group for self-efficacy for environmental action, nature relatedness and attitude towards insects. Interview data reveal a notable discrepancy between measured and perceived changes. In general, citizen scientists had an existing, long-term affinity for the natural world and perceived increases in their science self-efficacy, self-efficacy for environmental action, nature relatedness and attitude towards insects. Perceived influences may act independently of test scores. Scale instruments may not show impacts with variances in individual’s prior knowledge and experiences. The value of mixed methods on citizen science program evaluation is discussed. PMID:29415522

  14. No actual measurement … was required: Maxwell and Cavendish's null method for the inverse square law of electrostatics.

    Science.gov (United States)

    Falconer, Isobel

    In 1877 James Clerk Maxwell and his student Donald MacAlister refined Henry Cavendish's 1773 null experiment demonstrating the absence of electricity inside a charged conductor. This null result was a mathematical prediction of the inverse square law of electrostatics, and both Cavendish and Maxwell took the experiment as verifying the law. However, Maxwell had already expressed absolute conviction in the law, based on results of Michael Faraday's. So, what was the value to him of repeating Cavendish's experiment? After assessing whether the law was as secure as he claimed, this paper explores its central importance to the electrical programme that Maxwell was pursuing. It traces the historical and conceptual re-orderings through which Maxwell established the law by constructing a tradition of null tests and asserting the superior accuracy of the method. Maxwell drew on his developing 'doctrine of method' to identify Cavendish's experiment as a member of a wider class of null methods. By doing so, he appealed to the null practices of telegraph engineers, diverted attention from the flawed logic of the method, and sought to localise issues around the mapping of numbers onto instrumental indications, on the grounds that 'no actual measurement … was required'. Copyright © 2017 Elsevier Ltd. All rights reserved.

  15. Margin Requirements and Equity Option Returns

    DEFF Research Database (Denmark)

    Hitzemann, Steffen; Hofmann, Michael; Uhrig-Homburg, Marliese

    In equity option markets, traders face margin requirements both for the options themselves and for hedging-related positions in the underlying stock market. We show that these requirements carry a significant margin premium in the cross-section of equity option returns. The sign of the margin...... premium depends on demand pressure: If end-users are on the long side of the market, option returns decrease with margins, while they increase otherwise. Our results are statistically and economically significant and robust to different margin specifications and various control variables. We explain our...... findings by a model of funding-constrained derivatives dealers that require compensation for satisfying end-users’ option demand....

  16. Margin Requirements and Equity Option Returns

    DEFF Research Database (Denmark)

    Hitzemann, Steffen; Hofmann, Michael; Uhrig-Homburg, Marliese

    In equity option markets, traders face margin requirements both for the options themselves and for hedging-related positions in the underlying stock market. We show that these requirements carry a significant "margin premium" in the cross-section of equity option returns. The sign of the margin...... premium depends on demand pressure: If end-users are on the long side of the market, option returns decrease with margins, while they increase otherwise. Our results are statistically and economically significant and robust to different margin specifications and various control variables. We explain our...... findings by a model of funding-constrained derivatives dealers that require compensation for satisfying end-users’ option demand....

  17. Grand Gulf-prioritization of regulatory requirements

    International Nuclear Information System (INIS)

    Meisner, M.J.

    1993-01-01

    As cost pressures mount, Grand Gulf nuclear station (GGNS) is relying increasingly on various prioritization approaches to implement, modify, eliminate, or defer regulatory requirements. Regulatory requirements can be prioritized through the use of three measures: (1) safety (or risk) significance; (2) cost; and (3) public policy (or political) significance. This paper summarizes GGNS' efforts to implement solutions to regulatory issues using these three prioritization schemes to preserve a balance between cost and safety benefit

  18. Retinal vascular speed prematurity requiring treatment.

    Science.gov (United States)

    Solans Pérez de Larraya, Ana M; Ortega Molina, José M; Fernández, José Uberos; Escudero Gómez, Júlia; Salgado Miranda, Andrés D; Chaves Samaniego, Maria J; García Serrano, José L

    2018-03-01

    To analyse the speed of temporal retinal vascularisation in preterm infants included in the screening programme for retinopathy of prematurity. A total of 185 premature infants were studied retrospectively between 2000 and 2017 in San Cecilio University Hospital of Granada, Spain. The method of binocular indirect ophthalmoscopy with indentation was used for the examination. The horizontal disc diameter was used as a unit of length. Speed of temporal retinal vascularisation (disc diameter/week) was calculated as the ratio between the extent of temporal retinal vascularisation (disc diameter) and the time in weeks. The weekly temporal retinal vascularisation (0-1.25 disc diameter/week, confidence interval) was significantly higher in no retinopathy of prematurity (0.73 ± 0.22 disc diameter/week) than in stage 1 retinopathy of prematurity (0.58 ± 0.22 disc diameter/week). It was also higher in stage 1 than in stages 2 (0.46 ± 0.14 disc diameter/week) and 3 of retinopathy of prematurity (0.36 ± 0.18 disc diameter/week). The rate of temporal retinal vascularisation (disc diameter/week) decreases when retinopathy of prematurity stage increases. The area under the receiver operating characteristic curve was 0.85 (95% confidence interval: 0.79-0.91) for retinopathy of prematurity requiring treatment versus not requiring treatment. The best discriminative cut-off point was a speed of retinal vascularisation prematurity may be required. However, before becoming a new standard of care for treatment, it requires careful documentation, with agreement between several ophthalmologists.

  19. The significance of employee biographics in explaining employability attributes

    Directory of Open Access Journals (Sweden)

    Jo-Anne Botha

    2017-12-01

    Full Text Available Background: Employability is the capacity of employees to acquire transferable competencies and individual capacities that enable them to adapt to, be innovative in and steer their own careers in a dynamic work environment. It is clear that employers would thus look for employees who are capable of proactive adjustment and action-oriented behaviours. Aim: The aim of the study was to determine whether significant differences exist in the employability attributes of individuals from different gender, race and age groups and if so, how should such a diverse workforce should be managed. Setting: This study was conducted at a distance education institution. The sample of respondents consisted of adult learners who are pursuing further distance learning studies in the economic and management sciences field in South Africa. Methods: Correlational and inferential statistical analyses were used. A stratified random sample of 1102 mainly black and female adult learners participated in the study. Results: The employability attributes framework identified three categories of employability: interpersonal, intrapersonal and career attributes. The research indicated that significant differences exist between gender, race and age groups with regard to employability. Male and female participants differed significantly with regard to entrepreneurial orientation, proactivity and career resilience. The various race groups differed considerably regarding cultural competence and sociability of individuals. Participants older than 50 years scored the highest on self-efficacy. Conclusion and implications: The findings of this research could ensure that previously disadvantaged individuals are not further marginalised because of a lack of employability attributes and that the required employability attributes can be cultivated to ensure advancement and success in the work place.

  20. Ethics Requirement Score: new tool for evaluating ethics in publications

    Science.gov (United States)

    dos Santos, Lígia Gabrielle; Fonseca, Ana Carolina da Costa e; Bica, Claudia Giuliano

    2014-01-01

    Objective To analyze ethical standards considered by health-related scientific journals, and to prepare the Ethics Requirement Score, a bibliometric index to be applied to scientific healthcare journals in order to evaluate criteria for ethics in scientific publication. Methods Journals related to healthcare selected by the Journal of Citation Reports™ 2010 database were considered as experimental units. Parameters related to publication ethics were analyzed for each journal. These parameters were acquired by analyzing the author’s guidelines or instructions in each journal website. The parameters considered were approval by an Internal Review Board, Declaration of Helsinki or Resolution 196/96, recommendations on plagiarism, need for application of Informed Consent Forms with the volunteers, declaration of confidentiality of patients, record in the database for clinical trials (if applicable), conflict of interest disclosure, and funding sources statement. Each item was analyzed considering their presence or absence. Result The foreign journals had a significantly higher Impact Factor than the Brazilian journals, however, no significant results were observed in relation to the Ethics Requirement Score. There was no correlation between the Ethics Requirement Score and the Impact Factor. Conclusion Although the Impact Factor of foreigner journals was considerably higher than that of the Brazilian publications, the results showed that the Impact Factor has no correlation with the proposed score. This allows us to state that the ethical requirements for publication in biomedical journals are not related to the comprehensiveness or scope of the journal. PMID:25628189

  1. Methods of evaluating SPECT images. The usefulness of the Matsuda`s method by the Patlak plot method in children

    Energy Technology Data Exchange (ETDEWEB)

    Takaishi, Yasuko [Nippon Medical School, Tokyo (Japan); Hashimoto, Kiyoshi; Fujino, Osamu [and others

    1998-11-01

    Single photon emission computed tomography (SPECT) is a tool to study cerebral blood flow (CBF) kinetics. There are three methods of evaluating SPECT images: visual, semi-quantitative (evaluation of the radioactivity ratio of the cerebral region to the cerebellum (R/CE) or to the thalamus (R/TH)) and quantitative (Matsuda`s method by Patlak plot method using {sup 99m}Tc-hexamethylpropylene amine oxime radionuclide angiography). We evaluated SPECT images by the quantitative method in 14 patients with neurological disorders and examined the correlation of the results to those obtained by the semi-quantitative method. There was no significant correlation between the R/CE or R/TH ratio and regional CBF except two regions. The evaluation by the semi-quantitative method may have been inappropriate, probably because the cerebellar or thalamic blood flow was not constant in each case. Evaluation by the quantitative method, on the other hand, seemed to be useful not only for the comparison of CBF among normal subjects, but also in the demonstration of progressive changes of CBF in the same case. The Matsuda`s method by the Patlak plot method is suitable for examination of children, since it dose not require aortic blood sampling. (author)

  2. Waste minimization in analytical methods

    International Nuclear Information System (INIS)

    Green, D.W.; Smith, L.L.; Crain, J.S.; Boparai, A.S.; Kiely, J.T.; Yaeger, J.S. Schilling, J.B.

    1995-01-01

    The US Department of Energy (DOE) will require a large number of waste characterizations over a multi-year period to accomplish the Department's goals in environmental restoration and waste management. Estimates vary, but two million analyses annually are expected. The waste generated by the analytical procedures used for characterizations is a significant source of new DOE waste. Success in reducing the volume of secondary waste and the costs of handling this waste would significantly decrease the overall cost of this DOE program. Selection of appropriate analytical methods depends on the intended use of the resultant data. It is not always necessary to use a high-powered analytical method, typically at higher cost, to obtain data needed to make decisions about waste management. Indeed, for samples taken from some heterogeneous systems, the meaning of high accuracy becomes clouded if the data generated are intended to measure a property of this system. Among the factors to be considered in selecting the analytical method are the lower limit of detection, accuracy, turnaround time, cost, reproducibility (precision), interferences, and simplicity. Occasionally, there must be tradeoffs among these factors to achieve the multiple goals of a characterization program. The purpose of the work described here is to add waste minimization to the list of characteristics to be considered. In this paper the authors present results of modifying analytical methods for waste characterization to reduce both the cost of analysis and volume of secondary wastes. Although tradeoffs may be required to minimize waste while still generating data of acceptable quality for the decision-making process, they have data demonstrating that wastes can be reduced in some cases without sacrificing accuracy or precision

  3. Histological staining methods preparatory to laser capture microdissection significantly affect the integrity of the cellular RNA

    Directory of Open Access Journals (Sweden)

    Li Ming-Chung

    2006-04-01

    Full Text Available Abstract Background Gene expression profiling by microarray analysis of cells enriched by laser capture microdissection (LCM faces several technical challenges. Frozen sections yield higher quality RNA than paraffin-imbedded sections, but even with frozen sections, the staining methods used for histological identification of cells of interest could still damage the mRNA in the cells. To study the contribution of staining methods to degradation of results from gene expression profiling of LCM samples, we subjected pellets of the mouse plasma cell tumor cell line TEPC 1165 to direct RNA extraction and to parallel frozen sectioning for LCM and subsequent RNA extraction. We used microarray hybridization analysis to compare gene expression profiles of RNA from cell pellets with gene expression profiles of RNA from frozen sections that had been stained with hematoxylin and eosin (H&E, Nissl Stain (NS, and for immunofluorescence (IF as well as with the plasma cell-revealing methyl green pyronin (MGP stain. All RNAs were amplified with two rounds of T7-based in vitro transcription and analyzed by two-color expression analysis on 10-K cDNA microarrays. Results The MGP-stained samples showed the least introduction of mRNA loss, followed by H&E and immunofluorescence. Nissl staining was significantly more detrimental to gene expression profiles, presumably owing to an aqueous step in which RNA may have been damaged by endogenous or exogenous RNAases. Conclusion RNA damage can occur during the staining steps preparatory to laser capture microdissection, with the consequence of loss of representation of certain genes in microarray hybridization analysis. Inclusion of RNAase inhibitor in aqueous staining solutions appears to be important in protecting RNA from loss of gene transcripts.

  4. Future US energy demands based upon traditional consumption patterns lead to requirements which significantly exceed domestic supply

    Science.gov (United States)

    1975-01-01

    Energy consumption in the United States has risen in response to both increasing population and to increasing levels of affluence. Depletion of domestic energy reserves requires consumption modulation, production of fossil fuels, more efficient conversion techniques, and large scale transitions to non-fossile fuel energy sources. Widening disparity between the wealthy and poor nations of the world contributes to trends that increase the likelihood of group action by the lesser developed countries to achieve political and economic goals. The formation of anticartel cartels is envisioned.

  5. Knowledge-based support system for requirement elaboration in design

    International Nuclear Information System (INIS)

    Furuta, Kazuo; Kondo, Shunsuke

    1994-01-01

    Design requirements are the seeds of every design activity, but elicitation and formalization of them are not easy tasks. This paper proposes a method to support designers in such requirement elaboration process with a computer. In this method the cognitive work space of designers is modeled by abstraction and structural hierarchies, and supporting functions of knowledge-based requirement elaboration, requirement classification and assessment of contentment status of requirements are provided on this framework. A prototype system was developed and tested using fast breeder reactor design. (author)

  6. Significance levels for studies with correlated test statistics.

    Science.gov (United States)

    Shi, Jianxin; Levinson, Douglas F; Whittemore, Alice S

    2008-07-01

    When testing large numbers of null hypotheses, one needs to assess the evidence against the global null hypothesis that none of the hypotheses is false. Such evidence typically is based on the test statistic of the largest magnitude, whose statistical significance is evaluated by permuting the sample units to simulate its null distribution. Efron (2007) has noted that correlation among the test statistics can induce substantial interstudy variation in the shapes of their histograms, which may cause misleading tail counts. Here, we show that permutation-based estimates of the overall significance level also can be misleading when the test statistics are correlated. We propose that such estimates be conditioned on a simple measure of the spread of the observed histogram, and we provide a method for obtaining conditional significance levels. We justify this conditioning using the conditionality principle described by Cox and Hinkley (1974). Application of the method to gene expression data illustrates the circumstances when conditional significance levels are needed.

  7. Engineering Requirements for crowds

    Directory of Open Access Journals (Sweden)

    Rogeiro Silva

    2015-12-01

    Full Text Available In the software project the interested parts are highly distributed and form numerous and heterogeneous groups, online or face, constituting what could be called crowds. The development of social applications and cloud computing and mobile has generated a marked increase in environments based requirements in crowds. Technical Requirements Engineering (RE traditional face these scalability issues, and require the co-presence of interested and engineers in joint meetings that can not be made in common physical environments. While different approaches have been introduced to partially automate RE in these contexts, still is required a multi-method approach to semi-automate all activities related to work with crowds. In this paper is propose an approach that integrates existing elicitation techniques and requirements analysis and is complemented by introducing new concepts. The information is collected through direct interaction and social collaboration, and through data mining techniques.

  8. Nutritional requirements of sheep, goats and cattle in warm climates: a meta-analysis.

    Science.gov (United States)

    Salah, N; Sauvant, D; Archimède, H

    2014-09-01

    The objective of the study was to update energy and protein requirements of growing sheep, goats and cattle in warm areas through a meta-analysis study of 590 publications. Requirements were expressed on metabolic live weight (MLW=LW0.75) and LW1 basis. The maintenance requirements for energy were 542.64 and 631.26 kJ ME/kg LW0.75 for small ruminants and cattle, respectively, and the difference was significant (Ptropical climate appeared to have higher ME requirements for maintenance relative to live weight (LW) compared with temperate climate ones and cattle. Maintenance requirements for protein were estimated via two approaches. For these two methods, the data in which retained nitrogen (RN) was used cover the same range of variability of observations. The regression of digestible CP intake (DCPI, g/kg LW0.75) against RN (g/kg LW0.75) indicated that DCP requirements are significantly higher in sheep (3.36 g/kg LW0.75) than in goats (2.38 g/kg LW0.75), with cattle intermediate (2.81 g/kg LW0.75), without any significant difference in the quantity of DCPI/g retained CP (RCP) (40.43). Regressing metabolisable protein (MP) or minimal digestible protein in the intestine (PDImin) against RCP showed that there was no difference between species and genotypes, neither for the intercept (maintenance=3.51 g/kg LW0.75 for sheep and goat v. 4.35 for cattle) nor for the slope (growth=0.60 g MP/g RCP). The regression of DCP against ADG showed that DCP requirements did not differ among species or genotypes. These new feeding standards are derived from a wider range of nutritional conditions compared with existing feeding standards as they are based on a larger database. The standards seem to be more appropriate for ruminants in warm and tropical climates around the world.

  9. Significance of local cerebral glucose utilization determined by the autoradiographic (/sup 14/C)deoxyglucose method in experimentally induced coma

    Energy Technology Data Exchange (ETDEWEB)

    Sakurada, O.; Kobayashi, M.; Ueno, H.; Ishii, S. (Juntendo Univ., Tokyo (Japan). School of Medicine)

    1982-01-01

    Bilateral lesions made in the midbrain reticular formation of the rat produced behavioral akinesia. These animals neither ate nor drank. EEGs of these animals usually showed high voltage slow waves at rest. Slight EEG arousal response was demonstrated by clapping, touching and pinching only in rats with moderate impairment. Concerning the rates of local cerebral glucose utilization (LCGU) measured by means of the autoradiographic (/sup 14/C) deoxyglucose method, 13 structures exhibited significant reductions in 28 gray structures examined when compared with sham operated rats. Lesions in the midbrain reticular formation resulted in reduction of LCGU in the neocortex, ventral nucleus of the thalamus, subthalamic nucleus, and medial and lateral geniculated bodies, mamillary body, septal nucleus and caudateputamen. Structures which did not show any significant change in LCGU were those related to the paleo and archi-cortices. These findings suggest the existence of two types of ascending activating systems. Administration of 30 mg/kg of pentobarbital reduced LCGU diffusely throughout the brain. When thyrotropin releasing hormone (TRH) was administered to rats with lesions in the midbrain reticular formation, reversal of the reduction of LCGU was observed in the dorsomedial nucleus of the thalamus and the mamillary body. Reversal of LCGU in the dorsomedial nucleus of thalamus was especially significant and its level exceeded the level of the sham control value. This suggests TRH might exert its function through the dorsomedial nucleus of the thalamus and mamillary body. When TRH was administered to rats treated with pentobarbital, significant reversal was observed in the following structures: the lateral and ventral nucleus of the thalamus, dentate gyrus, caudate-putamen, nucleus accumbens, pontine gray matter, and raphe nucleus.

  10. Applicabilities of ship emission reduction methods

    Energy Technology Data Exchange (ETDEWEB)

    Guleryuz, Adem [ARGEMAN Research Group, Marine Division (Turkey)], email: ademg@argeman.org; Kilic, Alper [Istanbul Technical University, Maritime Faculty, Marine Engineering Department (Turkey)], email: enviromarineacademic@yahoo.com

    2011-07-01

    Ships, with their high consumption of fossil fuels to power their engines, are significant air polluters. Emission reduction methods therefore need to be implemented and the aim of this paper is to assess the advantages and disadvantages of each emissions reduction method. Benefits of the different methods are compared, with their disadvantages and requirements, to determine the applicability of such solutions. The methods studied herein are direct water injection, humid air motor, sea water scrubbing, diesel particulate filter, selected catalytic reduction, design of engine components, exhaust gas recirculation and engine replacement. Results of the study showed that the usefulness of each emissions reduction method depends on the particular case and that an evaluation should be carried out for each ship. This study pointed out that methods to reduce ship emissions are available but that their applicability depends on each case.

  11. An effective technique for the software requirements analysis of NPP safety-critical systems, based on software inspection, requirements traceability, and formal specification

    International Nuclear Information System (INIS)

    Koo, Seo Ryong; Seong, Poong Hyun; Yoo, Junbeom; Cha, Sung Deok; Yoo, Yeong Jae

    2005-01-01

    A thorough requirements analysis is indispensable for developing and implementing safety-critical software systems such as nuclear power plant (NPP) software systems because a single error in the requirements can generate serious software faults. However, it is very difficult to completely analyze system requirements. In this paper, an effective technique for the software requirements analysis is suggested. For requirements verification and validation (V and V) tasks, our technique uses software inspection, requirement traceability, and formal specification with structural decomposition. Software inspection and requirements traceability analysis are widely considered the most effective software V and V methods. Although formal methods are also considered an effective V and V activity, they are difficult to use properly in the nuclear fields as well as in other fields because of their mathematical nature. In this work, we propose an integrated environment (IE) approach for requirements, which is an integrated approach that enables easy inspection by combining requirement traceability and effective use of a formal method. The paper also introduces computer-aided tools for supporting IE approach for requirements. Called the nuclear software inspection support and requirements traceability (NuSISRT), the tool incorporates software inspection, requirement traceability, and formal specification capabilities. We designed the NuSISRT to partially automate software inspection and analysis of requirement traceability. In addition, for the formal specification and analysis, we used the formal requirements specification and analysis tool for nuclear engineering (NuSRS)

  12. Estimation of L-threonine requirements for Longyan laying ducks

    Directory of Open Access Journals (Sweden)

    A. M. Fouad

    2017-02-01

    Full Text Available Objective A study was conducted to test six threonine (Thr levels (0.39%, 0.44%, 0.49%, 0.54%, 0.59%, and 0.64% to estimate the optimal dietary Thr requirements for Longyan laying ducks from 17 to 45 wk of age. Methods Nine hundred Longyan ducks aged 17 wk were assigned randomly to the six dietary treatments, where each treatment comprised six replicate pens with 25 ducks per pen. Results Increasing the Thr level enhanced egg production, egg weight, egg mass, and the feed conversion ratio (FCR (linearly or quadratically; p<0.05. The Haugh unit score, yolk color, albumen height, and the weight, percentage, thickness, and breaking strength of the eggshell did not response to increases in the Thr levels, but the albumen weight and its proportion increased significantly (p<0.05, whereas the yolk weight and its proportion decreased significantly as the Thr levels increased. Conclusion According to a regression model, the optimal Thr requirement for egg production, egg mass, and FCR in Longyan ducks is 0.57%, while 0.58% is the optimal level for egg weight from 17 to 45 wk of age.

  13. Significance of Supply Logistics in Big Cities

    Directory of Open Access Journals (Sweden)

    Mario Šafran

    2012-10-01

    Full Text Available The paper considers the concept and importance of supplylogistics as element in improving storage, supply and transportof goods in big cities. There is always room for improvements inthis segmenl of economic activities, and therefore continuousoptimisation of the cargo flows from the manufacturer to theend user is impor1a11t. Due to complex requirements in thecargo supply a11d the "spoiled" end users, modem cities represe/ll great difficulties and a big challenge for the supply organisers.The consumers' needs in big cities have developed over therecent years i11 such a way that they require supply of goods severaltimes a day at precisely determined times (orders are receivedby e-mail, and the information transfer is therefore instantaneous.In order to successfully meet the consumers'needs in advanced economic systems, advanced methods ofgoods supply have been developed and improved, such as 'justin time'; ''door-to-door", and "desk-to-desk". Regular operationof these systems requires supply logistics 1vhiclz includes thetotalthroughpw of materials, from receiving the raw materialsor reproduction material to the delive1y of final products to theend users.

  14. Clinical significance of determination of serum IGF-I, TNF-α and SS levels in patients with chronic viral hepatitis

    International Nuclear Information System (INIS)

    Zhu Peiming

    2009-01-01

    Objective: To study the clinical significance of changes of serum IGF-I, TNF-α and somatostatin(SS) levels in patients with chronic viral hepatitis. Methods: Serum IGF-I, TNF-α and SS levels were determined with RIA in 30 patients with chronic viral hepatitis (B hepatitis, n=24,C hepatitis, n=6) and 30 controls.Results The serum IGF-I, TNF-α and SS levels in the patients were all significantly higher than those in controls (P<0.01). Conclusion: Serum IGF-I, TNF-α and SS levels were markedly increased in patients with chronic viral hepatitis, the exact mechanism and consequence of the changes required further study. (authors)

  15. Advanced Extra-Vehicular Activity Pressure Garment Requirements Development

    Science.gov (United States)

    Ross, Amy; Aitchison, Lindsay; Rhodes, Richard

    2015-01-01

    The NASA Johnson Space Center advanced pressure garment technology development team is addressing requirements development for exploration missions. Lessons learned from the Z-2 high fidelity prototype development have reiterated that clear low-level requirements and verification methods reduce risk to the government, improve efficiency in pressure garment design efforts, and enable the government to be a smart buyer. The expectation is to provide requirements at the specification level that are validated so that their impact on pressure garment design is understood. Additionally, the team will provide defined verification protocols for the requirements. However, in reviewing exploration space suit high level requirements there are several gaps in the team's ability to define and verify related lower level requirements. This paper addresses the efforts in requirement areas such as mobility/fit/comfort and environmental protection (dust, radiation, plasma, secondary impacts) to determine the method by which the requirements can be defined and use of those methods for verification. Gaps exist at various stages. In some cases component level work is underway, but no system level effort has begun; in other cases no effort has been initiated to close the gap. Status of on-going efforts and potential approaches to open gaps are discussed.

  16. A discrete ordinate response matrix method for massively parallel computers

    International Nuclear Information System (INIS)

    Hanebutte, U.R.; Lewis, E.E.

    1991-01-01

    A discrete ordinate response matrix method is formulated for the solution of neutron transport problems on massively parallel computers. The response matrix formulation eliminates iteration on the scattering source. The nodal matrices which result from the diamond-differenced equations are utilized in a factored form which minimizes memory requirements and significantly reduces the required number of algorithm utilizes massive parallelism by assigning each spatial node to a processor. The algorithm is accelerated effectively by a synthetic method in which the low-order diffusion equations are also solved by massively parallel red/black iterations. The method has been implemented on a 16k Connection Machine-2, and S 8 and S 16 solutions have been obtained for fixed-source benchmark problems in X--Y geometry

  17. Cytomorphologic significance of marginal vacuoles in diffuse thyroid enlargements

    Directory of Open Access Journals (Sweden)

    Anshu Gupta

    2013-01-01

    Conclusions: A significant association was found between abundant MVs and a hyperthyroid state. Moderate/absent MVs in diffuse goiters were not found to correlate with thyroid function. Thus, all diffuse goiters with prominent MVs require hormonal evaluation to rule out hyperfunction of the thyroid.

  18. Significance of ultrasonography in selecting methods for the treatment of acute cholecystitis

    Directory of Open Access Journals (Sweden)

    Grzegorz Ćwik

    2013-09-01

    Full Text Available Surgical removal of the gallbladder is indicated in nearly all cases of complicated acute cholecystitis. In the 1990s, laparoscopic cholecystectomy became the method of choice in the treatment of cholecystolithiasis. Due to a large inflammatory reaction in the course of acute inflammation, a laparoscopic procedure is conducted in technically difficult conditions and entails the risk of complications. The aim of this paper was: 1 to analyze ultrasound images in acute cholecystitis; 2 to specify the most common causes of conversion from the laparoscopic method to open laparotomy; 3 to determine the degree to which the necessity for such a conversion may be predicted with the help of ultrasound examinations. Material and methods: In 1993–2011, in the Second Department and Clinic of General, Gastroenterological and Oncological Surgery of the Medical University in Lublin, 5,596 cholecystectomies were performed including 4,105 laparoscopic procedures that constituted 73.4% of all cholecystectomies. Five hundred and forty-two patients (13.2% were qualified for laparoscopic procedure despite manifesting typical symptoms of acute cholecystitis in ultrasound examination, which comprise: thickening of the gallbladder wall of > 3 mm, inflammatory infiltration in the Calot’s triangle region, gallbladder filled with stagnated or purulent contents and mural or intramural effusion. Results: In the group of operated patients, the conversion was necessary in 130 patients, i.e. in 24% of cases in comparison with 3.8% of patients with uncomplicated cholecystolithiasis (without the signs of inflammation. The conversion most frequently occurred when the assessment of the anatomical structures of the Calot’s triangle was rendered more difficult due to local inflammatory process, mural effusion and thickening of the gallbladder wall of >5 mm. The remaining changes occurred more rarely. Conclusions: Based on imaging scans, the most common causes of conversion

  19. A fuzzy model for exploiting customer requirements

    Directory of Open Access Journals (Sweden)

    Zahra Javadirad

    2016-09-01

    Full Text Available Nowadays, Quality function deployment (QFD is one of the total quality management tools, where customers’ views and requirements are perceived and using various techniques improves the production requirements and operations. The QFD department, after identification and analysis of the competitors, takes customers’ feedbacks to meet the customers’ demands for the products compared with the competitors. In this study, a comprehensive model for assessing the importance of the customer requirements in the products or services for an organization is proposed. The proposed study uses linguistic variables, as a more comprehensive approach, to increase the precision of the expression evaluations. The importance of these requirements specifies the strengths and weaknesses of the organization in meeting the requirements relative to competitors. The results of these experiments show that the proposed method performs better than the other methods.

  20. Post-operative analgesic requirement in non-closure and closure of peritoneum during open appendectomy

    International Nuclear Information System (INIS)

    Khan, A.W.; Maqsood, R.; Saleem, M.M.

    2017-01-01

    To compare the mean post-operative analgesic requirement in non-closure and closure of peritoneum during open appendectomy. Study Design: Randomized controlled trial. Place and Duration of Study: Department of General Surgery Combined Military Hospital Quetta, from 1st August 2014 to 30th April 2015. Material and Methods: A total of 60 patients were included in this study and were divided into two groups of 30 each. Patients in group A underwent open appendectomy with closure of peritoneum while patients in group B had non-closure of peritoneum during the same procedure. Post-operatively, pain severity was assessed on visual analogue scale (VAS) numeric pain distress scale. On presence of VAS numeric pain distress scale between 5 to 7, intramuscular (IM) diclofenac sodium was given and on score >7, intravascular (IV) tramadol was given. The final outcome was measured at day 0 and day 1. Results: Pain score and analgesic requirements were significantly less in non-closure group than closure group on day 0 and day 1, showing statistically significant difference between the two groups. Conclusion: Mean post-operative analgesic requirement is significantly less in non-closure group as compared to closure group during open appendectomy. (author)

  1. Energy Requirements in Critically Ill Patients

    Science.gov (United States)

    2018-01-01

    During the management of critical illness, optimal nutritional support is an important key for achieving positive clinical outcomes. Compared to healthy people, critically ill patients have higher energy expenditure, thereby their energy requirements and risk of malnutrition being increased. Assessing individual nutritional requirement is essential for a successful nutritional support, including the adequate energy supply. Methods to assess energy requirements include indirect calorimetry (IC) which is considered as a reference method, and the predictive equations which are commonly used due to the difficulty of using IC in certain conditions. In this study, a literature review was conducted on the energy metabolic changes in critically ill patients, and the implications for the estimation of energy requirements in this population. In addition, the issue of optimal caloric goal during nutrition support is discussed, as well as the accuracy of selected resting energy expenditure predictive equations, commonly used in critically ill patients.

  2. Energy Requirements in Critically Ill Patients.

    Science.gov (United States)

    Ndahimana, Didace; Kim, Eun-Kyung

    2018-04-01

    During the management of critical illness, optimal nutritional support is an important key for achieving positive clinical outcomes. Compared to healthy people, critically ill patients have higher energy expenditure, thereby their energy requirements and risk of malnutrition being increased. Assessing individual nutritional requirement is essential for a successful nutritional support, including the adequate energy supply. Methods to assess energy requirements include indirect calorimetry (IC) which is considered as a reference method, and the predictive equations which are commonly used due to the difficulty of using IC in certain conditions. In this study, a literature review was conducted on the energy metabolic changes in critically ill patients, and the implications for the estimation of energy requirements in this population. In addition, the issue of optimal caloric goal during nutrition support is discussed, as well as the accuracy of selected resting energy expenditure predictive equations, commonly used in critically ill patients.

  3. Requirements of Inconel 718 alloy for aeronautical applications

    Science.gov (United States)

    Ghiban, Brandusa; Elefterie, Cornelia Florina; Guragata, Constantin; Bran, Dragos

    2018-02-01

    The main requirements imposed by aviation components made from super alloys based on Nickel are presented in present paper. A significant portion of fasteners, locking lugs, blade retainers and inserts are manufactured from Inconel 718 alloy. The thesis describes environmental factors (corrosion), conditions of external aggression (salt air, intense heat, heavy industrial pollution, high condensation, high pressure), mechanical characteristics (tensile strength, creep, density, yield strength, fracture toughness, fatigue resistance) and loadings (tensions, compression loads) that must be satisfied simultaneously by Ni-based super alloy, compared to other classes of aviation alloys (as egg. Titanium alloys, Aluminum alloys). For this alloy the requirements are strength, durability, damage tolerance, fail safety and so on. The corrosion can be an issue, but the fatigue under high-magnitude cyclic tensile loading it what limits the lifetime of the airframe. The excellent malleability and weldability characteristics of the 718 system make the material physical properties tolerant of manufacturing processes. These characteristics additionally continue to provide new opportunities for advanced manufacturing methods.

  4. Compliance to two city convenience store ordinance requirements

    Science.gov (United States)

    Menéndez, Cammie K Chaumont; Amandus, Harlan E; Wu, Nan; Hendricks, Scott A

    2015-01-01

    Background Robbery-related homicides and assaults are the leading cause of death in retail businesses. Robbery reduction approaches focus on compliance to Crime Prevention Through Environmental Design (CPTED) guidelines. Purpose We evaluated the level of compliance to CPTED guidelines specified by convenience store safety ordinances effective in 2010 in Dallas and Houston, Texas, USA. Methods Convenience stores were defined as businesses less than 10 000 square feet that sell grocery items. Store managers were interviewed for store ordinance requirements from August to November 2011, in a random sample of 594 (289 in Dallas, 305 in Houston) convenience stores that were open before and after the effective dates of their city’s ordinance. Data were collected in 2011 and analysed in 2012–2014. Results Overall, 9% of stores were in full compliance, although 79% reported being registered with the police departments as compliant. Compliance was consistently significantly higher in Dallas than in Houston for many requirements and by store type. Compliance was lower among single owner-operator stores compared with corporate/franchise stores. Compliance to individual requirements was lowest for signage and visibility. Conclusions Full compliance to the required safety measures is consistent with industry ‘best practices’ and evidence-based workplace violence prevention research findings. In Houston and Dallas compliance was higher for some CPTED requirements but not the less costly approaches that are also the more straightforward to adopt. PMID:26337569

  5. Digital Resonant Controller based on Modified Tustin Discretization Method

    Directory of Open Access Journals (Sweden)

    STOJIC, D.

    2016-11-01

    Full Text Available Resonant controllers are used in power converter voltage and current control due to their simplicity and accuracy. However, digital implementation of resonant controllers introduces problems related to zero and pole mapping from the continuous to the discrete time domain. Namely, some discretization methods introduce significant errors in the digital controller resonant frequency, resulting in the loss of the asymptotic AC reference tracking, especially at high resonant frequencies. The delay compensation typical for resonant controllers can also be compromised. Based on the existing analysis, it can be concluded that the Tustin discretization with frequency prewarping represents a preferable choice from the point of view of the resonant frequency accuracy. However, this discretization method has a shortcoming in applications that require real-time frequency adaptation, since complex trigonometric evaluation is required for each frequency change. In order to overcome this problem, in this paper the modified Tustin discretization method is proposed based on the Taylor series approximation of the frequency prewarping function. By comparing the novel discretization method with commonly used two-integrator-based proportional-resonant (PR digital controllers, it is shown that the resulting digital controller resonant frequency and time delay compensation errors are significantly reduced for the novel controller.

  6. Visual soil evaluation - future research requirements

    Science.gov (United States)

    Emmet-Booth, Jeremy; Forristal, Dermot; Fenton, Owen; Ball, Bruce; Holden, Nick

    2017-04-01

    A review of Visual Soil Evaluation (VSE) techniques (Emmet-Booth et al., 2016) highlighted their established utility for soil quality assessment, though some limitations were identified; (1) The examination of aggregate size, visible intra-porosity and shape forms a key assessment criterion in almost all methods, thus limiting evaluation to structural form. The addition of criteria that holistically examine structure may be desirable. For example, structural stability can be indicated using dispersion tests or examining soil surface crusting, while the assessment of soil colour may indirectly indicate soil organic matter content, a contributor to stability. Organic matter assessment may also indicate structural resilience, along with rooting, earthworm numbers or shrinkage cracking. (2) Soil texture may influence results or impeded method deployment. Modification of procedures to account for extreme texture variation is desirable. For example, evidence of compaction in sandy or single grain soils greatly differs to that in clayey soils. Some procedures incorporate separate classification systems or adjust deployment based on texture. (3) Research into impacts of soil moisture content on VSE evaluation criteria is required. Criteria such as rupture resistance and shape may be affected by moisture content. It is generally recommended that methods are deployed on moist soils and quantification of influences of moisture variation on results is necessary. (4) Robust sampling strategies for method deployment are required. Dealing with spatial variation differs between methods, but where methods can be deployed over large areas, clear instruction on sampling is required. Additionally, as emphasis has been placed on the agricultural production of soil, so the ability of VSE for exploring structural quality in terms of carbon storage, water purification and biodiversity support also requires research. References Emmet-Booth, J.P., Forristal. P.D., Fenton, O., Ball, B

  7. Requirements and testing methods for surfaces of metallic bipolar plates for low-temperature PEM fuel cells

    Science.gov (United States)

    Jendras, P.; Lötsch, K.; von Unwerth, T.

    2017-03-01

    To reduce emissions and to substitute combustion engines automotive manufacturers, legislature and first users aspire hydrogen fuel cell vehicles. Up to now the focus of research was set on ensuring functionality and increasing durability of fuel cell components. Therefore, expensive materials were used. Contemporary research and development try to substitute these substances by more cost-effective material combinations. The bipolar plate is a key component with the greatest influence on volume and mass of a fuel cell stack and they have to meet complex requirements. They support bending sensitive components of stack, spread reactants over active cell area and form the electrical contact to another cell. Furthermore, bipolar plates dissipate heat of reaction and separate one cell gastight from the other. Consequently, they need a low interfacial contact resistance (ICR) to the gas diffusion layer, high flexural strength, good thermal conductivity and a high durability. To reduce costs stainless steel is a favoured material for bipolar plates in automotive applications. Steel is characterized by good electrical and thermal conductivity but the acid environment requires a high chemical durability against corrosion as well. On the one hand formation of a passivating oxide layer increasing ICR should be inhibited. On the other hand pitting corrosion leading to increased permeation rate may not occur. Therefore, a suitable substrate lamination combination is wanted. In this study material testing methods for bipolar plates are considered.

  8. 40 CFR 52.683 - Significant deterioration of air quality.

    Science.gov (United States)

    2010-07-01

    ... quality. 52.683 Section 52.683 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) AIR... deterioration of air quality. (a) The State of Idaho Rules for Control of Air Pollution in Idaho, specifically... the Clean Air Act for preventing significant deterioration of air quality. (b) The requirements of...

  9. 40 CFR 52.144 - Significant deterioration of air quality.

    Science.gov (United States)

    2010-07-01

    ... quality. 52.144 Section 52.144 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) AIR... deterioration of air quality. (a) The requirements of sections 160 through 165 of the Clean Act are not met... lands does not include approvable procedures for preventing the significant deterioration of air quality...

  10. 40 CFR 52.738 - Significant deterioration of air quality.

    Science.gov (United States)

    2010-07-01

    ... quality. 52.738 Section 52.738 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) AIR... deterioration of air quality. (a) The requirements of sections 160 through 165 of the Clean Air Act are not met... air quality. (b) Regulations for preventing significant deterioration of air quality. The provisions...

  11. 40 CFR 52.793 - Significant deterioration of air quality.

    Science.gov (United States)

    2010-07-01

    ... quality. 52.793 Section 52.793 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) AIR... deterioration of air quality. (a) The requirements of sections 160 through 165 of the Clean Air Act are not met... air quality. (b) Regulations for preventing significant deterioration of air quality. The provisions...

  12. 40 CFR 52.632 - Significant deterioration of air quality.

    Science.gov (United States)

    2010-07-01

    ... quality. 52.632 Section 52.632 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) AIR... deterioration of air quality. (a) The requirements of sections 160 through 165 of the Clean Air Act are not met... air quality. (b) Regulations for preventing significant deterioration of air quality. The provisions...

  13. Detection of EGFR and COX-2 Expression by Immunohistochemical Method on a Tissue Microarray Section in Lung Cancer and Biological Significance

    Directory of Open Access Journals (Sweden)

    Xinyun WANG

    2010-02-01

    Full Text Available Background and objective Epidermal growth factor receptor (EGFR and cyclooxygenase-2 (COX-2, which can regulate growth, invasion and metastasis of tumor through relevant signaling pathway, have been detected in a variety of solid tumors. The aim of this study is to investigate the biological significance of EGFR and COX-2 expression in lung cancer and the relationship between them. Methods The expression of EGFR and COX-2 was detected in 89 primary lung cancer tissues, 12 premaliganant lesions, 12 lymph node metastases, and 10 normal lung tissues as the control by immunohistochemical method on a tissue microarray section. Results EGFR protein was detectable in 59.6%, 41.7%, and 66.7% of primary lung cancer tissues, premalignant lesions and lymph node metastases, respectively; COX-2 protein was detectable in 52.8%, 41.7%, and 66.7% of primary lung cancer tissues, premalignant lesions and lymph node metastases, respectively, which were significantly higher than those of the control (P 0.05. COX-2 expression was related to gross type (P < 0.05. A highly positive correlation was observed between EGFR and COX-2 expression (P < 0.01. Conclusion Overexpression of EGFR and COX-2 may play an important role in the tumorgenesis, progression and malignancy of lung cancer. Detection of EGFR and COX-2 expression might be helpful to diagnosis and prognosis of lung cancer.

  14. Space Suit Joint Torque Measurement Method Validation

    Science.gov (United States)

    Valish, Dana; Eversley, Karina

    2012-01-01

    In 2009 and early 2010, a test method was developed and performed to quantify the torque required to manipulate joints in several existing operational and prototype space suits. This was done in an effort to develop joint torque requirements appropriate for a new Constellation Program space suit system. The same test method was levied on the Constellation space suit contractors to verify that their suit design met the requirements. However, because the original test was set up and conducted by a single test operator there was some question as to whether this method was repeatable enough to be considered a standard verification method for Constellation or other future development programs. In order to validate the method itself, a representative subset of the previous test was repeated, using the same information that would be available to space suit contractors, but set up and conducted by someone not familiar with the previous test. The resultant data was compared using graphical and statistical analysis; the results indicated a significant variance in values reported for a subset of the re-tested joints. Potential variables that could have affected the data were identified and a third round of testing was conducted in an attempt to eliminate and/or quantify the effects of these variables. The results of the third test effort will be used to determine whether or not the proposed joint torque methodology can be applied to future space suit development contracts.

  15. Roles and significance of water conducting features for transport models in performance assessment

    International Nuclear Information System (INIS)

    Carrera, J.; Sanchez-Vila, X.; Medina, A.

    1999-01-01

    The term water conducting features (WCF) refers to zones of high hydraulic conductivity. In the context of waste disposal, it is further implied that they are narrow so that chances of sampling them are low. Yet, they may carry significant amounts of water. Moreover, their relatively small volumetric water content causes solutes to travel fast through them. Water-conducting features are a rather common feature of natural media. The fact that they have become a source of concern in recent years, reflects more the increased level of testing and monitoring than any intrinsic property of low permeability media. Accurate simulations of solute transport require a realistic accounting for water conducting features. Methods are presented to do so and examples are shown to illustrate these methods. Since detailed accounting of WCF's will not be possible in actual performance assessments, efforts should be directed towards typification, so as to identify the essential effects of WCF's on solute transport through different types of rocks. Field evidence suggests that, although individual WCF's may be difficult to characterize, their effects are quite predictable. (author)

  16. Assay of serum ferritin by two different radioimmunometric methods and its clinical significance

    International Nuclear Information System (INIS)

    Kaltwasser, J.P.; Werner, E.; Gesellschaft fuer Strahlen- und Umweltforschung m.b.H., Frankfurt am Main

    1977-01-01

    Serum ferritin was measured by two different radioimmunometric methods a) the Addison assay, b) a commercial radioimmunoassay. Iron storage in the body was determined using 59 Fe. A dose correlation was found between serum ferritin and iron storage in the body. (AJ) [de

  17. Applying WHO's 'workforce indicators of staffing need' (WISN) method to calculate the health worker requirements for India's maternal and child health service guarantees in Orissa State.

    Science.gov (United States)

    Hagopian, Amy; Mohanty, Manmath K; Das, Abhijit; House, Peter J

    2012-01-01

    In one district of Orissa state, we used the World Health Organization's Workforce Indicators of Staffing Need (WISN) method to calculate the number of health workers required to achieve the maternal and child health 'service guarantees' of India's National Rural Health Mission (NRHM). We measured the difference between this ideal number and current staffing levels. We collected census data, routine health information data and government reports to calculate demand for maternal and child health services. By conducting 54 interviews with physicians and midwives, and six focus groups, we were able to calculate the time required to perform necessary health care tasks. We also interviewed 10 new mothers to cross-check these estimates at a global level and get assessments of quality of care. For 18 service centres of Ganjam District, we found 357 health workers in our six cadre categories, to serve a population of 1.02 million. Total demand for the MCH services guaranteed under India's NRHM outpaced supply for every category of health worker but one. To properly serve the study population, the health workforce supply should be enhanced by 43 additional physicians, 15 nurses and 80 nurse midwives. Those numbers probably under-estimate the need, as they assume away geographic barriers. Our study established time standards in minutes for each MCH activity promised by the NRHM, which could be applied elsewhere in India by government planners and civil society advocates. Our calculations indicate significant numbers of new health workers are required to deliver the services promised by the NRHM.

  18. Applied mathematical methods in nuclear thermal hydraulics

    International Nuclear Information System (INIS)

    Ransom, V.H.; Trapp, J.A.

    1983-01-01

    Applied mathematical methods are used extensively in modeling of nuclear reactor thermal-hydraulic behavior. This application has required significant extension to the state-of-the-art. The problems encountered in modeling of two-phase fluid transients and the development of associated numerical solution methods are reviewed and quantified using results from a numerical study of an analogous linear system of differential equations. In particular, some possible approaches for formulating a well-posed numerical problem for an ill-posed differential model are investigated and discussed. The need for closer attention to numerical fidelity is indicated

  19. Relationship between CT visual score and lung volume which is measured by helium dilution method and body plethysmographic method in patients with pulmonary emphysema

    International Nuclear Information System (INIS)

    Toyoshima, Hideo; Ishibashi, Masayoshi; Senju, Syoji; Tanaka, Hideki; Aritomi, Takamichi; Watanabe, Kentaro; Yoshida, Minoru

    1997-01-01

    We examined the relationship between CT visual score and pulmonary function studies in patients with pulmonary emphysema. Lung volume was measured using helium dilution method and body plethysmographic method. Although airflow obstruction and overinflation measured by helium dilution method did not correlate with CT visual score, CO diffusing capacity per alveolar volume (DL CO /V A ) showed significant negative correlation with CT visual score (r=-0.49, p CO /V A reflect pathologic change in pulmonary emphysema. Further, both helium dilution method and body plethysmographic method are required to evaluate lung volume of pulmonary emphysema because of its ventilatory unevenness. (author)

  20. Method for assessing reliability of a network considering probabilistic safety assessment

    International Nuclear Information System (INIS)

    Cepin, M.

    2005-01-01

    A method for assessment of reliability of the network is developed, which uses the features of the fault tree analysis. The method is developed in a way that the increase of the network under consideration does not require significant increase of the model. The method is applied to small examples of network consisting of a small number of nodes and a small number of their connections. The results give the network reliability. They identify equipment, which is to be carefully maintained in order that the network reliability is not reduced, and equipment, which is a candidate for redundancy, as this would improve network reliability significantly. (author)

  1. 40 CFR 52.884 - Significant deterioration of air quality.

    Science.gov (United States)

    2010-07-01

    ... 40 Protection of Environment 3 2010-07-01 2010-07-01 false Significant deterioration of air quality. 52.884 Section 52.884 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) AIR... deterioration of air quality. (a) The requirements of section 160 through 165 of the Clean Air Act, as amended...

  2. 40 CFR 52.343 - Significant deterioration of air quality.

    Science.gov (United States)

    2010-07-01

    ... quality. 52.343 Section 52.343 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) AIR... deterioration of air quality. (a) The requirements of sections 160 through 165 of the Clean Air Act are not met for the following categories of sources for preventing the significant deterioration of air quality...

  3. 40 CFR 52.833 - Significant deterioration of air quality.

    Science.gov (United States)

    2010-07-01

    ... quality. 52.833 Section 52.833 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) AIR... deterioration of air quality. (a) The requirements of sections 160 through 165 of the Clean Air Act are met... for preventing significant deterioration of air quality. The provisions of § 52.21 except paragraph (a...

  4. 40 CFR 52.986 - Significant deterioration of air quality.

    Science.gov (United States)

    2010-07-01

    ... quality. 52.986 Section 52.986 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) AIR... deterioration of air quality. (a) The plan submitted by the Governor of Louisiana on August 14, 1984 (as adopted... preventing significant deterioration of air quality. (b) The requirements of sections 160 through 165 of the...

  5. 40 CFR 52.581 - Significant deterioration of air quality.

    Science.gov (United States)

    2010-07-01

    ... 40 Protection of Environment 3 2010-07-01 2010-07-01 false Significant deterioration of air quality. 52.581 Section 52.581 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) AIR... deterioration of air quality. (a) All applications and other information required pursuant to § 52.21 of this...

  6. 40 CFR 52.432 - Significant deterioration of air quality.

    Science.gov (United States)

    2010-07-01

    ... quality. 52.432 Section 52.432 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) AIR... deterioration of air quality. (a) The requirements of sections 160 through 165 of the Clean Air Act are not met... air quality. (b) Regulation for preventing significant deterioration of air quality. The provisions of...

  7. Moral significance of phenomenal consciousness.

    Science.gov (United States)

    Levy, Neil; Savulescu, Julian

    2009-01-01

    Recent work in neuroimaging suggests that some patients diagnosed as being in the persistent vegetative state are actually conscious. In this paper, we critically examine this new evidence. We argue that though it remains open to alternative interpretations, it strongly suggests the presence of consciousness in some patients. However, we argue that its ethical significance is less than many people seem to think. There are several different kinds of consciousness, and though all kinds of consciousness have some ethical significance, different kinds underwrite different kinds of moral value. Demonstrating that patients have phenomenal consciousness--conscious states with some kind of qualitative feel to them--shows that they are moral patients, whose welfare must be taken into consideration. But only if they are subjects of a sophisticated kind of access consciousness--where access consciousness entails global availability of information to cognitive systems--are they persons, in the technical sense of the word employed by philosophers. In this sense, being a person is having the full moral status of ordinary human beings. We call for further research which might settle whether patients who manifest signs of consciousness possess the sophisticated kind of access consciousness required for personhood.

  8. Which Criteria are More Valuable in Defining Hemodynamic Significance of Patent Ductus Arteriosus in Premature Infants? Respiratory or Echocardiographic?

    Directory of Open Access Journals (Sweden)

    İrfan Oğuz Şahin

    2017-03-01

    Full Text Available Aim: Patent ductus arteriosus (PDA is a frequent health problem in premature infants. Pharmacologic closure is recommended only for hemodynamically significant PDA (hsPDA that is defined according to the clinical and echocardiographic criteria. The aim of this study was to explore the value of commonly used criteria in defining hsPDA and predicting the required number of courses of ibuprofen treatment to close PDA in premature infants. Methods: Sixty premature infants with a gestational age of ≤33 weeks were evaluated prospectively. Clinical and echocardiographic criteria [O2 requirement, ductus diameter (DD and left atrial-to-aortic root diameter ratio (LA:Ao] were used to define hsPDA. Clinical improvement after pharmacologic closure of PDA and association between the criteria and required number of ibuprofen courses were investigated. Results: O2 requirement decreased by PDA closure but was not different between patients with hsPDA and the others with PDA. Also, O2 requirement was not found to be associated with required number of ibuprofen courses. DD and LA:Ao were greater in patients with hsPDA. DD was found to be associated with required number of courses of ibuprofen treatment. Conclusion: Although there was an improvement in O2 requirement with PDA closure, echocardiographic criteria were found to be more valuable in defining hsPDA. DD should also be used to estimate the duration of treatment.

  9. Optimized distributed systems achieve significant performance improvement on sorted merging of massive VCF files.

    Science.gov (United States)

    Sun, Xiaobo; Gao, Jingjing; Jin, Peng; Eng, Celeste; Burchard, Esteban G; Beaty, Terri H; Ruczinski, Ingo; Mathias, Rasika A; Barnes, Kathleen; Wang, Fusheng; Qin, Zhaohui S

    2018-06-01

    Sorted merging of genomic data is a common data operation necessary in many sequencing-based studies. It involves sorting and merging genomic data from different subjects by their genomic locations. In particular, merging a large number of variant call format (VCF) files is frequently required in large-scale whole-genome sequencing or whole-exome sequencing projects. Traditional single-machine based methods become increasingly inefficient when processing large numbers of files due to the excessive computation time and Input/Output bottleneck. Distributed systems and more recent cloud-based systems offer an attractive solution. However, carefully designed and optimized workflow patterns and execution plans (schemas) are required to take full advantage of the increased computing power while overcoming bottlenecks to achieve high performance. In this study, we custom-design optimized schemas for three Apache big data platforms, Hadoop (MapReduce), HBase, and Spark, to perform sorted merging of a large number of VCF files. These schemas all adopt the divide-and-conquer strategy to split the merging job into sequential phases/stages consisting of subtasks that are conquered in an ordered, parallel, and bottleneck-free way. In two illustrating examples, we test the performance of our schemas on merging multiple VCF files into either a single TPED or a single VCF file, which are benchmarked with the traditional single/parallel multiway-merge methods, message passing interface (MPI)-based high-performance computing (HPC) implementation, and the popular VCFTools. Our experiments suggest all three schemas either deliver a significant improvement in efficiency or render much better strong and weak scalabilities over traditional methods. Our findings provide generalized scalable schemas for performing sorted merging on genetics and genomics data using these Apache distributed systems.

  10. Methods and techniques for obtaining significant discharge measurements on high-voltage bushings

    Energy Technology Data Exchange (ETDEWEB)

    Jolley, H E.W.

    1965-05-01

    Forms of discharge tests are described and compared. The use of the Arman and Starr discharge bridge with cathode-ray-tube display is shown to be practicable for bushing testing up to 600 kV. Spurious discharge effects and the precautions necessary to eliminate them are discussed. Consideration is given to calibration methods and to the errors to be expected with various practical circuits. The problem of establishing safe discharge limits for bushings is considered on the basis of a large number of test results and on service experience.

  11. Some advanced parametric methods for assessing waveform distortion in a smart grid with renewable generation

    Science.gov (United States)

    Alfieri, Luisa

    2015-12-01

    Power quality (PQ) disturbances are becoming an important issue in smart grids (SGs) due to the significant economic consequences that they can generate on sensible loads. However, SGs include several distributed energy resources (DERs) that can be interconnected to the grid with static converters, which lead to a reduction of the PQ levels. Among DERs, wind turbines and photovoltaic systems are expected to be used extensively due to the forecasted reduction in investment costs and other economic incentives. These systems can introduce significant time-varying voltage and current waveform distortions that require advanced spectral analysis methods to be used. This paper provides an application of advanced parametric methods for assessing waveform distortions in SGs with dispersed generation. In particular, the Standard International Electrotechnical Committee (IEC) method, some parametric methods (such as Prony and Estimation of Signal Parameters by Rotational Invariance Technique (ESPRIT)), and some hybrid methods are critically compared on the basis of their accuracy and the computational effort required.

  12. Incidence of blood transfusion requirement and factors associated with transfusion following liver lobectomy in dogs and cats: 72 cases (2007-2015).

    Science.gov (United States)

    Hanson, Kayla R; Pigott, Armi M; J Linklater, Andrew K

    2017-10-15

    OBJECTIVE To determine the incidence of blood transfusion, mortality rate, and factors associated with transfusion in dogs and cats undergoing liver lobectomy. DESIGN Retrospective case series. ANIMALS 63 client-owned dogs and 9-client owned cats that underwent liver lobectomy at a specialty veterinary practice from August 2007 through June 2015. PROCEDURES Medical records were reviewed and data extracted regarding dog and cat signalment, hematologic test results before and after surgery, surgical method, number and identity of lobes removed, concurrent surgical procedures, hemoabdomen detected during surgery, incidence of blood transfusion, and survival to hospital discharge (for calculation of mortality rate). Variables were compared between patients that did and did not require transfusion. RESULTS 11 of 63 (17%) dogs and 4 of 9 cats required a blood transfusion. Mortality rate was 8% for dogs and 22% for cats. Pre- and postoperative PCV and plasma total solids concentration were significantly lower and mortality rate significantly higher in dogs requiring transfusion than in dogs not requiring transfusion. Postoperative PCV was significantly lower in cats requiring transfusion than in cats not requiring transfusion. No significant differences in any other variable were identified between dogs and cats requiring versus not requiring transfusion. CONCLUSIONS AND CLINICAL RELEVANCE Dogs and cats undergoing liver lobectomy had a high requirement for blood transfusion, and a higher requirement for transfusion should be anticipated in dogs with perioperative anemia and cats with postoperative anemia. Veterinarians performing liver lobectomies in dogs and cats should have blood products readily available.

  13. The Interview as an Approach to Elicit Requirements

    Directory of Open Access Journals (Sweden)

    Luz Marina Iriarte

    2013-07-01

    Full Text Available In many software projects requirements elicitation is incomplete or inconsistent. One issue that works for this is presented has to be with the requirements engineers use a single method to do it, which can cause a deficiency in the expected results. Among the factors contributing to the success of this stage of the life cycle is an adequate selection of the elicitation technique and other approaches needed. This article describes an experimental study to elicit requirements, in which was applied a combination of methods and techniques, and discusses the advantages of doing it this way. The results obtained allow concluding that to achieve adequate elicitation is necessary to combine several techniques and methods.

  14. Evaluation of the effect of torsemide on warfarin dosage requirements.

    Science.gov (United States)

    Lai, Sophia; Momper, Jeremiah D; Yam, Felix K

    2017-08-01

    Background According to drug interaction databases, torsemide may potentiate the effects of warfarin. Evidence for this drug-drug interaction, however, is conflicting and the clinical significance is unknown. Objective The aim of this study is to evaluate the impact of torsemide initiation on warfarin dosage requirements. Setting This study was conducted at the Veterans Affairs Healthcare System in San Diego, California. Method A retrospective cohort study was conducted using Veterans Affairs data from patients who were converted from bumetanide to torsemide between March 2014 and July 2014. Patients were also prescribed and taking warfarin during the observation period. Warfarin dosage requirements were evaluated to determine if any changes occurred within the first 3 months of starting torsemide. Main outcome measure The primary outcome was the average weekly warfarin dose before and after torsemide initiation. Results Eighteen patients met study inclusion criteria. The weekly warfarin dose before and after initiation of torsemide was not significantly different (34 ± 15 and 34 ± 13 mg, p > 0.05). Of those eighteen patients, only two experienced elevations in INR that required a decrease in warfarin dosage after torsemide initiation. Between those two patients, dosage reductions ranged from 5.3 to 18%. Conclusion These results indicated that most patients did not require any warfarin dosage adjustments after torsemide was initiated. The potential for interaction, however, still exists. While empiric warfarin dosage adjustments are not recommended when initiating torsemide, increased monitoring is warranted to minimize the risk of adverse effects.

  15. 27 CFR 4.28 - Type designations of varietal significance.

    Science.gov (United States)

    2010-04-01

    ... 27 Alcohol, Tobacco Products and Firearms 1 2010-04-01 2010-04-01 false Type designations of varietal significance. 4.28 Section 4.28 Alcohol, Tobacco Products and Firearms ALCOHOL AND TOBACCO TAX AND... and separated by the required appellation of origin, the name(s) of the grape variety or varieties...

  16. 40 CFR 52.382 - Significant deterioration of air quality.

    Science.gov (United States)

    2010-07-01

    ... 40 Protection of Environment 3 2010-07-01 2010-07-01 false Significant deterioration of air quality. 52.382 Section 52.382 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) AIR... deterioration of air quality. (a) The requirements of sections 160 through 165 of the Clean Air Act are not met...

  17. 40 CFR 52.1116 - Significant deterioration of air quality.

    Science.gov (United States)

    2010-07-01

    ... quality. 52.1116 Section 52.1116 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) AIR... Significant deterioration of air quality. (a) The requirements of sections 160 through 165 of the Clean Air... deterioration of air quality. (b) The following provisions of 40 CFR 52.21 are hereby incorporated and made a...

  18. Molecular requirements for radiation-activated recombination

    International Nuclear Information System (INIS)

    Stevens, Craig W.; Zeng Ming; Stamato, Thomas; Cerniglia, George

    1997-01-01

    Purpose/Objective: The major stumbling block to successful gene therapy today is poor gene transfer. We hypothesized that ionizing radiation might activate cellular recombination, and so improve stable gene transfer. We further hypothesized that known DNA-damage-repair proteins might also be important in radiation-activated recombination. Materials and Methods: The effect of irradiation on stable gene transfer efficiency was determined in human (A549 and 39F) and rodent (NIH/3T3) cell lines. Continuous low dose rate and multiple radiation fractions were also tested. Nuclear extracts were made and the effect of irradiation on inter-plasmid recombination/ligation determined. Multiple DNA damage-repair deficient cell lines were tested for radiation-activated recombination. Results: A significant radiation dose-dependent improvement in stable plasmid transfection (by as much as 1300 fold) is demonstrated in neoplastic and primary cells. An improvement in transient plasmid transfection is also seen, with as much as 85% of cells transiently expressing b-galactosidase (20-50 fold improvement). Stable transfection is only improved for linearized or nicked plasmids. Cells have improved gene transfer for at least 96 hours after irradiation. Both fractionated and continuous low dose rate irradiation are effective at improving stable gene transfer in mammalian cells, thus making relatively high radiation dose delivery clinically feasible. Inter-plasmid recombination is radiation dose dependent in nuclear extract assays, and the type of overhang (3', 5' or blunt end) significantly affects recombination efficiency and the type of product. The most common end-joining activity involves filling-in of the overhang followed by blunt end ligation. Adenovirus is a linear, double stranded DNA virus. We demonstrate that adenoviral infection efficiency is increased by irradiation. The duration of transgene expression is lengthened because the virus integrates with high efficiency (∼10

  19. A facile template method to synthesize significantly improved LiNi0.5Mn1.5O4 using corn stalk as a bio-template

    International Nuclear Information System (INIS)

    Liu, Guiyang; Kong, Xin; Sun, Hongyan; Wang, Baosen; Yi, Zhongzhou; Wang, Quanbiao

    2014-01-01

    In order to simplify the template method for the synthesis of cathode materials for lithium ion batteries, a facile template method using plant stalks as bio-templates has been introduced. Based on this method, LiNi 0.5 Mn 1.5 O 4 spinel with a significantly improved electrochemical performance has been synthesized using corn stalk as a template. X-ray diffraction (XRD), Fourier transform infrared pectroscopy (FTIR) and scanning electron microscope (SEM) have been used to investigate the phase composition and micro-morphologies of the products. Charge-discharge measurements in lithium cells, cyclic voltammetry (CV) and Electrochemical impedance spectroscopy (EIS) have been used to study the electrochemical performance of the products. The results indicate that the templated product exhibits higher crystallinity than that of non-templated product. Both of the templated product and the non-templated product are combination of the ordered space group P4 3 32 and the disordered Fd-3 m. The specific BET surface area of the templated product is about twice larger than that of the non-templated product. Moreover, the electrochemical performances of the templated product including specific capacity, cycling stability and rate capability are significantly improved as compared with the non-templated product, due to its higher crystallinity, larger Li + diffusion coefficient and lower charge transfer resistance

  20. Dictionary Pruning with Visual Word Significance for Medical Image Retrieval.

    Science.gov (United States)

    Zhang, Fan; Song, Yang; Cai, Weidong; Hauptmann, Alexander G; Liu, Sidong; Pujol, Sonia; Kikinis, Ron; Fulham, Michael J; Feng, David Dagan; Chen, Mei

    2016-02-12

    Content-based medical image retrieval (CBMIR) is an active research area for disease diagnosis and treatment but it can be problematic given the small visual variations between anatomical structures. We propose a retrieval method based on a bag-of-visual-words (BoVW) to identify discriminative characteristics between different medical images with Pruned Dictionary based on Latent Semantic Topic description. We refer to this as the PD-LST retrieval. Our method has two main components. First, we calculate a topic-word significance value for each visual word given a certain latent topic to evaluate how the word is connected to this latent topic. The latent topics are learnt, based on the relationship between the images and words, and are employed to bridge the gap between low-level visual features and high-level semantics. These latent topics describe the images and words semantically and can thus facilitate more meaningful comparisons between the words. Second, we compute an overall-word significance value to evaluate the significance of a visual word within the entire dictionary. We designed an iterative ranking method to measure overall-word significance by considering the relationship between all latent topics and words. The words with higher values are considered meaningful with more significant discriminative power in differentiating medical images. We evaluated our method on two public medical imaging datasets and it showed improved retrieval accuracy and efficiency.

  1. Computer-Aided Identification and Validation of Privacy Requirements

    Directory of Open Access Journals (Sweden)

    Rene Meis

    2016-05-01

    Full Text Available Privacy is a software quality that is closely related to security. The main difference is that security properties aim at the protection of assets that are crucial for the considered system, and privacy aims at the protection of personal data that are processed by the system. The identification of privacy protection needs in complex systems is a hard and error prone task. Stakeholders whose personal data are processed might be overlooked, or the sensitivity and the need of protection of the personal data might be underestimated. The later personal data and the needs to protect them are identified during the development process, the more expensive it is to fix these issues, because the needed changes of the system-to-be often affect many functionalities. In this paper, we present a systematic method to identify the privacy needs of a software system based on a set of functional requirements by extending the problem-based privacy analysis (ProPAn method. Our method is tool-supported and automated where possible to reduce the effort that has to be spent for the privacy analysis, which is especially important when considering complex systems. The contribution of this paper is a semi-automatic method to identify the relevant privacy requirements for a software-to-be based on its functional requirements. The considered privacy requirements address all dimensions of privacy that are relevant for software development. As our method is solely based on the functional requirements of the system to be, we enable users of our method to identify the privacy protection needs that have to be addressed by the software-to-be at an early stage of the development. As initial evaluation of our method, we show its applicability on a small electronic health system scenario.

  2. Clinically significant bleeding in incurable cancer patients: effectiveness of hemostatic radiotherapy

    International Nuclear Information System (INIS)

    Cihoric, Nikola; Crowe, Susanne; Eychmüller, Steffen; Aebersold, Daniel M; Ghadjar, Pirus

    2012-01-01

    This study was performed to evaluate the outcome after hemostatic radiotherapy (RT) of significant bleeding in incurable cancer patients. Patients treated by hemostatic RT between November 2006 and February 2010 were retrospectively analyzed. Bleeding was assessed according to the World Health Organization (WHO) scale (grade 0 = no bleeding, 1 = petechial bleeding, 2 = clinically significant bleeding, 3 = bleeding requiring transfusion, 4 = bleeding associated with fatality). The primary endpoint was bleeding at the end of RT. Key secondary endpoints included overall survival (OS) and acute toxicity. The bleeding score before and after RT were compared using the Wilcoxon signed rank test. Time to event endpoints were estimated using the Kaplan Meier method. Overall 62 patients were analyzed including 1 patient whose benign cause of bleeding was pseudomyxoma peritonei. Median age was 66 (range, 37–93) years. Before RT, bleeding was graded as 2 and 3 in 24 (39%) and 38 (61%) patients, respectively. A median dose of 20 (range, 5–45) Gy of hemostatic RT was applied to the bleeding site. At the end of RT, there was a statistically significant difference in bleeding (p < 0.001); it was graded as 0 (n = 39), 1 (n = 12), 2 (n = 6), 3 (n = 4) and 4 (n = 1). With a median follow-up of 19.3 (range, 0.3-19.3) months, the 6-month OS rate was 43%. Forty patients died (65%); 5 due to bleeding. No grade 3 or above acute toxicity was observed. Hemostatic RT seems to be a safe and effective treatment for clinically and statistically significantly reducing bleeding in incurable cancer patients

  3. Numerical Feynman integrals with physically inspired interpolation: Faster convergence and significant reduction of computational cost

    Directory of Open Access Journals (Sweden)

    Nikesh S. Dattani

    2012-03-01

    Full Text Available One of the most successful methods for calculating reduced density operator dynamics in open quantum systems, that can give numerically exact results, uses Feynman integrals. However, when simulating the dynamics for a given amount of time, the number of time steps that can realistically be used with this method is always limited, therefore one often obtains an approximation of the reduced density operator at a sparse grid of points in time. Instead of relying only on ad hoc interpolation methods (such as splines to estimate the system density operator in between these points, I propose a method that uses physical information to assist with this interpolation. This method is tested on a physically significant system, on which its use allows important qualitative features of the density operator dynamics to be captured with as little as two time steps in the Feynman integral. This method allows for an enormous reduction in the amount of memory and CPU time required for approximating density operator dynamics within a desired accuracy. Since this method does not change the way the Feynman integral itself is calculated, the value of the density operator approximation at the points in time used to discretize the Feynamn integral will be the same whether or not this method is used, but its approximation in between these points in time is considerably improved by this method. A list of ways in which this proposed method can be further improved is presented in the last section of the article.

  4. The energetic significance of cooking.

    Science.gov (United States)

    Carmody, Rachel N; Wrangham, Richard W

    2009-10-01

    While cooking has long been argued to improve the diet, the nature of the improvement has not been well defined. As a result, the evolutionary significance of cooking has variously been proposed as being substantial or relatively trivial. In this paper, we evaluate the hypothesis that an important and consistent effect of cooking food is a rise in its net energy value. The pathways by which cooking influences net energy value differ for starch, protein, and lipid, and we therefore consider plant and animal foods separately. Evidence of compromised physiological performance among individuals on raw diets supports the hypothesis that cooked diets tend to provide energy. Mechanisms contributing to energy being gained from cooking include increased digestibility of starch and protein, reduced costs of digestion for cooked versus raw meat, and reduced energetic costs of detoxification and defence against pathogens. If cooking consistently improves the energetic value of foods through such mechanisms, its evolutionary impact depends partly on the relative energetic benefits of non-thermal processing methods used prior to cooking. We suggest that if non-thermal processing methods such as pounding were used by Lower Palaeolithic Homo, they likely provided an important increase in energy gain over unprocessed raw diets. However, cooking has critical effects not easily achievable by non-thermal processing, including the relatively complete gelatinisation of starch, efficient denaturing of proteins, and killing of food borne pathogens. This means that however sophisticated the non-thermal processing methods were, cooking would have conferred incremental energetic benefits. While much remains to be discovered, we conclude that the adoption of cooking would have led to an important rise in energy availability. For this reason, we predict that cooking had substantial evolutionary significance.

  5. Testing for significance of phase synchronisation dynamics in the EEG.

    Science.gov (United States)

    Daly, Ian; Sweeney-Reed, Catherine M; Nasuto, Slawomir J

    2013-06-01

    A number of tests exist to check for statistical significance of phase synchronisation within the Electroencephalogram (EEG); however, the majority suffer from a lack of generality and applicability. They may also fail to account for temporal dynamics in the phase synchronisation, regarding synchronisation as a constant state instead of a dynamical process. Therefore, a novel test is developed for identifying the statistical significance of phase synchronisation based upon a combination of work characterising temporal dynamics of multivariate time-series and Markov modelling. We show how this method is better able to assess the significance of phase synchronisation than a range of commonly used significance tests. We also show how the method may be applied to identify and classify significantly different phase synchronisation dynamics in both univariate and multivariate datasets.

  6. A coupling method for a cardiovascular simulation model which includes the Kalman filter.

    Science.gov (United States)

    Hasegawa, Yuki; Shimayoshi, Takao; Amano, Akira; Matsuda, Tetsuya

    2012-01-01

    Multi-scale models of the cardiovascular system provide new insight that was unavailable with in vivo and in vitro experiments. For the cardiovascular system, multi-scale simulations provide a valuable perspective in analyzing the interaction of three phenomenons occurring at different spatial scales: circulatory hemodynamics, ventricular structural dynamics, and myocardial excitation-contraction. In order to simulate these interactions, multiscale cardiovascular simulation systems couple models that simulate different phenomena. However, coupling methods require a significant amount of calculation, since a system of non-linear equations must be solved for each timestep. Therefore, we proposed a coupling method which decreases the amount of calculation by using the Kalman filter. In our method, the Kalman filter calculates approximations for the solution to the system of non-linear equations at each timestep. The approximations are then used as initial values for solving the system of non-linear equations. The proposed method decreases the number of iterations required by 94.0% compared to the conventional strong coupling method. When compared with a smoothing spline predictor, the proposed method required 49.4% fewer iterations.

  7. Efficient methods of piping cleaning

    Directory of Open Access Journals (Sweden)

    Orlov Vladimir Aleksandrovich

    2014-01-01

    Full Text Available The article contains the analysis of the efficient methods of piping cleaning of water supply and sanitation systems. Special attention is paid to the ice cleaning method, in course of which biological foil and various mineral and organic deposits are removed due to the ice crust buildup on the inner surface of water supply and drainage pipes. These impurities are responsible for the deterioration of the organoleptic properties of the transported drinking water or narrowing cross-section of drainage pipes. The co-authors emphasize that the use of ice compared to other methods of pipe cleaning has a number of advantages due to the relative simplicity and cheapness of the process, economical efficiency and lack of environmental risk. The equipment for performing ice cleaning is presented, its technological options, terms of cleansing operations, as well as the volumes of disposed pollution per unit length of the water supply and drainage pipelines. It is noted that ice cleaning requires careful planning in the process of cooking ice and in the process of its supply in the pipe. There are specific requirements to its quality. In particular, when you clean drinking water system the ice applied should be hygienically clean and meet sanitary requirements.In pilot projects, in particular, quantitative and qualitative analysis of sediments adsorbed by ice is conducted, as well as temperature and the duration of the process. The degree of pollution of the pipeline was estimated by the volume of the remote sediment on 1 km of pipeline. Cleaning pipelines using ice can be considered one of the methods of trenchless technologies, being a significant alternative to traditional methods of cleaning the pipes. The method can be applied in urban pipeline systems of drinking water supply for the diameters of 100—600 mm, and also to diversion collectors. In the world today 450 km of pipelines are subject to ice cleaning method.Ice cleaning method is simple

  8. Requirements to nuclear power to significantly contribute to decarbonization

    Energy Technology Data Exchange (ETDEWEB)

    Zipper, Reinhard [Forschungs- und Technologieberatung Zipper (FTBZ), Haan (Germany)

    2016-06-15

    By the end of the UN Climate Conference 2015 in Paris all members agreed to take action with the goal to limit climate warming to less than 2 C compared to pre-industrial status. CO{sub 2} emissions are supposed to be the main driver of climate change. This implies that all kinds of energy supply must be reengineered to avoid deployment of fossil fuels as far as practicable. Nuclear energy has a minimal CO{sub 2} footprint and with the ability to flexibly adjust to the individual demands of supply regions it may be favored to play a major role in decarbonization. However, on this path some principal obstacles must be removed to gain general public, policy and industry acceptance.

  9. Multi-level significance of vulnerability indicators. Case study: Eastern Romania

    Science.gov (United States)

    Stanga, I. C.; Grozavu, A.

    2012-04-01

    Vulnerability assessment aims, most frequently, to emphasize internal fragility of a system comparing to a reference standard, to similar systems or in relation to a given hazard. Internal fragility, either biophysical or structural, may affect the capacity to predict, to prepare for, to cope with or to recover from a disaster. Thus, vulnerability is linked to resilience and adaptive capacity. From local level to global one, vulnerability factors and corresponding indicators are different and their significance must be tested and validated in a well-structured conceptual and methodological framework. In this paper, the authors aim to show the real vulnerability of rural settlements in Eastern Romania in a multi-level approach. The research area, Tutova Hills, counts about 3421 sq.km and more than 200.000 inhabitants in 421 villages characterized by deficient accessibility, lack of endowments, subsistential agriculture, high pressure on natural environment (especially on forest and soil resources), poverty and aging process of population. Factors that could influence the vulnerability of these rural settlements have been inventoried and assigned into groups through a cluster analysis: habitat and technical urban facilities, infrastructure, economical, social and demographical indicators, environment quality, management of emergency situations etc. Firstly, the main difficulty was to convert qualitative variable in quantitative indicators and to standardize all values to make possible mathematical and statistical processing of data. Secondly, the great variability of vulnerability factors, their different measuring units and their high amplitude of variation require different method of standardization in order to obtain values between zero (minimum vulnerability) and one (maximum vulnerability). Final vulnerability indicators were selected and integrated in a general scheme, according to their significance resulted from an appropriate factor analysis: linear and

  10. Precise method for correcting count-rate losses in scintillation cameras

    International Nuclear Information System (INIS)

    Madsen, M.T.; Nickles, R.J.

    1986-01-01

    Quantitative studies performed with scintillation detectors often require corrections for lost data because of the finite resolving time of the detector. Methods that monitor losses by means of a reference source or pulser have unacceptably large statistical fluctuations associated with their correction factors. Analytic methods that model the detector as a paralyzable system require an accurate estimate of the system resolving time. Because the apparent resolving time depends on many variables, including the window setting, source distribution, and the amount of scattering material, significant errors can be introduced by relying on a resolving time obtained from phantom measurements. These problems can be overcome by curve-fitting the data from a reference source to a paralyzable model in which the true total count rate in the selected window is estimated from the observed total rate. The resolving time becomes a free parameter in this method which is optimized to provide the best fit to the observed reference data. The fitted curve has the inherent accuracy of the reference source method with the precision associated with the observed total image count rate. Correction factors can be simply calculated from the ratio of the true reference source rate and the fitted curve. As a result, the statistical uncertainty of the data corrected by this method is not significantly increased

  11. Requirements for a quality measurement instrument for semantic standards

    NARCIS (Netherlands)

    Folmer, E.J.A.; Krukkert, D.; Oude Luttighuis, P.; Hillegersberg van, J. van

    2010-01-01

    This study describes requirements for an instrument to measure the quality of semantic standards. A situational requirements engineering method was used, resulting in a goal-tree in which requirements are structured. This structure shows requirements related to the input of the instrument; stating

  12. A novel ultra-performance liquid chromatography hyphenated with quadrupole time of flight mass spectrometry method for rapid estimation of total toxic retronecine-type of pyrrolizidine alkaloids in herbs without requiring corresponding standards.

    Science.gov (United States)

    Zhu, Lin; Ruan, Jian-Qing; Li, Na; Fu, Peter P; Ye, Yang; Lin, Ge

    2016-03-01

    Nearly 50% of naturally-occurring pyrrolizidine alkaloids (PAs) are hepatotoxic, and the majority of hepatotoxic PAs are retronecine-type PAs (RET-PAs). However, quantitative measurement of PAs in herbs/foodstuffs is often difficult because most of reference PAs are unavailable. In this study, a rapid, selective, and sensitive UHPLC-QTOF-MS method was developed for the estimation of RET-PAs in herbs without requiring corresponding standards. This method is based on our previously established characteristic and diagnostic mass fragmentation patterns and the use of retrorsine for calibration. The use of a single RET-PA (i.e. retrorsine) for construction of calibration was based on high similarities with no significant differences demonstrated by the calibration curves constructed by peak areas of extract ion chromatograms of fragment ion at m/z 120.0813 or 138.0919 versus concentrations of five representative RET-PAs. The developed method was successfully applied to measure a total content of toxic RET-PAs of diversified structures in fifteen potential PA-containing herbs. Copyright © 2014 Elsevier Ltd. All rights reserved.

  13. Limitations of the mnemonic-keyword method.

    Science.gov (United States)

    Campos, Alfredo; González, María Angeles; Amor, Angeles

    2003-10-01

    The effectiveness of the mnemonic-keyword method was investigated in 4 experiments in which participants were required to learn the 1st-language (L1, Spanish) equivalents of a list of 30 2nd-language words (L2, Latin). Experiments 1 (adolescents) and 2 (adults) were designed to assess whether the keyword method was more effective than the rote method; the researcher supplied the keyword, and the participants were allowed to pace themselves through the list. Experiments 3 (adolescents) and 4 (adults) were similar to Experiments 1 and 2 except that the participants were also supplied with a drawing that illustrated the relationship between the keyword and the L1 target word. All the experiments were performed with groups of participants in their classrooms (i.e., not in a laboratory context). In all experiments, the rote method was significantly more effective than was the keyword method.

  14. Adding Timing Requirements to the CODARTS Real-Time Software Design Method

    DEFF Research Database (Denmark)

    Bach, K.R.

    The CODARTS software design method consideres how concurrent, distributed and real-time applications can be designed. Although accounting for the important issues of task and communication, the method does not provide means for expressing the timeliness of the tasks and communication directly...

  15. Active Learning with Rationales for Identifying Operationally Significant Anomalies in Aviation

    Science.gov (United States)

    Sharma, Manali; Das, Kamalika; Bilgic, Mustafa; Matthews, Bryan; Nielsen, David Lynn; Oza, Nikunj C.

    2016-01-01

    A major focus of the commercial aviation community is discovery of unknown safety events in flight operations data. Data-driven unsupervised anomaly detection methods are better at capturing unknown safety events compared to rule-based methods which only look for known violations. However, not all statistical anomalies that are discovered by these unsupervised anomaly detection methods are operationally significant (e.g., represent a safety concern). Subject Matter Experts (SMEs) have to spend significant time reviewing these statistical anomalies individually to identify a few operationally significant ones. In this paper we propose an active learning algorithm that incorporates SME feedback in the form of rationales to build a classifier that can distinguish between uninteresting and operationally significant anomalies. Experimental evaluation on real aviation data shows that our approach improves detection of operationally significant events by as much as 75% compared to the state-of-the-art. The learnt classifier also generalizes well to additional validation data sets.

  16. Cold Vacuum Drying (CVD) Facility Technical Safety Requirements

    International Nuclear Information System (INIS)

    KRAHN, D.E.

    2000-01-01

    The Technical Safety Requirements (TSRs) for the Cold Vacuum Drying Facility define acceptable conditions, safe boundaries, bases thereof, and management or administrative controls required to ensure safe operation during receipt of multi-canister overpacks (MCOs) containing spent nuclear fuel. removal of free water from the MCOs using the cold vacuum drying process, and inerting and testing of the MCOs before transport to the Canister Storage Building. Controls required for public safety, significant defense in depth, significant worker safety, and for maintaining radiological and toxicological consequences below risk evaluation guidelines are included

  17. Band structure calculation of GaSe-based nanostructures using empirical pseudopotential method

    International Nuclear Information System (INIS)

    Osadchy, A V; Obraztsova, E D; Volotovskiy, S G; Golovashkin, D L; Savin, V V

    2016-01-01

    In this paper we present the results of band structure computer simulation of GaSe- based nanostructures using the empirical pseudopotential method. Calculations were performed using a specially developed software that allows performing simulations using cluster computing. Application of this method significantly reduces the demands on computing resources compared to traditional approaches based on ab-initio techniques and provides receiving the adequate comparable results. The use of cluster computing allows to obtain information for structures that require an explicit account of a significant number of atoms, such as quantum dots and quantum pillars. (paper)

  18. Reduction in the Number of Comparisons Required to Create Matrix of Expert Judgment in the Comet Method

    Directory of Open Access Journals (Sweden)

    Sałabun Wojciech

    2014-09-01

    Full Text Available Multi-criteria decision-making (MCDM methods are associated with the ranking of alternatives based on expert judgments made using a number of criteria. In the MCDM field, the distance-based approach is one popular method for receiving a final ranking. One of the newest MCDM method, which uses the distance-based approach, is the Characteristic Objects Method (COMET. In this method, the preferences of each alternative are obtained on the basis of the distance from the nearest characteristic ob jects and their values. For this purpose, the domain and fuzzy numbers set for all the considered criteria are determined. The characteristic objects are obtained as the combination of the crisp values of all the fuzzy numbers. The preference values of all the characteristic ob ject are determined based on the tournament method and the principle of indifference. Finally, the fuzzy model is constructed and is used to calculate preference values of the alternatives. In this way, a multi-criteria model is created and it is free of rank reversal phenomenon. In this approach, the matrix of expert judgment is necessary to create. For this purpose, an expert has to compare all the characteristic ob jects with each other. The number of necessary comparisons depends squarely to the number of ob jects. This study proposes the improvement of the COMET method by using the transitivity of pairwise comparisons. Three numerical examples are used to illustrate the efficiency of the proposed improvement with respect to results from the original approach. The proposed improvement reduces significantly the number of necessary comparisons to create the matrix of expert judgment.

  19. Transforming Multidisciplinary Customer Requirements to Product Design Specifications

    Science.gov (United States)

    Ma, Xiao-Jie; Ding, Guo-Fu; Qin, Sheng-Feng; Li, Rong; Yan, Kai-Yin; Xiao, Shou-Ne; Yang, Guang-Wu

    2017-09-01

    With the increasing of complexity of complex mechatronic products, it is necessary to involve multidisciplinary design teams, thus, the traditional customer requirements modeling for a single discipline team becomes difficult to be applied in a multidisciplinary team and project since team members with various disciplinary backgrounds may have different interpretations of the customers' requirements. A new synthesized multidisciplinary customer requirements modeling method is provided for obtaining and describing the common understanding of customer requirements (CRs) and more importantly transferring them into a detailed and accurate product design specifications (PDS) to interact with different team members effectively. A case study of designing a high speed train verifies the rationality and feasibility of the proposed multidisciplinary requirement modeling method for complex mechatronic product development. This proposed research offersthe instruction to realize the customer-driven personalized customization of complex mechatronic product.

  20. Functional Mobility Testing: A Novel Method to Establish Human System Interface Design Requirements

    Science.gov (United States)

    England, Scott A.; Benson, Elizabeth A.; Rajulu, Sudhakar

    2008-01-01

    Across all fields of human-system interface design it is vital to posses a sound methodology dictating the constraints on the system based on the capabilities of the human user. These limitations may be based on strength, mobility, dexterity, cognitive ability, etc. and combinations thereof. Data collected in an isolated environment to determine, for example, maximal strength or maximal range of motion would indeed be adequate for establishing not-to-exceed type design limitations, however these restraints on the system may be excessive over what is basally needed. Resources may potentially be saved by having a technique to determine the minimum measurements a system must accommodate. This paper specifically deals with the creation of a novel methodology for establishing mobility requirements for a new generation of space suit design concepts. Historically, the Space Shuttle and the International Space Station vehicle and space hardware design requirements documents such as the Man-Systems Integration Standards and International Space Station Flight Crew Integration Standard explicitly stated that the designers should strive to provide the maximum joint range of motion capabilities exhibited by a minimally clothed human subject. In the course of developing the Human-Systems Integration Requirements (HSIR) for the new space exploration initiative (Constellation), an effort was made to redefine the mobility requirements in the interest of safety and cost. Systems designed for manned space exploration can receive compounded gains from simplified designs that are both initially less expensive to produce and lighter, thereby, cheaper to launch.

  1. Computer-Aided Identification and Validation of Intervenability Requirements

    Directory of Open Access Journals (Sweden)

    Rene Meis

    2017-03-01

    Full Text Available Privacy as a software quality is becoming more important these days and should not be underestimated during the development of software that processes personal data. The privacy goal of intervenability, in contrast to unlinkability (including anonymity and pseudonymity, has so far received little attention in research. Intervenability aims for the empowerment of end-users by keeping their personal data and how it is processed by the software system under their control. Several surveys have pointed out that the lack of intervenability options is a central privacy concern of end-users. In this paper, we systematically assess the privacy goal of intervenability and set up a software requirements taxonomy that relates the identified intervenability requirements with a taxonomy of transparency requirements. Furthermore, we provide a tool-supported method to identify intervenability requirements from the functional requirements of a software system. This tool-supported method provides the means to elicit and validate intervenability requirements in a computer-aided way. Our combined taxonomy of intervenability and transparency requirements gives a detailed view on the privacy goal of intervenability and its relation to transparency. We validated the completeness of our taxonomy by comparing it to the relevant literature that we derived based on a systematic literature review. The proposed method for the identification of intervenability requirements shall support requirements engineers to elicit and document intervenability requirements in compliance with the EU General Data Protection Regulation.

  2. Endovascular management for significant iatrogenic portal vein bleeding.

    Science.gov (United States)

    Kim, Jong Woo; Shin, Ji Hoon; Park, Jonathan K; Yoon, Hyun-Ki; Ko, Gi-Young; Gwon, Dong Il; Kim, Jin Hyoung; Sung, Kyu-Bo

    2017-11-01

    Background Despite conservative treatment, hemorrhage from an intrahepatic branch of the portal vein can cause hemodynamic instability requiring urgent intervention. Purpose To retrospectively report the outcomes of hemodynamically significant portal vein bleeding after endovascular management. Material and Methods During a period of 15 years, four patients (2 men, 2 women; median age, 70.5 years) underwent angiography and embolization for iatrogenic portal vein bleeding. Causes of hemorrhage, angiographic findings, endovascular treatment, and complications were reported. Results Portal vein bleeding occurred after percutaneous liver biopsy (n = 2), percutaneous radiofrequency ablation (n = 1), and percutaneous cholecystostomy (n = 1). The median time interval between angiography and percutaneous procedure was 5 h (range, 4-240 h). Common hepatic angiograms including indirect mesenteric portograms showed active portal vein bleeding into the peritoneal cavity with (n = 1) or without (n = 2) an arterioportal (AP) fistula, and portal vein pseudoaneurysm alone with an AP fistula (n = 1). Successful transcatheter arterial embolization (n = 2) or percutaneous transhepatic portal vein embolization (n = 2) was performed. Embolic materials were n-butyl cyanoacrylate alone (n = 2) or in combination with gelatin sponge particles and coils (n = 2). There were no major treatment-related complications or patient mortality within 30 days. Conclusion Patients with symptomatic or life-threatening portal vein bleeding following liver-penetrating procedures can successfully be managed with embolization.

  3. Understanding your users a practical guide to user requirements methods, tools, and techniques

    CERN Document Server

    Baxter, Kathy

    2005-01-01

    Today many companies are employing a user-centered design (UCD) process, but for most companies, usability begins and ends with the usability test. Although usability testing is a critical part of an effective user-centered life cycle, it is only one component of the UCD process. This book is focused on the requirements gathering stage, which often receives less attention than usability testing, but is equally as important. Understanding user requirements is critical to the development of a successful product. Understanding Your Users is an easy to read, easy to implement, how-to guide on

  4. Improvement of spatial discretization error on the semi-analytic nodal method using the scattered source subtraction method

    International Nuclear Information System (INIS)

    Yamamoto, Akio; Tatsumi, Masahiro

    2006-01-01

    In this paper, the scattered source subtraction (SSS) method is newly proposed to improve the spatial discretization error of the semi-analytic nodal method with the flat-source approximation. In the SSS method, the scattered source is subtracted from both side of the diffusion or the transport equation to make spatial variation of the source term to be small. The same neutron balance equation is still used in the SSS method. Since the SSS method just modifies coefficients of node coupling equations (those used in evaluation for the response of partial currents), its implementation is easy. Validity of the present method is verified through test calculations that are carried out in PWR multi-assemblies configurations. The calculation results show that the SSS method can significantly improve the spatial discretization error. Since the SSS method does not have any negative impact on execution time, convergence behavior and memory requirement, it will be useful to reduce the spatial discretization error of the semi-analytic nodal method with the flat-source approximation. (author)

  5. Waste Management System Requirement document

    International Nuclear Information System (INIS)

    1990-04-01

    This volume defines the top level technical requirements for the Monitored Retrievable Storage (MRS) facility. It is designed to be used in conjunction with Volume 1, General System Requirements. Volume 3 provides a functional description expanding the requirements allocated to the MRS facility in Volume 1 and, when appropriate, elaborates on requirements by providing associated performance criteria. Volumes 1 and 3 together convey a minimum set of requirements that must be satisfied by the final MRS facility design without unduly constraining individual design efforts. The requirements are derived from the Nuclear Waste Policy Act of 1982 (NWPA), the Nuclear Waste Policy Amendments Act of 1987 (NWPAA), the Environmental Protection Agency's (EPA) Environmental Standards for the Management and Disposal of Spent Nuclear Fuel (40 CFR 191), NRC Licensing Requirements for the Independent Storage of Spent Nuclear and High-Level Radioactive Waste (10 CFR 72), and other federal statutory and regulatory requirements, and major program policy decisions. This document sets forth specific requirements that will be fulfilled. Each subsequent level of the technical document hierarchy will be significantly more detailed and provide further guidance and definition as to how each of these requirements will be implemented in the design. Requirements appearing in Volume 3 are traceable into the MRS Design Requirements Document. Section 2 of this volume provides a functional breakdown for the MRS facility. 1 tab

  6. Assessing Statistically Significant Heavy-Metal Concentrations in Abandoned Mine Areas via Hot Spot Analysis of Portable XRF Data.

    Science.gov (United States)

    Kim, Sung-Min; Choi, Yosoon

    2017-06-18

    To develop appropriate measures to prevent soil contamination in abandoned mining areas, an understanding of the spatial variation of the potentially toxic trace elements (PTEs) in the soil is necessary. For the purpose of effective soil sampling, this study uses hot spot analysis, which calculates a z -score based on the Getis-Ord Gi* statistic to identify a statistically significant hot spot sample. To constitute a statistically significant hot spot, a feature with a high value should also be surrounded by other features with high values. Using relatively cost- and time-effective portable X-ray fluorescence (PXRF) analysis, sufficient input data are acquired from the Busan abandoned mine and used for hot spot analysis. To calibrate the PXRF data, which have a relatively low accuracy, the PXRF analysis data are transformed using the inductively coupled plasma atomic emission spectrometry (ICP-AES) data. The transformed PXRF data of the Busan abandoned mine are classified into four groups according to their normalized content and z -scores: high content with a high z -score (HH), high content with a low z -score (HL), low content with a high z -score (LH), and low content with a low z -score (LL). The HL and LH cases may be due to measurement errors. Additional or complementary surveys are required for the areas surrounding these suspect samples or for significant hot spot areas. The soil sampling is conducted according to a four-phase procedure in which the hot spot analysis and proposed group classification method are employed to support the development of a sampling plan for the following phase. Overall, 30, 50, 80, and 100 samples are investigated and analyzed in phases 1-4, respectively. The method implemented in this case study may be utilized in the field for the assessment of statistically significant soil contamination and the identification of areas for which an additional survey is required.

  7. Assessing Statistically Significant Heavy-Metal Concentrations in Abandoned Mine Areas via Hot Spot Analysis of Portable XRF Data

    Directory of Open Access Journals (Sweden)

    Sung-Min Kim

    2017-06-01

    Full Text Available To develop appropriate measures to prevent soil contamination in abandoned mining areas, an understanding of the spatial variation of the potentially toxic trace elements (PTEs in the soil is necessary. For the purpose of effective soil sampling, this study uses hot spot analysis, which calculates a z-score based on the Getis-Ord Gi* statistic to identify a statistically significant hot spot sample. To constitute a statistically significant hot spot, a feature with a high value should also be surrounded by other features with high values. Using relatively cost- and time-effective portable X-ray fluorescence (PXRF analysis, sufficient input data are acquired from the Busan abandoned mine and used for hot spot analysis. To calibrate the PXRF data, which have a relatively low accuracy, the PXRF analysis data are transformed using the inductively coupled plasma atomic emission spectrometry (ICP-AES data. The transformed PXRF data of the Busan abandoned mine are classified into four groups according to their normalized content and z-scores: high content with a high z-score (HH, high content with a low z-score (HL, low content with a high z-score (LH, and low content with a low z-score (LL. The HL and LH cases may be due to measurement errors. Additional or complementary surveys are required for the areas surrounding these suspect samples or for significant hot spot areas. The soil sampling is conducted according to a four-phase procedure in which the hot spot analysis and proposed group classification method are employed to support the development of a sampling plan for the following phase. Overall, 30, 50, 80, and 100 samples are investigated and analyzed in phases 1–4, respectively. The method implemented in this case study may be utilized in the field for the assessment of statistically significant soil contamination and the identification of areas for which an additional survey is required.

  8. Hanford analytical services quality assurance requirements documents. Volume 1: Administrative Requirements

    International Nuclear Information System (INIS)

    Hyatt, J.E.

    1997-01-01

    Hanford Analytical Services Quality Assurance Requirements Document (HASQARD) is issued by the Analytical Services, Program of the Waste Management Division, US Department of Energy (US DOE), Richland Operations Office (DOE-RL). The HASQARD establishes quality requirements in response to DOE Order 5700.6C (DOE 1991b). The HASQARD is designed to meet the needs of DOE-RL for maintaining a consistent level of quality for sampling and field and laboratory analytical services provided by contractor and commercial field and laboratory analytical operations. The HASQARD serves as the quality basis for all sampling and field/laboratory analytical services provided to DOE-RL through the Analytical Services Program of the Waste Management Division in support of Hanford Site environmental cleanup efforts. This includes work performed by contractor and commercial laboratories and covers radiological and nonradiological analyses. The HASQARD applies to field sampling, field analysis, and research and development activities that support work conducted under the Hanford Federal Facility Agreement and Consent Order Tri-Party Agreement and regulatory permit applications and applicable permit requirements described in subsections of this volume. The HASQARD applies to work done to support process chemistry analysis (e.g., ongoing site waste treatment and characterization operations) and research and development projects related to Hanford Site environmental cleanup activities. This ensures a uniform quality umbrella to analytical site activities predicated on the concepts contained in the HASQARD. Using HASQARD will ensure data of known quality and technical defensibility of the methods used to obtain that data. The HASQARD is made up of four volumes: Volume 1, Administrative Requirements; Volume 2, Sampling Technical Requirements; Volume 3, Field Analytical Technical Requirements; and Volume 4, Laboratory Technical Requirements. Volume 1 describes the administrative requirements

  9. Large Scale Computing and Storage Requirements for Biological and Environmental Research

    Energy Technology Data Exchange (ETDEWEB)

    DOE Office of Science, Biological and Environmental Research Program Office (BER),

    2009-09-30

    In May 2009, NERSC, DOE's Office of Advanced Scientific Computing Research (ASCR), and DOE's Office of Biological and Environmental Research (BER) held a workshop to characterize HPC requirements for BER-funded research over the subsequent three to five years. The workshop revealed several key points, in addition to achieving its goal of collecting and characterizing computing requirements. Chief among them: scientific progress in BER-funded research is limited by current allocations of computational resources. Additionally, growth in mission-critical computing -- combined with new requirements for collaborative data manipulation and analysis -- will demand ever increasing computing, storage, network, visualization, reliability and service richness from NERSC. This report expands upon these key points and adds others. It also presents a number of"case studies" as significant representative samples of the needs of science teams within BER. Workshop participants were asked to codify their requirements in this"case study" format, summarizing their science goals, methods of solution, current and 3-5 year computing requirements, and special software and support needs. Participants were also asked to describe their strategy for computing in the highly parallel,"multi-core" environment that is expected to dominate HPC architectures over the next few years.

  10. On the computational assessment of white matter hyperintensity progression: difficulties in method selection and bias field correction performance on images with significant white matter pathology

    Energy Technology Data Exchange (ETDEWEB)

    Valdes Hernandez, Maria del C.; Gonzalez-Castro, Victor; Wang, Xin; Doubal, Fergus; Munoz Maniega, Susana; Wardlaw, Joanna M. [Centre for Clinical Brian Sciences, Department of Neuroimaging Sciences, Edinburgh (United Kingdom); Ghandour, Dina T. [University of Edinburgh, College of Medicine and Veterinary Medicine, Edinburgh (United Kingdom); Armitage, Paul A. [University of Sheffield, Department of Cardiovascular Sciences, Sheffield (United Kingdom)

    2016-05-15

    Subtle inhomogeneities in the scanner's magnetic fields (B{sub 0} and B{sub 1}) alter the intensity levels of the structural magnetic resonance imaging (MRI) affecting the volumetric assessment of WMH changes. Here, we investigate the influence that (1) correcting the images for the B{sub 1} inhomogeneities (i.e. bias field correction (BFC)) and (2) selection of the WMH change assessment method can have on longitudinal analyses of WMH progression and discuss possible solutions. We used brain structural MRI from 46 mild stroke patients scanned at stroke onset and 3 years later. We tested three BFC approaches: FSL-FAST, N4 and exponentially entropy-driven homomorphic unsharp masking (E{sup 2}D-HUM) and analysed their effect on the measured WMH change. Separately, we tested two methods to assess WMH changes: measuring WMH volumes independently at both time points semi-automatically (MCMxxxVI) and subtracting intensity-normalised FLAIR images at both time points following image gamma correction. We then combined the BFC with the computational method that performed best across the whole sample to assess WMH changes. Analysis of the difference in the variance-to-mean intensity ratio in normal tissue between BFC and uncorrected images and visual inspection showed that all BFC methods altered the WMH appearance and distribution, but FSL-FAST in general performed more consistently across the sample and MRI modalities. The WMH volume change over 3 years obtained with MCMxxxVI with vs. without FSL-FAST BFC did not significantly differ (medians(IQR)(with BFC) = 3.2(6.3) vs. 2.9(7.4)ml (without BFC), p = 0.5), but both differed significantly from the WMH volume change obtained from subtracting post-processed FLAIR images (without BFC)(7.6(8.2)ml, p < 0.001). This latter method considerably inflated the WMH volume change as subtle WMH at baseline that became more intense at follow-up were counted as increase in the volumetric change. Measurement of WMH volume change remains

  11. Significant decimal digits for energy representation on short-word computers

    International Nuclear Information System (INIS)

    Sartori, E.

    1989-01-01

    The general belief that single precision floating point numbers have always at least seven significant decimal digits on short word computers such as IBM is erroneous. Seven significant digits are required however for representing the energy variable in nuclear cross-section data sets containing sharp p-wave resonances at 0 Kelvin. It is suggested that either the energy variable is stored in double precision or that cross-section resonances are reconstructed to room temperature or higher on short word computers

  12. Erroneous analyses of interactions in neuroscience: a problem of significance

    NARCIS (Netherlands)

    Nieuwenhuis, S.; Forstmann, B.U.; Wagenmakers, E.-J.

    2011-01-01

    In theory, a comparison of two experimental effects requires a statistical test on their difference. In practice, this comparison is often based on an incorrect procedure involving two separate tests in which researchers conclude that effects differ when one effect is significant (P < 0.05) but the

  13. Testing Significance Testing

    Directory of Open Access Journals (Sweden)

    Joachim I. Krueger

    2018-04-01

    Full Text Available The practice of Significance Testing (ST remains widespread in psychological science despite continual criticism of its flaws and abuses. Using simulation experiments, we address four concerns about ST and for two of these we compare ST’s performance with prominent alternatives. We find the following: First, the 'p' values delivered by ST predict the posterior probability of the tested hypothesis well under many research conditions. Second, low 'p' values support inductive inferences because they are most likely to occur when the tested hypothesis is false. Third, 'p' values track likelihood ratios without raising the uncertainties of relative inference. Fourth, 'p' values predict the replicability of research findings better than confidence intervals do. Given these results, we conclude that 'p' values may be used judiciously as a heuristic tool for inductive inference. Yet, 'p' values cannot bear the full burden of inference. We encourage researchers to be flexible in their selection and use of statistical methods.

  14. A Novel Least Significant Bit First Processing Parallel CRC Circuit

    Directory of Open Access Journals (Sweden)

    Xiujie Qu

    2013-01-01

    Full Text Available In HDLC serial communication protocol, CRC calculation can first process the most or least significant bit of data. Nowadays most CRC calculation is based on the most significant bit (MSB first processing. An algorithm of the least significant bit (LSB first processing parallel CRC is proposed in this paper. Based on the general expression of the least significant bit first processing serial CRC, using state equation method of linear system, we derive a recursive formula by the mathematical deduction. The recursive formula is applicable to any number of bits processed in parallel and any series of generator polynomial. According to the formula, we present the parallel circuit of CRC calculation and implement it with VHDL on FPGA. The results verify the accuracy and effectiveness of this method.

  15. Factoring local sequence composition in motif significance analysis.

    Science.gov (United States)

    Ng, Patrick; Keich, Uri

    2008-01-01

    We recently introduced a biologically realistic and reliable significance analysis of the output of a popular class of motif finders. In this paper we further improve our significance analysis by incorporating local base composition information. Relying on realistic biological data simulation, as well as on FDR analysis applied to real data, we show that our method is significantly better than the increasingly popular practice of using the normal approximation to estimate the significance of a finder's output. Finally we turn to leveraging our reliable significance analysis to improve the actual motif finding task. Specifically, endowing a variant of the Gibbs Sampler with our improved significance analysis we demonstrate that de novo finders can perform better than has been perceived. Significantly, our new variant outperforms all the finders reviewed in a recently published comprehensive analysis of the Harbison genome-wide binding location data. Interestingly, many of these finders incorporate additional information such as nucleosome positioning and the significance of binding data.

  16. A New Method for Quick and Easy Hemolymph Collection from Apidae Adults.

    Directory of Open Access Journals (Sweden)

    Grzegorz Borsuk

    Full Text Available Bio-analysis of insects is increasingly dependent on highly sensitive methods that require high quality biological material, such as hemolymph. However, it is difficult to collect fresh and uncontaminated hemolymph from adult bees since they are very active and have the potential to sting, and because hemolymph is rapidly melanized. Here we aimed to develop and test a quick and easy method for sterile and contamination-free hemolymph sampling from adult Apidae. Our novel antennae method for hemolymph sampling (AMHS, entailed the detachment of an antenna, followed by application of delicate pressure to the bee's abdomen. This resulted in the appearance of a drop of hemolymph at the base of the detached antenna, which was then aspirated using an automatic pipetter. Larger insect size corresponded to easier and faster hemolymph sampling, and to a greater sample volume. We obtained 80-100 μL of sterile non-melanized hemolymph in 1 minute from one Bombus terrestris worker, in 6 minutes from 10 Apis mellifera workers, and in 15 minutes from 18 Apis cerana workers (+/-0.5 minutes. Compared to the most popular method of hemolymph collection, in which hemolymph is sampled by puncturing the dorsal sinus of the thorax with a capillary (TCHS, significantly fewer bees were required to collect 80-100 μL hemolymph using our novel AMHS method. Moreover, the time required for hemolymph collection was significantly shorter using the AMHS compared to the TCHS, which protects the acquired hemolymph against melanization, thus providing the highest quality material for biological analysis.

  17. The Robin Hood method - A novel numerical method for electrostatic problems based on a non-local charge transfer

    International Nuclear Information System (INIS)

    Lazic, Predrag; Stefancic, Hrvoje; Abraham, Hrvoje

    2006-01-01

    We introduce a novel numerical method, named the Robin Hood method, of solving electrostatic problems. The approach of the method is closest to the boundary element methods, although significant conceptual differences exist with respect to this class of methods. The method achieves equipotentiality of conducting surfaces by iterative non-local charge transfer. For each of the conducting surfaces, non-local charge transfers are performed between surface elements, which differ the most from the targeted equipotentiality of the surface. The method is tested against analytical solutions and its wide range of application is demonstrated. The method has appealing technical characteristics. For the problem with N surface elements, the computational complexity of the method essentially scales with N α , where α < 2, the required computer memory scales with N, while the error of the potential decreases exponentially with the number of iterations for many orders of magnitude of the error, without the presence of the Critical Slowing Down. The Robin Hood method could prove useful in other classical or even quantum problems. Some future development ideas for possible applications outside electrostatics are addressed

  18. Tumor significant dose

    International Nuclear Information System (INIS)

    Supe, S.J.; Nagalaxmi, K.V.; Meenakshi, L.

    1983-01-01

    In the practice of radiotherapy, various concepts like NSD, CRE, TDF, and BIR are being used to evaluate the biological effectiveness of the treatment schedules on the normal tissues. This has been accepted as the tolerance of the normal tissue is the limiting factor in the treatment of cancers. At present when various schedules are tried, attention is therefore paid to the biological damage of the normal tissues only and it is expected that the damage to the cancerous tissues would be extensive enough to control the cancer. Attempt is made in the present work to evaluate the concent of tumor significant dose (TSD) which will represent the damage to the cancerous tissue. Strandquist in the analysis of a large number of cases of squamous cell carcinoma found that for the 5 fraction/week treatment, the total dose required to bring about the same damage for the cancerous tissue is proportional to T/sup -0.22/, where T is the overall time over which the dose is delivered. Using this finding the TSD was defined as DxN/sup -p/xT/sup -q/, where D is the total dose, N the number of fractions, T the overall time p and q are the exponents to be suitably chosen. The values of p and q are adjusted such that p+q< or =0.24, and p varies from 0.0 to 0.24 and q varies from 0.0 to 0.22. Cases of cancer of cervix uteri treated between 1978 and 1980 in the V. N. Cancer Centre, Kuppuswamy Naidu Memorial Hospital, Coimbatore, India were analyzed on the basis of these formulations. These data, coupled with the clinical experience, were used for choice of a formula for the TSD. Further, the dose schedules used in the British Institute of Radiology fraction- ation studies were also used to propose that the tumor significant dose is represented by DxN/sup -0.18/xT/sup -0.06/

  19. A neural network method to correct bidirectional effects in water-leaving radiance

    Science.gov (United States)

    Fan, Yongzhen; Li, Wei; Voss, Kenneth J.; Gatebe, Charles K.; Stamnes, Knut

    2017-02-01

    The standard method to convert the measured water-leaving radiances from the observation direction to the nadir direction developed by Morel and coworkers requires knowledge of the chlorophyll concentration (CHL). Also, the standard method was developed for open ocean water, which makes it unsuitable for turbid coastal waters. We introduce a neural network method to convert the water-leaving radiance (or the corresponding remote sensing reflectance) from the observation direction to the nadir direction. This method does not require any prior knowledge of the water constituents or the inherent optical properties (IOPs). This method is fast, accurate and can be easily adapted to different remote sensing instruments. Validation using NuRADS measurements in different types of water shows that this method is suitable for both open ocean and coastal waters. In open ocean or chlorophyll-dominated waters, our neural network method produces corrections similar to those of the standard method. In turbid coastal waters, especially sediment-dominated waters, a significant improvement was obtained compared to the standard method.

  20. 20 CFR 650.2 - Federal law requirements.

    Science.gov (United States)

    2010-04-01

    ... 20 Employees' Benefits 3 2010-04-01 2010-04-01 false Federal law requirements. 650.2 Section 650.2 Employees' Benefits EMPLOYMENT AND TRAINING ADMINISTRATION, DEPARTMENT OF LABOR STANDARD FOR APPEALS... Security Act requires that a State law include provision for: Such methods of administration * * * as are...

  1. Distributed and Collaborative Requirements Elicitation Based on Social Intelligence

    NARCIS (Netherlands)

    Wen, Bin; Luo, Z.; Liang, P.

    2012-01-01

    Requirements is the formal expression of user's needs. Also, requirements elicitation is the process of activity focusing on requirements collection. Traditional acquisition methods, such as interview, observation and prototype, are unsuited for the service-oriented software development featuring in

  2. Waste management system requirements document

    International Nuclear Information System (INIS)

    1991-02-01

    This volume defines the top level requirements for the Mined Geologic Disposal System (MGDS). It is designed to be used in conjunction with Volume 1 of the WMSR, General System Requirements. It provides a functional description expanding the requirements allocated to the MGDS in Volume 1 and elaborates on each requirement by providing associated performance criteria as appropriate. Volumes 1 and 4 of the WMSR provide a minimum set of requirements that must be satisfied by the final MGDS design. This document sets forth specific requirements that must be fulfilled. It is not the intent or purpose of this top level document to describe how each requirement is to be satisfied in the final MGDS design. Each subsequent level of the technical document hierarchy must provide further guidance and definition as to how each of these requirements is to be implemented in the design. It is expected that each subsequent level of requirements will be significantly more detailed. Section 2 of this volume provides a functional description of the MGDS. Each function is addressed in terms of requirements, and performance criteria. Section 3 provides a list of controlling documents. Each document cited in a requirement of Chapter 2 is included in this list and is incorporated into this document as a requirement on the final system. The WMSR addresses only federal requirements (i.e., laws, regulations and DOE orders). State and local requirements are not addressed. However, it will be specifically noted at the potentially affected WMSR requirements that there could be additional or more stringent regulations imposed by a state or local requirements or administering agency over the cited federal requirements

  3. Display Parameters and Requirements

    Science.gov (United States)

    Bahadur, Birendra

    The following sections are included: * INTRODUCTION * HUMAN FACTORS * Anthropometry * Sensory * Cognitive * Discussions * THE HUMAN VISUAL SYSTEM - CAPABILITIES AND LIMITATIONS * Cornea * Pupil and Iris * Lens * Vitreous Humor * Retina * RODS - NIGHT VISION * CONES - DAY VISION * RODS AND CONES - TWILIGHT VISION * VISUAL PIGMENTS * MACULA * BLOOD * CHOROID COAT * Visual Signal Processing * Pathways to the Brain * Spatial Vision * Temporal Vision * Colour Vision * Colour Blindness * DICHROMATISM * Protanopia * Deuteranopia * Tritanopia * ANOMALOUS TRICHROMATISM * Protanomaly * Deuteranomaly * Tritanomaly * CONE MONOCHROMATISM * ROD MONOCHROMATISM * Using Colour Effectively * COLOUR MIXTURES AND THE CHROMATICITY DIAGRAM * Colour Matching Functions and Chromaticity Co-ordinates * CIE 1931 Colour Space * CIE PRIMARIES * CIE COLOUR MATCHING FUNCTIONS AND CHROMATICITY CO-ORDINATES * METHODS FOR DETERMINING TRISTIMULUS VALUES AND COLOUR CO-ORDINATES * Spectral Power Distribution Method * Filter Method * CIE 1931 CHROMATICITY DIAGRAM * ADDITIVE COLOUR MIXTURE * CIE 1976 Chromaticity Diagram * CIE Uniform Colour Spaces and Colour Difference Formulae * CIELUV OR L*u*v* * CIELAB OR L*a*b* * CIE COLOUR DIFFERENCE FORMULAE * Colour Temperature and CIE Standard Illuminants and source * RADIOMETRIC AND PHOTOMETRIC QUANTITIES * Photopic (Vλ and Scotopic (Vλ') Luminous Efficiency Function * Photometric and Radiometric Flux * Luminous and Radiant Intensities * Incidence: Illuminance and Irradiance * Exitance or Emittance (M) * Luminance and Radiance * ERGONOMIC REQUIREMENTS OF DISPLAYS * ELECTRO-OPTICAL PARAMETERS AND REQUIREMENTS * Contrast and Contrast Ratio * Luminance and Brightness * Colour Contrast and Chromaticity * Glare * Other Aspects of Legibility * SHAPE AND SIZE OF CHARACTERS * DEFECTS AND BLEMISHES * FLICKER AND DISTORTION * ANGLE OF VIEW * Switching Speed * Threshold and Threshold Characteristic * Measurement Techniques For Electro-optical Parameters * RADIOMETRIC

  4. Using the longest significance run to estimate region-specific p-values in genetic association mapping studies

    Directory of Open Access Journals (Sweden)

    Yang Hsin-Chou

    2008-05-01

    Full Text Available Abstract Background Association testing is a powerful tool for identifying disease susceptibility genes underlying complex diseases. Technological advances have yielded a dramatic increase in the density of available genetic markers, necessitating an increase in the number of association tests required for the analysis of disease susceptibility genes. As such, multiple-tests corrections have become a critical issue. However the conventional statistical corrections on locus-specific multiple tests usually result in lower power as the number of markers increases. Alternatively, we propose here the application of the longest significant run (LSR method to estimate a region-specific p-value to provide an index for the most likely candidate region. Results An advantage of the LSR method relative to procedures based on genotypic data is that only p-value data are needed and hence can be applied extensively to different study designs. In this study the proposed LSR method was compared with commonly used methods such as Bonferroni's method and FDR controlling method. We found that while all methods provide good control over false positive rate, LSR has much better power and false discovery rate. In the authentic analysis on psoriasis and asthma disease data, the LSR method successfully identified important candidate regions and replicated the results of previous association studies. Conclusion The proposed LSR method provides an efficient exploratory tool for the analysis of sequences of dense genetic markers. Our results show that the LSR method has better power and lower false discovery rate comparing with the locus-specific multiple tests.

  5. Runge-Kutta methods with minimum storage implementations

    KAUST Repository

    Ketcheson, David I.

    2010-03-01

    Solution of partial differential equations by the method of lines requires the integration of large numbers of ordinary differential equations (ODEs). In such computations, storage requirements are typically one of the main considerations, especially if a high order ODE solver is required. We investigate Runge-Kutta methods that require only two storage locations per ODE. Existing methods of this type require additional memory if an error estimate or the ability to restart a step is required. We present a new, more general class of methods that provide error estimates and/or the ability to restart a step while still employing the minimum possible number of memory registers. Examples of such methods are found to have good properties. © 2009 Elsevier Inc. All rights reserved.

  6. A simple method for determining split renal function from dynamic {sup 99m}Tc-MAG3 scintigraphic data

    Energy Technology Data Exchange (ETDEWEB)

    Wesolowski, Michal J.; Watson, Gage; Wanasundara, Surajith N.; Babyn, Paul [University of Saskatchewan, Department of Medical Imaging, Saskatoon, SK (Canada); Conrad, Gary R. [University of Kentucky College of Medicine, Department of Radiology, Lexington, KY (United States); Samal, Martin [Charles University Prague and the General University Hospital in Prague, Department of Nuclear Medicine, First Faculty of Medicine, Praha 2 (Czech Republic); Wesolowski, Carl A. [University of Saskatchewan, Department of Medical Imaging, Saskatoon, SK (Canada); Memorial University of Newfoundland, Department of Radiology, St. John' s, NL (Canada)

    2016-03-15

    Commonly used methods for determining split renal function (SRF) from dynamic scintigraphic data require extrarenal background subtraction and additional correction for intrarenal vascular activity. The use of these additional regions of interest (ROIs) can produce inaccurate results and be challenging, e.g. if the heart is out of the camera field of view. The purpose of this study was to evaluate a new method for determining SRF called the blood pool compensation (BPC) technique, which is simple to implement, does not require extrarenal background correction and intrinsically corrects for intrarenal vascular activity. In the BPC method SRF is derived from a parametric plot of the curves generated by one blood-pool and two renal ROIs. Data from 107 patients who underwent {sup 99m}Tc-MAG3 scintigraphy were used to determine SRF values. Values calculated using the BPC method were compared to those obtained with the integral (IN) and Patlak-Rutland (PR) techniques using Bland-Altman plotting and Passing-Bablok regression. The interobserver variability of the BPC technique was also assessed for two observers. The SRF values obtained with the BPC method did not differ significantly from those obtained with the PR method and showed no consistent bias, while SRF values obtained with the IN method showed significant differences with some bias in comparison to those obtained with either the PR or BPC method. No significant interobserver variability was found between two observers calculating SRF using the BPC method. The BPC method requires only three ROIs to produce reliable estimates of SRF, was simple to implement, and in this study yielded statistically equivalent results to the PR method with appreciable interobserver agreement. As such, it adds a new reliable method for quality control of monitoring relative kidney function. (orig.)

  7. Waste Encapsulation and Storage Facility interim operational safety requirements

    CERN Document Server

    Covey, L I

    2000-01-01

    The Interim Operational Safety Requirements (IOSRs) for the Waste Encapsulation and Storage Facility (WESF) define acceptable conditions, safe boundaries, bases thereof, and management or administrative controls required to ensure safe operation during receipt and inspection of cesium and strontium capsules from private irradiators; decontamination of the capsules and equipment; surveillance of the stored capsules; and maintenance activities. Controls required for public safety, significant defense-in-depth, significant worker safety, and for maintaining radiological consequences below risk evaluation guidelines (EGs) are included.

  8. A method to measure internal contact angle in opaque systems by magnetic resonance imaging.

    Science.gov (United States)

    Zhu, Weiqin; Tian, Ye; Gao, Xuefeng; Jiang, Lei

    2013-07-23

    Internal contact angle is an important parameter for internal wettability characterization. However, due to the limitation of optical imaging, methods available for contact angle measurement are only suitable for transparent or open systems. For most of the practical situations that require contact angle measurement in opaque or enclosed systems, the traditional methods are not effective. Based upon the requirement, a method suitable for contact angle measurement in nontransparent systems is developed by employing MRI technology. In the Article, the method is demonstrated by measuring internal contact angles in opaque cylindrical tubes. It proves that the method also shows great feasibility in transparent situations and opaque capillary systems. By using the method, contact angle in opaque systems could be measured successfully, which is significant in understanding the wetting behaviors in nontransparent systems and calculating interfacial parameters in enclosed systems.

  9. The European Utility Requirement Document

    International Nuclear Information System (INIS)

    Roche, I.I.

    1999-01-01

    The major European electricity producers work on a common requirement document for future LWR plants since 1992. They aim at requirements acceptable together by the owners, the public and the authorities. Thus the designers can develop standard LWR designs acceptable everywhere in Europe and the utilities can open their consultations to vendors on common bases. Such a standardisation promotes an improvement of generation costs and of safety : public and authorities acceptance should be improved as well ; significant savings are expected in development and construction costs. Since the early stages of the project, the EUR group has grown significantly. It now includes utilities from nine European countries. Utilities from two other European countries are joining the group. Specific cooperation agreements are also in progress with a few extra-European partners

  10. ASSESSING SELF-STUDY WORK’S SIGNIFICANT SKILLS FOR SUCCESSFUL LEARNING IN THE HIGHER SCHOOL

    Directory of Open Access Journals (Sweden)

    Galina V. Milovanova

    2017-06-01

    Full Text Available Introduction: the problem of organizing students’ independent work/self-study is not new, but the changes in the higher school for the last two decades show that the experience accumulated in the traditional educational model can be applied only when it is processed in the present-day conditions. The article analyses the innovative component of the educational process in terms of a significant increase in the volume of compulsory independent work in the university. Particular attention is paid to determining the levels of the formation of skills for independent work in terms of students’ readiness for its implementa¬tion. The aim of the research is to identify the most significant skills of independent work for successful study at the university. Materials and Methods: the research is based on general scholarly methods: analysis, comparison, generalisation. A questionnaire survey was carried out and a correlation analysis of the results was presented. The mathematical statistics methods in Excel application were u sed for processing the survey data. Results: the article focused on the relevance of formation the students’ ability to work independently in the learning process. Requirements for professionals recognize the need for knowledge and skills, but more importantly, the ability and readiness to complete this knowledge and be in a state of continuous education and self-education. In turn, readiness to self-education cannot exist without independent work. The ratio of students to work independently and their skills’ levels in this area of the gnostic, design, structural, organisational and communicative blocks were identified because o f the research. Discussion and Conclusions: the levels of the formation of the skills for independent work influence on the success of the learning. There is a correlation between indicators of achievement and the ability to work independently. Organisation and communication skills have significant

  11. Significance analysis of lexical bias in microarray data

    Directory of Open Access Journals (Sweden)

    Falkow Stanley

    2003-04-01

    Full Text Available Abstract Background Genes that are determined to be significantly differentially regulated in microarray analyses often appear to have functional commonalities, such as being components of the same biochemical pathway. This results in certain words being under- or overrepresented in the list of genes. Distinguishing between biologically meaningful trends and artifacts of annotation and analysis procedures is of the utmost importance, as only true biological trends are of interest for further experimentation. A number of sophisticated methods for identification of significant lexical trends are currently available, but these methods are generally too cumbersome for practical use by most microarray users. Results We have developed a tool, LACK, for calculating the statistical significance of apparent lexical bias in microarray datasets. The frequency of a user-specified list of search terms in a list of genes which are differentially regulated is assessed for statistical significance by comparison to randomly generated datasets. The simplicity of the input files and user interface targets the average microarray user who wishes to have a statistical measure of apparent lexical trends in analyzed datasets without the need for bioinformatics skills. The software is available as Perl source or a Windows executable. Conclusion We have used LACK in our laboratory to generate biological hypotheses based on our microarray data. We demonstrate the program's utility using an example in which we confirm significant upregulation of SPI-2 pathogenicity island of Salmonella enterica serovar Typhimurium by the cation chelator dipyridyl.

  12. The importance of the keyword-generation method in keyword mnemonics.

    Science.gov (United States)

    Campos, Alfredo; Amor, Angeles; González, María Angeles

    2004-01-01

    Keyword mnemonics is under certain conditions an effective approach for learning foreign-language vocabulary. It appears to be effective for words with high image vividness but not for words with low image vividness. In this study, two experiments were performed to assess the efficacy of a new keyword-generation procedure (peer generation). In Experiment 1, a sample of 363 high-school students was randomly into four groups. The subjects were required to learn L1 equivalents of a list of 16 Latin words (8 with high image vividness, 8 with low image vividness), using a) the rote method, or the keyword method with b) keywords and images generated and supplied by the experimenter, c) keywords and images generated by themselves, or d) keywords and images previously generated by peers (i.e., subjects with similar sociodemographic characteristics). Recall was tested immediately and one week later. For high-vivideness words, recall was significantly better in the keyword groups than the rote method group. For low-vividness words, learning method had no significant effect. Experiment 2 was basically identical, except that the word lists comprised 32 words (16 high-vividness, 16 low-vividness). In this experiment, the peer-generated-keyword group showed significantly better recall of high-vividness words than the rote method groups and the subject generated keyword group; again, however, learning method had no significant effect on recall of low-vividness words.

  13. SUPERVISION OF CREDIT INSTITUTIONS SIGNIFICANT RISKS TO FINANCIAL STABILITY

    Directory of Open Access Journals (Sweden)

    LUCIAN-ION MEDAR

    2014-12-01

    Full Text Available Financial stability of Romanian banking system is determined by the constant supervision of credit institutions significant risks. Accession of Romania to Union Banking requires the signing of a linked protocol between the central bank and European Central Bank regarding prudential supervision to ensure financial stability. This means that from the next year, the central bank will impose a new supervision of credit institutions in our country. And especially to those credit institutions that do not fall under European supervisors, according to the procedures of the ECB. Through this study we propose to specify the main elements of management of significant risks to ensure financial stability.

  14. Use of dynamic grid adaption in the ASWR-method

    International Nuclear Information System (INIS)

    Graf, U.; Romstedt, P.; Werner, W.

    1985-01-01

    A dynamic grid adaption method has been developed for use with the ASWR-method. The method automatically adapts the number and position of the spatial meshpoints as the solution of hyperbolic or parabolic vector partial differential equations progresses in time. The mesh selection algorithm is based on the minimization of the L 2 -norm of the spatial discretization error. The method permits accurate calculation of the evolution of inhomogenities like wave fronts, shock layers and other sharp transitions, while generally using a coarse computational grid. The number of required mesh points is significantly reduced, relative to a fixed Eulerian grid. Since the mesh selection algorithm is computationally inexpensive, a corresponding reduction of computing time results

  15. Potential Significance of the EU Water Framework Directive to China

    Institute of Scientific and Technical Information of China (English)

    Lars Skov Andersen; Martin Griffiths

    2009-01-01

    The European Union Water Framework Directive (EU WFD) is a unique piece of legislation, which may be of great significance to on - going reforms of the water sector in China. First and foremost it unites 27 European mem- ber states behind a common goal, which is "to achieve good chemical and ecological status" of all water bodies across the EU. Other significant characteristics of the EU WFD are that (1) it sets a clear timeframe with a number of time - bound actions for member states to achieve the goal, hut leaves it to member states to achieve this goal in a decent- ralised process, which makes allowance for the different socio - economic conditions, (2) it defines the river basin as the management unit for water thus departing with the traditional fragmented management by administrative units and it appoints a single competent authority for water management within each fiver basin, thus facilitating resolution of sector conflicts, (3) it requires a financial and economic analysis of the costs of implementing the EU WFD to enable deci- sion makers to assess whether the required improvements are affordable to government and to the population within the fiver basin, and (4) it requires a structured process for information and consultation with stakeholders and the public throughout the planning and implementation process.

  16. Statistical methods applied to gamma-ray spectroscopy algorithms in nuclear security missions.

    Science.gov (United States)

    Fagan, Deborah K; Robinson, Sean M; Runkle, Robert C

    2012-10-01

    Gamma-ray spectroscopy is a critical research and development priority to a range of nuclear security missions, specifically the interdiction of special nuclear material involving the detection and identification of gamma-ray sources. We categorize existing methods by the statistical methods on which they rely and identify methods that have yet to be considered. Current methods estimate the effect of counting uncertainty but in many cases do not address larger sources of decision uncertainty, which may be significantly more complex. Thus, significantly improving algorithm performance may require greater coupling between the problem physics that drives data acquisition and statistical methods that analyze such data. Untapped statistical methods, such as Bayes Modeling Averaging and hierarchical and empirical Bayes methods, could reduce decision uncertainty by rigorously and comprehensively incorporating all sources of uncertainty. Application of such methods should further meet the needs of nuclear security missions by improving upon the existing numerical infrastructure for which these analyses have not been conducted. Copyright © 2012 Elsevier Ltd. All rights reserved.

  17. Trends of significant operating events in assessing programmatic issues

    International Nuclear Information System (INIS)

    Lanning, W.D.

    1990-01-01

    This paper describes one part of the U. S. Nuclear Regulatory Commission's (NRC's) program for evaluating significant events and the process for identifying trends that may be indicative of programmatic weaknesses at operating nuclear power plants. A database management system was developed to permit analyses of significant operating events, events of potential safety significance, and certain reactor scrams. The analyses were based on events and problems reported by telephone to the NRC by licensees within hours of the events and, therefore, provided current operational data trend information. The regulatory requirements for reporting significant events, the screening criteria, and the process for identifying outliers for formal evaluation are described herein. This process contributed to an understanding of the underlying causes for events and problems. Examples are included of operating experience assessments that identified plants with a poor operating experience history that was attributable to procedural inadequacies, operator training deficiencies, inadequate root cause analysis, and inadequate control and planning of activities

  18. Exploring matrix factorization techniques for significant genes identification of Alzheimer’s disease microarray gene expression data

    Directory of Open Access Journals (Sweden)

    Hu Xiaohua

    2011-07-01

    Full Text Available Abstract Background The wide use of high-throughput DNA microarray technology provide an increasingly detailed view of human transcriptome from hundreds to thousands of genes. Although biomedical researchers typically design microarray experiments to explore specific biological contexts, the relationships between genes are hard to identified because they are complex and noisy high-dimensional data and are often hindered by low statistical power. The main challenge now is to extract valuable biological information from the colossal amount of data to gain insight into biological processes and the mechanisms of human disease. To overcome the challenge requires mathematical and computational methods that are versatile enough to capture the underlying biological features and simple enough to be applied efficiently to large datasets. Methods Unsupervised machine learning approaches provide new and efficient analysis of gene expression profiles. In our study, two unsupervised knowledge-based matrix factorization methods, independent component analysis (ICA and nonnegative matrix factorization (NMF are integrated to identify significant genes and related pathways in microarray gene expression dataset of Alzheimer’s disease. The advantage of these two approaches is they can be performed as a biclustering method by which genes and conditions can be clustered simultaneously. Furthermore, they can group genes into different categories for identifying related diagnostic pathways and regulatory networks. The difference between these two method lies in ICA assume statistical independence of the expression modes, while NMF need positivity constrains to generate localized gene expression profiles. Results In our work, we performed FastICA and non-smooth NMF methods on DNA microarray gene expression data of Alzheimer’s disease respectively. The simulation results shows that both of the methods can clearly classify severe AD samples from control samples, and

  19. Fitting methods to paradigms: are ergonomics methods fit for systems thinking?

    Science.gov (United States)

    Salmon, Paul M; Walker, Guy H; M Read, Gemma J; Goode, Natassia; Stanton, Neville A

    2017-02-01

    The issues being tackled within ergonomics problem spaces are shifting. Although existing paradigms appear relevant for modern day systems, it is worth questioning whether our methods are. This paper asks whether the complexities of systems thinking, a currently ubiquitous ergonomics paradigm, are outpacing the capabilities of our methodological toolkit. This is achieved through examining the contemporary ergonomics problem space and the extent to which ergonomics methods can meet the challenges posed. Specifically, five key areas within the ergonomics paradigm of systems thinking are focused on: normal performance as a cause of accidents, accident prediction, system migration, systems concepts and ergonomics in design. The methods available for pursuing each line of inquiry are discussed, along with their ability to respond to key requirements. In doing so, a series of new methodological requirements and capabilities are identified. It is argued that further methodological development is required to provide researchers and practitioners with appropriate tools to explore both contemporary and future problems. Practitioner Summary: Ergonomics methods are the cornerstone of our discipline. This paper examines whether our current methodological toolkit is fit for purpose given the changing nature of ergonomics problems. The findings provide key research and practice requirements for methodological development.

  20. The adaptive significance of ontogenetic colour change in a tropical python

    OpenAIRE

    Wilson, David; Heinsohn, Robert; Endler, John A

    2006-01-01

    Ontogenetic colour change is typically associated with changes in size, vulnerability or habitat, but assessment of its functional significance requires quantification of the colour signals from the receivers' perspective. The tropical python, Morelia viridis, is an ideal species to establish the functional significance of ontogenetic colour change. Neonates hatch either yellow or red and both the morphs change to green with age. Here, we show that colour change from red or yellow to green pr...

  1. Atherogenic index of plasma: a significant indicator for the onset of ...

    African Journals Online (AJOL)

    Background: Hypertension and menopause are independent risk factors for dyslipidaemia. In normotensive subjects, menopause causes significant alteration in lipid levels that may require medical intervention. It is thus expected that this alteration will become more relevant in hypertensive menopausal subjects.

  2. The significance of amlodipine on autonomic nervous system adjustment (ANSA method: A new approach in the treatment of hypertension

    Directory of Open Access Journals (Sweden)

    Milovanović Branislav

    2009-01-01

    Full Text Available Introduction. Cardiovascular autonomic modulation is altered in patients with essential hypertension. Objective To evaluate acute and long-term effects of amlodipine on cardiovascular autonomic function and haemodynamic status in patients with mild essential hypertension. Methods. Ninety patients (43 male, mean age 52.12 ±10.7 years with mild hypertension were tested before, 30 minutes after the first 5 mg oral dose of amlodipine and three weeks after monotherapy with amlodipine. A comprehensive study protocol was done including finger blood pressure variability (BPV and heart rate variability (HRV beat-to-beat analysis with impedance cardiography, ECG with software short-term HRV and nonlinear analysis, 24-hour Holter ECG monitoring with QT and HRV analysis, 24-hour blood pressure (BP monitoring with systolic and diastolic BPV analysis, cardiovascular autonomic reflex tests, cold pressure test, mental stress test. The patients were also divided into sympathetic and parasympathetic groups, depending on predominance in short time spectral analysis of sympathovagal balance according to low frequency and high frequency values. Results. We confirmed a significant systolic and diastolic BP reduction, and a reduction of pulse pressure during day, night and early morning hours. The reduction of supraventricular and ventricular ectopic beats during the night was also achieved with therapy, but without statistical significance. The increment of sympathetic activity in early phase of amlodipine therapy was without statistical significance and persistence of sympathetic predominance after a few weeks of therapy detected based on the results of short-term spectral HRV analysis. All time domain parameters of long-term HRV analysis were decreased and low frequency amongst spectral parameters. Amlodipne reduced baroreflex sensitivity after three weeks of therapy, but increased it immediately after the administration of the first dose. Conclusion. The results

  3. Modified small angle magnetization rotation method in multilayer magnetic microwires

    International Nuclear Information System (INIS)

    Torrejon, J.; Badini, G.; Pirota, K.; Vazquez, M.

    2007-01-01

    The small angle magnetization rotation (SAMR) technique is a widely used method to quantify magnetostriction in elongated ultrasoft magnetic materials. In the present work, we introduce significant optimization of the method, particularly simplification of the required equipment, profiting of the very peculiar characteristics of a recently introduced family of multilayer magnetic microwires consisting of a soft magnetic core, insulating intermediate layer and a hard magnetic outer layer. The introduced modified SAMR method is used not only to determine the saturation magnetostriction constant of the soft magnetic nucleus but also the magnetoelastic and magnetostatic coupling. This new method has a great potential in multifunctional sensor applications

  4. A massively parallel discrete ordinates response matrix method for neutron transport

    International Nuclear Information System (INIS)

    Hanebutte, U.R.; Lewis, E.E.

    1992-01-01

    In this paper a discrete ordinates response matrix method is formulated with anisotropic scattering for the solution of neutron transport problems on massively parallel computers. The response matrix formulation eliminates iteration on the scattering source. The nodal matrices that result from the diamond-differenced equations are utilized in a factored form that minimizes memory requirements and significantly reduces the number of arithmetic operations required per node. The red-black solution algorithm utilizes massive parallelism by assigning each spatial node to one or more processors. The algorithm is accelerated by a synthetic method in which the low-order diffusion equations are also solved by massively parallel red-black iterations. The method is implemented on a 16K Connection Machine-2, and S 8 and S 16 solutions are obtained for fixed-source benchmark problems in x-y geometry

  5. The significance of appearance in physician-nurse collaboration

    DEFF Research Database (Denmark)

    Konradsen, H.; Christensen, O.M.; Berthelsen, C.

    2009-01-01

    INTRODUCTION: According to nurses' assessment, physician-nurse collaboration is problematic. The aim of the study was to investigate whether nurses believe physicians' appearances is significant for their ability to collaborate. MATERIAL AND METHODS: This is a single-blinded, quasi-experimental i......-depth interviews to achieve harmonic interaction leading to a prolific and close future collaboration Udgivelsesdato: 2009/12/14......INTRODUCTION: According to nurses' assessment, physician-nurse collaboration is problematic. The aim of the study was to investigate whether nurses believe physicians' appearances is significant for their ability to collaborate. MATERIAL AND METHODS: This is a single-blinded, quasi....... The Jefferson Scale of Attitudes Toward Physician-Nurse Collaboration will be used for baseline and follow-up study of the nurses' assessment. RESULTS: Due to ethical considerations, researchers had difficulties finding surgeons prepared to perform procedures aiming at weakening the physicians' physical...

  6. Job requirements compared to dental school education: impact of a case-based learning curriculum.

    Science.gov (United States)

    Keeve, Philip L; Gerhards, Ute; Arnold, Wolfgang A; Zimmer, Stefan; Zöllner, Axel

    2012-01-01

    Case-based learning (CBL) is suggested as a key educational method of knowledge acquisition to improve dental education. The purpose of this study was to assess graduates from a patient-oriented, case-based learning (CBL)-based curriculum as regards to key competencies required at their professional activity. 407 graduates from a patient-oriented, case-based learning (CBL) dental curriculum who graduated between 1990 and 2006 were eligible for this study. 404 graduates were contacted between 2007 and 2008 to self-assess nine competencies as required at their day-to-day work and as taught in dental school on a 6-point Likert scale. Baseline demographics and clinical characteristics were presented as mean ± standard deviation (SD) for continuous variables. To determine whether dental education sufficiently covers the job requirements of physicians, we calculated the mean difference ∆ between the ratings of competencies as required in day-to-day work and as taught in medical school by subtracting those from each other (negative mean difference ∆ indicates deficit; positive mean difference ∆ indicates surplus). Spearman's rank correlation coefficient was calculated to reveal statistical significance (statistical significance plearning/working" (∆+0.08), whereas "Problem-solving skills" (∆-0.07), "Psycho-social competence" (∆-0.66) and "Business competence" (∆-2.86) needed improvement in the CBL-based curriculum. CBL demonstrated benefits with regard to competencies which were highly required in the job of dentists. Psycho-social and business competence deserve closer attention in future curricular development.

  7. The efficiency of different estimation methods of hydro-physical limits

    Directory of Open Access Journals (Sweden)

    Emma María Martínez

    2012-12-01

    Full Text Available The soil water available to crops is defined by specific values of water potential limits. Underlying the estimation of hydro-physical limits, identified as permanent wilting point (PWP and field capacity (FC, is the selection of a suitable method based on a multi-criteria analysis that is not always clear and defined. In this kind of analysis, the time required for measurements must be taken into consideration as well as other external measurement factors, e.g., the reliability and suitability of the study area, measurement uncertainty, cost, effort and labour invested. In this paper, the efficiency of different methods for determining hydro-physical limits is evaluated by using indices that allow for the calculation of efficiency in terms of effort and cost. The analysis evaluates both direct determination methods (pressure plate - PP and water activity meter - WAM and indirect estimation methods (pedotransfer functions - PTFs. The PTFs must be validated for the area of interest before use, but the time and cost associated with this validation are not included in the cost of analysis. Compared to the other methods, the combined use of PP and WAM to determine hydro-physical limits differs significantly in time and cost required and quality of information. For direct methods, increasing sample size significantly reduces cost and time. This paper assesses the effectiveness of combining a general analysis based on efficiency indices and more specific analyses based on the different influencing factors, which were considered separately so as not to mask potential benefits or drawbacks that are not evidenced in efficiency estimation.

  8. Risk-Informed SSCs Categorization: Elicitation Method of Expert's Opinion

    International Nuclear Information System (INIS)

    Hwang, Mee Jeong; Yang, Joon Eon; Kim, Kil Yoo

    2005-01-01

    The regulations have been performing by deterministic way since nuclear power plants have been operating. However, some SSCs identified as safety-significance by deterministic way, were turned out to be low or non safety-significant and some SSCs identified as non-safety significance were turned out to be high safety-significant according to the results of PSA. Considering these risk insights, Regulatory Guide 1.174 and 10CFR50.69 were drawn up, and we can re-categorize the SSCs according to their safety significance. Therefore, a study and an interest about the risk-informed SSCs re-categorization and treatment has been continued. The objective of this regulatory initiative is to adjust the scope of equipment subject to special regulatory treatment to better focus licensee and regulatory attention and resources on equipment that has safety significance. Current most regulations define the plant equipment necessary to meet deterministic regulatory basis as 'safety-related.' This equipment is subject to special treatment regulations. Other plant equipment is categorized as 'non-safety related,' and is not subject to a select number of special treatment requirement or a subset of those requirement. However, risk information is not a magic tool making a decision but a supporting tool to categorize SSCs. This is because only small parts of a plant are modeled in PSA model. Thus, engineering and deterministic judgments are also used for risk-informed SSCs categorization, and expert opinion elicitation is very important for risk-informed SSCs categorization. Therefore, we need a rational method to elicit the expert's opinions, and in this study, we developed a systematic method for expert elicitation to categorize the nuclear power plants' SSCs. Current states for SSCs categorization of the USA and the existing methods for expert elicitation were surveyed and more systematic way eliciting the expert opinions and combining was developed. To validate the developed method

  9. Statistical Significance and Effect Size: Two Sides of a Coin.

    Science.gov (United States)

    Fan, Xitao

    This paper suggests that statistical significance testing and effect size are two sides of the same coin; they complement each other, but do not substitute for one another. Good research practice requires that both should be taken into consideration to make sound quantitative decisions. A Monte Carlo simulation experiment was conducted, and a…

  10. THE EQUALITY PRINCIPLE REQUIREMENTS

    Directory of Open Access Journals (Sweden)

    CLAUDIA ANDRIŢOI

    2013-05-01

    Full Text Available The problem premises and the objectives followed: the idea of inserting the equality principle between the freedom and the justice principles is manifested in positive law in two stages, as a general idea of all judicial norms and as requirement of the owner of a subjective right of the applicants of an objective law. Equality in face of the law and of public authorities can not involve the idea of standardization, of uniformity, of enlisting of all citizens under the mark of the same judicial regime, regardless of their natural or socio-professional situation. Through the Beijing Platform and the position documents of the European Commission we have defined the integrative approach of equality as representing an active and visible integration of the gender perspective in all sectors and at all levels. The research methods used are: the conceptualist method, the logical method and the intuitive method necessary as means of reasoning in order to argue our demonstration. We have to underline the fact that the system analysis of the research methods of the judicial phenomenon doesn’t agree with “value ranking”, because one value cannot be generalized in rapport to another. At the same time, we must fight against a methodological extremism. The final purpose of this study is represented by the reaching of the perfecting/excellence stage by all individuals through the promotion of equality and freedom. This supposes the fact that the existence of a non-discrimination favourable frame (fairness represents a means and a condition of self-determination, and the state of perfection/excellency is a result of this self-determination, the condition necessary for the obtaining of this nondiscrimination frame for all of us and in conditions of freedom for all individuals, represents the same condition that promotes the state of perfection/excellency. In conclusion we may state the fact that the equality principle represents a true catalyst of the

  11. Ergonomic requirements to control room design - evaluation method

    International Nuclear Information System (INIS)

    Hinz, W.

    1985-01-01

    The method of evaluation introduced is the result of work carried out by the sub-committee 'Control Room Design' of the Engineering Standards Committee in DIN Standards, Ergonomy. This committee compiles standards for the design of control rooms (instrumentation and control) for the monitoring and operation of process engineering cycles. With the agreement of the committee - whom we wish to take the opportunity of thanking at this point for their constructive collaboration - a planned partial standard will be introduced thematically in the following, in order that knowledge gained from the discussion can be included in further work on the subject. The matter in question is a procedure for the qualitative evaluation of the duties to be performed under the control of operators in order that an assessment can be made of existing control concepts or such concepts as are to be found in the draft phase. (orig./GL) [de

  12. Review of nuclear piping seismic design requirements

    International Nuclear Information System (INIS)

    Slagis, G.C.; Moore, S.E.

    1994-01-01

    Modern-day nuclear plant piping systems are designed with a large number of seismic supports and snubbers that may be detrimental to plant reliability. Experimental tests have demonstrated the inherent ruggedness of ductile steel piping for seismic loading. Present methods to predict seismic loads on piping are based on linear-elastic analysis methods with low damping. These methods overpredict the seismic response of ductile steel pipe. Section III of the ASME Boiler and Pressure Vessel Code stresses limits for piping systems that are based on considerations of static loads and hence are overly conservative. Appropriate stress limits for seismic loads on piping should be incorporated into the code to allow more flexible piping designs. The existing requirements and methods for seismic design of piping systems, including inherent conservations, are explained to provide a technical foundation for modifications to those requirements. 30 refs., 5 figs., 3 tabs

  13. Polynomial regression analysis and significance test of the regression function

    International Nuclear Information System (INIS)

    Gao Zhengming; Zhao Juan; He Shengping

    2012-01-01

    In order to analyze the decay heating power of a certain radioactive isotope per kilogram with polynomial regression method, the paper firstly demonstrated the broad usage of polynomial function and deduced its parameters with ordinary least squares estimate. Then significance test method of polynomial regression function is derived considering the similarity between the polynomial regression model and the multivariable linear regression model. Finally, polynomial regression analysis and significance test of the polynomial function are done to the decay heating power of the iso tope per kilogram in accord with the authors' real work. (authors)

  14. Measuring larval nematode contamination on cattle pastures: Comparing two herbage sampling methods.

    Science.gov (United States)

    Verschave, S H; Levecke, B; Duchateau, L; Vercruysse, J; Charlier, J

    2015-06-15

    Assessing levels of pasture larval contamination is frequently used to study the population dynamics of the free-living stages of parasitic nematodes of livestock. Direct quantification of infective larvae (L3) on herbage is the most applied method to measure pasture larval contamination. However, herbage collection remains labour intensive and there is a lack of studies addressing the variation induced by the sampling method and the required sample size. The aim of this study was (1) to compare two different sampling methods in terms of pasture larval count results and time required to sample, (2) to assess the amount of variation in larval counts at the level of sample plot, pasture and season, respectively and (3) to calculate the required sample size to assess pasture larval contamination with a predefined precision using random plots across pasture. Eight young stock pastures of different commercial dairy herds were sampled in three consecutive seasons during the grazing season (spring, summer and autumn). On each pasture, herbage samples were collected through both a double-crossed W-transect with samples taken every 10 steps (method 1) and four random located plots of 0.16 m(2) with collection of all herbage within the plot (method 2). The average (± standard deviation (SD)) pasture larval contamination using sampling methods 1 and 2 was 325 (± 479) and 305 (± 444)L3/kg dry herbage (DH), respectively. Large discrepancies in pasture larval counts of the same pasture and season were often seen between methods, but no significant difference (P = 0.38) in larval counts between methods was found. Less time was required to collect samples with method 2. This difference in collection time between methods was most pronounced for pastures with a surface area larger than 1 ha. The variation in pasture larval counts from samples generated by random plot sampling was mainly due to the repeated measurements on the same pasture in the same season (residual variance

  15. Application of an efficient Bayesian discretization method to biomedical data

    Directory of Open Access Journals (Sweden)

    Gopalakrishnan Vanathi

    2011-07-01

    Full Text Available Abstract Background Several data mining methods require data that are discrete, and other methods often perform better with discrete data. We introduce an efficient Bayesian discretization (EBD method for optimal discretization of variables that runs efficiently on high-dimensional biomedical datasets. The EBD method consists of two components, namely, a Bayesian score to evaluate discretizations and a dynamic programming search procedure to efficiently search the space of possible discretizations. We compared the performance of EBD to Fayyad and Irani's (FI discretization method, which is commonly used for discretization. Results On 24 biomedical datasets obtained from high-throughput transcriptomic and proteomic studies, the classification performances of the C4.5 classifier and the naïve Bayes classifier were statistically significantly better when the predictor variables were discretized using EBD over FI. EBD was statistically significantly more stable to the variability of the datasets than FI. However, EBD was less robust, though not statistically significantly so, than FI and produced slightly more complex discretizations than FI. Conclusions On a range of biomedical datasets, a Bayesian discretization method (EBD yielded better classification performance and stability but was less robust than the widely used FI discretization method. The EBD discretization method is easy to implement, permits the incorporation of prior knowledge and belief, and is sufficiently fast for application to high-dimensional data.

  16. Environmental assessments and findings of no significant impact--FDA. Notice.

    Science.gov (United States)

    1998-05-18

    The Food and Drug Administration (FDA) is announcing that it has reviewed environmental assessments (EA's) and issued findings of no significant impact (FONSI's) relating to the 167 new drug applications (NDA's) and supplemental applications listed in this document. FDA is publishing this notice because Federal regulations require public notice of the availability of environmental documents.

  17. Significance and clinical value of the transitional zone volume (TZV ...

    African Journals Online (AJOL)

    M. El Ghoneimy

    2017-01-12

    Jan 12, 2017 ... Objective: The aim of this work was to evaluate the significance and clinical value of the TZI, which has been a point of ... Conclusion: Estimating the transition zone volume during TRUS is a reasonable way to obtain the required ... Besides the IPSS score, a complete medical and surgical history was also.

  18. Whole-organ and segmental stiffness measured with liver magnetic resonance elastography in healthy adults: significance of the region of interest.

    Science.gov (United States)

    Rusak, Grażyna; Zawada, Elżbieta; Lemanowicz, Adam; Serafin, Zbigniew

    2015-04-01

    MR elastography (MRE) is a recent non-invasive technique that provides in vivo data on the viscoelasticity of the liver. Since the method is not well established, several different protocols were proposed that differ in results. The aim of the study was to analyze the variability of stiffness measurements in different regions of the liver. Twenty healthy adults aged 24-45 years were recruited. The examination was performed using a mechanical excitation of 64 Hz. MRE images were fused with axial T2WI breath-hold images (thickness 10 mm, spacing 10 mm). Stiffness was measured as a mean value of each cross section of the whole liver, on a single largest cross section, in the right lobe, and in ROIs (50 pix.) placed in the center of the left lobe, segments 5/6, 7, 8, and the parahilar region. Whole-liver stiffness ranged from 1.56 to 2.75 kPa. Mean segmental stiffness differed significantly between the tested regions (range from 1.55 ± 0.28 to 2.37 ± 0.32 kPa; P < 0.0001, ANOVA). Within-method variability of measurements ranged from 14 % for whole liver and segment 8-26 % for segment 7. Within-subject variability ranged from 13 to 31 %. Results of measurement within segment 8 were closest to the whole-liver method (ICC, 0.84). Stiffness of the liver presented significant variability depending on the region of measurement. The most reproducible method is averaging of cross sections of the whole liver. There was significant variability between stiffness in subjects considered healthy, which requires further investigation.

  19. Graphical user interface prototyping for distributed requirements engineering

    CERN Document Server

    Scheibmayr, Sven

    2014-01-01

    Finding and understanding the right requirements is essential for every software project. This book deals with the challenge to improve requirements engineering in distributed software projects. The use of graphical user interface (GUI) prototypes can help stakeholders in such projects to elicit and specify high quality requirements. The research objective of this study is to develop a method and a software artifact to support the activities in the early requirements engineering phase in order to overcome some of the difficulties and improve the quality of the requirements, which should eventu

  20. Zooplankton - Study methods, importance and significant observations

    Digital Repository Service at National Institute of Oceanography (India)

    Gajbhiye, S.N.

    density, shorter life span, drifting nature, high group/species diversity and different tolerance to the stress, they are being used as the indicator organisms for the physical, chemical and biological processes in the aquatic ecosystem. In the deeper...

  1. Method Verification Requirements for an Advanced Imaging System for Microbial Plate Count Enumeration.

    Science.gov (United States)

    Jones, David; Cundell, Tony

    2018-01-01

    The Growth Direct™ System that automates the incubation and reading of membrane filtration microbial counts on soybean-casein digest, Sabouraud dextrose, and R2A agar differs only from the traditional method in that micro-colonies on the membrane are counted using an advanced imaging system up to 50% earlier in the incubation. Based on the recommendations in USP Validation of New Microbiological Testing Methods , the system may be implemented in a microbiology laboratory after simple method verification and not a full method validation. LAY ABSTRACT: The Growth Direct™ System that automates the incubation and reading of microbial counts on membranes on solid agar differs only from the traditional method in that micro-colonies on the membrane are counted using an advanced imaging system up to 50% earlier in the incubation time. Based on the recommendations in USP Validation of New Microbiological Testing Methods , the system may be implemented in a microbiology laboratory after simple method verification and not a full method validation. © PDA, Inc. 2018.

  2. Energy Storage Requirements for PV Power Ramp Rate Control in Northern Europe

    Directory of Open Access Journals (Sweden)

    Julius Schnabel

    2016-01-01

    Full Text Available Photovoltaic (PV generators suffer from fluctuating output power due to the highly fluctuating primary energy source. With significant PV penetration, these fluctuations can lead to power system instability and power quality problems. The use of energy storage systems as fluctuation compensators has been proposed as means to mitigate these problems. In this paper, the behavior of PV power fluctuations in Northern European climatic conditions and requirements for sizing the energy storage systems to compensate them have been investigated and compared to similar studies done in Southern European climate. These investigations have been performed through simulations that utilize measurements from the Tampere University of Technology solar PV power station research plant in Finland. An enhanced energy storage charging control strategy has been developed and tested. Energy storage capacity, power, and cycling requirements have been derived for different PV generator sizes and power ramp rate requirements. The developed control strategy leads to lesser performance requirements for the energy storage systems compared to the methods presented earlier. Further, some differences on the operation of PV generators in Northern and Southern European climates have been detected.

  3. Septic Pulmonary Embolism Requiring Critical Care: Clinicoradiological Spectrum, Causative Pathogens and Outcomes

    Science.gov (United States)

    Chou, Deng-Wei; Wu, Shu-Ling; Chung, Kuo-Mou; Han, Shu-Chen; Cheung, Bruno Man-Hon

    2016-01-01

    OBJECTIVES: Septic pulmonary embolism is an uncommon but life-threatening disorder. However, data on patients with septic pulmonary embolism who require critical care have not been well reported. This study elucidated the clinicoradiological spectrum, causative pathogens and outcomes of septic pulmonary embolism in patients requiring critical care. METHODS: The electronic medical records of 20 patients with septic pulmonary embolism who required intensive care unit admission between January 2005 and December 2013 were reviewed. RESULTS: Multiple organ dysfunction syndrome developed in 85% of the patients, and acute respiratory failure was the most common organ failure (75%). The most common computed tomographic findings included a feeding vessel sign (90%), peripheral nodules without cavities (80%) or with cavities (65%), and peripheral wedge-shaped opacities (75%). The most common primary source of infection was liver abscess (40%), followed by pneumonia (25%). The two most frequent causative pathogens were Klebsiella pneumoniae (50%) and Staphylococcus aureus (35%). Compared with survivors, nonsurvivors had significantly higher serum creatinine, arterial partial pressure of carbon dioxide, and Acute Physiology and Chronic Health Evaluation II and Sequential Organ Failure Assessment scores, and they were significantly more likely to have acute kidney injury, disseminated intravascular coagulation and lung abscesses. The in-hospital mortality rate was 30%. Pneumonia was the most common cause of death, followed by liver abscess. CONCLUSIONS: Patients with septic pulmonary embolism who require critical care, especially those with pneumonia and liver abscess, are associated with high mortality. Early diagnosis, appropriate antibiotic therapy, surgical intervention and respiratory support are essential. PMID:27759843

  4. Septic Pulmonary Embolism Requiring Critical Care: Clinicoradiological Spectrum, Causative Pathogens and Outcomes

    Directory of Open Access Journals (Sweden)

    Deng-Wei Chou

    Full Text Available OBJECTIVES: Septic pulmonary embolism is an uncommon but life-threatening disorder. However, data on patients with septic pulmonary embolism who require critical care have not been well reported. This study elucidated the clinicoradiological spectrum, causative pathogens and outcomes of septic pulmonary embolism in patients requiring critical care. METHODS: The electronic medical records of 20 patients with septic pulmonary embolism who required intensive care unit admission between January 2005 and December 2013 were reviewed. RESULTS: Multiple organ dysfunction syndrome developed in 85% of the patients, and acute respiratory failure was the most common organ failure (75%. The most common computed tomographic findings included a feeding vessel sign (90%, peripheral nodules without cavities (80% or with cavities (65%, and peripheral wedge-shaped opacities (75%. The most common primary source of infection was liver abscess (40%, followed by pneumonia (25%. The two most frequent causative pathogens were Klebsiella pneumoniae (50% and Staphylococcus aureus (35%. Compared with survivors, nonsurvivors had significantly higher serum creatinine, arterial partial pressure of carbon dioxide, and Acute Physiology and Chronic Health Evaluation II and Sequential Organ Failure Assessment scores, and they were significantly more likely to have acute kidney injury, disseminated intravascular coagulation and lung abscesses. The in-hospital mortality rate was 30%. Pneumonia was the most common cause of death, followed by liver abscess. CONCLUSIONS: Patients with septic pulmonary embolism who require critical care, especially those with pneumonia and liver abscess, are associated with high mortality. Early diagnosis, appropriate antibiotic therapy, surgical intervention and respiratory support are essential.

  5. Identify and Manage the Software Requirements Volatility

    OpenAIRE

    Khloud Abd Elwahab; Mahmoud Abd EL Latif; Sherif Kholeif

    2016-01-01

    Management of software requirements volatility through development of life cycle is a very important stage. It helps the team to control significant impact all over the project (cost, time and effort), and also it keeps the project on track, to finally satisfy the user which is the main success criteria for the software project. In this research paper, we have analysed the root causes of requirements volatility through a proposed framework presenting the requirements volatility causes and how...

  6. Transit Traffic Analysis Zone Delineating Method Based on Thiessen Polygon

    Directory of Open Access Journals (Sweden)

    Shuwei Wang

    2014-04-01

    Full Text Available A green transportation system composed of transit, busses and bicycles could be a significant in alleviating traffic congestion. However, the inaccuracy of current transit ridership forecasting methods is imposing a negative impact on the development of urban transit systems. Traffic Analysis Zone (TAZ delineating is a fundamental and essential step in ridership forecasting, existing delineating method in four-step models have some problems in reflecting the travel characteristics of urban transit. This paper aims to come up with a Transit Traffic Analysis Zone delineation method as supplement of traditional TAZs in transit service analysis. The deficiencies of current TAZ delineating methods were analyzed, and the requirements of Transit Traffic Analysis Zone (TTAZ were summarized. Considering these requirements, Thiessen Polygon was introduced into TTAZ delineating. In order to validate its feasibility, Beijing was then taken as an example to delineate TTAZs, followed by a spatial analysis of office buildings within a TTAZ and transit station departure passengers. Analysis result shows that the TTAZs based on Thiessen polygon could reflect the transit travel characteristic and is of in-depth research value.

  7. Unbiased proteomics analysis demonstrates significant variability in mucosal immune factor expression depending on the site and method of collection.

    Directory of Open Access Journals (Sweden)

    Kenzie M Birse

    Full Text Available Female genital tract secretions are commonly sampled by lavage of the ectocervix and vaginal vault or via a sponge inserted into the endocervix for evaluating inflammation status and immune factors critical for HIV microbicide and vaccine studies. This study uses a proteomics approach to comprehensively compare the efficacy of these methods, which sample from different compartments of the female genital tract, for the collection of immune factors. Matching sponge and lavage samples were collected from 10 healthy women and were analyzed by tandem mass spectrometry. Data was analyzed by a combination of differential protein expression analysis, hierarchical clustering and pathway analysis. Of the 385 proteins identified, endocervical sponge samples collected nearly twice as many unique proteins as cervicovaginal lavage (111 vs. 61 with 55% of proteins common to both (213. Each method/site identified 73 unique proteins that have roles in host immunity according to their gene ontology. Sponge samples enriched for specific inflammation pathways including acute phase response proteins (p = 3.37×10(-24 and LXR/RXR immune activation pathways (p = 8.82×10(-22 while the role IL-17A in psoriasis pathway (p = 5.98×10(-4 and the complement system pathway (p = 3.91×10(-3 were enriched in lavage samples. Many host defense factors were differentially enriched (p<0.05 between sites including known/potential antimicrobial factors (n = 21, S100 proteins (n = 9, and immune regulatory factors such as serpins (n = 7. Immunoglobulins (n = 6 were collected at comparable levels in abundance in each site although 25% of those identified were unique to sponge samples. This study demonstrates significant differences in types and quantities of immune factors and inflammation pathways collected by each sampling technique. Therefore, clinical studies that measure mucosal immune activation or factors assessing HIV transmission should utilize

  8. Current Methods to Detoxify Fly Ash from Waste Incineration

    Energy Technology Data Exchange (ETDEWEB)

    Hallgren, Christine; Stroemberg, Birgitta [TPS Termiska Processer AB, Nykoeping (Sweden)

    2004-07-01

    Fly ash from waste incineration contains large amounts of heavy metals and dioxins, which will cause a significant disposal problem within the coming years. The amount of fly ash produced in Sweden is currently approximately 60,000 tons/y. New technological options for the decontamination and/or inertization of incinerator fly ash are being developed with the objective of rendering a product that can be reused or, at least, be deposited at standard landfill sites with no risk. Many of these technologies have been tested at industrial scale or in pilot projects. The proposed alternatives include: Thermal treatments; Immobilization/stabilization by cement based techniques; Wet chemical treatments (extractions, immobilizations); Microbiological treatments. Of these, thermal treatments are the most promising solution. Depending on the temperature thermal treatments are classified in two main types: 1) low temperature (below 600 deg C) thermal treatments and 2) high temperature (above 1200 deg C) thermal treatments (vitrification). Most dioxins can be successfully destroyed at temperatures up to 400 deg C under oxygen deficient conditions and at temperatures up to 600 deg C under oxidising conditions. However most heavy metals remain in the fly ash after low temperature treatment. At a temperature of 900 deg C most heavy metals can also be removed in a 10% HCl atmosphere by forming volatile metal chlorides (CT-Fluapur process). During vitrification processes the fly ash melts and forms an inert glassy slag. The product does not leach any significant amount of heavy metals and is free from dioxin. The volume of the fly ash is significantly reduced. The product can be land filled at low costs or used as construction material. The properties of the product depend on the cooling process and on additives such as sand, limestone or waste glass. A series of vitrification methods at industrial size or in pilot scale using different furnaces are studied. Among these, plasma

  9. Thermal test requirements and their verification by different test methods

    International Nuclear Information System (INIS)

    Droste, B.; Wieser, G.; Probst, U.

    1993-01-01

    The paper discusses the parameters influencing the thermal test conditions for type B-packages. Criteria for different test methods (by analytical as well as by experimental means) will be developed. A comparison of experimental results from fuel oil pool and LPG fire tests will be given. (J.P.N.)

  10. New weighting methods for phylogenetic tree reconstruction using multiple loci.

    Science.gov (United States)

    Misawa, Kazuharu; Tajima, Fumio

    2012-08-01

    Efficient determination of evolutionary distances is important for the correct reconstruction of phylogenetic trees. The performance of the pooled distance required for reconstructing a phylogenetic tree can be improved by applying large weights to appropriate distances for reconstructing phylogenetic trees and small weights to inappropriate distances. We developed two weighting methods, the modified Tajima-Takezaki method and the modified least-squares method, for reconstructing phylogenetic trees from multiple loci. By computer simulations, we found that both of the new methods were more efficient in reconstructing correct topologies than the no-weight method. Hence, we reconstructed hominoid phylogenetic trees from mitochondrial DNA using our new methods, and found that the levels of bootstrap support were significantly increased by the modified Tajima-Takezaki and by the modified least-squares method.

  11. Economic analysis vs. capital-recovery requirements of power reactor decommissioning

    International Nuclear Information System (INIS)

    Ferguson, J.S.

    1980-01-01

    As a consultant to electric utilities the author often becomes involved in the development of policy for capital recovery and in the determination of depreciation rates that will implement the policy. Utility capital recovery is controlled by generally accepted depreciation accounting practices and by regulatory commission accounting rules and, as a result, can differ significantly from engineering economics. Those involved with decommissioning of power reactors should be aware of the depreciation accounting and regulatory framework that dictates capital recovery requirements, whether their involvement is related to engineering economics or capital recovery. This presentation defines that framework, points out several significant implications (particularly tax), describes several conforming capital-recovery methods, describes several techniques that have been used with the decommissioning component in economic analysis of alternative energy sources, and discusses why those involved in economic analysis should learn the accounting and regulatory framework for capital recovery

  12. Human Body 3D Posture Estimation Using Significant Points and Two Cameras

    Science.gov (United States)

    Juang, Chia-Feng; Chen, Teng-Chang; Du, Wei-Chin

    2014-01-01

    This paper proposes a three-dimensional (3D) human posture estimation system that locates 3D significant body points based on 2D body contours extracted from two cameras without using any depth sensors. The 3D significant body points that are located by this system include the head, the center of the body, the tips of the feet, the tips of the hands, the elbows, and the knees. First, a linear support vector machine- (SVM-) based segmentation method is proposed to distinguish the human body from the background in red, green, and blue (RGB) color space. The SVM-based segmentation method uses not only normalized color differences but also included angle between pixels in the current frame and the background in order to reduce shadow influence. After segmentation, 2D significant points in each of the two extracted images are located. A significant point volume matching (SPVM) method is then proposed to reconstruct the 3D significant body point locations by using 2D posture estimation results. Experimental results show that the proposed SVM-based segmentation method shows better performance than other gray level- and RGB-based segmentation approaches. This paper also shows the effectiveness of the 3D posture estimation results in different postures. PMID:24883422

  13. Methodical investigations on the determination of metabolic lysine requirements in broiler chickens. 1

    International Nuclear Information System (INIS)

    Bergner, H.; Nguyen Thi Nhan; Wilke, A.

    1987-01-01

    For the estimation of lysine requirement 128 male broiler chickens were used at an age of 7 to 21 days posthatching. They received a lysine-deficient diet composed of wheat and wheat gluten. To this basal diet L-lysine-HCL was supplemented successively resulting in 8 lysine levels ranging from 5.8 to 23.3 g lysine per kg dry matter (DM) (2.2 to 8.7 g lysine per 16 g N). At the end of the two-week feeding period of the experimental diets 14 C-lysine was injected intravenously 1.5 and 5.5 hours after feed withdrawal. During the following 4 hours the exretion of CO 2 and 14 CO 2 was measured. The highest daily gain of 21.5 g was observed in animals fed 13.3 g lysine-kg DM. Lysine concentrations exceeding 18.3 g/kg DM depressed body weight gain. The CO 2 excretion was not influenced by lysine intake. 14 CO 2 excretion was low with diets low in lysine content and increased 3 to 4 times with diets meeting the lysine requirement. Based on measurements 1.5 to 5.5 hours after feed withdrawal the saturation value for lysine was reached at 13.3 g/kg DM. This value was lowered (10.8 g/kg DM), however, if the estimation was carried out 5.5 to 9.5 hours after feed withdrawal. These results suggest a higher metabolic lysine requirement during the earlier period after feed intake. Both, reduced weight gain and non linearity in 14 CO 2 excretion in diets exceeding a lysine content of 18.3 g/kg DM indicate a limited capacity of the organism to degrade excessive lysine. According to the results a lysine requirement betwen 10.8 and 13.3 g/kg DM (27% CP and 660 EFU/sub hen//kg DM) was estimated for broiler chickens 3 weeks posthatching. (author)

  14. Towards an Automated Requirements-driven Development of Smart Cyber-Physical Systems

    Directory of Open Access Journals (Sweden)

    Jiri Vinarek

    2016-03-01

    Full Text Available The Invariant Refinement Method for Self Adaptation (IRM-SA is a design method targeting development of smart Cyber-Physical Systems (sCPS. It allows for a systematic translation of the system requirements into the system architecture expressed as an ensemble-based component system (EBCS. However, since the requirements are captured using natural language, there exists the danger of their misinterpretation due to natural language requirements' ambiguity, which could eventually lead to design errors. Thus, automation and validation of the design process is desirable. In this paper, we (i analyze the translation process of natural language requirements into the IRM-SA model, (ii identify individual steps that can be automated and/or validated using natural language processing techniques, and (iii propose suitable methods.

  15. Path Planning with a Lazy Significant Edge Algorithm (LSEA

    Directory of Open Access Journals (Sweden)

    Joseph Polden

    2013-04-01

    Full Text Available Probabilistic methods have been proven to be effective for robotic path planning in a geometrically complex environment. In this paper, we propose a novel approach, which utilizes a specialized roadmap expansion phase, to improve lazy probabilistic path planning. This expansion phase analyses roadmap connectivity information to bias sampling towards objects in the workspace that have not yet been navigated by the robot. A new method to reduce the number of samples required to navigate narrow passages is also proposed and tested. Experimental results show that the new algorithm is more efficient than the traditional path planning methodologies. It was able to generate solutions for a variety of path planning problems faster, using fewer samples to arrive at a valid solution.

  16. Prevalence and clinical significance of cathepsin G antibodies in systemic sclerosis

    Directory of Open Access Journals (Sweden)

    M. Favaro

    2011-09-01

    Full Text Available Objectives: To evaluate the prevalence and clinical significance of cathepsin G antibodies in patients affected with systemic sclerosis (SSc, scleroderma. Methods: 115 patients affected by SSc, 55 (47,8% with diffuse scleroderma (dSSc and 60 (52,2% with limited scleroderma (lSSc, were tested for cathepsin G antibodies by ELISA method. Moreover these sera were evaluated by indirect immunofluorescence (IIF on ethanol and formalin fixed human neutrophils. Results: By means of the ELISA method 16 (13,9% patients were found to be sera positive for anti-cathepsin G, 2 (12.5% of which showed a perinuclear fluorescence pattern (P-ANCA and 4 (25% an atypical ANCA staining, while 10 (62,5% were negative on IIF. The IIF on scleroderma sera revealed 5 (4,3% P-ANCA and 18 (15,7% atypical ANCA patterns. The anti-cathepsin G antibodies significantly prevailed in scleroderma sera (p=0.02 when their frequency was compared with that of healthy controls; while they were not significantly associated to any clinical or serological features of SSc patients. Conclusions: The anti-cathepsin G antibodies were significantly frequent in scleroderma sera; however, no clinical correlations were found. Thus, the significance of their presence in SSc still needs to be clarified.

  17. Ranking of risk significant components for the Davis-Besse Component Cooling Water System

    International Nuclear Information System (INIS)

    Seniuk, P.J.

    1994-01-01

    Utilities that run nuclear power plants are responsible for testing pumps and valves, as specified by the American Society of Mechanical Engineers (ASME) that are required for safe shutdown, mitigating the consequences of an accident, and maintaining the plant in a safe condition. These inservice components are tested according to ASME Codes, either the earlier requirements of the ASME Boiler and Pressure Vessel Code, Section XI, or the more recent requirements of the ASME Operation and Maintenance Code, Section IST. These codes dictate test techniques and frequencies regardless of the component failure rate or significance of failure consequences. A probabilistic risk assessment or probabilistic safety assessment may be used to evaluate the component importance for inservice test (IST) risk ranking, which is a combination of failure rate and failure consequences. Resources for component testing during the normal quarterly verification test or postmaintenance test are expensive. Normal quarterly testing may cause component unavailability. Outage testing may increase outage cost with no real benefit. This paper identifies the importance ranking of risk significant components in the Davis-Besse component cooling water system. Identifying the ranking of these risk significant IST components adds technical insight for developing the appropriate test technique and test frequency

  18. Ocean disposal of heat generating radioactive waste backfilling requirements

    International Nuclear Information System (INIS)

    1986-07-01

    This report describes the backfilling requirements arising from the disposal of HGW in deep ocean sediments. The two disposal options considered are the drilled emplacement method and the free fall penetrator method. The materials best suited for filling the voids in the two options are reviewed. Candidate materials are selected following a study of the property requirements of each backfill. Placement methods for the candidate materials, as well as the means available for verifying the quality of the filling, are presented. Finally, an assessment of the overall feasibility of each placement method is given. The main conclusion is that, although the proposed methods are feasible, further work is necessary to test in inactive trials each of the proposed filling methods. Moreover, it is difficult to envisage how two of the backfilling operations in drilled emplacement option can be verified by non destructive methods. (author)

  19. Meeting the latest qualification requirements for Class 1E protection system equipment: a practical approach

    International Nuclear Information System (INIS)

    Daigle, R.P.; Breen, R.J.

    1977-01-01

    The requirements for qualifying Class 1E equipment for Nuclear Power Generating Stations were significantly revised in 1974 and 1975. These new requirements reflect the desire of the industry to provide improved methods of determining the qualification of this vital equipment. The revised standards do, in fact, meet these industry goals in a generally acceptable manner. The Nuclear Regulatory Commission is presently requiring utilities to comply with these revised standards and regulatory guides in order to obtain the necessary permits. Manufacturers are developing and implementing programs to comply with the new requirements. One of the more difficult new requirements of qualification is aging to achieve advanced life condition. The objectives and methods described for aging are difficult for much of the equipment within the Protection System. The use of thermal and vibrational techniques to simulate aging is valid for some components (i.e., capacitors, transistors, cable and/or motor insulation) but may be neither valid nor practical for many items (e.g., complete instrument systems, etc.). A seemingly obvious approach, although rarely followed, in regarding new or revised standards is to refrain from making any type of commitment until the standards are thoroughly understood. Often too hasty a decision is made by a utility (concerned about licensing) or a manufacturer (concerned about being competitive) to commit to new requirements. Consequently, the broad range of interpretations that usually develops for a given set of requirements may result in difficult relations between organizations. This paper deals with solutions for qualification in a practical sense, with emphasis on the aging issue and does not elaborate on seismic qualification

  20. Surface-Based fMRI-Driven Diffusion Tractography in the Presence of Significant Brain Pathology: A Study Linking Structure and Function in Cerebral Palsy

    Science.gov (United States)

    Cunnington, Ross; Boyd, Roslyn N.; Rose, Stephen E.

    2016-01-01

    Diffusion MRI (dMRI) tractography analyses are difficult to perform in the presence of brain pathology. Automated methods that rely on cortical parcellation for structural connectivity studies often fail, while manually defining regions is extremely time consuming and can introduce human error. Both methods also make assumptions about structure-function relationships that may not hold after cortical reorganisation. Seeding tractography with functional-MRI (fMRI) activation is an emerging method that reduces these confounds, but inherent smoothing of fMRI signal may result in the inclusion of irrelevant pathways. This paper describes a novel fMRI-seeded dMRI-analysis pipeline based on surface-meshes that reduces these issues and utilises machine-learning to generate task specific white matter pathways, minimising the requirement for manually-drawn ROIs. We directly compared this new strategy to a standard voxelwise fMRI-dMRI approach, by investigating correlations between clinical scores and dMRI metrics of thalamocortical and corticomotor tracts in 31 children with unilateral cerebral palsy. The surface-based approach successfully processed more participants (87%) than the voxel-based approach (65%), and provided significantly more-coherent tractography. Significant correlations between dMRI metrics and five clinical scores of function were found for the more superior regions of these tracts. These significant correlations were stronger and more frequently found with the surface-based method (15/20 investigated were significant; R2 = 0.43–0.73) than the voxelwise analysis (2 sig. correlations; 0.38 & 0.49). More restricted fMRI signal, better-constrained tractography, and the novel track-classification method all appeared to contribute toward these differences. PMID:27487011

  1. Significant Returns in Engagement and Performance with a Free Teaching App

    Science.gov (United States)

    Green, Alan

    2016-01-01

    Pedagogical research shows that teaching methods other than traditional lectures may result in better outcomes. However, lecture remains the dominant method in economics, likely due to high implementation costs of methods shown to be effective in the literature. In this article, the author shows significant benefits of using a teaching app for…

  2. Dynamic contrast-enhanced magnetic resonance imaging: a non-invasive method to evaluate significant differences between malignant and normal tissue

    International Nuclear Information System (INIS)

    Rudisch, Ansgar; Kremser, Christian; Judmaier, Werner; Zunterer, Hildegard; DeVries, Alexander F.

    2005-01-01

    Purpose: An ever recurring challenge in diagnostic radiology is the differentiation between non-malignant and malignant tissue. Based on evidence that microcirculation of normal, non-malignant tissue differs from that of malignant tissue, the goal of this study was to assess the reliability of dynamic contrast-enhanced Magnetic Resonance Imaging (dcMRI) for differentiating these two entities. Materials and methods: DcMRI data of rectum carcinoma and gluteus maximus muscles were acquired in 41 patients. Using an fast T1-mapping sequence on a 1.5-T whole body scanner, T1-maps were dynamically retrieved before, during and after constant rate i.v. infusion of a contrast medium (CM). On the basis of the acquired data sets, PI-values were calculated on a pixel-by-pixel basis. The relevance of spatial heterogeneities of microcirculation was investigated by relative frequency histograms of the PI-values. Results: A statistically significant difference between malignant and normal tissue was found for the mean PI-value (P < 0.001; 8.95 ml/min/100 g ± 2.45 versus 3.56 ml/min/100 g ± 1.20). Additionally relative frequency distributions of PI-values with equal class intervals of 2.5 ml/min/100 g revealed significant differences between the histograms of muscles and rectum carcinoma. Conclusion: We could show that microcirculation differences between malignant and normal, non-malignant tissue can be reliably assessed by non-invasive dcMRI. Therefore, dcMRI holds great promise in the aid of cancer assessment, especially in patients where biopsy is contraindicated

  3. 40 CFR 465.03 - Monitoring and reporting requirements.

    Science.gov (United States)

    2010-07-01

    ... less than 30 days in advance of the scheduled production and shall provide the chemical analysis of the... this regulation. (a) Periodic analyses for cyanide are not required when both of the following... method required for determination of petroleum hydrocarbons (non-polar material) is given under the...

  4. Methods for calculating energy and current requirements for industrial electron beam processing

    International Nuclear Information System (INIS)

    Cleland, M.R.; Farrell, J.P.

    1976-01-01

    The practical problems of determining electron beam parameters for industrial irradiation processes are discussed. To assist the radiation engineer in this task, the physical aspects of electron beam absorption are briefly described. Formulas are derived for calculating the surface dose in the treated material using the electron energy, beam current and the area thruput rate of the conveyor. For thick absorbers electron transport results are used to obtain the depth-dose distributions. From these the average dose in the material, anti D, and the beam power utilization efficiency, F/sub p/, can be found by integration over the distributions. These concepts can be used to relate the electron beam power to the mass thruput rate. Qualitatively, the thickness of the material determines the beam energy, the area thruput rate and surface dose determine the beam current while the mass thruput rate and average depth-dose determine the beam power requirements. Graphs are presented showing these relationships as a function of electron energy from 0.2 to 4.0 MeV for polystyrene. With this information, the determination of electron energy and current requirements is a relatively simple procedure

  5. On the relative significance of lithospheric weakening mechanisms for sustained plate tectonics

    Science.gov (United States)

    Araceli Sanchez-Maes, Sophia

    2018-01-01

    Plate tectonics requires the bending of strong plates at subduction zones, which is difficult to achieve without a secondary weakening mechanism. Two classes of weakening mechanisms have been proposed for the generation of ongoing plate tectonics, distinguished by whether or not they require water. Here we show that the energy budget of global subduction zones offers a simple yet decisive test on their relative significance. Theoretical studies of mantle convection suggest bending dissipation to occupy only 10-20 % of total dissipation in the mantle, and our results indicate that the hydrous mechanism in the shallow part of plates is essential to satisfy the requirement. Thus, surface oceans are required for the long-term operation of plate tectonics on terrestrial worlds. Establishing this necessary and observable condition for sustained plate tectonics carries important implications for planetary habitability at large.

  6. Sampling methods

    International Nuclear Information System (INIS)

    Loughran, R.J.; Wallbrink, P.J.; Walling, D.E.; Appleby, P.G.

    2002-01-01

    Methods for the collection of soil samples to determine levels of 137 Cs and other fallout radionuclides, such as excess 210 Pb and 7 Be, will depend on the purposes (aims) of the project, site and soil characteristics, analytical capacity, the total number of samples that can be analysed and the sample mass required. The latter two will depend partly on detector type and capabilities. A variety of field methods have been developed for different field conditions and circumstances over the past twenty years, many of them inherited or adapted from soil science and sedimentology. The use of them inherited or adapted from soil science and sedimentology. The use of 137 Cs in erosion studies has been widely developed, while the application of fallout 210 Pb and 7 Be is still developing. Although it is possible to measure these nuclides simultaneously, it is common for experiments to designed around the use of 137 Cs along. Caesium studies typically involve comparison of the inventories found at eroded or sedimentation sites with that of a 'reference' site. An accurate characterization of the depth distribution of these fallout nuclides is often required in order to apply and/or calibrate the conversion models. However, depending on the tracer involved, the depth distribution, and thus the sampling resolution required to define it, differs. For example, a depth resolution of 1 cm is often adequate when using 137 Cs. However, fallout 210 Pb and 7 Be commonly has very strong surface maxima that decrease exponentially with depth, and fine depth increments are required at or close to the soil surface. Consequently, different depth incremental sampling methods are required when using different fallout radionuclides. Geomorphic investigations also frequently require determination of the depth-distribution of fallout nuclides on slopes and depositional sites as well as their total inventories

  7. 1 CFR 21.21 - General requirements: References.

    Science.gov (United States)

    2010-01-01

    ... 1 General Provisions 1 2010-01-01 2010-01-01 false General requirements: References. 21.21 Section... to test methods or consensus standards produced by a Federal agency that have replaced or preempted private or voluntary test methods or consensus standards in a subject matter area. (5) The reference is to...

  8. Reverse Schreinemakers Method for Experimental Analysis of Mixed-Solvent Electrolyte Systems

    DEFF Research Database (Denmark)

    Fosbøl, Philip Loldrup; Thomsen, Kaj; Stenby, Erling Halfdan

    2009-01-01

    the reverse Schreinemakers (RS) method. The method is based on simple mass balance principles similar to the wet residues method. It allows for accurate determination of the mixed-solvent phase composition even though part of the solvent may precipitate as complexes between solvent and salt. Discrepancies......A method based on Schreinemakers's tie-line theory of 1893 is derived for determining the composition and phase amounts in solubility experiments for multi-solvent electrolyte systems. The method uses the lever rule in reverse compared to Schreinemakers's wet residue method, and is therefore called...... from determining the composition of salt mixtures by pH titration are discussed, and the derived method significantly improves the obtained result from titration. Furthermore, the method reduces the required experimental work needed for analysis of phase composition. The method is applicable to multi...

  9. On Structuring Subjective Judgements: Originality, Significance and Rigour in RAE2008

    Science.gov (United States)

    Johnston, Ron

    2008-01-01

    The 2008 United Kingdom Research Assessment Exercise will involve the evaluation of thousands of individual research outputs. The Funding Councils set three criteria for those evaluations--Originality, rigour and significance--and required each output to be placed into a fivefold categorisation of excellence, using absolute rather than relative…

  10. General review of quality assurance system requirements. The utility or customer requirement

    International Nuclear Information System (INIS)

    Fowler, J.L.

    1976-01-01

    What are the customer's Quality Assurance requirements and how does he convey these to his contractor, or apply them to himself. Many documents have been prepared mostly by countries with high technology availability and it is significant to note that many of the documents, particularly those of the United States of America, were prepared for nuclear safety related plant, but the logic of these documents equally applied to heavy engineering projects that are cost effective, and this is the current thinking and practice within the CEGB (Central Electricity Generating Board). Some documents have legislative backing, others rely on contractual disciplines, but they all appear to repeat the same basic requirements, so why does one continue to write more documents. The basic problem is that customers have to satisfy differing national legislative, economic and commercial requirements and, like all discerning customers, wish to reserve the right to satisfy their own needs, which are very often highly specialized. The CEGB are aware of this problem and are actively co-operating with most of the national and international authorities who are leading in this field, with a view to obtaining compatibility of requirements, but now there still remains the problem of satisfying national custom and practice. (author)

  11. Derivation of a No-Significant-Risk-Level (NSRL) for diethanolamine (DEA).

    Science.gov (United States)

    Wang, Bingxuan; Amacher, David E; Whittaker, Margaret H

    2014-02-01

    Diethanolamine (DEA) has been listed on the State of California's Proposition 65 List. This listing is based in part on tumors reported in a National Toxicology Program (NTP) 2-year dermal carcinogenicity study in mice which found clear evidence of carcinogenic activity in B6C3F₁ mice based on increased incidences of liver neoplasms in both sexes, and increased incidences of renal tubule neoplasms in males. Although considerable controversy exists on the relevance of the NTP study to humans, industries are obligated to comply with the Proposition 65 labeling requirement and drinking water discharge prohibition, unless they are able to demonstrate that DEA levels in their products are below a specific No Significant Risk Level (NSRL). The State of California has not published an NSRL for DEA. In this article, a NSRL of 5.6 μg/day and a life-stage-adjusted NSRL(adj) of 1.4 μg/day are derived from the NTP carcinogenicity study using a benchmark dose modeling method based on the incidence of hepatocellular carcinomas in female mice, in accordance with the guidelines of California EPA. Copyright © 2013 Elsevier Inc. All rights reserved.

  12. Impact significance determination-Designing an approach

    International Nuclear Information System (INIS)

    Lawrence, David P.

    2007-01-01

    The question of how best to go about determining the significance of impacts has, to date, only been addressed in a partial and preliminary way. The assumption tends to be made that it is either only necessary to provide explicit, justified reasons for a judgment about significance and/or to explicitly apply a prescribed procedure-a procedure usually involving the staged application of thresholds and/or criteria. The detailed attributes, strengths and limitations of such approaches and possible alternative approaches have yet to be explored systematically. This article addresses these deficiencies by analyzing the characteristics, specific methods and positive and negative tendencies of three general impact significance determination approaches-the technical approach, the collaborative approach and the reasoned argumentation approach. A range of potential composite approaches are also described. With an enhanced understanding of these approaches, together with potential combinations, EIA practitioners and other EIA participants can be in a better position to select an approach appropriate to their needs, to reinforce the positive tendencies and offset the negative tendencies of the selected approach and to combine the best qualities of more than one approach

  13. A Fast LMMSE Channel Estimation Method for OFDM Systems

    Directory of Open Access Journals (Sweden)

    Zhou Wen

    2009-01-01

    Full Text Available A fast linear minimum mean square error (LMMSE channel estimation method has been proposed for Orthogonal Frequency Division Multiplexing (OFDM systems. In comparison with the conventional LMMSE channel estimation, the proposed channel estimation method does not require the statistic knowledge of the channel in advance and avoids the inverse operation of a large dimension matrix by using the fast Fourier transform (FFT operation. Therefore, the computational complexity can be reduced significantly. The normalized mean square errors (NMSEs of the proposed method and the conventional LMMSE estimation have been derived. Numerical results show that the NMSE of the proposed method is very close to that of the conventional LMMSE method, which is also verified by computer simulation. In addition, computer simulation shows that the performance of the proposed method is almost the same with that of the conventional LMMSE method in terms of bit error rate (BER.

  14. Roles and methods of performance evaluation of hospital academic leadership.

    Science.gov (United States)

    Zhou, Ying; Yuan, Huikang; Li, Yang; Zhao, Xia; Yi, Lihua

    2016-01-01

    The rapidly advancing implementation of public hospital reform urgently requires the identification and classification of a pool of exceptional medical specialists, corresponding with incentives to attract and retain them, providing a nucleus of distinguished expertise to ensure public hospital preeminence. This paper examines the significance of academic leadership, from a strategic management perspective, including various tools, methods and mechanisms used in the theory and practice of performance evaluation, and employed in the selection, training and appointment of academic leaders. Objective methods of assessing leadership performance are also provided for reference.

  15. An anaesthetic pre-operative assessment clinic reduces pre-operative inpatient stay in patients requiring major vascular surgery.

    LENUS (Irish Health Repository)

    O'Connor, D B

    2012-02-01

    BACKGROUND: Patients undergoing major vascular surgery (MVS) require extensive anaesthetic assessment. This can require extended pre-operative stays. AIMS: We investigated whether a newly established anaesthetic pre-operative assessment clinic (PAC) would reduce the pre-operative inpatient stay, avoid unnecessary investigations and facilitate day before surgery (DBS) admissions for patients undergoing MVS. PATIENT AND METHODS: One year following and preceding the establishment of the PAC the records of patients undergoing open or endovascular aortic aneurysm repair, carotid endarterectomy and infra-inguinal bypass were reviewed to measure pre-operative length of stay (LoS). RESULTS: Pre-operative LoS was significantly reduced in the study period (1.85 vs. 4.2 days, respectively, P < 0.0001). Only 12 out of 61 patients in 2007 were admitted on the DBS and this increased to 33 out of 63 patients (P = 0.0002). No procedure was cancelled for medical reasons. CONCLUSION: The PAC has facilitated accurate outpatient anaesthetic assessment for patients requiring MVS. The pre-operative in-patient stay has been significantly reduced.

  16. Histological staining methods preparatory to laser capture microdissection significantly affect the integrity of the cellular RNA

    OpenAIRE

    Wang, Hongyang; Owens, James D; Shih, Joanna H; Li, Ming-Chung; Bonner, Robert F; Mushinski, J Frederic

    2006-01-01

    Abstract Background Gene expression profiling by microarray analysis of cells enriched by laser capture microdissection (LCM) faces several technical challenges. Frozen sections yield higher quality RNA than paraffin-imbedded sections, but even with frozen sections, the staining methods used for histological identification of cells of interest could still damage the mRNA in the cells. To study the contribution of staining methods to degradation of results from gene expression profiling of LCM...

  17. An Approach for Integrating the Prioritization of Functional and Nonfunctional Requirements

    Directory of Open Access Journals (Sweden)

    Mohammad Dabbagh

    2014-01-01

    Full Text Available Due to the budgetary deadlines and time to market constraints, it is essential to prioritize software requirements. The outcome of requirements prioritization is an ordering of requirements which need to be considered first during the software development process. To achieve a high quality software system, both functional and nonfunctional requirements must be taken into consideration during the prioritization process. Although several requirements prioritization methods have been proposed so far, no particular method or approach is presented to consider both functional and nonfunctional requirements during the prioritization stage. In this paper, we propose an approach which aims to integrate the process of prioritizing functional and nonfunctional requirements. The outcome of applying the proposed approach produces two separate prioritized lists of functional and non-functional requirements. The effectiveness of the proposed approach has been evaluated through an empirical experiment aimed at comparing the approach with the two state-of-the-art-based approaches, analytic hierarchy process (AHP and hybrid assessment method (HAM. Results show that our proposed approach outperforms AHP and HAM in terms of actual time-consumption while preserving the quality of the results obtained by our proposed approach at a high level of agreement in comparison with the results produced by the other two approaches.

  18. An approach for integrating the prioritization of functional and nonfunctional requirements.

    Science.gov (United States)

    Dabbagh, Mohammad; Lee, Sai Peck

    2014-01-01

    Due to the budgetary deadlines and time to market constraints, it is essential to prioritize software requirements. The outcome of requirements prioritization is an ordering of requirements which need to be considered first during the software development process. To achieve a high quality software system, both functional and nonfunctional requirements must be taken into consideration during the prioritization process. Although several requirements prioritization methods have been proposed so far, no particular method or approach is presented to consider both functional and nonfunctional requirements during the prioritization stage. In this paper, we propose an approach which aims to integrate the process of prioritizing functional and nonfunctional requirements. The outcome of applying the proposed approach produces two separate prioritized lists of functional and non-functional requirements. The effectiveness of the proposed approach has been evaluated through an empirical experiment aimed at comparing the approach with the two state-of-the-art-based approaches, analytic hierarchy process (AHP) and hybrid assessment method (HAM). Results show that our proposed approach outperforms AHP and HAM in terms of actual time-consumption while preserving the quality of the results obtained by our proposed approach at a high level of agreement in comparison with the results produced by the other two approaches.

  19. Use of simple transport equations to estimate waste package performance requirements

    International Nuclear Information System (INIS)

    Wood, B.J.

    1982-01-01

    A method of developing waste package performance requirements for specific nuclides is described. The method is based on: Federal regulations concerning permissible concentrations in solution at the point of discharge to the accessible environment; a simple and conservative transport model; baseline and potential worst-case release scenarios. Use of the transport model enables calculation of maximum permissible release rates within a repository in basalt for each of the scenarios. The maximum permissible release rates correspond to performance requirements for the engineered barrier system. The repository was assumed to be constructed in a basalt layer. For the cases considered, including a well drilled into an aquifer 1750 m from the repository center, little significant advantage is obtained from a 1000-yr as opposed to a 100-yr waste package. A 1000-yr waste package is of importance only for nuclides with half-lives much less than 100 yr which travel to the accessible environment in much less than 1000 yr. Such short travel times are extremely unlikely for a mined repository. Among the actinides, the most stringent maximum permissible release rates are for 236 U and 234 U. A simple solubility calculation suggests, however, that these performance requirements can be readily met by the engineered barrier system. Under the reducing conditions likely to occur in a repository located in basalt, uranium would be sufficiently insoluble that no solution could contain more than about 0.01% of the maximum permissible concentration at saturation. The performance requirements derived from the one-dimensional modeling approach are conservative by at least one to two orders of magnitude. More quantitative three-dimensional modeling at specific sites should enable relaxation of the performance criteria derived in this study. 12 references, 8 figures, 8 tables

  20. A goal-oriented requirements modelling language for enterprise architecture

    NARCIS (Netherlands)

    Quartel, Dick; Engelsman, W.; Jonkers, Henk; van Sinderen, Marten J.

    2009-01-01

    Methods for enterprise architecture, such as TOGAF, acknowledge the importance of requirements engineering in the development of enterprise architectures. Modelling support is needed to specify, document, communicate and reason about goals and requirements. Current modelling techniques for

  1. Environmental impact statement requirements for CNEA

    International Nuclear Information System (INIS)

    Ciurciolo, Melisa N.; Mender, J. A.

    2009-01-01

    The purpose of this paper is to describe the legal framework on Environmental Impact Assessment (EIA) regarding the activities of the National Atomic Energy Commission (Comision Nacional de Energia Atomica, CNEA), and particularly, the Procedure for Internal Management of Environmental Impact Statements of CNEA (PN-PR-027). According to the distribution of powers stated in article 41 of the National Constitution, the environmental legal framework is constituted by National minimum standards for environmental protection and complementary provincial and municipal regulations. As a result, the EIA legal framework is not uniform across the Nation, and therefore, it differs according to the jurisdiction in which the activity subject to EIA is developed. Notwithstanding, the General Statute of the Environment (25.675) requires EIA for any project or activity developed in the National territory, which may cause a significant degradation to the environment, any of its components, or affect the populations' quality of life in a significant way. Since CNEA develops activities along the National territory, it is not possible to determine a uniform legal EIA framework for the entire Institution. Consequently, the binding requirements for Environmental Impact Statements (EISs) of CNEA activities differ among the activities developed in the different locations and atomic centers. In order to achieve a uniform environmental performance in CNEA, it has been considered necessary to uniform, in the internal sphere, the binding requirements for EIS, by means of a procedure written within the framework of the Environmental Management System of the Institution. The purpose of the Procedure for Internal Management of Environmental Impact Statements is to determine the requirements to be complied by the atomic centers, locations and enterprises associated with CNEA, regarding EIS Management. This Procedure shall apply to those projects and activities subjected to EIA, according to a

  2. REQUIREMENTS FOR IMAGE QUALITY OF EMERGENCY SPACECRAFTS

    Directory of Open Access Journals (Sweden)

    A. I. Altukhov

    2015-05-01

    Full Text Available The paper deals with the method for formation of quality requirements to the images of emergency spacecrafts. The images are obtained by means of remote sensing of near-earth space orbital deployment in the visible range. of electromagnetic radiation. The method is based on a joint taking into account conditions of space survey, characteristics of surveillance equipment, main design features of the observed spacecrafts and orbital inspection tasks. Method. Quality score is the predicted linear resolution image that gives the possibility to create a complete view of pictorial properties of the space image obtained by electro-optical system from the observing satellite. Formulation of requirements to the numerical value of this indicator is proposed to perform based on the properties of remote sensing system, forming images in the conditions of outer space, and the properties of the observed emergency spacecraft: dimensions, platform construction of the satellite, on-board equipment placement. For method implementation the authors have developed a predictive model of requirements to a linear resolution for images of emergency spacecrafts, making it possible to select the intervals of space shooting and get the satellite images required for quality interpretation. Main results. To verify the proposed model functionality we have carried out calculations of the numerical values for the linear resolution of the image, ensuring the successful task of determining the gross structural damage of the spacecrafts and identifying changes in their spatial orientation. As input data were used with dimensions and geometric primitives corresponding to the shape of deemed inspected spacecrafts: Resurs-P", "Canopus-B", "Electro-L". Numerical values of the linear resolution images have been obtained, ensuring the successful task solution for determining the gross structural damage of spacecrafts.

  3. The significance and perspectives of role playing as a teaching method in process oriented nursing education

    OpenAIRE

    小松, 万喜子; 冨岡, 詔子; 山崎, 章恵; 柳澤, 節子; 百瀬, 由美子

    1999-01-01

    This study was to report the results of students' responses to role playing which has been introduced as a teaching method in adult nursing clinical practice. About 80% of 79 students participated, positively evaluated their experiences in role playing as useful to their clinical practice, with 20% claiming negatively as not useful. An overall result suggested the usefulness of role playing as a method to enhance students' general readiness to their clinical practice. As students with negativ...

  4. Detecting significant changes in protein abundance

    Directory of Open Access Journals (Sweden)

    Kai Kammers

    2015-06-01

    Full Text Available We review and demonstrate how an empirical Bayes method, shrinking a protein's sample variance towards a pooled estimate, leads to far more powerful and stable inference to detect significant changes in protein abundance compared to ordinary t-tests. Using examples from isobaric mass labelled proteomic experiments we show how to analyze data from multiple experiments simultaneously, and discuss the effects of missing data on the inference. We also present easy to use open source software for normalization of mass spectrometry data and inference based on moderated test statistics.

  5. Non-stationarities significantly distort short-term spectral, symbolic and entropy heart rate variability indices

    International Nuclear Information System (INIS)

    Magagnin, Valentina; Bassani, Tito; Bari, Vlasta; Turiel, Maurizio; Porta, Alberto; Maestri, Roberto; Pinna, Gian Domenico

    2011-01-01

    The autonomic regulation is non-invasively estimated from heart rate variability (HRV). Many methods utilized to assess autonomic regulation require stationarity of HRV recordings. However, non-stationarities are frequently present even during well-controlled experiments, thus potentially biasing HRV indices. The aim of our study is to quantify the potential bias of spectral, symbolic and entropy HRV indices due to non-stationarities. We analyzed HRV series recorded in healthy subjects during uncontrolled daily life activities typical of 24 h Holter recordings and during predetermined levels of robotic-assisted treadmill-based physical exercise. A stationarity test checking the stability of the mean and variance over short HRV series (about 300 cardiac beats) was utilized to distinguish stationary periods from non-stationary ones. Spectral, symbolic and entropy indices evaluated solely over stationary periods were contrasted with those derived from all the HRV segments. When indices were calculated solely over stationary series, we found that (i) during both uncontrolled daily life activities and controlled physical exercise, the entropy-based complexity indices were significantly larger; (ii) during uncontrolled daily life activities, the spectral and symbolic indices linked to sympathetic modulation were significantly smaller and those associated with vagal modulation were significantly larger; (iii) while during uncontrolled daily life activities, the variance of spectral, symbolic and entropy rate indices was significantly larger, during controlled physical exercise, it was smaller. The study suggests that non-stationarities increase the likelihood to overestimate the contribution of sympathetic control and affect the power of statistical tests utilized to discriminate conditions and/or groups

  6. Management competencies required in the transition from a technician to a supervisor

    Directory of Open Access Journals (Sweden)

    Sibongile R. Mahlangu (Kubheka

    2015-05-01

    Full Text Available Orientation: Technicians are frequently promoted to supervisory positions based on their technical abilities, with scant attention focused on developing management competencies. This oversight often poses significant challenges. The effective transition from technician to supervisor is important in any organisation. Research objective: The primary objective is to identify and verify the competencies that are required for a technician and a supervisory position; the secondary objective is to identify the gap that must be filled with relevant training interventions to enable technicians to make an effective transition to a supervisory position. Motivation for this study: The identification of the management competencies required for a technician who makes a career change to a supervisor position. Research method: The sequential mixed method approach was used to enable the twophase data collection process: phase one was the quantitative phase and phase two was the qualitative phase. Main findings: The overall findings confirm that there are indeed management competencies that technicians require training and development on before being promoted to a supervisory position. Implication: Organisations need to identify the key competencies for a technician and a supervisor and implement development or training interventions that are essential to successfully transition an employee from the level of a technician to the level of a supervisor. Contribution: Organisations need to implement essential development or training interventions focused on developing management competencies and put in place support interventions such as coaching, job shadowing, mentoring and networking.

  7. Clinical significance of the fabella

    International Nuclear Information System (INIS)

    Dodevski, A.; Lazarova-Tosovska, D.; Zhivadinovik, J.; Lazareska, M.

    2012-01-01

    Full text: Introduction: There is variable number of sesamoid bones in the human body; one of them is fabella, located in the tendon of the gastrocnemius muscle. Aim of this study was to investigate the frequency of occurrence of fabella in the Macedonian population and to discuss about clinical importance of this bone. Materials and methods: We retrospectively examined radiographs of 53 patients who had knee exams undertaken for a variety of clinical reasons, performed as a part of their medical treatment. Over a time span of six months, 53 patients (38 males and 15 females, age range 19-60 years, mean age of 36.7±12.3 years) were examined. Results: In seven (13.2%) patients of 53 analyzed reports, fabella was found in the lateral tendon of gastrocnemius muscle. We did not find a significant gender or side difference in the appearance of fabella. Conclusion: Although anatomic studies emphasized a lack of significance of the fabella, this bone has been associated with a spectrum of pathology affecting the knee as fabellar syndrome, perineal nerve injury and fracture. We should think of this sesamoid bone while performing diagnostic and surgical procedures

  8. Cognition and procedure representational requirements for predictive human performance models

    Science.gov (United States)

    Corker, K.

    1992-01-01

    Models and modeling environments for human performance are becoming significant contributors to early system design and analysis procedures. Issues of levels of automation, physical environment, informational environment, and manning requirements are being addressed by such man/machine analysis systems. The research reported here investigates the close interaction between models of human cognition and models that described procedural performance. We describe a methodology for the decomposition of aircrew procedures that supports interaction with models of cognition on the basis of procedures observed; that serves to identify cockpit/avionics information sources and crew information requirements; and that provides the structure to support methods for function allocation among crew and aiding systems. Our approach is to develop an object-oriented, modular, executable software representation of the aircrew, the aircraft, and the procedures necessary to satisfy flight-phase goals. We then encode in a time-based language, taxonomies of the conceptual, relational, and procedural constraints among the cockpit avionics and control system and the aircrew. We have designed and implemented a goals/procedures hierarchic representation sufficient to describe procedural flow in the cockpit. We then execute the procedural representation in simulation software and calculate the values of the flight instruments, aircraft state variables and crew resources using the constraints available from the relationship taxonomies. The system provides a flexible, extensible, manipulative and executable representation of aircrew and procedures that is generally applicable to crew/procedure task-analysis. The representation supports developed methods of intent inference, and is extensible to include issues of information requirements and functional allocation. We are attempting to link the procedural representation to models of cognitive functions to establish several intent inference methods

  9. Tillage methods and mulch on water saving and yield of spring maize in Chitwan

    Directory of Open Access Journals (Sweden)

    Ishwari Prasad Upadhyay

    2016-12-01

    Full Text Available Tillage methods and mulch influences the productivity and water requirement of spring maize hence a field experiment was conducted at the National Maize Research Program, Rampur in spring seasons of 2011 and 2012 with the objectives to evaluate different tillage methods with and without mulch on water requirement and grain yield of spring maize. The experiment was laid out in two factors factorial randomized complete design with three replications. The treatments consisted of tillage methods (Permanent bed, Zero tillage and Conventional tillage and mulch (with and without. Irrigation timing was fixed as knee high stage, tasseling stage and milking/dough stage. Data on number of plants, number of ears, thousand grain weight and grain yield were recorded and analysed using GenStat. Two years combined result showed that the effect of tillage methods and mulch significant influenced grain yield and water requirement of spring maize. The maize grain yield was the highest in permanent beds with mulch (4626 kg ha-1 followed by zero tillage with mulch (3838 kg ha-1. Whereas total water applied calculated during the crop period were the highest in conventional tillage without mulch followed by conventional tillage with mulch. The permanent bed with mulch increased the yield and reduced the water requirement of spring maize in Chitwan.

  10. Impacts modeling using the SPH particulate method. Case study; Modelisation d'impacts par la methode particulaire SPH. Etude de cas

    Energy Technology Data Exchange (ETDEWEB)

    Debord, R

    1999-07-01

    The aim of this study is the modeling of the impact of melted metal on the reactor vessel head in the case of a core-meltdown accident. Modeling using the classical finite-element method alone is not sufficient but requires a coupling with particulate methods in order to take into account the behaviour of the corium. After a general introduction about particulate methods, the Nabor and SPH (smoothed particle hydrodynamics) methods are described. Then, the theoretical and numerical reliability of the SPH method is determined using simple cases. In particular, the number of neighbours significantly influences the preciseness of calculations. Also, the mesh of the structure must be adapted to the mesh of the fluid in order to reduce the edge effects. Finally, this study has shown that the values of artificial velocity coefficients used in the simulation of the BERDA test performed by the FZK Karlsruhe (Germany) are not correct. The domain of use of these coefficients was precised during a low speed impact. (J.S.)

  11. Role of Barium Swallow in Diagnosing Clinically Significant Anastomotic Leak following Esophagectomy

    Directory of Open Access Journals (Sweden)

    Simon Roh

    2016-04-01

    Full Text Available Background: Barium swallow is performed following esophagectomy to evaluate the anastomosis for detection of leaks and to assess the emptying of the gastric conduit. The aim of this study was to evaluate the reliability of the barium swallow study in diagnosing anastomotic leaks following esophagectomy. Methods: Patients who underwent esophagectomy from January 2000 to December 2013 at our institution were investigated. Barium swallow was routinely done between days 5–7 to detect a leak. These results were compared to clinically determined leaks (defined by neck wound infection requiring jejunal feeds and or parenteral nutrition during the postoperative period. The sensitivity and specificity of barium swallow in diagnosing clinically significant anastomotic leaks was determined. Results: A total of 395 esophagectomies were performed (mean age, 62.2 years. The indications for the esophagectomy were as follows: malignancy (n=320, high-grade dysplasia (n=14, perforation (n=27, benign stricture (n=7, achalasia (n=16, and other (n=11. A variety of techniques were used including transhiatal (n=351, McKeown (n=35, and Ivor Lewis (n=9 esophagectomies. Operative mortality was 2.8% (n=11. Three hundred and sixty-eight patients (93% underwent barium swallow study after esophagectomy. Clinically significant anastomotic leak was identified in 36 patients (9.8%. Barium swallow was able to detect only 13/36 clinically significant leaks. The sensitivity of the swallow in diagnosing a leak was 36% and specificity was 97%. The positive and negative predictive values of barium swallow study in detecting leaks were 59% and 93%, respectively. Conclusion: Barium swallow is an insensitive but specific test for detecting leaks at the cervical anastomotic site after esophagectomy.

  12. Functional requirements regarding medical registries--preliminary results.

    Science.gov (United States)

    Oberbichler, Stefan; Hörbst, Alexander

    2013-01-01

    The term medical registry is used to reference tools and processes to support clinical or epidemiologic research or provide a data basis for decisions regarding health care policies. In spite of this wide range of applications the term registry and the functional requirements which a registry should support are not clearly defined. This work presents preliminary results of a literature review to discover functional requirements which form a registry. To extract these requirements a set of peer reviewed articles was collected. These set of articles was screened by using methods from qualitative research. Up to now most discovered functional requirements focus on data quality (e. g. prevent transcription error by conducting automatic domain checks).

  13. A methodology to describe process control requirements

    International Nuclear Information System (INIS)

    Carcagno, R.; Ganni, V.

    1994-01-01

    This paper presents a methodology to describe process control requirements for helium refrigeration plants. The SSC requires a greater level of automation for its refrigeration plants than is common in the cryogenics industry, and traditional methods (e.g., written descriptions) used to describe process control requirements are not sufficient. The methodology presented in this paper employs tabular and graphic representations in addition to written descriptions. The resulting document constitutes a tool for efficient communication among the different people involved in the design, development, operation, and maintenance of the control system. The methodology is not limited to helium refrigeration plants, and can be applied to any process with similar requirements. The paper includes examples

  14. Significantly improving nuclear resonance fluorescence non-destructive assay by using the integral resonance transmission method and photofission

    International Nuclear Information System (INIS)

    Angell, Christopher T.; Hayakawa, Takehito; Shizuma, Toshiyuki; Hajima, Ryoichi

    2013-01-01

    Non-destructive assay (NDA) of 239 Pu in spent nuclear fuel or melted fuel using a γ-ray beam is possible using self absorption and the integral resonance transmission method. The method uses nuclear resonance absorption where resonances in 239 Pu remove photons from the beam, and the selective absorption is detected by measuring the decrease in scattering in a witness target placed in the beam after the fuel, consisting of the isotope of interest, namely 239 Pu. The method is isotope specific, and can use photofission or scattered γ-rays to assay the 239 Pu. It overcomes several problems related to NDA of melted fuel, including the radioactivity of the fuel, and the unknown composition and geometry. This talk will explain the general method, and how photofission can be used to assay specific isotopes, and present example calculations. (author)

  15. An analytical approach to customer requirement information processing

    Science.gov (United States)

    Zhou, Zude; Xiao, Zheng; Liu, Quan; Ai, Qingsong

    2013-11-01

    'Customer requirements' (CRs) management is a key component of customer relationship management (CRM). By processing customer-focused information, CRs management plays an important role in enterprise systems (ESs). Although two main CRs analysis methods, quality function deployment (QFD) and Kano model, have been applied to many fields by many enterprises in the past several decades, the limitations such as complex processes and operations make them unsuitable for online businesses among small- and medium-sized enterprises (SMEs). Currently, most SMEs do not have the resources to implement QFD or Kano model. In this article, we propose a method named customer requirement information (CRI), which provides a simpler and easier way for SMEs to run CRs analysis. The proposed method analyses CRs from the perspective of information and applies mathematical methods to the analysis process. A detailed description of CRI's acquisition, classification and processing is provided.

  16. Significant wave height retrieval from synthetic radar images

    NARCIS (Netherlands)

    Wijaya, Andreas Parama; van Groesen, Embrecht W.C.

    2014-01-01

    In many offshore activities radar imagery is used to observe and predict ocean waves. An important issue in analyzing the radar images is to resolve the significant wave height. Different from 3DFFT methods that use an estimate related to the square root of the signal-to-noise ratio of radar images,

  17. Optimized Clustering Estimators for BAO Measurements Accounting for Significant Redshift Uncertainty

    Energy Technology Data Exchange (ETDEWEB)

    Ross, Ashley J. [Portsmouth U., ICG; Banik, Nilanjan [Fermilab; Avila, Santiago [Madrid, IFT; Percival, Will J. [Portsmouth U., ICG; Dodelson, Scott [Fermilab; Garcia-Bellido, Juan [Madrid, IFT; Crocce, Martin [ICE, Bellaterra; Elvin-Poole, Jack [Jodrell Bank; Giannantonio, Tommaso [Cambridge U., KICC; Manera, Marc [Cambridge U., DAMTP; Sevilla-Noarbe, Ignacio [Madrid, CIEMAT

    2017-05-15

    We determine an optimized clustering statistic to be used for galaxy samples with significant redshift uncertainty, such as those that rely on photometric redshifts. To do so, we study the BAO information content as a function of the orientation of galaxy clustering modes with respect to their angle to the line-of-sight (LOS). The clustering along the LOS, as observed in a redshift-space with significant redshift uncertainty, has contributions from clustering modes with a range of orientations with respect to the true LOS. For redshift uncertainty $\\sigma_z \\geq 0.02(1+z)$ we find that while the BAO information is confined to transverse clustering modes in the true space, it is spread nearly evenly in the observed space. Thus, measuring clustering in terms of the projected separation (regardless of the LOS) is an efficient and nearly lossless compression of the signal for $\\sigma_z \\geq 0.02(1+z)$. For reduced redshift uncertainty, a more careful consideration is required. We then use more than 1700 realizations of galaxy simulations mimicking the Dark Energy Survey Year 1 sample to validate our analytic results and optimized analysis procedure. We find that using the correlation function binned in projected separation, we can achieve uncertainties that are within 10 per cent of of those predicted by Fisher matrix forecasts. We predict that DES Y1 should achieve a 5 per cent distance measurement using our optimized methods. We expect the results presented here to be important for any future BAO measurements made using photometric redshift data.

  18. Extending enterprise architecture modelling with business goals and requirements

    NARCIS (Netherlands)

    Engelsman, W.; Quartel, Dick; Jonkers, Henk; van Sinderen, Marten J.

    The methods for enterprise architecture (EA), such as The Open Group Architecture Framework, acknowledge the importance of requirements modelling in the development of EAs. Modelling support is needed to specify, document, communicate and reason about goals and requirements. The current modelling

  19. Vacuum-venipuncture skills: time required and importance of tube order

    Directory of Open Access Journals (Sweden)

    Fujii C

    2013-08-01

    Full Text Available Chieko FujiiFaculty of Nursing and Medical Care, Keio University, Fujisawa, JapanBackground: The purpose of this study was to assess specific vacuum-venipuncture skills and the influence of the time involved in skin puncture and blood collection.Methods: Thirty subjects undergoing venipuncture in which video analysis was possible were included. These procedures were carried out by four nurses and recorded with a digital camera. Venipuncture skills classified by our observations were delineated on the basis of frame-by-frame video images, and a graph of x and y coordinates was created.Results: With the first blood-collection tube, strong blood flow required the practitioner to push the tube back in to compensate for the strong repulsive force in approximately 46% of cases. By the third blood-collection tube, the blood flow had weakened; therefore, the tube was moved up and down. In cases that required a second venipuncture, the tube was already pierced, so the time required to fill it to 5 mL was significantly longer.Conclusion: Hand movement of the practitioner is adjusted according to blood flow. Reflex movement in response to strong blood flow may increase the risk of pushing the needle through the vein with excessive force. The time required to fill the tube varies among nurses, tube order, and level of venipuncture skills.Keywords: blood collection, blood-collection tube, clinical practice, venipuncture skill

  20. Two-level method with coarse space size independent convergence

    Energy Technology Data Exchange (ETDEWEB)

    Vanek, P.; Brezina, M. [Univ. of Colorado, Denver, CO (United States); Tezaur, R.; Krizkova, J. [UWB, Plzen (Czech Republic)

    1996-12-31

    The basic disadvantage of the standard two-level method is the strong dependence of its convergence rate on the size of the coarse-level problem. In order to obtain the optimal convergence result, one is limited to using a coarse space which is only a few times smaller than the size of the fine-level one. Consequently, the asymptotic cost of the resulting method is the same as in the case of using a coarse-level solver for the original problem. Today`s two-level domain decomposition methods typically offer an improvement by yielding a rate of convergence which depends on the ratio of fine and coarse level only polylogarithmically. However, these methods require the use of local subdomain solvers for which straightforward application of iterative methods is problematic, while the usual application of direct solvers is expensive. We suggest a method diminishing significantly these difficulties.

  1. Genome-wide identification of significant aberrations in cancer genome.

    Science.gov (United States)

    Yuan, Xiguo; Yu, Guoqiang; Hou, Xuchu; Shih, Ie-Ming; Clarke, Robert; Zhang, Junying; Hoffman, Eric P; Wang, Roger R; Zhang, Zhen; Wang, Yue

    2012-07-27

    Somatic Copy Number Alterations (CNAs) in human genomes are present in almost all human cancers. Systematic efforts to characterize such structural variants must effectively distinguish significant consensus events from random background aberrations. Here we introduce Significant Aberration in Cancer (SAIC), a new method for characterizing and assessing the statistical significance of recurrent CNA units. Three main features of SAIC include: (1) exploiting the intrinsic correlation among consecutive probes to assign a score to each CNA unit instead of single probes; (2) performing permutations on CNA units that preserve correlations inherent in the copy number data; and (3) iteratively detecting Significant Copy Number Aberrations (SCAs) and estimating an unbiased null distribution by applying an SCA-exclusive permutation scheme. We test and compare the performance of SAIC against four peer methods (GISTIC, STAC, KC-SMART, CMDS) on a large number of simulation datasets. Experimental results show that SAIC outperforms peer methods in terms of larger area under the Receiver Operating Characteristics curve and increased detection power. We then apply SAIC to analyze structural genomic aberrations acquired in four real cancer genome-wide copy number data sets (ovarian cancer, metastatic prostate cancer, lung adenocarcinoma, glioblastoma). When compared with previously reported results, SAIC successfully identifies most SCAs known to be of biological significance and associated with oncogenes (e.g., KRAS, CCNE1, and MYC) or tumor suppressor genes (e.g., CDKN2A/B). Furthermore, SAIC identifies a number of novel SCAs in these copy number data that encompass tumor related genes and may warrant further studies. Supported by a well-grounded theoretical framework, SAIC has been developed and used to identify SCAs in various cancer copy number data sets, providing useful information to study the landscape of cancer genomes. Open-source and platform-independent SAIC software is

  2. Predicting respiratory motion signals for image-guided radiotherapy using multi-step linear methods (MULIN)

    International Nuclear Information System (INIS)

    Ernst, Floris; Schweikard, Achim

    2008-01-01

    Forecasting of respiration motion in image-guided radiotherapy requires algorithms that can accurately and efficiently predict target location. Improved methods for respiratory motion forecasting were developed and tested. MULIN, a new family of prediction algorithms based on linear expansions of the prediction error, was developed and tested. Computer-generated data with a prediction horizon of 150 ms was used for testing in simulation experiments. MULIN was compared to Least Mean Squares-based predictors (LMS; normalized LMS, nLMS; wavelet-based multiscale autoregression, wLMS) and a multi-frequency Extended Kalman Filter (EKF) approach. The in vivo performance of the algorithms was tested on data sets of patients who underwent radiotherapy. The new MULIN methods are highly competitive, outperforming the LMS and the EKF prediction algorithms in real-world settings and performing similarly to optimized nLMS and wLMS prediction algorithms. On simulated, periodic data the MULIN algorithms are outperformed only by the EKF approach due to its inherent advantage in predicting periodic signals. In the presence of noise, the MULIN methods significantly outperform all other algorithms. The MULIN family of algorithms is a feasible tool for the prediction of respiratory motion, performing as well as or better than conventional algorithms while requiring significantly lower computational complexity. The MULIN algorithms are of special importance wherever high-speed prediction is required. (orig.)

  3. Predicting respiratory motion signals for image-guided radiotherapy using multi-step linear methods (MULIN)

    Energy Technology Data Exchange (ETDEWEB)

    Ernst, Floris; Schweikard, Achim [University of Luebeck, Institute for Robotics and Cognitive Systems, Luebeck (Germany)

    2008-06-15

    Forecasting of respiration motion in image-guided radiotherapy requires algorithms that can accurately and efficiently predict target location. Improved methods for respiratory motion forecasting were developed and tested. MULIN, a new family of prediction algorithms based on linear expansions of the prediction error, was developed and tested. Computer-generated data with a prediction horizon of 150 ms was used for testing in simulation experiments. MULIN was compared to Least Mean Squares-based predictors (LMS; normalized LMS, nLMS; wavelet-based multiscale autoregression, wLMS) and a multi-frequency Extended Kalman Filter (EKF) approach. The in vivo performance of the algorithms was tested on data sets of patients who underwent radiotherapy. The new MULIN methods are highly competitive, outperforming the LMS and the EKF prediction algorithms in real-world settings and performing similarly to optimized nLMS and wLMS prediction algorithms. On simulated, periodic data the MULIN algorithms are outperformed only by the EKF approach due to its inherent advantage in predicting periodic signals. In the presence of noise, the MULIN methods significantly outperform all other algorithms. The MULIN family of algorithms is a feasible tool for the prediction of respiratory motion, performing as well as or better than conventional algorithms while requiring significantly lower computational complexity. The MULIN algorithms are of special importance wherever high-speed prediction is required. (orig.)

  4. Unmanned Aerial Vehicles unique cost estimating requirements

    Science.gov (United States)

    Malone, P.; Apgar, H.; Stukes, S.; Sterk, S.

    Unmanned Aerial Vehicles (UAVs), also referred to as drones, are aerial platforms that fly without a human pilot onboard. UAVs are controlled autonomously by a computer in the vehicle or under the remote control of a pilot stationed at a fixed ground location. There are a wide variety of drone shapes, sizes, configurations, complexities, and characteristics. Use of these devices by the Department of Defense (DoD), NASA, civil and commercial organizations continues to grow. UAVs are commonly used for intelligence, surveillance, reconnaissance (ISR). They are also use for combat operations, and civil applications, such as firefighting, non-military security work, surveillance of infrastructure (e.g. pipelines, power lines and country borders). UAVs are often preferred for missions that require sustained persistence (over 4 hours in duration), or are “ too dangerous, dull or dirty” for manned aircraft. Moreover, they can offer significant acquisition and operations cost savings over traditional manned aircraft. Because of these unique characteristics and missions, UAV estimates require some unique estimating methods. This paper describes a framework for estimating UAV systems total ownership cost including hardware components, software design, and operations. The challenge of collecting data, testing the sensitivities of cost drivers, and creating cost estimating relationships (CERs) for each key work breakdown structure (WBS) element is discussed. The autonomous operation of UAVs is especially challenging from a software perspective.

  5. A method for determining the spent-fuel contribution to transport cask containment requirements

    International Nuclear Information System (INIS)

    Sanders, T.L.; Seager, K.D.; Rashid, Y.R.; Barrett, P.R.; Malinauskas, A.P.; Einziger, R.E.; Jordan, H.; Reardon, P.C.

    1992-11-01

    This report examines containment requirements for spent-fuel transport containers that are transported under normal and hypothetical accident conditions. A methodology is described that estimates the probability of rod failure and the quantity of radioactive material released from breached rods. This methodology characterizes the dynamic environment of the cask and its contents and deterministically models the peak stresses that are induced in spent-fuel cladding by the mechanical and thermal dynamic environments. The peak stresses are evaluated in relation to probabilistic failure criteria for generated or preexisting ductile tearing and material fractures at cracks partially through the wall in fuel rods. Activity concentrations in the cask cavity are predicted from estimates of the fraction of gases, volatiles, and fuel fines that are released when the rod cladding is breached. Containment requirements based on the source term are calculated in terms of maximum permissible volumetric leak rates from the cask. Calculations are included for representative cask designs

  6. Photogrammetric methods of measurement in industrial applications

    International Nuclear Information System (INIS)

    Godding, R.; Groene, A.; Heinrich, G.; Schneider, C.T.

    1993-01-01

    Methods for 3D measurement are required for very varied applications in the industrial field. This includes tasks of quality assurance and plant monitoring, among others. It should be possible to apply the process flexibly it should require as short interruptions of production as possible and should meet the required accuracies. These requirements can be met by photogrammetric methods of measurement. The article introduces these methods and shows their capabilities from various selected examples (eg: the replacement of large components in a pressurized water reactor, and aircraft measurements (orig./DG) [de

  7. Risk based maintenance: Resource requirements and organizational challenges

    International Nuclear Information System (INIS)

    Weerakkody, S.D.

    2000-01-01

    10CFR50.65 'Requirements for Monitoring the Effectiveness of Maintenance at Nuclear Power Plants' required licensees to monitor the performance or condition of structures, systems, or components (SSCs) against licensee established goals, in a manner sufficient to provide reasonable assurance that such SSCs are capable of fulfilling their intended functions. The goals were required to be commensurate with safety significance and operating experience. Northeast Utilities relied upon PRAs to implement 10CFR50.65, which is also referred to as the 'Maintenance Rule'. The Maintenance Rule changed some aspects of maintenance of structures, systems, and components (SSC) at nuclear power plants. One objective of the rule was to focus the maintenance resources based on risk significance of components. This paper will discuss the organizational challenges and resource requirements associated with implementation of the Maintenance Rule at nuclear facilities that are supported by the Northeast Utilities Services Company (NUSCo). The paper will discuss (a) how these challenges were addressed, (b) the resources required for ongoing efforts to support the Maintenance Rule, and (c) several key safety benefits derived from the implementation of the Maintenance Rule. (author)

  8. An industrial case study in reconstructing requirements views

    NARCIS (Netherlands)

    Lormans, M.; Van Deursen, A.; Gross, H.G.

    2008-01-01

    Requirements views, such as coverage and status views, are an important asset for monitoring and managing software development projects. We have developed a method that automates the process of reconstructing these views, and we have built a tool, REQANALYST, that supports this method. This paper

  9. Domain decomposed preconditioners with Krylov subspace methods as subdomain solvers

    Energy Technology Data Exchange (ETDEWEB)

    Pernice, M. [Univ. of Utah, Salt Lake City, UT (United States)

    1994-12-31

    Domain decomposed preconditioners for nonsymmetric partial differential equations typically require the solution of problems on the subdomains. Most implementations employ exact solvers to obtain these solutions. Consequently work and storage requirements for the subdomain problems grow rapidly with the size of the subdomain problems. Subdomain solves constitute the single largest computational cost of a domain decomposed preconditioner, and improving the efficiency of this phase of the computation will have a significant impact on the performance of the overall method. The small local memory available on the nodes of most message-passing multicomputers motivates consideration of the use of an iterative method for solving subdomain problems. For large-scale systems of equations that are derived from three-dimensional problems, memory considerations alone may dictate the need for using iterative methods for the subdomain problems. In addition to reduced storage requirements, use of an iterative solver on the subdomains allows flexibility in specifying the accuracy of the subdomain solutions. Substantial savings in solution time is possible if the quality of the domain decomposed preconditioner is not degraded too much by relaxing the accuracy of the subdomain solutions. While some work in this direction has been conducted for symmetric problems, similar studies for nonsymmetric problems appear not to have been pursued. This work represents a first step in this direction, and explores the effectiveness of performing subdomain solves using several transpose-free Krylov subspace methods, GMRES, transpose-free QMR, CGS, and a smoothed version of CGS. Depending on the difficulty of the subdomain problem and the convergence tolerance used, a reduction in solution time is possible in addition to the reduced memory requirements. The domain decomposed preconditioner is a Schur complement method in which the interface operators are approximated using interface probing.

  10. 75 FR 12581 - Notice of Availability of Environmental Assessment and Finding of No Significant Impact for...

    Science.gov (United States)

    2010-03-16

    ... Environmental Assessment and Finding of No Significant Impact for Exemption From 10 CFR 30, 40, and 70... Assessment and Finding of No Significant Impact for Exemption from Commencement of Construction Requirements... has reached a Finding of No Significant Impact. II. Summary of the Environmental Assessment Background...

  11. Tensor-GMRES method for large sparse systems of nonlinear equations

    Science.gov (United States)

    Feng, Dan; Pulliam, Thomas H.

    1994-01-01

    This paper introduces a tensor-Krylov method, the tensor-GMRES method, for large sparse systems of nonlinear equations. This method is a coupling of tensor model formation and solution techniques for nonlinear equations with Krylov subspace projection techniques for unsymmetric systems of linear equations. Traditional tensor methods for nonlinear equations are based on a quadratic model of the nonlinear function, a standard linear model augmented by a simple second order term. These methods are shown to be significantly more efficient than standard methods both on nonsingular problems and on problems where the Jacobian matrix at the solution is singular. A major disadvantage of the traditional tensor methods is that the solution of the tensor model requires the factorization of the Jacobian matrix, which may not be suitable for problems where the Jacobian matrix is large and has a 'bad' sparsity structure for an efficient factorization. We overcome this difficulty by forming and solving the tensor model using an extension of a Newton-GMRES scheme. Like traditional tensor methods, we show that the new tensor method has significant computational advantages over the analogous Newton counterpart. Consistent with Krylov subspace based methods, the new tensor method does not depend on the factorization of the Jacobian matrix. As a matter of fact, the Jacobian matrix is never needed explicitly.

  12. Transient heat conduction in multi-layer walls: An efficient strategy for Laplace's method

    Energy Technology Data Exchange (ETDEWEB)

    Maestre, Ismael R.; Cubillas, Paloma R. [Escuela Politecnica Superior de Algeciras, University of Cadiz, Algeciras (Spain); Perez-Lombard, Luis [Escuela Superior de Ingenieros, University of Seville (Spain)

    2010-04-15

    Enhancing load calculation tools into building simulation programs requires an in-depth revision and fine tuning of the load calculation assumptions prior to the addition of the HVAC system modelling routines. It is of special interest the analysis of transient heat conduction through multi-layer walls where, in order to improve the coupling between the passive elements of the building and the HVAC systems, an improvement of the time resolution in the calculation becomes critical. Several methods have been historically used, although recently Laplace's method has been displaced by the State Space method. This paper proposes a new strategy for fine time resolution on the calculation of the response factors through Laplace's method considering a comparison with the performance of the State Space method when used to calculate conduction transfer functions. Our analysis shows that in order to achieve similar accuracy with both approaches, the State Space method requires significant additional computational time. (author)

  13. Significance and radioimmunoassay of gastric inhibitory polypeptide

    International Nuclear Information System (INIS)

    Zheng Ping; Zeng Minde; Yuan Jimin

    1995-01-01

    We have established the GIP Radioimmunoassay which has high sensitivity and specificity by labelling with iodogen and purified with HPLC. Using this method, the plasma GIP level was measured in 64 cases of which there are 10 normal individuals, 25 cases of diabetes and 29 cases of liver cirrhosis . The results showed that the plasma GIP level was significantly increased in patients with liver cirrhosis and correlated to degree of glucose tolerance damage

  14. Systems and context modeling approach to requirements analysis

    Science.gov (United States)

    Ahuja, Amrit; Muralikrishna, G.; Patwari, Puneet; Subhrojyoti, C.; Swaminathan, N.; Vin, Harrick

    2014-08-01

    Ensuring completeness and correctness of the requirements for a complex system such as the SKA is challenging. Current system engineering practice includes developing a stakeholder needs definition, a concept of operations, and defining system requirements in terms of use cases and requirements statements. We present a method that enhances this current practice into a collection of system models with mutual consistency relationships. These include stakeholder goals, needs definition and system-of-interest models, together with a context model that participates in the consistency relationships among these models. We illustrate this approach by using it to analyze the SKA system requirements.

  15. Significance of perfectionism in understanding different forms of insomnia

    Directory of Open Access Journals (Sweden)

    Totić-Poznanović Sanja

    2012-01-01

    Full Text Available Introduction. Studies consistently show a connection between perfectionism as a multidimensional construct with various psychological and psychopathological states and characteristics. However, studies that analyze the connection between this concept and sleep disturbances, especially modalities of insomnia, are rare. Objective. The aim of this study was to examine whether dimensions of perfectionism can explain different forms of insomnia; difficulties initiating sleep (insomnia early, difficulties during the sleep (insomnia middle, waking in early hours of the morning (insomnia late and dissatisfaction with sleep quality (subjective insomnia. Methods. The sample consisted of 254 students of the School of Medicine in Belgrade. Predictive significance of nine perfectionism dimensions, measured by Frost’s and Hewitt’s and Flett’s scales of multi-dimensional perfectionism, related to four modalities of insomnia, measured by a structured questionnaire, was analyzed by multiple linear regression method. Results. Perfectionism dimensions are significant predictors of each of the tested forms of insomnia. Doubt about actions significantly predicts initial insomnia; to other-oriented perfectionism in the negative pole and socially prescribed perfectionism underlie the difficulties during the sleep, while organization and parental criticism underlie late insomnia. Significant predictors of subjective insomnia are personal standards and organization and to other-oriented perfectionism on the negative pole. Three of nine analyzed dimensions were not confirmed as significant; concern over mistakes, parental expectations and self-oriented perfectionism. Conclusion. Various aspects of perfectionism can be considered as a vulnerability factor for understanding some forms of insomnia. Out of all forms of insomnia tested, perfectionism as the personality trait proved to be the most significant for understanding subjective insomnia.

  16. Feature selection based on SVM significance maps for classification of dementia

    NARCIS (Netherlands)

    E.E. Bron (Esther); M. Smits (Marion); J.C. van Swieten (John); W.J. Niessen (Wiro); S. Klein (Stefan)

    2014-01-01

    textabstractSupport vector machine significance maps (SVM p-maps) previously showed clusters of significantly different voxels in dementiarelated brain regions. We propose a novel feature selection method for classification of dementia based on these p-maps. In our approach, the SVM p-maps are

  17. Comparison of topical use of protamine and tranexamic acid in surgical patients requiring cardio-pulmonary bypass

    International Nuclear Information System (INIS)

    Siddiqeh, M.; Siddiqi, R.; Ali, N.; Iqbal, A.; Younus, Z.; Haq, I.U.

    2015-01-01

    To determine the effectiveness of local protamine in reducing post-operative blood loss compared to local tranexamic acid. Study Design: Randomized controlled trial. Place and Duration of Study: Armed Forces Institute of Cardiology/National Institute of Heart Diseases Rawalpindi from January 2011 to September 2011. Patients and Methods: One hundred and twenty cardiac surgical patients were randomly divided into two equal groups, one receiving local protamine while the other group receiving local tranexamic acid before chest closure. The efficiency was measured as post-operative blood loss and requirement of blood and blood products in the post-surgical ICU. Results: Average blood loss in protamine group was significantly less (252.97 ml) compared to tranexamic acid group (680.67 ml). Number of patients requiring no post-operative blood transfusion was significantly higher in protamine group (76.7%) compared to tranexamic acid group (53.3%). Conclusion: Local protamine is more effective in reducing post-operative blood loss than local tranexamic acid. (author)

  18. Monitoring requirements for assessment of internal dose

    International Nuclear Information System (INIS)

    Eckerman, K.F.

    1985-01-01

    Data obtained by routine personnel monitoring is usually not a sufficient basis for estimation of dose. Collected data must be interpreted carefully and supplemented with appropriate information before reasonably accurate estimates of dose (i.e., accurate enough to indicate whether or nor personnel are exposed in excess of recommended limits) can be developed. When the exposure is of sufficient magnitude that a rather precise estimate of dose is needed, the health physicist will bring to bear on the problem other, more refined, methods of dosimetry. These might include a reconstruction of the incident and, for internal emitters, an extensive series of in vivo measurements or analyses of excreta. Thus, cases of special significance must often be evaluated using techniques and resources beyond those routinely employed. This is not a criticism of most routine monitoring programs. These programs are usually carefully designed in a manner commensurate with the degree of exposure routinely encountered and the requirement of a practical program of radiation protection. 10 refs

  19. Premedication with oral Dextromethorphan reduces intra-operative Morphine requirement

    Directory of Open Access Journals (Sweden)

    R Talakoub

    2005-09-01

    Full Text Available Background: Intra-operative pain has adverse effects on hemodynamic parameters. Due to complications of opioids for pain relief, using non-opioids medication is preferred. The purpose of this study was to investigate the effect of oral dextrometorphan premedication on intra-operative Morphine requirement. Methods: After approval of the Ethics committee and informed consent, 40 adult patients who stand in American Society of Anesthesiologists Physical Status I and II, under general anesthesia for elective laparatomy were selected and classified in two equal groups randomly. In group A, oral dextromethorphan (60mg was administered at 10 PM and 6 AM preoperatively. In group B, placebo (dextrose was administered. After induction of general anesthesia and before skin incision, intravenous morphine (0.01 mg/kg was administered. During surgery, when systolic blood pressure or heart rate was increased more than 20% of the preoperative baseline, 0.01 mg/kg morphine was administered. At the end of surgery, the totally prescribed morphine (mg/kg and maximal increase in systolic, diastolic, mean arterial blood pressure and heart rate relative to the baseline values were calculated and statistically compared with student’s t-test. Results: The mean dose of administered morphine during surgery was significantly less in group A than group B (P<0.0001. Also, Maximal increase in systolic, diastolic and mean arterial blood pressure was significantly less in group A (p<0.003, p<0.004, p<0.0001, respectively. There was no significant difference in maximal heart rate increase between two groups (p<0.114. Conclusion: Oral dextromethorphan premedication may decrease intra-operative morphine requirement and reduce maximal increase in systolic and mean arterial blood pressure during surgery. Key words: Dextromethorphan, Morphine, Intra-operative, Premedication Hemodynamic

  20. Model-based human reliability analysis: prospects and requirements

    International Nuclear Information System (INIS)

    Mosleh, A.; Chang, Y.H.

    2004-01-01

    Major limitations of the conventional methods for human reliability analysis (HRA), particularly those developed for operator response analysis in probabilistic safety assessments (PSA) of nuclear power plants, are summarized as a motivation for the need and a basis for developing requirements for the next generation HRA methods. It is argued that a model-based approach that provides explicit cognitive causal links between operator behaviors and directly or indirectly measurable causal factors should be at the core of the advanced methods. An example of such causal model is briefly reviewed, where due to the model complexity and input requirements can only be currently implemented in a dynamic PSA environment. The computer simulation code developed for this purpose is also described briefly, together with current limitations in the models, data, and the computer implementation