WorldWideScience

Sample records for significant methodological limitations

  1. Methodology, theoretical framework and scholarly significance: An ...

    African Journals Online (AJOL)

    Methodology, theoretical framework and scholarly significance: An overview ... AFRICAN JOURNALS ONLINE (AJOL) · Journals · Advanced Search ... Keywords: Legal Research, Methodology, Theory, Pedagogy, Legal Training, Scholarship ...

  2. CASE METHOD. ACTIVE LEARNING METHODOLOGY TO ACQUIRE SIGNIFICANT IN CHEMISTRY

    Directory of Open Access Journals (Sweden)

    Clotilde Pizarro

    2015-09-01

    Full Text Available In this paper the methodology of cases in first year students of the Engineering Risk Prevention and Environment is applied. For this purpose a real case of contamination occurred at a school in the region of Valparaiso called "La Greda" is presented. If the application starts delivering an extract of the information collected from the media and they made a brief induction on the methodology to be applied. A plenary session, which is debate about possible solutions to the problem and establishing a relationship between the case and drives the chemistry program is then performed. Is concluded that the application of the case method, was a fruitful tool in yields obtained by students, since the percentage of approval was 75%, which is considerably higher than previous years.

  3. Sulfonylurea herbicides – methodological challenges in setting aquatic limit values

    DEFF Research Database (Denmark)

    Rosenkrantz, Rikke Tjørnhøj; Baun, Anders; Kusk, Kresten Ole

    according to the EU Water Framework Directive, the resulting Water Quality Standards (WQSs) are below the analytical quantification limit, making it difficult to verify compliance with the limit values. However, several methodological concerns may be raised in relation to the very low effect concentrations...... and rimsulfuron. The following parameters were varied during testing: pH, exposure duration, temperature and light/dark cycle. Preliminary results show that a decrease in pH causes an increase in toxicity for all compounds. Exposure to a high concentration for 24 hours caused a reduction in growth rate, from...... for setting limit values for SUs or if more detailed information should be gained by taking methodological considerations into account....

  4. An EPRI methodology for determining and monitoring simulator operating limits

    International Nuclear Information System (INIS)

    Eichelberg, R.; Pellechi, M.; Wolf, B.; Colley, R.

    1989-01-01

    Of paramount concern to nuclear utilities today is whether their plant-referenced simulator(s) comply with ANSI/ANS 3.5-1985. Of special interest is Section 4.3 of the Standard which requires, in part, that a means be provided to alert the instructor when certain parameters approach values indicative of events beyond the implemented model or known plant behavior. EPRI established Research Project 2054-2 to develop a comprehensive plan for determining, monitoring, and implementing simulator operating limits. As part of the project, a survey was conducted to identify the current/anticipated approach each of the sampled utilities was using to meet the requirements of Section 4.3. A preliminary methodology was drafted and host utilities interviewed. The interview process led to redefining the methodology. This paper covers the objectives of the EPRI project, survey responses, overview of the methodology, resource requirements and conclusions

  5. Temperature biofeedback and sleep: limited findings and methodological challenges

    Directory of Open Access Journals (Sweden)

    De Koninck J

    2012-10-01

    Full Text Available Geneviève Forest,1,2 Cameron van den Heuvel,3 Kurt Lushington,4 Joseph De Koninck21Sleep Laboratory, Département de Psychoéducation et de Psychologie, Université du Québec en Outaouais, Gatineau, Québec, Canada; 2Sleep and Dreams Laboratory, School of Psychology, University of Ottawa, Ottawa, Ontario, Canada; 3Research Branch University of Adelaide, South Australia, Australia; 4School of Psychology, Social Work and Social Policy, University of South Australia, South Australia, AustraliaAbstract: Given the close link between body temperature and sleep, the perspective of manipulating core and peripheral temperature by self-regulation techniques is very appealing. We report here on a series of attempts conducted independently in two laboratories to use self-regulation (biofeedback of oral (central and hand (peripheral temperature, and measured the impact on sleep-onset latency, sleep architecture, and circadian phase. We found that hand temperature was more successful than oral temperature biofeedback. Moreover, an increase in hand temperature was associated with reduced sleep-onset latency. However, most participants found the procedure difficult to implement. The temperature response to biofeedback was reduced in the aged and weakest at the time of sleep onset, and there was not a systematic relationship between the change in temperature and change in sleep latency. Methodological limitations and individual differences may account for these results. Recommendations for future research are presented.Keywords: biofeedback, core body temperature, sleep, circadian rhythm, sleep onset

  6. Development of an accident sequence precursor methodology and its application to significant accident precursors

    Energy Technology Data Exchange (ETDEWEB)

    Jang, Seung Hyun; Park, Sung Hyun; Jae, Moo Sung [Dept. of of Nuclear Engineering, Hanyang University, Seoul (Korea, Republic of)

    2017-03-15

    The systematic management of plant risk is crucial for enhancing the safety of nuclear power plants and for designing new nuclear power plants. Accident sequence precursor (ASP) analysis may be able to provide risk significance of operational experience by using probabilistic risk assessment to evaluate an operational event quantitatively in terms of its impact on core damage. In this study, an ASP methodology for two operation mode, full power and low power/shutdown operation, has been developed and applied to significant accident precursors that may occur during the operation of nuclear power plants. Two operational events, loss of feedwater and steam generator tube rupture, are identified as ASPs. Therefore, the ASP methodology developed in this study may contribute to identifying plant risk significance as well as to enhancing the safety of nuclear power plants by applying this methodology systematically.

  7. The Development and Significance of Standards for Smoking-Machine Methodology

    Directory of Open Access Journals (Sweden)

    Baker R

    2014-12-01

    Full Text Available Bialous and Yach have recently published an article in Tobacco Control in which they claim that all smoking-machine standards stem from a method developed unilaterally by the tobacco industry within the Cooperation Centre for Scientific Research Relative to Tobacco (CORESTA. Using a few highly selective quotations from internal tobacco company memos, they allege, inter alia, that the tobacco industry has changed the method to suit its own needs, that because humans do not smoke like machines the standards are of little value, and that the tobacco industry has unjustifiably made health claims about low “tar” cigarettes. The objectives of this paper are to review the development of smoking-machine methodology and standards, involvement of relative parties, outline the significance of the results and explore the validity of Bialous and Yach's claims. The large volume of published scientific information on the subject together with other information in the public domain has been consulted. When this information is taken into account it becomes obvious that the very narrow and restricted literature base of Bialous and Yach's analysis has resulted in them, perhaps inadvertedly, making factual errors, drawing wrong conclusions and writing inaccurate statements on many aspects of the subject. The first smoking-machine standard was specified by the Federal Trade Commission (FTC, a federal government agency in the USA, in 1966. The CORESTA Recommended Method, similar in many aspects to that of the FTC, was developed in the late 1960s and published in 1969. Small differences in the butt lengths, smoke collection and analytical procedures in methods used in various countries including Germany, Canada and the UK, developed later, resulted in about a 10% difference in smoke “tar” yields. These differences in methodology were harmonised in a common International Organisation for Standardisation (ISO Standard Method in 1991, after a considerable amount

  8. Estimating significances of differences between slopes: A new methodology and software

    Directory of Open Access Journals (Sweden)

    Vasco M. N. C. S. Vieira

    2013-09-01

    Full Text Available Determining the significance of slope differences is a common requirement in studies of self-thinning, ontogeny and sexual dimorphism, among others. This has long been carried out testing for the overlap of the bootstrapped 95% confidence intervals of the slopes. However, the numerical random re-sampling with repetition favours the occurrence of re-combinations yielding largely diverging slopes, widening the confidence intervals and thus increasing the chances of overlooking significant differences. To overcome this problem a permutation test simulating the null hypothesis of no differences between slopes is proposed. This new methodology, when applied both to artificial and factual data, showed an enhanced ability to differentiate slopes.

  9. Significance of the proportion of binucleate cells in the micronucleus assay; A methodological study

    Energy Technology Data Exchange (ETDEWEB)

    Imamura, Masahiro; Edgren, M.R. (Karolinska Inst., Stockholm (Sweden). Dept. of Radiation Physics)

    1994-03-01

    Using treatment with cytochalasin-B (Cyt-B) for the induction of a cytokinetic block, the significance of the proportion of binucleate cells (BNC) in the micronucleus (MN) assay was investigated in a methodological study. A Chinese hamster cell line V79 was used in which MN were induced by radiation. In complementary tests the radiation effect in inducing MN was enhanced by depletion of the cellular glutathione content with buthionine sulfoximine (BSO). The data indicated that the concentration of Cyt-B is the major factor which determines the proportion of BNC. This proportion was shown to be independent of radiation dose and of BSO. Furthermore, the MN frequency was not related to the percentage of BNC. Therefore, a high proportion of BNC may be practical for the MN assay, but may not make the technique more accurate. (author).

  10. 40 CFR Appendix B to Part 72 - Methodology for Conversion of Emissions Limits

    Science.gov (United States)

    2010-07-01

    ... site-specific heat rates and capacities to develop conversions for Btu per hour. Standard conversion... 40 Protection of Environment 16 2010-07-01 2010-07-01 false Methodology for Conversion of... Conversion of Emissions Limits For the purposes of the Acid Rain Program, all emissions limits must be...

  11. Application of a new methodology to evaluate Dnb limits based on statistical propagation of uncertainties

    International Nuclear Information System (INIS)

    Machado, Marcio Dornellas

    1998-09-01

    One of the most important thermalhydraulics safety parameters is the DNBR (Departure from Nucleate Boiling Ratio). The current methodology in use at Eletronuclear to determine DNBR is extremely conservative and may result in penalties to the reactor power due to an increase plugging level of steam generator tubes. This work uses a new methodology to evaluate DNBR, named mini-RTDP. The standard methodology (STDP) currently in use establishes a limit design value which cannot be surpassed. This limit value is determined taking into account the uncertainties of the empirical correlation used in COBRA IIC/MIT code, modified to Angra 1 conditions. The correlation used is the Westinghouse's W-3 and the minimum DNBR (MDBR) value cannot be less than 1.3. The new methodology reduces the excessive level of conservatism associated with the parameters used in the DNBR calculation, which take most unfavorable values in the STDP methodology, by using their best estimate values. The final goal is to obtain a new DNBR design limit which will provide a margin gain due to more realistic parameters values used in the methodology. (author)

  12. The significance of some methodological effects on filtration and ingestion rates of the rotifer Brachionus plicatilis

    Science.gov (United States)

    Schlosser, H. J.; Anger, K.

    1982-06-01

    Filtration rate (F) and ingestion rate (I) were measured in the rotifer Brachionus plicatilis feeding on the flagellate Dunaliella spec. and on yeast cells (Saccharomyces cerevisiae). 60-min experiments in rotating bottles served as a standard for testing methodological effects on levels of F and I. A lack of rotation reduced F values by 40 %, and a rise in temperature from 18° to 23.5 °C increased them by 42 %. Ingestion rates increased significantly up to a particle (yeast) concentration of ca. 600-800 cells · μl-1; then they remained constant, whereas filtration rates decreased beyond this threshold. Rotifer density (up to 1000 ind · ml-1) and previous starvation (up to 40 h) did not significantly influence food uptake rates. The duration of the experiment proved to have the most significant effect on F and I values: in 240-min experiments, these values were on the average more than 90 % lower than in 15-min experiments. From this finding it is concluded that ingestion rates obtained from short-term experiments (60 min or less) cannot be used in energy budgets, because they severely overestimate the actual long-term feeding capacity of the rotifers. At the lower end of the particle size spectrum (2 to 3 µm) there are not only food cells, but apparently also contaminating faecal particles. Their number increased with increasing duration of experiments and lead to an underestimation of F and I. Elemental analyses of rotifers and their food suggest that B. plicatilis can ingest up to 0.6 mJ or ca. 14 % of its own body carbon within 15 min. The long term average was estimated as 3.4 mJ · ind-1 · d-1 or ca. 75 % of body carbon · d-1.

  13. Application of the risk-informed methodology for APR1400 P-T limits curve

    Energy Technology Data Exchange (ETDEWEB)

    Kim, K.; Namgung, I. [KEPCO International Nuclear Graduate School, Ulsan (Korea, Republic of)

    2014-07-01

    A reactor pressure vessel (RPV) in a nuclear power plant has operational limits of pressure and temperature to prevent a potential drastic propagation of cracks due to brittle fracture. We call it a pressure-temperature limits curve (P-T limits curve). Appendix G of the American Society of Mechanical Engineers (ASME) Boiler and Pressure Vessel Code, Section XI, provides deterministic procedures to develop the P-T limits curve for the reactor pressure vessel. Recently, an alternative risk-informed methodology has been added in the ASME Code. Risk-informed means that we can consider insights from a probabilistic risk assessment by using this methodology. This alternative methodology provides a simple procedure to develop risk-informed P-T limits for heat up, cool down, and hydrostatic test events. The risk-informed P-T limits curve is known to provide more operational flexibility, particularly for reactor pressure vessels with relatively high irradiation levels and radiation sensitive materials. In this paper, we developed both the deterministic and a risk-informed P-T limits curve for an APR1400 reactor using Appendix G of the ASME Code, Section XI and compare the results in terms of additional operational margin. (author)

  14. Theoretical Significance in Q Methodology: A Qualitative Approach to a Mixed Method

    Science.gov (United States)

    Ramlo, Susan

    2015-01-01

    Q methodology (Q) has offered researchers a unique scientific measure of subjectivity since William Stephenson's first article in 1935. Q's focus on subjectivity includes self-referential meaning and interpretation. Q is most often identified with its technique (Q-sort) and its method (factor analysis to group people); yet, it consists of a…

  15. Methodology to identify risk-significant components for inservice inspection and testing

    International Nuclear Information System (INIS)

    Anderson, M.T.; Hartley, R.S.; Jones, J.L. Jr.; Kido, C.; Phillips, J.H.

    1992-08-01

    Periodic inspection and testing of vital system components should be performed to ensure the safe and reliable operation of Department of Energy (DOE) nuclear processing facilities. Probabilistic techniques may be used to help identify and rank components by their relative risk. A risk-based ranking would allow varied DOE sites to implement inspection and testing programs in an effective and cost-efficient manner. This report describes a methodology that can be used to rank components, while addressing multiple risk issues

  16. Significant aspects of the external event analysis methodology of the Jose Cabrera NPP PSA

    International Nuclear Information System (INIS)

    Barquin Duena, A.; Martin Martinez, A.R.; Boneham, P.S.; Ortega Prieto, P.

    1994-01-01

    This paper describes the following advances in the methodology for Analysis of External Events in the PSA of the Jose Cabrera NPP: In the Fire Analysis, a version of the COMPBRN3 CODE, modified by Empresarios Agrupados according to the guidelines of Appendix D of the NUREG/CR-5088, has been used. Generic cases were modelled and general conclusions obtained, applicable to fire propagation in closed areas. The damage times obtained were appreciably lower than those obtained with the previous version of the code. The Flood Analysis methodology is based on the construction of event trees to represent flood propagation dependent on the condition of the communication paths between areas, and trees showing propagation stages as a function of affected areas and damaged mitigation equipment. To determine temporary evolution of the flood area level, the CAINZO-EA code has been developed, adapted to specific plant characteristics. In both the Fire and Flood Analyses a quantification methodology has been adopted, which consists of analysing the damages caused at each stage of growth or propagation and identifying, in the Internal Events models, the gates, basic events or headers to which safe failure (probability 1) due to damages is assigned. (Author)

  17. Methodological development for selection of significant predictors explaining fatal road accidents.

    Science.gov (United States)

    Dadashova, Bahar; Arenas-Ramírez, Blanca; Mira-McWilliams, José; Aparicio-Izquierdo, Francisco

    2016-05-01

    Identification of the most relevant factors for explaining road accident occurrence is an important issue in road safety research, particularly for future decision-making processes in transport policy. However model selection for this particular purpose is still an ongoing research. In this paper we propose a methodological development for model selection which addresses both explanatory variable and adequate model selection issues. A variable selection procedure, TIM (two-input model) method is carried out by combining neural network design and statistical approaches. The error structure of the fitted model is assumed to follow an autoregressive process. All models are estimated using Markov Chain Monte Carlo method where the model parameters are assigned non-informative prior distributions. The final model is built using the results of the variable selection. For the application of the proposed methodology the number of fatal accidents in Spain during 2000-2011 was used. This indicator has experienced the maximum reduction internationally during the indicated years thus making it an interesting time series from a road safety policy perspective. Hence the identification of the variables that have affected this reduction is of particular interest for future decision making. The results of the variable selection process show that the selected variables are main subjects of road safety policy measures. Published by Elsevier Ltd.

  18. Marrying Step Feed with Secondary Clarifier Improvements to Significantly Increase Peak Wet Weather Treatment Capacity: An Integrated Methodology.

    Science.gov (United States)

    Daigger, Glen T; Siczka, John S; Smith, Thomas F; Frank, David A; McCorquodale, J A

    2017-08-01

      The need to increase the peak wet weather secondary treatment capacity of the City of Akron, Ohio, Water Reclamation Facility (WRF) provided the opportunity to test an integrated methodology for maximizing the peak wet weather secondary treatment capacity of activated sludge systems. An initial investigation, consisting of process modeling of the secondary treatment system and computational fluid dynamics (CFD) analysis of the existing relatively shallow secondary clarifiers (3.3 and 3.7 m sidewater depth in 30.5 m diameter units), indicated that a significant increase in capacity from 416 000 to 684 000 m3/d or more was possible by adding step feed capabilities to the existing bioreactors and upgrading the existing secondary clarifiers. One of the six treatment units at the WRF was modified, and an extensive 2-year testing program was conducted to determine the total peak wet weather secondary treatment capacity achievable. The results demonstrated that a peak wet weather secondary treatment capacity approaching 974 000 m3/d is possible as long as secondary clarifier solids and hydraulic loadings could be separately controlled using the step feed capability provided. Excellent sludge settling characteristics are routinely experienced at the City of Akron WRF, raising concerns that the identified peak wet weather secondary treatment capacity could not be maintained should sludge settling characteristics deteriorate for some reason. Computational fluid dynamics analysis indicated that the impact of the deterioration of sludge settling characteristics could be mitigated and the identified peak wet weather secondary treatment capacity maintained by further use of the step feed capability provided to further reduce secondary clarifier solids loading rates at the identified high surface overflow rates. The results also demonstrated that effluent limits not only for total suspended solids (TSS) and five-day carbonaceous biochemical oxygen demand (cBOD5) could be

  19. Methodology of strength calculation under alternating stresses using the diagram of limiting amplitudes

    Science.gov (United States)

    Konovodov, V. V.; Valentov, A. V.; Kukhar, I. S.; Retyunskiy, O. Yu; Baraksanov, A. S.

    2016-08-01

    The work proposes the algorithm to calculate strength under alternating stresses using the developed methodology of building the diagram of limiting stresses. The overall safety factor is defined by the suggested formula. Strength calculations of components working under alternating stresses in the great majority of cases are conducted as the checking ones. It is primarily explained by the fact that the overall fatigue strength reduction factor (Kσg or Kτg) can only be chosen approximately during the component design as the engineer at this stage of work has just the approximate idea on the component size and shape.

  20. Density limit investigations near and significantly above the Greenwald limit on the tokamaks TEXTOR-94 and RTP

    International Nuclear Information System (INIS)

    Rapp, J.; Koslowski, H.R.; Pospieszczyk, A.; Salzedas, F.; Vries, P.C. de; Schueller, F.C.; Hokin, S.; Messiaen, A.M.

    2001-01-01

    Ignition scenarios like those developed for ITER require plasma densities which will be close or above the Greenwald limit. Generally it is observed that exceeding this limit may lead to a degradation of plasma confinement or to a violent end of the discharge. The achievable density limit and the related processes, such as radiative instabilities and MHD phenomena, which eventually lead to disruption, have been investigated in the limiter tokamaks TEXTOR-94 and RTP. (author)

  1. Methodology and significance of studies of atmospheric deposition in highway runoff

    Science.gov (United States)

    Colman, John A.; Rice, Karen C.; Willoughby, Timothy C.

    2001-01-01

    Atmospheric deposition and the processes that are involved in causing and altering atmospheric deposition in relation to highway surfaces and runoff were evaluated nationwide. Wet deposition is more easily monitored than dry deposition, and data on wet deposition are available for major elements and water properties (constituents affecting acid deposition) from the inter-agency National Atmospheric Deposition Program/ National Trends Network (NADP/NTN). Many trace constituents (metals and organic compounds) of interest in highway runoff loads, however, are not included in the NADP/NTN. Dry deposition, which constitutes a large part of total atmospheric deposition for many constituents in highway runoff loads, is difficult to monitor accurately. Dry-deposition rates are not widely available.Many of the highway-runoff investigations that have addressed atmospheric-deposition sources have had flawed investigative designs or problems with methodology. Some results may be incorrect because of reliance on time-aggregated data collected during a period of changing atmospheric emissions. None of the investigations used methods that could accurately quantify the part of highway runoff load that can be attributed to ambient atmospheric deposition. Lack of information about accurate ambient deposition rates and runoff loads was part of the problem. Samples collected to compute the rates and loads were collected without clean-sampling methods or sampler protocols, and without quality-assurance procedures that could validate the data. Massbudget calculations comparing deposition and runoff did not consider loss of deposited material during on-highway processing. Loss of deposited particles from highway travel lanes could be large, as has been determined in labeled particle studies, because of resuspension caused by turbulence from passing traffic. Although a cause of resuspension of large particles, traffic turbulence may increase the rate of deposition for small particles and

  2. Impact limiters for radioactive materials transport packagings: a methodology for assessment

    International Nuclear Information System (INIS)

    Mourao, Rogerio Pimenta

    2002-01-01

    This work aims at establishing a methodology for design assessment of a cellular material-filled impact limiter to be used as part of a radioactive material transport packaging. This methodology comprises the selection of the cellular material, its structural characterization by mechanical tests, the development of a case study in the nuclear field, preliminary determination of the best cellular material density for the case study, performance of the case and its numerical simulation using the finite element method. Among the several materials used as shock absorbers in packagings, the polyurethane foam was chosen, particularly the foam obtained from the castor oil plant (Ricinus communis), a non-polluting and renewable source. The case study carried out was the 9 m drop test of a package prototype containing radioactive wastes incorporated in a cement matrix, considered one of the most severe tests prescribed by the Brazilian and international transport standards. Prototypes with foam density pre-determined as ideal as well as prototypes using lighter and heavier foams were tested for comparison. The results obtained validate the methodology in that expectations regarding the ideal foam density were confirmed by the drop tests and the numerical simulation. (author)

  3. Applying GRADE-CERQual to qualitative evidence synthesis findings-paper 3: how to assess methodological limitations.

    Science.gov (United States)

    Munthe-Kaas, Heather; Bohren, Meghan A; Glenton, Claire; Lewin, Simon; Noyes, Jane; Tunçalp, Özge; Booth, Andrew; Garside, Ruth; Colvin, Christopher J; Wainwright, Megan; Rashidian, Arash; Flottorp, Signe; Carlsen, Benedicte

    2018-01-25

    The GRADE-CERQual (Confidence in Evidence from Reviews of Qualitative research) approach has been developed by the GRADE (Grading of Recommendations Assessment, Development and Evaluation) Working Group. The approach has been developed to support the use of findings from qualitative evidence syntheses in decision-making, including guideline development and policy formulation. CERQual includes four components for assessing how much confidence to place in findings from reviews of qualitative research (also referred to as qualitative evidence syntheses): (1) methodological limitations, (2) coherence, (3) adequacy of data and (4) relevance. This paper is part of a series providing guidance on how to apply CERQual and focuses on CERQual's methodological limitations component. We developed the methodological limitations component by searching the literature for definitions, gathering feedback from relevant research communities and developing consensus through project group meetings. We tested the CERQual methodological limitations component within several qualitative evidence syntheses before agreeing on the current definition and principles for application. When applying CERQual, we define methodological limitations as the extent to which there are concerns about the design or conduct of the primary studies that contributed evidence to an individual review finding. In this paper, we describe the methodological limitations component and its rationale and offer guidance on how to assess methodological limitations of a review finding as part of the CERQual approach. This guidance outlines the information required to assess methodological limitations component, the steps that need to be taken to assess methodological limitations of data contributing to a review finding and examples of methodological limitation assessments. This paper provides guidance for review authors and others on undertaking an assessment of methodological limitations in the context of the CERQual

  4. The significance of reporting to the thousandths place: Figuring out the laboratory limitations

    Directory of Open Access Journals (Sweden)

    Joely A. Straseski

    2017-04-01

    Full Text Available Objectives: A request to report laboratory values to a specific number of decimal places represents a delicate balance between clinical interpretation of a true analytical change versus laboratory understanding of analytical imprecision and significant figures. Prostate specific antigen (PSA was used as an example to determine if an immunoassay routinely reported to the hundredths decimal place based on significant figure assessment in our laboratory was capable of providing analytically meaningful results when reported to the thousandths places when requested by clinicians. Design and methods: Results of imprecision studies of a representative PSA assay (Roche MODULAR E170 employing two methods of statistical analysis are reported. Sample pools were generated with target values of 0.01 and 0.20 μg/L PSA as determined by the E170. Intra-assay imprecision studies were conducted and the resultant data were analyzed using two independent statistical methods to evaluate reporting limits. Results: These statistical methods indicated reporting results to the thousandths place at the two assessed concentrations was an appropriate reflection of the measurement imprecision for the representative assay. This approach used two independent statistical tests to determine the ability of an analytical system to support a desired reporting level. Importantly, data were generated during a routine intra-assay imprecision study, thus this approach does not require extra data collection by the laboratory. Conclusions: Independent statistical analysis must be used to determine appropriate significant figure limitations for clinically relevant analytes. Establishing these limits is the responsibility of the laboratory and should be determined prior to providing clinical results. Keywords: Significant figures, Imprecision, Prostate cancer, Prostate specific antigen, PSA

  5. On the Question of Methodological Support of Research on Relationships of Interpersonal Significance in Kindergarten Groups

    Directory of Open Access Journals (Sweden)

    Iliyn V.A.

    2016-03-01

    Full Text Available The paper focuses on the importance of in-depth research (in particular, employing an algorithm developed by M.Yu. Kondratyev for defining integral status of an individual on child-child interpersonal relationship in kindergarten groups. Although relationships with significant adults are by all means essential for preschool children, interpersonal relation- ships on the child-child level to a great extent shape the content of the social situation of development in general. Still, when it comes to revealing status and role position of the child in the structure of interpersonal relationships within the kindergarten group, there’s the challenge of defining informal intragroup structure of power in contact community (due to the age specifics. The paper suggests how this challenge may be addressed and provides a version of the technique suitable for preschoolers that helps overcome age restrictions implied by the original technique. Also, the paper reports on the outcomes of approbation of this version which proved its heuristic nature. For instance, the outcomes show a high degree of correlation between the results of kindergarten group members ranking in accordance with their influence upon peers carried out by teachers working in these groups.

  6. Physiologically Based Pharmacokinetic Modeling: Methodology, Applications, and Limitations with a Focus on Its Role in Pediatric Drug Development

    Directory of Open Access Journals (Sweden)

    Feras Khalil

    2011-01-01

    Full Text Available The concept of physiologically based pharmacokinetic (PBPK modeling was introduced years ago, but it has not been practiced significantly. However, interest in and implementation of this modeling technique have grown, as evidenced by the increased number of publications in this field. This paper demonstrates briefly the methodology, applications, and limitations of PBPK modeling with special attention given to discuss the use of PBPK models in pediatric drug development and some examples described in detail. Although PBPK models do have some limitations, the potential benefit from PBPK modeling technique is huge. PBPK models can be applied to investigate drug pharmacokinetics under different physiological and pathological conditions or in different age groups, to support decision-making during drug discovery, to provide, perhaps most important, data that can save time and resources, especially in early drug development phases and in pediatric clinical trials, and potentially to help clinical trials become more “confirmatory” rather than “exploratory”.

  7. Development and application of a methodology for the analysis of significant human related event trends in nuclear power plants

    International Nuclear Information System (INIS)

    Cho, H.Y.

    1981-01-01

    A methodology is developed to identify and flag significant trends related to the safety and availability of U.S. commercial nuclear power plants. The development is intended to aid in reducing likelihood of human errors. To assure that the methodology can be easily adapted to various types of classification schemes of operation data, a data bank classified by the Transient Analysis Classification and Evaluation (TRACE) scheme is selected for the methodology. The significance criteria for human-initiated events affecting the systems and for events caused by human deficiencies were developed. Clustering analysis was used to verify the learning trend in multidimensional histograms. A computer code is developed based on the K-Means algorithm and applied to find the learning period in which error rates are monotonously decreasing with plant age. The Freeman-Tukey (F-T) deviates are used to select generic problems identified by a large positive value (here approximately over 2.0) for the deviate. The identified generic problems are: decision errors which are highly associated with reactor startup operations in the learning period of PWR plants (PWRs), response errors which are highly associated with Secondary Non-Nuclear Systems (SNS) in PWRs, and significant errors affecting systems and which are caused by response action are highly associated with startup reactor mode in BWRS

  8. Cognitive models of executive functions development: methodological limitations and theoretical challenges

    Directory of Open Access Journals (Sweden)

    Florencia Stelzer

    2014-01-01

    Full Text Available Executive functions (EF have been defined as a series of higher-order cognitive processes which allow the control of thought, behavior and affection according to the achievement of a goal. Such processes present a lengthy postnatal development which matures completely by the end of adolescence. In this article we make a review of some of the main models of EF development during childhood. The aim of this work is to describe the state of the art related to the topic, identifying the main theoretical difficulties and methodological limitations associated with the different proposed paradigms. Finally, some suggestions are given to cope with such difficulties, emphasizing that the development of an ontology of EF could be a viable alternative to counter them. We believe that futture researches should guide their efforts toward the development of that ontology.

  9. Methodological limitations of psychosocial interventions in patients with an implantable cardioverter-defibrillator (ICD A systematic review

    Directory of Open Access Journals (Sweden)

    Ockene Ira S

    2009-12-01

    Full Text Available Abstract Background Despite the potentially life-saving benefits of the implantable cardioverter-defibrillator (ICD, a significant group of patients experiences emotional distress after ICD implantation. Different psychosocial interventions have been employed to improve this condition, but previous reviews have suggested that methodological issues may limit the validity of such interventions. Aim: To review the methodology of previously published studies of psychosocial interventions in ICD patients, according to CONSORT statement guidelines for non-pharmacological interventions, and provide recommendations for future research. Methods We electronically searched the PubMed, PsycInfo and Cochrane databases. To be included, studies needed to be published in a peer-reviewed journal between 1980 and 2008, to involve a human population aged 18+ years and to have an experimental design. Results Twelve studies met the eligibility criteria. Samples were generally small. Interventions were very heterogeneous; most studies used cognitive behavioural therapy (CBT and exercise programs either as unique interventions or as part of a multi-component program. Overall, studies showed a favourable effect on anxiety (6/9 and depression (4/8. CBT appeared to be the most effective intervention. There was no effect on the number of shocks and arrhythmic events, probably because studies were not powered to detect such an effect. Physical functioning improved in the three studies evaluating this outcome. Lack of information about the indication for ICD implantation (primary vs. secondary prevention, limited or no information regarding use of anti-arrhythmic (9/12 and psychotropic (10/12 treatment, lack of assessments of providers' treatment fidelity (12/12 and patients' adherence to the intervention (11/12 were the most common methodological limitations. Conclusions Overall, this review supports preliminary evidence of a positive effect of psychosocial interventions

  10. Scoping review identifies significant number of knowledge translation theories, models and frameworks with limited use.

    Science.gov (United States)

    Strifler, Lisa; Cardoso, Roberta; McGowan, Jessie; Cogo, Elise; Nincic, Vera; Khan, Paul A; Scott, Alistair; Ghassemi, Marco; MacDonald, Heather; Lai, Yonda; Treister, Victoria; Tricco, Andrea C; Straus, Sharon E

    2018-04-13

    To conduct a scoping review of knowledge translation (KT) theories, models and frameworks that have been used to guide dissemination or implementation of evidence-based interventions targeted to prevention and/or management of cancer or other chronic diseases. We used a comprehensive multistage search process from 2000-2016, which included traditional bibliographic database searching, searching using names of theories, models and frameworks, and cited reference searching. Two reviewers independently screened the literature and abstracted data. We found 596 studies reporting on the use of 159 KT theories, models or frameworks. A majority (87%) of the identified theories, models or frameworks were used in five or fewer studies, with 60% used once. The theories, models and frameworks were most commonly used to inform planning/design, implementation and evaluation activities, and least commonly used to inform dissemination and sustainability/scalability activities. Twenty-six were used across the full implementation spectrum (from planning/design to sustainability/scalability) either within or across studies. All were used for at least individual-level behavior change, while 48% were used for organization-level, 33% for community-level and 17% for system-level change. We found a significant number of KT theories, models and frameworks with a limited evidence base describing their use. Copyright © 2018. Published by Elsevier Inc.

  11. Extreme fluxes in solar energetic particle events: Methodological and physical limitations

    International Nuclear Information System (INIS)

    Miroshnichenko, L.I.; Nymmik, R.A.

    2014-01-01

    In this study, all available data on the largest solar proton events (SPEs), or extreme solar energetic particle (SEP) events, for the period from 1561 up to now are analyzed. Under consideration are the observational, methodological and physical problems of energy-spectrum presentation for SEP fluxes (fluences) near the Earth's orbit. Special attention is paid to the study of the distribution function for extreme fluences of SEPs by their sizes. The authors present advances in at least three aspects: 1) a form of the distribution function that was previously obtained from the data for three cycles of solar activity has been completely confirmed by the data for 41 solar cycles; 2) early estimates of extremely large fluences in the past have been critically revised, and their values were found to be overestimated; and 3) extremely large SEP fluxes are shown to obey a probabilistic distribution, so the concept of an “upper limit flux” does not carry any strict physical sense although it serves as an important empirical restriction. SEP fluxes may only be characterized by the relative probabilities of their appearance, and there is a sharp break in the spectrum in the range of large fluences (or low probabilities). It is emphasized that modern observational data and methods of investigation do not allow, for the present, the precise resolution of the problem of the spectrum break or the estimation of the maximum potentialities of solar accelerator(s). This limitation considerably restricts the extrapolation of the obtained results to the past and future for application to the epochs with different levels of solar activity. - Highlights: • All available data on the largest solar proton events (SPEs) are analyzed. • Distribution function obtained for 3 last cycles is confirmed for 41 solar cycles. • Estimates of extremely large fluences in the past are found to be overestimated. • Extremely large SEP fluxes are shown to obey a probabilistic distribution.

  12. Quantification of the least limiting water range in an oxisol using two methodological strategies

    Directory of Open Access Journals (Sweden)

    Wagner Henrique Moreira

    2014-12-01

    Full Text Available The least limiting water range (LLWR has been used as an indicator of soil physical quality as it represents, in a single parameter, the soil physical properties directly linked to plant growth, with the exception of temperature. The usual procedure for obtaining the LLWR involves determination of the water retention curve (WRC and the soil resistance to penetration curve (SRC in soil samples with undisturbed structure in the laboratory. Determination of the WRC and SRC using field measurements (in situ is preferable, but requires appropriate instrumentation. The objective of this study was to determine the LLWR from the data collected for determination of WRC and SRC in situ using portable electronic instruments, and to compare those determinations with the ones made in the laboratory. Samples were taken from the 0.0-0.1 m layer of a Latossolo Vermelho distrófico (Oxisol. Two methods were used for quantification of the LLWR: the traditional, with measurements made in soil samples with undisturbed structure; and in situ , with measurements of water content (θ, soil water potential (Ψ, and soil resistance to penetration (SR through the use of sensors. The in situ measurements of θ, Ψ and SR were taken over a period of four days of soil drying. At the same time, samples with undisturbed structure were collected for determination of bulk density (BD. Due to the limitations of measurement of Ψ by tensiometer, additional determinations of θ were made with a psychrometer (in the laboratory at the Ψ of -1500 kPa. The results show that it is possible to determine the LLWR by the θ, Ψ and SR measurements using the suggested approach and instrumentation. The quality of fit of the SRC was similar in both strategies. In contrast, the θ and Ψ in situ measurements, associated with those measured with a psychrometer, produced a better WRC description. The estimates of the LLWR were similar in both methodological strategies. The quantification of

  13. FCγ Chimeric Receptor-Engineered T Cells: Methodology, Advantages, Limitations, and Clinical Relevance

    Directory of Open Access Journals (Sweden)

    Giuseppe Sconocchia

    2017-04-01

    . Third, the off-target effect of CD16-CR T cells may be controlled by withdrawing the mAb administration. The goal of this manuscript was threefold. First, we review the current state-of-the-art of preclinical CD16-CR T cell technology. Second, we describe its in vitro and in vivo antitumor activity. Finally, we compare the advantages and limitations of the CD16-CR T cell technology with those of CAR T cell methodology.

  14. Evidence that DNA excision-repair in xeroderma pigmentosum group A is limited but biologically significant

    International Nuclear Information System (INIS)

    Hull, D.R.; Kantor, G.J.

    1983-01-01

    The loss of pyrimidine dimers in nondividing populations of an excision-repair deficient xeroderma pigmentosum group. A strain (XP12BE) was measured throughout long periods (up to 5 months) following exposure to low doses of ultraviolet light (UV, 254 nm) using a UV endonuclease-alkaline sedimentation assay. Excision of about 90% of the dimers induced by 1 J/m 2 occurred during the first 50 days. The rate curve has some similarities with that of normal excision-repair proficient cultures that may not be coincidental. Rate curves for both XP12BE and normal cultures are characterized by a fast and slow component, with both rate constants for the XP12BE cultures (0.15 day -1 and 0.025 day -1 ) a factor of 10 smaller than those observed for the respective components of normal cell cultures. The slow components for both XP12BE and normal cultures extrapolate to about 30% of the initial number of dimers. No further excision was detected throughout an additional 90-day period even though the cultures were capable of excision-repair of other newly-introduced pyrimidine dimers. We conclude that nondividing XP12BE cells in addition to having a slower repair rate, cannot repair some of the UV-induced DNA damage. The repair in XP12BE is shown to have biological significance as detected by a cell-survival assay and dose-fractionation techniques. Nondividing XP12BE cells are more resistant to UV when irradiated chronically than when irradiated acutely with the same total dose. (orig.)

  15. Methodologies for the practical determination and use of method detection limits

    International Nuclear Information System (INIS)

    Rucker, T.L.

    1995-01-01

    Method detection limits have often been misunderstood and misused. The basic definitions developed by Lloyd Currie and others have been combined with assumptions that are inappropriate for many types of radiochemical analyses. A partical way for determining detection limits based on Currie's basic definition is presented that removes the reliance on assumptions and that accounts for the total measurement uncertainty. Examples of proper and improper use of detection limits are also presented, including detection limits reported by commercial software for gamma spectroscopy and neutron activation analyses. (author) 6 refs.; 2 figs

  16. Interest and limits of the six sigma methodology in medical laboratory.

    Science.gov (United States)

    Scherrer, Florian; Bouilloux, Jean-Pierre; Calendini, Ors'Anton; Chamard, Didier; Cornu, François

    2017-02-01

    The mandatory accreditation of clinical laboratories in France provides an incentive to develop real tools to measure performance management methods and to optimize the management of internal quality controls. Six sigma methodology is an approach commonly applied to software quality management and discussed in numerous publications. This paper discusses the primary factors that influence the sigma index (the choice of the total allowable error, the approach used to address bias) and compares the performance of different analyzers on the basis of the sigma index. Six sigma strategy can be applied to the policy management of internal quality control in a laboratory and demonstrates through a comparison of four analyzers that there is no single superior analyzer in clinical chemistry. Similar sigma results are obtained using approaches toward bias based on the EQAS or the IQC. The main difficulty in using the six sigma methodology lies in the absence of official guidelines for the definition of the total error acceptable. Despite this drawback, our comparison study suggests that difficulties with defined analytes do not vary with the analyzer used.

  17. Unsolicited written narratives as a methodological genre in terminal illness: challenges and limitations.

    Science.gov (United States)

    O'Brien, Mary R; Clark, David

    2012-02-01

    Stories about illness have proven invaluable in helping health professionals understand illness experiences. Such narratives have traditionally been solicited by researchers through interviews and the collection of personal writings, including diaries. These approaches are, however, researcher driven; the impetus for the creation of the story comes from the researcher and not the narrator. In recent years there has been exponential growth in illness narratives created by individuals, of their own volition, and made available for others to read in print or as Internet accounts. We sought to determine whether it was possible to identify such material for use as research data to explore the subject of living with the terminal illness amyotrophic lateral sclerosis/motor neuron disease--the contention being that these accounts are narrator driven and therefore focus on issues of greatest importance to the affected person. We encountered and sought to overcome a number of methodological and ethical challenges, which is our focus here.

  18. Phenotypic variance, plasticity and heritability estimates of critical thermal limits depend on methodological context

    DEFF Research Database (Denmark)

    Chown, Steven L.; Jumbam, Keafon R.; Sørensen, Jesper Givskov

    2009-01-01

    used during assessments of critical thermal limits to activity. To date, the focus of work has almost exclusively been on the effects of rate variation on mean values of the critical limits. 2.  If the rate of temperature change used in an experimental trial affects not only the trait mean but also its...... this is the case for critical thermal limits using a population of the model species Drosophila melanogaster and the invasive ant species Linepithema humile. 4.  We found that effects of the different rates of temperature change are variable among traits and species. However, in general, different rates...... of temperature change resulted in different phenotypic variances and different estimates of heritability, presuming that genetic variance remains constant. We also found that different rates resulted in different conclusions regarding the responses of the species to acclimation, especially in the case of L...

  19. Structured Observation of School Administrator Work Activities: Methodological Limitations and Recommendations for Research, Part 1.

    Science.gov (United States)

    Pitner, Nancy J.; Russell, James S.

    1986-01-01

    This paper critically reviews studies of administrator work activities which follow the work of Henry Mintzberg (1973), concentrating on these shortcomings of the method: (1) procedural difficulties in coding; (2) design limitations of classifying activities; (3) inadequate testing of Mintzberg's hypotheses; and (4) failure to explore antecedents…

  20. Proposal for a methodology for the determination of the Detection Limit: Case of bulks detector for the Chlorothalonil

    International Nuclear Information System (INIS)

    Gonzalez G, J.; Parra M, C.M.; Romero R, R.M.

    1998-01-01

    As it usually found in the literature, the Lower Detection Limit for an analytical method is obtained by means of a simple statistical exercise whose value is seldom attained in practice. For this reason a method was designed for achieving a Practical Lower Detection Limit for the lowest concentration of the compound that could be detected, S= 2 R, with a probability of at least, 0.90. A methodological design was formulated to get a Practical Lower Detection Limit that is not timed consuming for the analyst. It is presented using an example and, additionally its benefits are discussed versus the unattainable values that are obtained by the traditional use of the standard deviation of a series of measurements of the compound at concentrations near zero, which is then multiplied by the t student for a one tailed test for n-1 degrees of freedom

  1. Bringing translation out of the shadows: translation as an issue of methodological significance in cross-cultural qualitative research.

    Science.gov (United States)

    Wong, Josephine Pui-Hing; Poon, Maurice Kwong-Lai

    2010-04-01

    Translation is an integral component of cross-cultural research that has remained invisible. It is commonly assumed that translation is an objective and neutral process, in which the translators are "technicians" in producing texts in different languages. Drawing from the field of translation studies and the findings of a translation exercise conducted with three bilingual Cantonese-English translators, the authors highlight some of the methodological issues about translation in cross-cultural qualitative research. They argue that only by making translation visible and through open dialogue can researchers uncover the richness embedded in the research data and facilitate multiple ways of knowing.

  2. Laboratory methodologies for indicators of iron status: strengths, limitations, and analytical challenges.

    Science.gov (United States)

    Pfeiffer, Christine M; Looker, Anne C

    2017-12-01

    Biochemical assessment of iron status relies on serum-based indicators, such as serum ferritin (SF), transferrin saturation, and soluble transferrin receptor (sTfR), as well as erythrocyte protoporphyrin. These indicators present challenges for clinical practice and national nutrition surveys, and often iron status interpretation is based on the combination of several indicators. The diagnosis of iron deficiency (ID) through SF concentration, the most commonly used indicator, is complicated by concomitant inflammation. sTfR concentration is an indicator of functional ID that is not an acute-phase reactant, but challenges in its interpretation arise because of the lack of assay standardization, common reference ranges, and common cutoffs. It is unclear which indicators are best suited to assess excess iron status. The value of hepcidin, non-transferrin-bound iron, and reticulocyte indexes is being explored in research settings. Serum-based indicators are generally measured on fully automated clinical analyzers available in most hospitals. Although international reference materials have been available for years, the standardization of immunoassays is complicated by the heterogeneity of antibodies used and the absence of physicochemical reference methods to establish "true" concentrations. From 1988 to 2006, the assessment of iron status in NHANES was based on the multi-indicator ferritin model. However, the model did not indicate the severity of ID and produced categorical estimates. More recently, iron status assessment in NHANES has used the total body iron stores (TBI) model, in which the log ratio of sTfR to SF is assessed. Together, sTfR and SF concentrations cover the full range of iron status. The TBI model better predicts the absence of bone marrow iron than SF concentration alone, and TBI can be analyzed as a continuous variable. Additional consideration of methodologies, interpretation of indicators, and analytic standardization is important for further

  3. Comparative risk-benefit-cost effectiveness in nuclear and alternate power sources: methodology, perspective, limitations

    International Nuclear Information System (INIS)

    Vinck, W.; Van Reijen, G.; Maurer, H.; Volta, G.

    1980-01-01

    A critical survey is given of the use of quantitative risk assessment in defining acceptable limits of safety and of its use together with cost-benefit analyses for decision making. The paper indicates uncertainties and even unknowns in risk assessment in particular if the whole fuel cycle for energy production is considered. It is made clear that for decisions on acceptance of risk also the risk perception factor must be considered. A difficult issue here is the potential for low-probability/large consequence accidents. Examples are given, suggestions for improvement are made and perspectives are outlined

  4. Limited-angle x-ray luminescence tomography: methodology and feasibility study

    International Nuclear Information System (INIS)

    Carpenter, C M; Pratx, G; Sun, C; Xing, L

    2011-01-01

    X-ray luminescence tomography (XLT) has recently been proposed as a new imaging modality for biological imaging applications. This modality utilizes phosphor nanoparticles which luminesce near-infrared light when excited by x-ray photons. The advantages of this modality are that it uniquely combines the high sensitivity of radioluminescent nanoparticles and the high spatial localization of collimated x-ray beams. Currently, XLT has been demonstrated using x-ray spatial encoding to resolve the imaging volume. However, there are applications where the x-ray excitation may be limited by geometry, where increased temporal resolution is desired, or where a lower dose is mandatory. This paper extends the utility of XLT to meet these requirements by incorporating a photon propagation model into the reconstruction algorithm in an x-ray limited-angle (LA) geometry. This enables such applications as image-guided surgery, where the ability to resolve lesions at depths of several centimeters can be the key to successful resection. The hybrid x-ray/diffuse optical model is first formulated and then demonstrated in a breast-sized phantom, simulating a breast lumpectomy geometry. Both numerical and experimental phantoms are tested, with lesion-simulating objects of various sizes and depths. Results show localization accuracy with median error of 2.2 mm, or 4% of object depth, for small 2-14 mm diameter lesions positioned from 1 to 4.5 cm in depth. This compares favorably with fluorescence optical imaging, which is not able to resolve such small objects at this depth. The recovered lesion size has lower size bias in the x-ray excitation direction than the optical direction, which is expected due to the increased optical scatter. However, the technique is shown to be quite invariant in recovered size with respect to depth, as the standard deviation is less than 2.5 mm. Sensitivity is a function of dose; radiological doses are found to provide sufficient recovery for μg ml -1

  5. Tolerance limits and methodologies for IMRT measurement-based verification QA: Recommendations of AAPM Task Group No. 218.

    Science.gov (United States)

    Miften, Moyed; Olch, Arthur; Mihailidis, Dimitris; Moran, Jean; Pawlicki, Todd; Molineu, Andrea; Li, Harold; Wijesooriya, Krishni; Shi, Jie; Xia, Ping; Papanikolaou, Nikos; Low, Daniel A

    2018-04-01

    Patient-specific IMRT QA measurements are important components of processes designed to identify discrepancies between calculated and delivered radiation doses. Discrepancy tolerance limits are neither well defined nor consistently applied across centers. The AAPM TG-218 report provides a comprehensive review aimed at improving the understanding and consistency of these processes as well as recommendations for methodologies and tolerance limits in patient-specific IMRT QA. The performance of the dose difference/distance-to-agreement (DTA) and γ dose distribution comparison metrics are investigated. Measurement methods are reviewed and followed by a discussion of the pros and cons of each. Methodologies for absolute dose verification are discussed and new IMRT QA verification tools are presented. Literature on the expected or achievable agreement between measurements and calculations for different types of planning and delivery systems are reviewed and analyzed. Tests of vendor implementations of the γ verification algorithm employing benchmark cases are presented. Operational shortcomings that can reduce the γ tool accuracy and subsequent effectiveness for IMRT QA are described. Practical considerations including spatial resolution, normalization, dose threshold, and data interpretation are discussed. Published data on IMRT QA and the clinical experience of the group members are used to develop guidelines and recommendations on tolerance and action limits for IMRT QA. Steps to check failed IMRT QA plans are outlined. Recommendations on delivery methods, data interpretation, dose normalization, the use of γ analysis routines and choice of tolerance limits for IMRT QA are made with focus on detecting differences between calculated and measured doses via the use of robust analysis methods and an in-depth understanding of IMRT verification metrics. The recommendations are intended to improve the IMRT QA process and establish consistent, and comparable IMRT QA

  6. Lung scans with significant perfusion defects limited to matching pleural effusions have a low probability of pulmonary embolism

    International Nuclear Information System (INIS)

    Datz, F.L.; Bedont, R.A.; Taylor, A.

    1985-01-01

    Patients with a pleural effusion on chest x-ray often undergo a lung scan to exclude pulmonary embolism (PE). According to other studies, when the scan shows a perfusion defect equal in size to a radiographic abnormality on chest x-ray, the scan should be classified as indeterminate or intermediate probability for PE. However, since those studies dealt primarily with alveolar infiltrates rather than pleural effusions, the authors undertook a retrospective study to determine the probability of PE in patients with pleural effusion and a matching perfusion defect. The authors reviewed 451 scans and x-rays of patients studied for suspected PE. Of those, 53 had moderate or large perfusion defects secondary to pleural effusion without other significant (>25% of a segment) effusion without other significant (>25% of a segment) defects on the scan. Final diagnosis was confirmed by pulmonary angiography (16), thoracentesis (40), venography (11), other radiographic and laboratory studies, and clinical course. Of the 53 patients, only 2 patients had venous thrombotic disease. One patient had PE on pulmonary angiography, the other patient had thrombophlebitis on venography. The remainder of the patients had effusions due to congestive heart failure (12), malignancy (12), infection (7), trauma (7), collegen vascular disease (7), sympathetic effusion (3) and unknown etiology (3). The authors conclude that lung scans with significant perfusion defects limited to matching pleural effusions on chest x-ray have a low probability for PE

  7. Laparoscopic limited Heller myotomy without anti-reflux procedure does not induce significant long-term gastroesophageal reflux.

    Science.gov (United States)

    Zurita Macías Valadez, L C; Pescarus, R; Hsieh, T; Wasserman, L; Apriasz, I; Hong, D; Gmora, S; Cadeddu, M; Anvari, M

    2015-06-01

    Laparoscopic Heller myotomy with partial fundoplication is the gold standard treatment for achalasia. Laparoscopic limited Heller myotomy (LLHM) with no anti-reflux procedure is another possible option. A review of prospectively collected data was performed on patients who underwent LLHM from January 1998 to December 2012. Evaluation included gastroscopy, esophageal manometry, 24-h pH-metry, and the Short Form(36) Health Survey(SF-36) questionnaire at baseline and 6 months, as well as the global symptom score at baseline, 6 months, and 5 years post-surgery. Comparison between outcomes was performed with a paired t student's test. 126 patients underwent LLHM. Of these, 60 patients had complete pre and post-operative motility studies. 57 % were female, patient mean age was 45.7 years, with a mean follow-up of 10.53 months. Mean operative time was 56.1 min, and the average length of stay was 1.7 days. At 6 months, a significant decrease in the lower esophageal sphincter resting pressure (29.1 vs. 7.1 mmHg; p < 0.001) and nadir (16.4 vs. 4.3 mmHg; p < 0.001) was observed. Normal esophageal acid exposure (total pH < 4 %) was observed in 68.3 % patients. Nevertheless, of the remaining 31.7 % with abnormal pH-metry, only 21.6 % were clinically symptomatic and all were properly controlled with medical treatment without requiring anti-reflux surgery. Significant improvement in all pre-operative symptoms was observed at 6 months and maintained over 5 years. Dysphagia score was reduced from 9.8 pre-operatively to 2.6 at 5 years (p < 0.001), heartburn score from 3.82 to 2 (p < 0.01), and regurgitation score from 7.5 to 0.8 (p < 0.001). Only one patient (0.8 %) presented with recurrent dysphagia requiring reoperation. LLHM without anti-reflux procedure is an effective long-term treatment for achalasia and does not cause symptomatic GERD in three quarters of patients. The remaining patients are well controlled on anti-reflux medications. It is believed that similar clinical

  8. Generally applicable limits on intakes of uranium based on its chemical toxicity and the radiological significance of intakes at those limits

    International Nuclear Information System (INIS)

    Thorne, M C; Wilson, J

    2015-01-01

    Uranium is chemically toxic and radioactive, and both considerations have to be taken into account when limiting intakes of the element, in the context of both occupational and public exposures. Herein, the most recent information available on the chemical toxicity and biokinetics of uranium is used to propose new standards for limiting intakes of the element. The approach adopted allows coherent standards to be set for ingestion and inhalation of different chemical forms of the element by various age groups. It also allows coherent standards to be set for occupational and public exposures (including exposures of different age groups) and for various exposure regimes (including short-term and chronic exposures). The proposed standards are more restrictive than those used previously, but are less restrictive than the Minimal Risk Levels proposed recently by the US Agency for Toxic Substances and Disease Registry. Having developed a set of proposed limits based solely on chemical toxicity considerations, the radiological implications of exposure at those proposed limits are investigated for natural, depleted and enriched uranium. (paper)

  9. Evaluating the impact of public space investments with limited time and funds: (methodological) lessons from a Swiss case study

    Energy Technology Data Exchange (ETDEWEB)

    Barjak, F.

    2016-07-01

    The paper suggests a methodology for evaluating innovation support policies and funding in the space sector. Previous evaluations have suggested methodologies which require considerable time and resources. Our approach combines a data collection at organisational level through standardised interviews and at project level through an online survey which are relatively quick to implement and less costly. We demonstrate that valid results can be obtained with such an approach. (Author)

  10. Methodology applied by the Petroleum, Natural Gas and Bio fuels National Agency for detection of cartels - their limits and perspectives; Metodology adotada pela Agencia Nacional do Petroleo, Gas Natural e Biocombustiveis para detectecao de carteis - seus limites e perspectives

    Energy Technology Data Exchange (ETDEWEB)

    Pedra, Douglas Pereira; Sales, Orlando de Araujo Vilela; Baran, Patricia Huguenin; Paiva, Rodrigo Milao de [Agencia Nacional do Petroleo, Gas Natural e Biocombustiveis (ANP), Rio de Janeiro, RJ (Brazil). Regulacao de Petroleo, seus Derivados, Alcool Combustivel e Gas Natural; Bicalho, Lucia Maria de Oliveira Navegantes [Universidade Federal do Rio de Janeiro (PPE/COPPE/UFRJ), RJ (Brazil). Coordenacao dos Programas de Pos-Graduacao de Engenharia. Programa de Planejamento Energetico

    2010-07-01

    This paper presents the present methodology used by the Defense Competition Coordination of the Petroleum National Agency for detection of vestiges of cartels in the Brazilian fuel market, their limits, in view of the instruments presently at the disposal of the Agency and perspectives of modification from the eventual availability of new instruments.

  11. The significance of nitrogen limited condition in the initiation of lipid biosynthesis in Aurantiochytrium sp. SW1

    Science.gov (United States)

    Haladu, Zangoma Maryam; Ibrahim, Izyanti; Hamid, Aidil Abdul

    2018-04-01

    The manner of the onset of lipid synthesis in Aurantiochytrium sp. SW1 as well as the possible role of NAD+ dependent isocitrate dehydrogenase (NAD+: ICDH) in the initiation of lipid biosynthesis were studied. The initiation of lipid synthesis in the microalgae was not associated with the cessation of growth, but commence at the early phase of growth. Substantial amount of lipid (30 %, g/g biomass) was accumulated during the active growth phase at 48 h with growth rate decreasing from 0.11 g/L/h during active growth to 0.02 g/L/h in the limited growth phase. At that period the activity of NAD+: ICDH was still detectable although it slightly decreased to 20 nmol/min/mg in 48 h from 25 nmol/min/mg at 24 h. Analysis of ammonium sulfate fractionated of NAD+: ICDH activity showed that NAD+: ICDH was not completely dependent on adenosine monophosphate (AMP) for its activity, although the presence of AMP increased the enzyme's affinity towards its substrate (isocitrate) indicated by the low Km value of the enzyme for isocitrate. While citrate acts as inhibitor of the enzyme only at high concentration. The probable implications of these properties to the regulation of lipid are discussed.

  12. Methodological Tools for the Assessment of Ecological and Socio-Economic Environment in the Region within the Limits of the Sustainability of Biosphere

    Directory of Open Access Journals (Sweden)

    Aleksey Yuryevich Davankov

    2016-12-01

    Full Text Available The article is devoted to the study of ecological and socio-economic environment as well as the development of effective methodological tool for the assessment of its stability. This tool allows to ascertain the level of economic activity of the regions within the limits of the sustainability of biosphere. In the article, the regional system is considered as the total of industrial enterprises, social infrastructure and natural environment creating a specific territorial ecological and socio-economic environment, whose stability depends on the level of economic activity measured by the capacity of territorial ecosystem. The use of a technique for the comparative assessment of the energy indicators of economic activity creating a specific ecological and socio-economic environment of the region as well as of the indicator of the ecological capacity of the territory is proved. The ecological capacity of the territory enables to better estimate the level of the sustainability of the region within the limits of sustainability of biosphere. This method allows to forecast the development of the studied territory by the measurement of general energy flow on the basis of closed material and energy flows. The research revealed an indicator of the sustainability of ecological and socio-economic environment of Ural Federal District. Yamalo-Nenets Autonomous District is the most stable, the Chelyabinsk region is the least stable, which is associated with both natural conditions and the specificities of economic structure. The labour productivity indicator, expressed in energy units, has revealed regions with rich natural resources. It was found that in these regions, there are significant material flows in the electricity industry that leads to a large proportion of greenhouse gas emissions. The assessment of the demographic capacity fully correlates with the calculations of the stability indicator of the regional system and the analysis of labour

  13. Report on an Assessment of the Application of EPP Results from the Strain Limit Evaluation Procedure to the Prediction of Cyclic Life Based on the SMT Methodology

    Energy Technology Data Exchange (ETDEWEB)

    Jetter, R. I. [R. I. Jetter Consulting, Pebble Beach, CA (United States); Messner, M. C. [Argonne National Lab. (ANL), Argonne, IL (United States); Sham, T. -L. [Argonne National Lab. (ANL), Argonne, IL (United States); Wang, Y. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)

    2017-08-01

    The goal of the proposed integrated Elastic Perfectly-Plastic (EPP) and Simplified Model Test (SMT) methodology is to incorporate an SMT data based approach for creep-fatigue damage evaluation into the EPP methodology to avoid the separate evaluation of creep and fatigue damage and eliminate the requirement for stress classification in current methods; thus greatly simplifying evaluation of elevated temperature cyclic service. This methodology should minimize over-conservatism while properly accounting for localized defects and stress risers. To support the implementation of the proposed methodology and to verify the applicability of the code rules, analytical studies and evaluation of thermomechanical test results continued in FY17. This report presents the results of those studies. An EPP strain limits methodology assessment was based on recent two-bar thermal ratcheting test results on 316H stainless steel in the temperature range of 405 to 7050C. Strain range predictions from the EPP evaluation of the two-bar tests were also evaluated and compared with the experimental results. The role of sustained primary loading on cyclic life was assessed using the results of pressurized SMT data from tests on Alloy 617 at 9500C. A viscoplastic material model was used in an analytic simulation of two-bar tests to compare with EPP strain limits assessments using isochronous stress strain curves that are consistent with the viscoplastic material model. A finite element model of a prior 304H stainless steel Oak Ridge National Laboratory (ORNL) nozzle-to-sphere test was developed and used for an EPP strain limits and creep-fatigue code case damage evaluations. A theoretical treatment of a recurring issue with convergence criteria for plastic shakedown illustrated the role of computer machine precision in EPP calculations.

  14. New insights on therapeutic touch: a discussion of experimental methodology and design that resulted in significant effects on normal human cells and osteosarcoma.

    Science.gov (United States)

    Monzillo, Eloise; Gronowicz, Gloria

    2011-01-01

    Our purpose is to discuss the study design and innovative approaches that led to finding significant effects of one energy medicine therapy, Therapeutic Touch (TT), on cells. In the original published studies, TT was shown to significantly increase human osteoblast DNA synthesis, differentiation, and mineralization; increase in a dose-dependent manner the growth of other human cell types; and decrease the differentiation and mineralization of a human osteosarcoma-derived cell line. A unique feature of the study's methodology and design that contributed to the success of the findings was that a basic level of skill and maturity of the TT practitioner was quantified for producing observable and replicable outcomes in a test administered to all TT practitioners. Only those practitioners that passed the test were selected for the study. (2) The practitioners were required to keep a journal, which appeared to promote their ability to stay centered and replicate their treatments over months of cell experimentation. (3) The origin of the cells that the practitioners were treating was explained to them, although they were blinded to cell type during the experiments. (4) Only early passage cells were used to maintain a stable cell phenotype. (5) Standard protocols for performing TT in the room were followed to ensure reproducible conditions. (6) Placebo controls and untreated controls were used for each experiment. (7) The principal investigator and technicians performing the assays were blinded as to the experimental groups, and all assays and procedures were well established in the laboratory prior to the start of the TT experiments. The absence of studies on the human biofield from mainstream scientific literature is also discussed by describing the difficulties encountered in publishing. These roadblocks contribute to our lack of understanding of the human biofield and energy medicine modalities in science. In conclusion, this report seeks to encourage well

  15. Review of IAEA recommendations on the principles and methodologies for limiting releases of radioactive effluents to the environment

    International Nuclear Information System (INIS)

    Ahmed, J.U.

    1988-01-01

    The limitation of radioactive releases is governed by the basic principles of radiation protection as presented in the ICRP Publication No. 26 and IAEA Safety Series No. 9. Unter its current programme on release limitation the IAEA issued Safety Series No. 77 on principles for release limitation and Safety Series No. 67 on protection against transboundary radiation exposures. A Safety Guide on global upper bounds is now nearly ready for publication, and to guide on the application of Safety Series No. 77, four documents are in various stages of completion

  16. Double contrast barium enema: technique, indications, results and limitations of a conventional imaging methodology in the MDCT virtual endoscopy era.

    Science.gov (United States)

    Rollandi, Gian Andrea; Biscaldi, Ennio; DeCicco, Enzo

    2007-03-01

    The double contrast barium enema of the colon continues to be a diffused conventional radiological technique and allows for the diagnosis of neoplastic and inflammatory pathology. After the '70s, a massive initiative is undertaken to simplify, perfect and encode the method of the double contrast barium enema: Altaras from Germany, Miller from USA and Cittadini from Italy are responsible for the perfection of this technique in the last 30 years. The tailored patient preparation, a perfect technique of execution and a precise radiological documentation are essentials steps to obtain a reliable examination. The main limit of double contrast enema is that it considers the pathology only from the mucosal surface. In neoplastic pathology evaluation the main limit is the "T" parameter staging, but more limited are the "N" and "M" parameters evaluation. Today the double contrast technique continues to be a refined, sensitive and specific diagnostic method, moreover, diagnostic results cannot compete with the new CT multislice techniques (CT-enteroclysis and virtual colonoscopy) which can examine both the lumen and the wall of the colon. The double contrast is a cheap and simple examination but in the next future is predictably a progressive substitution of conventional radiology from new multislice techniques, because the cross sectional imaging is more frequently able to detect causes of the symptoms whether resulting both from colonic or non colonic origin.

  17. Double contrast barium enema: Technique, indications, results and limitations of a conventional imaging methodology in the MDCT virtual endoscopy era

    International Nuclear Information System (INIS)

    Rollandi, Gian Andrea; Biscaldi, Ennio; DeCicco, Enzo

    2007-01-01

    The double contrast barium enema of the colon continues to be a diffused conventional radiological technique and allows for the diagnosis of neoplastic and inflammatory pathology. After the '70s, a massive initiative is undertaken to simplify, perfect and encode the method of the double contrast barium enema: Altaras from Germany, Miller from USA and Cittadini from Italy are responsible for the perfection of this technique in the last 30 years. The tailored patient preparation, a perfect technique of execution and a precise radiological documentation are essentials steps to obtain a reliable examination. The main limit of double contrast enema is that it considers the pathology only from the mucosal surface. In neoplastic pathology evaluation the main limit is the 'T' parameter staging, but more limited are the 'N' and 'M' parameters evaluation. Today the double contrast technique continues to be a refined, sensitive and specific diagnostic method, moreover, diagnostic results cannot compete with the new CT multislice techniques (CT-enteroclysis and virtual colonoscopy) which can examine both the lumen and the wall of the colon. The double contrast is a cheap and simple examination but in the next future is predictably a progressive substitution of conventional radiology from new multislice techniques, because the cross sectional imaging is more frequently able to detect causes of the symptoms whether resulting both from colonic or non colonic origin

  18. Methodology for the identification of significant environmental aspects of oil refining process; Metodologia para identifcacao de aspectos ambientais significativos nos processos de refino de petroleo

    Energy Technology Data Exchange (ETDEWEB)

    Ugaya, Cassia Maria Lie; Henschel, Jefferson [Centro Federal de Educacao Tecnologica do Parana (CEFET-PR), Curitiba, PR (Brazil)

    2004-07-01

    The oil-producing sector have been developing several actions in order to reduce its environmental impacts, an example is the implementation of the Environmental Management System (EMS). It requires the identification of the environmental aspects which are relevant, but it does not specify the methodology for it. Some authors suggest the Life Cycle Assessment (LCA) as it is a scientific, rigorous and possible subject to be reproduced. Life Cycle Assessment aims the reduction of the environmental impacts generated by a product, process or service, since the extraction of the natural resources until it's discard. This work seeks to adapt this methodology to the demands, needs and inherent reality of an oil-refinery. An analyses of the natural resources inputs and residues of the most common pollutants for each process has been carried out. After that, equations were gathered in order to create a computer software, based on Delphi. This program is been tested. (author)

  19. Bandwidth based methodology for designing a hybrid energy storage system for a series hybrid electric vehicle with limited all electric mode

    Science.gov (United States)

    Shahverdi, Masood

    The cost and fuel economy of hybrid electrical vehicles (HEVs) are significantly dependent on the power-train energy storage system (ESS). A series HEV with a minimal all-electric mode (AEM) permits minimizing the size and cost of the ESS. This manuscript, pursuing the minimal size tactic, introduces a bandwidth based methodology for designing an efficient ESS. First, for a mid-size reference vehicle, a parametric study is carried out over various minimal-size ESSs, both hybrid (HESS) and non-hybrid (ESS), for finding the highest fuel economy. The results show that a specific type of high power battery with 4.5 kWh capacity can be selected as the winning candidate to study for further minimization. In a second study, following the twin goals of maximizing Fuel Economy (FE) and improving consumer acceptance, a sports car class Series-HEV (SHEV) was considered as a potential application which requires even more ESS minimization. The challenge with this vehicle is to reduce the ESS size compared to 4.5 kWh, because the available space allocation is only one fourth of the allowed battery size in the mid-size study by volume. Therefore, an advanced bandwidth-based controller is developed that allows a hybridized Subaru BRZ model to be realized with a light ESS. The result allows a SHEV to be realized with 1.13 kWh ESS capacity. In a third study, the objective is to find optimum SHEV designs with minimal AEM assumption which cover the design space between the fuel economies in the mid-size car study and the sports car study. Maximizing FE while minimizing ESS cost is more aligned with customer acceptance in the current state of market. The techniques applied to manage the power flow between energy sources of the power-train significantly affect the results of this optimization. A Pareto Frontier, including ESS cost and FE, for a SHEV with limited AEM, is introduced using an advanced bandwidth-based control strategy teamed up with duty ratio control. This controller

  20. Potential and limitations of X-Ray micro-computed tomography in arthropod neuroanatomy: A methodological and comparative survey

    Science.gov (United States)

    Sombke, Andy; Lipke, Elisabeth; Michalik, Peter; Uhl, Gabriele; Harzsch, Steffen

    2015-01-01

    Classical histology or immunohistochemistry combined with fluorescence or confocal laser scanning microscopy are common techniques in arthropod neuroanatomy, and these methods often require time-consuming and difficult dissections and sample preparations. Moreover, these methods are prone to artifacts due to compression and distortion of tissues, which often result in information loss and especially affect the spatial relationships of the examined parts of the nervous system in their natural anatomical context. Noninvasive approaches such as X-ray micro-computed tomography (micro-CT) can overcome such limitations and have been shown to be a valuable tool for understanding and visualizing internal anatomy and structural complexity. Nevertheless, knowledge about the potential of this method for analyzing the anatomy and organization of nervous systems, especially of taxa with smaller body size (e.g., many arthropods), is limited. This study set out to analyze the brains of selected arthropods with micro-CT, and to compare these results with available histological and immunohistochemical data. Specifically, we explored the influence of different sample preparation procedures. Our study shows that micro-CT is highly suitable for analyzing arthropod neuroarchitecture in situ and allows specific neuropils to be distinguished within the brain to extract quantitative data such as neuropil volumes. Moreover, data acquisition is considerably faster compared with many classical histological techniques. Thus, we conclude that micro-CT is highly suitable for targeting neuroanatomy, as it reduces the risk of artifacts and is faster than classical techniques. J. Comp. Neurol. 523:1281–1295, 2015. © 2015 Wiley Periodicals, Inc. PMID:25728683

  1. Atom counting in HAADF STEM using a statistical model-based approach: methodology, possibilities, and inherent limitations.

    Science.gov (United States)

    De Backer, A; Martinez, G T; Rosenauer, A; Van Aert, S

    2013-11-01

    In the present paper, a statistical model-based method to count the number of atoms of monotype crystalline nanostructures from high resolution high-angle annular dark-field (HAADF) scanning transmission electron microscopy (STEM) images is discussed in detail together with a thorough study on the possibilities and inherent limitations. In order to count the number of atoms, it is assumed that the total scattered intensity scales with the number of atoms per atom column. These intensities are quantitatively determined using model-based statistical parameter estimation theory. The distribution describing the probability that intensity values are generated by atomic columns containing a specific number of atoms is inferred on the basis of the experimental scattered intensities. Finally, the number of atoms per atom column is quantified using this estimated probability distribution. The number of atom columns available in the observed STEM image, the number of components in the estimated probability distribution, the width of the components of the probability distribution, and the typical shape of a criterion to assess the number of components in the probability distribution directly affect the accuracy and precision with which the number of atoms in a particular atom column can be estimated. It is shown that single atom sensitivity is feasible taking the latter aspects into consideration. © 2013 Elsevier B.V. All rights reserved.

  2. Calculation of limits for significant unidirectional changes in two or more serial results of a biomarker based on a computer simulation model

    DEFF Research Database (Denmark)

    Lund, Flemming; Petersen, Per Hyltoft; Fraser, Callum G

    2015-01-01

    BACKGROUND: Reference change values (RCVs) were introduced more than 30 years ago and provide objective tools for assessment of the significance of differences in two consecutive results from an individual. However, in practice, more results are usually available for monitoring, and using the RCV...... the presented factors. The first result is multiplied by the appropriate factor for increase or decrease, which gives the limits for a significant difference.......BACKGROUND: Reference change values (RCVs) were introduced more than 30 years ago and provide objective tools for assessment of the significance of differences in two consecutive results from an individual. However, in practice, more results are usually available for monitoring, and using the RCV......,000 simulated data from healthy individuals, a series of up to 20 results from an individual was generated using different values for the within-subject biological variation plus the analytical variation. Each new result in this series was compared to the initial measurement result. These successive serial...

  3. Simple serum markers for significant liver inflammation in chronic hepatitis B patients with an alanine aminotransferase level lower than 2 times upper limit of normal

    Directory of Open Access Journals (Sweden)

    LI Qiang

    2016-06-01

    Full Text Available ObjectiveTo investigate the simple serum markers for significant liver inflammation in chronic hepatitis B (CHB patients with an alanine aminotransferase (ALT level of <2 times upper limit of normal (ULN. MethodsThe clinical data of 278 CHB patients with ALT <2×ULN (ULN=40 U/L were analyzed retrospectively. Significant liver inflammation was defined as a liver inflammatory activity grade (G ≥2. The t-test was used for comparison of normally distributed continuous data between groups, and the Kruskal-Wallis rank sum test was used for non-normally distributed continuous data; the chi-square test was used for comparison of categorical data between groups. Multivariate logistic regression analysis was used to identify independent predictors for significant liver inflammation in CHB patients with ALT <2×ULN. The receiver operating characteristic (ROC curve was used to evaluate the diagnostic value of serum markers in significant liver inflammation. ResultsOf the 278 CHB patients enrolled, 175 (62.9% had no significant liver inflammation (G0-1 group and 103 (37.1% had significant liver inflammation (G2-4 group. There were significant differences in ALT, aspartate aminotransferase, alkaline phosphatase, gamma-glutamyl transpeptidase (GGT, albumin, globulin, prothrombin time (PT, platelet, absolute neutrophil count, hyaluronic acid (HA, glycocholic acid, precollagen Ⅲ, and collagen type Ⅳ(ⅣC between the two groups (all P<0.05. The multivariate regression analysis showed that GGT, PT, ⅣC, and HA were independent predictors for significant liver inflammation in CHB patients with ALT<2×ULN (OR=1.015, 1.600, 1.151, and 1.014, P=0.008, 0.021, 0.003, and 0.018. The areas under the ROC curve for GGT, PT, IVC, and HA to diagnose significant liver inflammation were 0.804, 0.722, 0.707, and 0.632, respectively. The cut-off value of 48.5 U/L for GGT to predict significant liver inflammation had a specificity of 90.3% and a negative

  4. Medico-economic evaluation of healthcare products. Methodology for defining a significant impact on French health insurance costs and selection of benchmarks for interpreting results.

    Science.gov (United States)

    Dervaux, Benoît; Baseilhac, Eric; Fagon, Jean-Yves; Biot, Claire; Blachier, Corinne; Braun, Eric; Debroucker, Frédérique; Detournay, Bruno; Ferretti, Carine; Granger, Muriel; Jouan-Flahault, Chrystel; Lussier, Marie-Dominique; Meyer, Arlette; Muller, Sophie; Pigeon, Martine; De Sahb, Rima; Sannié, Thomas; Sapède, Claudine; Vray, Muriel

    2014-01-01

    Decree No. 2012-1116 of 2 October 2012 on medico-economic assignments of the French National Authority for Health (Haute autorité de santé, HAS) significantly alters the conditions for accessing the health products market in France. This paper presents a theoretical framework for interpreting the results of the economic evaluation of health technologies and summarises the facts available in France for developing benchmarks that will be used to interpret incremental cost-effectiveness ratios. This literature review shows that it is difficult to determine a threshold value but it is also difficult to interpret then incremental cost effectiveness ratio (ICER) results without a threshold value. In this context, round table participants favour a pragmatic approach based on "benchmarks" as opposed to a threshold value, based on an interpretative and normative perspective, i.e. benchmarks that can change over time based on feedback. © 2014 Société Française de Pharmacologie et de Thérapeutique.

  5. A New Methodology for Open Pit Slope Design in Karst-Prone Ground Conditions Based on Integrated Stochastic-Limit Equilibrium Analysis

    Science.gov (United States)

    Zhang, Ke; Cao, Ping; Ma, Guowei; Fan, Wenchen; Meng, Jingjing; Li, Kaihui

    2016-07-01

    Using the Chengmenshan Copper Mine as a case study, a new methodology for open pit slope design in karst-prone ground conditions is presented based on integrated stochastic-limit equilibrium analysis. The numerical modeling and optimization design procedure contain a collection of drill core data, karst cave stochastic model generation, SLIDE simulation and bisection method optimization. Borehole investigations are performed, and the statistical result shows that the length of the karst cave fits a negative exponential distribution model, but the length of carbonatite does not exactly follow any standard distribution. The inverse transform method and acceptance-rejection method are used to reproduce the length of the karst cave and carbonatite, respectively. A code for karst cave stochastic model generation, named KCSMG, is developed. The stability of the rock slope with the karst cave stochastic model is analyzed by combining the KCSMG code and the SLIDE program. This approach is then applied to study the effect of the karst cave on the stability of the open pit slope, and a procedure to optimize the open pit slope angle is presented.

  6. Polyethylene imine/graphene oxide layer-by-layer surface functionalization for significantly improved limit of detection and binding kinetics of immunoassays on acrylate surfaces.

    Science.gov (United States)

    Miyazaki, Celina M; Mishra, Rohit; Kinahan, David J; Ferreira, Marystela; Ducrée, Jens

    2017-10-01

    Antibody immobilization on polymeric substrates is a key manufacturing step for microfluidic devices that implement sample-to-answer automation of immunoassays. In this work, a simple and versatile method to bio-functionalize poly(methylmethacrylate) (PMMA), a common material of such "Lab-on-a-Chip" systems, is proposed; using the Layer-by-Layer (LbL) technique, we assemble nanostructured thin films of poly(ethylene imine) (PEI) and graphene oxide (GO). The wettability of PMMA surfaces was significantly augmented by the surface treatment with (PEI/GO) 5 film, with an 81% reduction of the contact angle, while the surface roughness increased by 600%, thus clearly enhancing wettability and antibody binding capacity. When applied to enzyme-linked immunosorbent assays (ELISAs), the limit of detection of PMMA surface was notably improved from 340pgmL -1 on commercial grade polystyrene (PS) and 230pgmL -1 on plain PMMA surfaces to 130pgmL -1 on (PEI/GO) 5 treated PMMA. Furthermore, the accelerated antibody adsorption kinetics on the LbL films of GO allowed to substantially shorten incubation times, e.g. for anti-rat IgG adsorption from 2h down to 15min on conventional and treated surfaces, respectively. Copyright © 2017 Elsevier B.V. All rights reserved.

  7. Isotopic dilution methods to determine the gross transformation rates of nitrogen, phosphorus, and sulfur in soil: a review of the theory, methodologies, and limitations

    International Nuclear Information System (INIS)

    Di, H. J.; Cameron, K. C.; McLaren, R. G.

    2000-01-01

    The rates at which nutrients are released to, and removed from, the mineral nutrient pool are important in regulating the nutrient supply to plants. These nutrient transformation rates need to be taken into account when developing nutrient management strategies for economical and sustainable production. A method that is gaining popularity for determining the gross transformation rates of nutrients in the soil is the isotopic dilution technique. The technique involves labelling a soil mineral nutrient pool, e.g. NH 4 + , NO 3 - , PO 4 3- , or SO 4 2- , and monitoring the changes with time of the size of the labelled nutrient pool and the excess tracer abundance (atom %, if stable isotope tracer is used) or specific activity (if radioisotope is used) in the nutrient pool. Because of the complexity of the concepts and procedures involved, the method has sometimes been used incorrectly, and results misinterpreted. This paper discusses the isotopic dilution technique, including the theoretical background, the methodologies to determine the gross flux rates of nitrogen, phosphorus, and sulfur, and the limitations of the technique. The assumptions, conceptual models, experimental procedures, and compounding factors are discussed. Possible effects on the results by factors such as the uniformity of tracer distribution in the soil, changes in soil moisture content, substrate concentration, and aeration status, and duration of the experiment are also discussed. The influx and out-flux transformation rates derived from this technique are often contributed by several processes simultaneously, and thus cannot always be attributed to a particular nutrient transformation process. Despite the various constraints or possible compounding factors, the technique is a valuable tool that can provide important quantitative information on nutrient dynamics in the soil-plant system. Copyright (2000) CSIRO Publishing

  8. A methodology for calculating transport emissions in cities with limited traffic data: Case study of diesel particulates and black carbon emissions in Murmansk.

    Science.gov (United States)

    Kholod, N; Evans, M; Gusev, E; Yu, S; Malyshev, V; Tretyakova, S; Barinov, A

    2016-03-15

    This paper presents a methodology for calculating exhaust emissions from on-road transport in cities with low-quality traffic data and outdated vehicle registries. The methodology consists of data collection approaches and emission calculation methods. For data collection, the paper suggests using video survey and parking lot survey methods developed for the International Vehicular Emissions model. Additional sources of information include data from the largest transportation companies, vehicle inspection stations, and official vehicle registries. The paper suggests using the European Computer Programme to Calculate Emissions from Road Transport (COPERT) 4 model to calculate emissions, especially in countries that implemented European emissions standards. If available, the local emission factors should be used instead of the default COPERT emission factors. The paper also suggests additional steps in the methodology to calculate emissions only from diesel vehicles. We applied this methodology to calculate black carbon emissions from diesel on-road vehicles in Murmansk, Russia. The results from Murmansk show that diesel vehicles emitted 11.7 tons of black carbon in 2014. The main factors determining the level of emissions are the structure of the vehicle fleet and the level of vehicle emission controls. Vehicles without controls emit about 55% of black carbon emissions. Copyright © 2015 Elsevier B.V. All rights reserved.

  9. Determining significance in social impact assessments (SIA) by applying both technical and participatory approaches: Methodology development and application in a case study of the concentrated solar power plant NOORO I in Morocco

    International Nuclear Information System (INIS)

    Terrapon-Pfaff, Julia; Fink, Thomas; Viebahn, Peter; Jamea, El Mostafa

    2017-01-01

    One of the main objectives of impact assessments is to identify potentially significant impacts. However, determining this significance has received very limited attention as a procedural step in social impact assessments. Consequently, only limited research and documentation exists on approaches, survey tools and evaluation methods, especially with regard to participatory approaches and combined participatory-technical approaches. This study aims to address this research gap by developing and applying a joined participatory and technical impact significance evaluation. The approach is applied in a case study which analysed the livelihood impacts of the large-scale concentrated solar power plant NOOR O I in Ouarzazate, Morocco. The analysis shows that although different approaches and significance criteria must be applied when involving both local stakeholders and experts, the linked analysis offers more robust results and an improved basis for decision-making. Furthermore, it was observed in the case study that impacts affecting the social, cultural and political spheres were more often considered significant than impacts affecting the physical and material livelihood dimensions. Regarding sustainability assessments of large-scale renewable energy plants, these findings underline the importance (as for other large-scale infrastructure developments) of placing greater emphasis on the inclusion of social aspects in impact assessments. - Highlights: •Significance evaluation in social impact assessments lacks participatory aspects. •Combined participatory and technical impact significance evaluation developed •Application in a case study to analyse the livelihood impacts of a CSP plant •Involvement of local stakeholders and technical experts results in robust findings. •Social, cultural and political issues are often more relevant than material aspects.

  10. Vector-based RNA interference against vascular endothelial growth factor-A significantly limits vascularization and growth of prostate cancer in vivo.

    Science.gov (United States)

    Wannenes, Francesca; Ciafré, Silvia Anna; Niola, Francesco; Frajese, Gaetano; Farace, Maria Giulia

    2005-12-01

    RNA interference technology is emerging as a very potent tool to obtain a cellular knockdown of a desired gene. In this work we used vector-based RNA interference to inhibit vascular endothelial growth factor (VEGF) expression in prostate cancer in vitro and in vivo. We demonstrated that transduction with a plasmid carrying a small interfering RNA targeting all isoforms of VEGF, dramatically impairs the expression of this growth factor in the human prostate cancer cell line PC3. As a consequence, PC3 cells loose their ability to induce one of the fundamental steps of angiogenesis, namely the formation of a tube-like network in vitro. Most importantly, our "therapeutic" vector is able to impair tumor growth rate and vascularization in vivo. We show that a single injection of naked plasmid in developing neoplastic mass significantly decreases microvessel density in an androgen-refractory prostate xenograft and is able to sustain a long-term slowing down of tumor growth. In conclusion, our results confirm the basic role of VEGF in the angiogenic development of prostate carcinoma, and suggest that the use of our vector-based RNA interference approach to inhibit angiogenesis could be an effective tool in view of future gene therapy applications for prostate cancer.

  11. Brief report on a systematic review of youth violence prevention through media campaigns: Does the limited yield of strong evidence imply methodological challenges or absence of effect?

    Science.gov (United States)

    Cassidy, Tali; Bowman, Brett; McGrath, Chloe; Matzopoulos, Richard

    2016-10-01

    We present a brief report on a systematic review which identified, assessed and synthesized the existing evidence of the effectiveness of media campaigns in reducing youth violence. Search strategies made use of terms for youth, violence and a range of terms relating to the intervention. An array of academic databases and websites were searched. Although media campaigns to reduce violence are widespread, only six studies met the inclusion criteria. There is little strong evidence to support a direct link between media campaigns and a reduction in youth violence. Several studies measure proxies for violence such as empathy or opinions related to violence, but the link between these measures and violence perpetration is unclear. Nonetheless, some evidence suggests that a targeted and context-specific campaign, especially when combined with other measures, can reduce violence. However, such campaigns are less cost-effective to replicate over large populations than generalised campaigns. It is unclear whether the paucity of evidence represents a null effect or methodological challenges with evaluating media campaigns. Future studies need to be carefully planned to accommodate for methodological difficulties as well as to identify the specific elements of campaigns that work, especially in lower and middle income countries. Copyright © 2016 The Foundation for Professionals in Services for Adolescents. Published by Elsevier Ltd. All rights reserved.

  12. The normal limits, subclinical significance, related metabolic derangements and distinct biological effects of body site-specific adiposity in relatively healthy population.

    Directory of Open Access Journals (Sweden)

    Chun-Ho Yun

    Full Text Available BACKGROUND: The accumulation of visceral adipose tissue that occurs with normal aging is associated with increased cardiovascular risks. However, the clinical significance, biological effects, and related cardiometabolic derangements of body-site specific adiposity in a relatively healthy population have not been well characterized. MATERIALS AND METHODS: In this cross-sectional study, we consecutively enrolled 608 asymptomatic subjects (mean age: 47.3 years, 27% female from 2050 subjects undergoing an annual health survey in Taiwan. We measured pericardial (PCF and thoracic peri-aortic (TAT adipose tissue volumes by 16-slice multi-detector computed tomography (MDCT (Aquarius 3D Workstation, TeraRecon, San Mateo, CA, USA and related these to clinical characteristics, body fat composition (Tanita 305 Corporation, Tokyo, Japan, coronary calcium score (CCS, serum insulin, high-sensitivity C-reactive protein (Hs-CRP level and circulating leukocytes count. Metabolic risk was scored by Adult Treatment Panel III guidelines. RESULTS: TAT, PCF, and total body fat composition all increased with aging and higher metabolic scores (all p<0.05. Only TAT, however, was associated with higher circulating leukocyte counts (ß-coef.:0.24, p<0.05, serum insulin (ß-coef.:0.17, p<0.05 and high sensitivity C-reactive protein (ß-coef.:0.24, p<0.05. These relationships persisted after adjustment in multivariable models (all p<0.05. A TAT volume of 8.29 ml yielded the largest area under the receiver operating characteristic curve (AUROC: 0.79, 95%CI: 0.74-0.83 to identify metabolic syndrome. TAT but not PCF correlated with higher coronary calcium score after adjustment for clinical variables (all p<0.05. CONCLUSION: In our study, we observe that age-related body-site specific accumulation of adipose tissue may have distinct biological effects. Compared to other adiposity measures, peri-aortic adiposity is more tightly associated with cardiometabolic risk profiles and

  13. HOME Income Limits

    Data.gov (United States)

    Department of Housing and Urban Development — HOME Income Limits are calculated using the same methodology that HUD uses for calculating the income limits for the Section 8 program. These limits are based on HUD...

  14. Methodology for determination of the operational limits in a atmospheric oven as a function of the load characteristics; Metodologia para determinacao dos limites operacionais de um forno atmosferico em funcao das caracteristicas da carga

    Energy Technology Data Exchange (ETDEWEB)

    Mundstock, Rene [Refinaria Alberto Pasqualini (REFAP), Canoas, RS (Brazil); Correa, Eduardo Coelho [PETROBRAS, Rio de Janeiro, RJ (Brazil)

    2000-07-01

    The necessity of increasing the crude processing with operation periods even greater, are resulting in Atmospheric Distillation Units in severe conditions, requiring even more rigorous operational procedures. The crude fired heaters are the critical equipment in this context, because they are the most susceptible to failures, if some determined limits are exceeded. This work was originated from the tube failure analysis of the fired heater F-101 of Alberto Pasqualini Refinery (REFAP), occurred in july/98, due to internal cocking, with unit emergency shut-down. During this analysis, it was verified that the operational limit of a fired heater cannot be determined only by its design thermal duty, it must be also taken into account the influence of the crude composition. This will determine, due to its film coefficient, the region of the radiation zone of the fired heater where it will begin the crude vaporization and will occur the greater temperatures. The work conclusions resulted on the revision of the Unit operational procedures and definition of new crude processing limits. The fired heater operates with no problems since then. (author)

  15. Cochrane Qualitative and Implementation Methods Group guidance series-paper 3: methods for assessing methodological limitations, data extraction and synthesis, and confidence in synthesized qualitative findings.

    Science.gov (United States)

    Noyes, Jane; Booth, Andrew; Flemming, Kate; Garside, Ruth; Harden, Angela; Lewin, Simon; Pantoja, Tomas; Hannes, Karin; Cargo, Margaret; Thomas, James

    2018-05-01

    The Cochrane Qualitative and Implementation Methods Group develops and publishes guidance on the synthesis of qualitative and mixed-method implementation evidence. Choice of appropriate methodologies, methods, and tools is essential when developing a rigorous protocol and conducting the synthesis. Cochrane authors who conduct qualitative evidence syntheses have thus far used a small number of relatively simple methods to address similarly written questions. Cochrane has invested in methodological work to develop new tools and to encourage the production of exemplar reviews to show the value of more innovative methods that address a wider range of questions. In this paper, in the series, we report updated guidance on the selection of tools to assess methodological limitations in qualitative studies and methods to extract and synthesize qualitative evidence. We recommend application of Grades of Recommendation, Assessment, Development, and Evaluation-Confidence in the Evidence from Qualitative Reviews to assess confidence in qualitative synthesized findings. This guidance aims to support review authors to undertake a qualitative evidence synthesis that is intended to be integrated subsequently with the findings of one or more Cochrane reviews of the effects of similar interventions. The review of intervention effects may be undertaken concurrently with or separate to the qualitative evidence synthesis. We encourage further development through reflection and formal testing. Copyright © 2017 Elsevier Inc. All rights reserved.

  16. Testing methodologies

    Energy Technology Data Exchange (ETDEWEB)

    Bender, M.A.

    1990-01-01

    Several methodologies are available for screening human populations for exposure to ionizing radiation. Of these, aberration frequency determined in peripheral blood lymphocytes is the best developed. Individual exposures to large doses can easily be quantitated, and population exposures to occupational levels can be detected. However, determination of exposures to the very low doses anticipated from a low-level radioactive waste disposal site is more problematical. Aberrations occur spontaneously, without known cause. Exposure to radiation induces no new or novel types, but only increases their frequency. The limitations of chromosomal aberration dosimetry for detecting low level radiation exposures lie mainly in the statistical signal to noise'' problem, the distribution of aberrations among cells and among individuals, and the possible induction of aberrations by other environmental occupational or medical exposures. However, certain features of the human peripheral lymphocyte-chromosomal aberration system make it useful in screening for certain types of exposures. Future technical developments may make chromosomal aberration dosimetry more useful for low-level radiation exposures. Other methods, measuring gene mutations or even minute changes on the DNA level, while presently less will developed techniques, may eventually become even more practical and sensitive assays for human radiation exposure. 15 refs.

  17. Testing methodologies

    International Nuclear Information System (INIS)

    Bender, M.A.

    1990-01-01

    Several methodologies are available for screening human populations for exposure to ionizing radiation. Of these, aberration frequency determined in peripheral blood lymphocytes is the best developed. Individual exposures to large doses can easily be quantitated, and population exposures to occupational levels can be detected. However, determination of exposures to the very low doses anticipated from a low-level radioactive waste disposal site is more problematical. Aberrations occur spontaneously, without known cause. Exposure to radiation induces no new or novel types, but only increases their frequency. The limitations of chromosomal aberration dosimetry for detecting low level radiation exposures lie mainly in the statistical ''signal to noise'' problem, the distribution of aberrations among cells and among individuals, and the possible induction of aberrations by other environmental occupational or medical exposures. However, certain features of the human peripheral lymphocyte-chromosomal aberration system make it useful in screening for certain types of exposures. Future technical developments may make chromosomal aberration dosimetry more useful for low-level radiation exposures. Other methods, measuring gene mutations or even minute changes on the DNA level, while presently less will developed techniques, may eventually become even more practical and sensitive assays for human radiation exposure. 15 refs

  18. Fundamental energy limits of SET-based Brownian NAND and half-adder circuits. Preliminary findings from a physical-information-theoretic methodology

    Science.gov (United States)

    Ercan, İlke; Suyabatmaz, Enes

    2018-06-01

    The saturation in the efficiency and performance scaling of conventional electronic technologies brings about the development of novel computational paradigms. Brownian circuits are among the promising alternatives that can exploit fluctuations to increase the efficiency of information processing in nanocomputing. A Brownian cellular automaton, where signals propagate randomly and are driven by local transition rules, can be made computationally universal by embedding arbitrary asynchronous circuits on it. One of the potential realizations of such circuits is via single electron tunneling (SET) devices since SET technology enable simulation of noise and fluctuations in a fashion similar to Brownian search. In this paper, we perform a physical-information-theoretic analysis on the efficiency limitations in a Brownian NAND and half-adder circuits implemented using SET technology. The method we employed here establishes a solid ground that enables studying computational and physical features of this emerging technology on an equal footing, and yield fundamental lower bounds that provide valuable insights into how far its efficiency can be improved in principle. In order to provide a basis for comparison, we also analyze a NAND gate and half-adder circuit implemented in complementary metal oxide semiconductor technology to show how the fundamental bound of the Brownian circuit compares against a conventional paradigm.

  19. Comparative analysis of in vivo T cell depletion with radiotherapy, combination chemotherapy, and the monoclonal antibody Campath-1G, using limiting dilution methodology

    International Nuclear Information System (INIS)

    Theobald, M.; Hoffmann, T.; Bunjes, D.; Heit, W.

    1990-01-01

    We have investigated the efficacy of standard conditioning regimens for bone marrow transplantation in depleting functional T lymphocytes in vivo and have compared it with the efficacy of the monoclonal antibody Campath-1G. Using limiting dilution techniques the frequencies of proliferating T cell precursors (PTL), cytotoxic T cell precursors (CTL-p), helper T cell precursors (HTL-p), and mature helper T cells (HTL) were determined before and after treatment. Both total body irradiation and combination chemotherapy with busulfan/cyclophosphamide were highly efficient at depleting PTL, CTL-p, and HTL-p (0-4 days) but spared HTL to a variable extent (0-99.5%). In the majority of patients treated with Campath-1G a similar degree of PTL, CTL-p, and HTL-p depletion was achieved, and, in addition, HTL were effectively removed (greater than 95.5%). These results suggest that Campath-1G could be successfully employed in depleting radio- and chemotherapy-resistant host T lymphocytes prior to T-depleted bone marrow transplantation

  20. Chronic early life stress induced by limited bedding and nesting (LBN) material in rodents: critical considerations of methodology, outcomes and translational potential.

    Science.gov (United States)

    Walker, Claire-Dominique; Bath, Kevin G; Joels, Marian; Korosi, Aniko; Larauche, Muriel; Lucassen, Paul J; Morris, Margaret J; Raineki, Charlis; Roth, Tania L; Sullivan, Regina M; Taché, Yvette; Baram, Tallie Z

    2017-09-01

    The immediate and long-term effects of exposure to early life stress (ELS) have been documented in humans and animal models. Even relatively brief periods of stress during the first 10 days of life in rodents can impact later behavioral regulation and the vulnerability to develop adult pathologies, in particular an impairment of cognitive functions and neurogenesis, but also modified social, emotional, and conditioned fear responses. The development of preclinical models of ELS exposure allows the examination of mechanisms and testing of therapeutic approaches that are not possible in humans. Here, we describe limited bedding and nesting (LBN) procedures, with models that produce altered maternal behavior ranging from fragmentation of care to maltreatment of infants. The purpose of this paper is to discuss important issues related to the implementation of this chronic ELS procedure and to describe some of the most prominent endpoints and consequences, focusing on areas of convergence between laboratories. Effects on the hypothalamic-pituitary adrenal (HPA) axis, gut axis and metabolism are presented in addition to changes in cognitive and emotional functions. Interestingly, recent data have suggested a strong sex difference in some of the reported consequences of the LBN paradigm, with females being more resilient in general than males. As both the chronic and intermittent variants of the LBN procedure have profound consequences on the offspring with minimal external intervention from the investigator, this model is advantageous ecologically and has a large translational potential. In addition to the direct effect of ELS on neurodevelopmental outcomes, exposure to adverse early environments can also have intergenerational impacts on mental health and function in subsequent generation offspring. Thus, advancing our understanding of the effect of ELS on brain and behavioral development is of critical concern for the health and wellbeing of both the current

  1. [Comparison of annual risk for tuberculosis infection (1994-2001) in school children in Djibouti: methodological limitations and epidemiological value in a hyperendemic context].

    Science.gov (United States)

    Bernatas, J J; Mohamed Ali, I; Ali Ismaël, H; Barreh Matan, A

    2008-12-01

    The purpose of this report was to describe a tuberculin survey conducted in 2001 to assess the trend in the annual risk for tuberculosis infection in Djibouti and compare resulting data with those obtained in a previous survey conducted in 1994. In 2001 cluster sampling allowed selection of 5599 school children between the ages of 6 and 10 years including 31.2% (1747/5599) without BCG vaccination scar. In this sample the annual risk of infection (ARI) estimated using cutoff points of 6 mm, 10 mm, and 14 mm corrected by a factor of 1/0.82 and a mode value (18 mm) determined according to the "mirror" method were 4.67%, 3.64%, 3.19% and 2.66% respectively. The distribution of positive tuberculin skin reaction size was significantly different from the normal law. In 1994 a total of 5257 children were selected using the same method. The distribution of positive reactions was not significantly different from the gaussian distribution and 28.6% (1505/5257) did not have a BCG scar. The ARI estimated using cutoff points of 6 mm, 10 mm, and 14 mm corrected by a factor of 1/0.82 and a mode value (17 mm) determined according to the "mirror" method were 2.68%, 2.52%, 2.75% and 3.32 respectively. Tuberculin skin reaction size among positive skin test reactors was correlated with the presence of a BCG scar, and its mean was significantly higher among children with BCG scar. The proportion of positive skin test reactors was also higher in the BCG scar group regardless of the cutoff point selected. Comparison of prevalence rates and ARI values did not allow any clear conclusion to be drawn, mainly because of a drastic difference in the positive reaction distribution profiles between the two studies. The distribution of the skin test reaction's size 1994 study could be modelized by a gaussian distribution while it could not in 2001. A partial explanation for the positive reaction distribution observed in the 2001 study might be the existence of cross-reactions with environmental

  2. On methodology

    DEFF Research Database (Denmark)

    Cheesman, Robin; Faraone, Roque

    2002-01-01

    This is an English version of the methodology chapter in the authors' book "El caso Berríos: Estudio sobre información errónea, desinformación y manipulación de la opinión pública".......This is an English version of the methodology chapter in the authors' book "El caso Berríos: Estudio sobre información errónea, desinformación y manipulación de la opinión pública"....

  3. Methodological guidelines

    International Nuclear Information System (INIS)

    Halsnaes, K.; Callaway, J.M.; Meyer, H.J.

    1999-01-01

    The guideline document establishes a general overview of the main components of climate change mitigation assessment. This includes an outline of key economic concepts, scenario structure, common assumptions, modelling tools and country study assumptions. The guidelines are supported by Handbook Reports that contain more detailed specifications of calculation standards, input assumptions and available tools. The major objectives of the project have been provided a methodology, an implementing framework and a reporting system which countries can follow in meeting their future reporting obligations under the FCCC and for GEF enabling activities. The project builds upon the methodology development and application in the UNEP National Abatement Coasting Studies (UNEP, 1994a). The various elements provide countries with a road map for conducting climate change mitigation studies and submitting national reports as required by the FCCC. (au) 121 refs

  4. Methodological guidelines

    Energy Technology Data Exchange (ETDEWEB)

    Halsnaes, K.; Callaway, J.M.; Meyer, H.J.

    1999-04-01

    The guideline document establishes a general overview of the main components of climate change mitigation assessment. This includes an outline of key economic concepts, scenario structure, common assumptions, modelling tools and country study assumptions. The guidelines are supported by Handbook Reports that contain more detailed specifications of calculation standards, input assumptions and available tools. The major objectives of the project have been provided a methodology, an implementing framework and a reporting system which countries can follow in meeting their future reporting obligations under the FCCC and for GEF enabling activities. The project builds upon the methodology development and application in the UNEP National Abatement Coasting Studies (UNEP, 1994a). The various elements provide countries with a road map for conducting climate change mitigation studies and submitting national reports as required by the FCCC. (au) 121 refs.

  5. Safety significance evaluation system

    International Nuclear Information System (INIS)

    Lew, B.S.; Yee, D.; Brewer, W.K.; Quattro, P.J.; Kirby, K.D.

    1991-01-01

    This paper reports that the Pacific Gas and Electric Company (PG and E), in cooperation with ABZ, Incorporated and Science Applications International Corporation (SAIC), investigated the use of artificial intelligence-based programming techniques to assist utility personnel in regulatory compliance problems. The result of this investigation is that artificial intelligence-based programming techniques can successfully be applied to this problem. To demonstrate this, a general methodology was developed and several prototype systems based on this methodology were developed. The prototypes address U.S. Nuclear Regulatory Commission (NRC) event reportability requirements, technical specification compliance based on plant equipment status, and quality assurance assistance. This collection of prototype modules is named the safety significance evaluation system

  6. Los límites de la Evidencia Científica o idoneidad metodológica en la investigación en Terapias Complementarias The limits of the scientific evidence or Suitability methodological research in Complementary Therapies

    Directory of Open Access Journals (Sweden)

    Paloma Echevarría Pérez

    2008-12-01

    some therapies based more on empiricism that positivism. It outlines two mixed methodological proposals. Nursing took the scientific evidence as a way to take hold disciplinary, but it is necessary to know its limits, because care nurse implies the consideration of comprehensive and personal care. There is a need to investigate without fear in the CT or other related topics and deep in the qualitative and social point of view, and it doesn’t mean to forget their "scientific" character.

  7. MIRD methodology

    International Nuclear Information System (INIS)

    Rojo, Ana M.; Gomez Parada, Ines

    2004-01-01

    The MIRD (Medical Internal Radiation Dose) system was established by the Society of Nuclear Medicine of USA in 1960 to assist the medical community in the estimation of the dose in organs and tissues due to the incorporation of radioactive materials. Since then, 'MIRD Dose Estimate Report' (from the 1 to 12) and 'Pamphlets', of great utility for the dose calculations, were published. The MIRD system was planned essentially for the calculation of doses received by the patients during nuclear medicine diagnostic procedures. The MIRD methodology for the absorbed doses calculations in different tissues is explained

  8. PSA methodology

    Energy Technology Data Exchange (ETDEWEB)

    Magne, L

    1997-12-31

    The purpose of this text is first to ask a certain number of questions on the methods related to PSAs. Notably we will explore the positioning of the French methodological approach - as applied in the EPS 1300{sup 1} and EPS 900{sup 2} PSAs - compared to other approaches (Part One). This reflection leads to more general reflection: what contents, for what PSA? This is why, in Part Two, we will try to offer a framework for definition of the criteria a PSA should satisfy to meet the clearly identified needs. Finally, Part Three will quickly summarize the questions approached in the first two parts, as an introduction to the debate. 15 refs.

  9. PSA methodology

    International Nuclear Information System (INIS)

    Magne, L.

    1996-01-01

    The purpose of this text is first to ask a certain number of questions on the methods related to PSAs. Notably we will explore the positioning of the French methodological approach - as applied in the EPS 1300 1 and EPS 900 2 PSAs - compared to other approaches (Part One). This reflection leads to more general reflection: what contents, for what PSA? This is why, in Part Two, we will try to offer a framework for definition of the criteria a PSA should satisfy to meet the clearly identified needs. Finally, Part Three will quickly summarize the questions approached in the first two parts, as an introduction to the debate. 15 refs

  10. The Significance of HIV ‘Blips’ in Resource-Limited Settings: Is It the Same? Analysis of the Treat Asia HIV Observational Database (TAHOD) and the Australian HIV Observational Database (AHOD)

    Science.gov (United States)

    Kanapathipillai, Rupa; McManus, Hamish; Kamarulzaman, Adeeba; Lim, Poh Lian; Templeton, David J.; Law, Matthew; Woolley, Ian

    2014-01-01

    Introduction Magnitude and frequency of HIV viral load blips in resource-limited settings, has not previously been assessed. This study was undertaken in a cohort from a high income country (Australia) known as AHOD (Australian HIV Observational Database) and another cohort from a mixture of Asian countries of varying national income per capita, TAHOD (TREAT Asia HIV Observational Database). Methods Blips were defined as detectable VL (≥ 50 copies/mL) preceded and followed by undetectable VL (failure (VF) was defined as two consecutive VL ≥50 copies/ml. Cox proportional hazard models of time to first VF after entry, were developed. Results 5040 patients (AHOD n = 2597 and TAHOD n = 2521) were included; 910 (18%) of patients experienced blips. 744 (21%) and 166 (11%) of high- and middle/low-income participants, respectively, experienced blips ever. 711 (14%) experienced blips prior to virological failure. 559 (16%) and 152 (10%) of high- and middle/low-income participants, respectively, experienced blips prior to virological failure. VL testing occurred at a median frequency of 175 and 91 days in middle/low- and high-income sites, respectively. Longer time to VF occurred in middle/low income sites, compared with high-income sites (adjusted hazards ratio (AHR) 0.41; pfailure (p = 0.360 for blip 50–≤1000, p = 0.309 for blip 50–≤400 and p = 0.300 for blip 50–≤200). 209 of 866 (24%) patients were switched to an alternate regimen in the setting of a blip. Conclusion Despite a lower proportion of blips occurring in low/middle-income settings, no significant difference was found between settings. Nonetheless, a substantial number of participants were switched to alternative regimens in the setting of blips. PMID:24516527

  11. Survey of Dynamic PSA Methodologies

    International Nuclear Information System (INIS)

    Lee, Hansul; Kim, Hyeonmin; Heo, Gyunyoung; Kim, Taewan

    2015-01-01

    Event Tree(ET)/Fault Tree(FT) are significant methodology in Probabilistic Safety Assessment(PSA) for Nuclear Power Plants(NPPs). ET/FT methodology has the advantage for users to be able to easily learn and model. It enables better communication between engineers engaged in the same field. However, conventional methodologies are difficult to cope with the dynamic behavior (e.g. operation mode changes or sequence-dependent failure) and integrated situation of mechanical failure and human errors. Meanwhile, new possibilities are coming for the improved PSA by virtue of the dramatic development on digital hardware, software, information technology, and data analysis.. More specifically, the computing environment has been greatly improved with being compared to the past, so we are able to conduct risk analysis with the large amount of data actually available. One method which can take the technological advantages aforementioned should be the dynamic PSA such that conventional ET/FT can have time- and condition-dependent behaviors in accident scenarios. In this paper, we investigated the various enabling techniques for the dynamic PSA. Even though its history and academic achievement was great, it seems less interesting from industrial and regulatory viewpoint. Authors expect this can contribute to better understanding of dynamic PSA in terms of algorithm, practice, and applicability. In paper, the overview for the dynamic PSA was conducted. Most of methodologies share similar concepts. Among them, DDET seems a backbone for most of methodologies since it can be applied to large problems. The common characteristics sharing the concept of DDET are as follows: • Both deterministic and stochastic approaches • Improves the identification of PSA success criteria • Helps to limit detrimental effects of sequence binning (normally adopted in PSA) • Helps to avoid defining non-optimal success criteria that may distort the risk • Framework for comprehensively considering

  12. Survey of Dynamic PSA Methodologies

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Hansul; Kim, Hyeonmin; Heo, Gyunyoung [Kyung Hee University, Yongin (Korea, Republic of); Kim, Taewan [KEPCO International Nuclear Graduate School, Ulsan (Korea, Republic of)

    2015-05-15

    Event Tree(ET)/Fault Tree(FT) are significant methodology in Probabilistic Safety Assessment(PSA) for Nuclear Power Plants(NPPs). ET/FT methodology has the advantage for users to be able to easily learn and model. It enables better communication between engineers engaged in the same field. However, conventional methodologies are difficult to cope with the dynamic behavior (e.g. operation mode changes or sequence-dependent failure) and integrated situation of mechanical failure and human errors. Meanwhile, new possibilities are coming for the improved PSA by virtue of the dramatic development on digital hardware, software, information technology, and data analysis.. More specifically, the computing environment has been greatly improved with being compared to the past, so we are able to conduct risk analysis with the large amount of data actually available. One method which can take the technological advantages aforementioned should be the dynamic PSA such that conventional ET/FT can have time- and condition-dependent behaviors in accident scenarios. In this paper, we investigated the various enabling techniques for the dynamic PSA. Even though its history and academic achievement was great, it seems less interesting from industrial and regulatory viewpoint. Authors expect this can contribute to better understanding of dynamic PSA in terms of algorithm, practice, and applicability. In paper, the overview for the dynamic PSA was conducted. Most of methodologies share similar concepts. Among them, DDET seems a backbone for most of methodologies since it can be applied to large problems. The common characteristics sharing the concept of DDET are as follows: • Both deterministic and stochastic approaches • Improves the identification of PSA success criteria • Helps to limit detrimental effects of sequence binning (normally adopted in PSA) • Helps to avoid defining non-optimal success criteria that may distort the risk • Framework for comprehensively considering

  13. Radiotracer methodology

    International Nuclear Information System (INIS)

    Eng, R.R.

    1988-01-01

    In 1923, George Hevesy demonstrated the distribution of radioactive lead in the horsebean plant. This early demonstration of the potential use of radiotracers in biology was reinforced when J.G. Hamilton and colleagues used iodine-131 for diagnostic purposes in patients. Then in 1950 Cassen et al. designed the first scintillation counter for measuring radioiodine in the body, using calcium tungstate crystals coupled to a photomultiplier tube. This was followed by the development of the Anger camera, which permitted visualization of radiotracer distribution in biological systems. From these significant early discoveries to the present, many advances have been made. They include the discovery and production of many useful radioisotopes; the formulation of these radioisotopes into useful radiotracers; the advent of first- , and second-, and third-generation instrumentation for monitoring in vitro and in vivo distributions of new radiotracers; and the application of this knowledge to allow us to better understand physiological processes and treat disease states. Radiotracer techniques are integral to numerous techniques described in this volume. Autoradiography, nuclear scintigraphy, positron emission tomography, and single-photon emission computed tomography (SPECT) are all dependent on an understanding of radiotracer techniques to properly utilize these probe devices

  14. Comparison of methodologies for automatic generation of limits and drainage networks for hidrographic basins Comparação entre metodologias para geração automática de limites e redes de drenagem em bacia hidrográfica

    Directory of Open Access Journals (Sweden)

    Samantha A. Alcaraz

    2009-08-01

    Full Text Available The objective of this work was to compare methodologies for the automatic generation of limits and drainage networks, using a geographical information system for basins of low relief variation, such as the Dourados catchment area. Various data/processes were assessed, especially the ArcHydro and AVSWAT interfaces used to process 50 m resolution DTMs formed from the interpolation of digitalized contour lines using ArcInfo, ArcView and Spring GIS, and a 90 m resolution SRTM DTM acquired by interferometry radar. Their accuracy was estimated based upon the pre-processing of small basic sub-basin units of different relief variations, before applying the best combinations to the entire Dourados basin. The accuracy of the automatic stream network generation and watershed delineation depends essentially on the quality of the raw digital terrain model. The selection of the most suitable one then depends completely on the aims of the user and on the work scale.Propôs-se, neste trabalho comparar metodologias para geração automática de limites e de redes de drenagem superficial na bacia hidrográfica do Rio Dourados, com baixa variação de relevo, usando-se sistemas de informações geográficas. Várias associações dados/processos foram testados, dentre os quais as interfaces ArcHydro e AVSWAT, usadas para processar DTMs com resolução de 50 m formados pela interpolação de linhas de contorno digitalizadas através de ArcInfo, ArcView e SPRING e DTMs com 90 m de resolução aplicadas ao SRTM, adquiridas por radar. Estudou-se a precisão com base no processamento de pequenas bacias de diferentes variações de relevo, antes de se aplicar a melhor combinação para toda a bacia do Rio Dourados. A precisão da geração automática da rede de drenagem e a delineação dos divisores de água da bacia, dependeram essencialmente da qualidade da formação das grades nos DTMs. A seleção da melhor combinação dados/processos depende, então, dos

  15. The 5-HT1A Receptor PET Radioligand 11C-CUMI-101 Has Significant Binding to α1-Adrenoceptors in Human Cerebellum, Limiting Its Use as a Reference Region.

    Science.gov (United States)

    Shrestha, Stal S; Liow, Jeih-San; Jenko, Kimberly; Ikawa, Masamichi; Zoghbi, Sami S; Innis, Robert B

    2016-12-01

    Prazosin, a potent and selective α 1 -adrenoceptor antagonist, displaces 25% of 11 C-CUMI-101 ([O-methyl- 11 C]2-(4-(4-(2-methoxyphenyl)piperazin-1-yl)butyl)-4-methyl-1,2,4-triazine-3,5(2H,4H)dione) binding in monkey cerebellum. We sought to estimate the percentage contamination of 11 C-CUMI-101 binding to α 1 -adrenoceptors in human cerebellum under in vivo conditions. In vitro receptor-binding techniques were used to measure α 1 -adrenoceptor density and the affinity of CUMI-101 for these receptors in human, monkey, and rat cerebellum. Binding potential (maximum number of binding sites × affinity [(1/dissociation constant]) was determined using in vitro homogenate binding assays in human, monkey, and rat cerebellum. 3 H-prazosin was used to determine the maximum number of binding sites, as well as the dissociation constant of 3 H-prazosin and the inhibition constant of CUMI-101. α 1 -adrenoceptor density and the affinity of CUMI-101 for these receptors were similar across species. Cerebellar binding potentials were 3.7 for humans, 2.3 for monkeys, and 3.4 for rats. Reasoning by analogy, 25% of 11 C-CUMI-101 uptake in human cerebellum reflects binding to α 1 -adrenoceptors, suggesting that the cerebellum is of limited usefulness as a reference tissue for quantification in human studies. © 2016 by the Society of Nuclear Medicine and Molecular Imaging, Inc.

  16. Clinical trial methodology

    National Research Council Canada - National Science Library

    Peace, Karl E; Chen, Ding-Geng

    2011-01-01

    ... in the pharmaceutical industry, Clinical trial methodology emphasizes the importance of statistical thinking in clinical research and presents the methodology as a key component of clinical research...

  17. Methodology for calculating the thickness free of sigma phase in duplex stainless steels large section parts during hiperquenching; Metodologia para el calculo de espesores limite libres de fase sigma durante el hipertemple en piezas de aceros duplex de gran seccion

    Energy Technology Data Exchange (ETDEWEB)

    Jimbert, P.; Guraya, T.; Torregary, A.; Bravo, P.

    2013-06-01

    To achieve the mechanical properties and corrosion resistance desired by duplex stainless steels used by the petrochemical and nuclear industry, parts are subjected to a hiperquenching heat treatment from about 1050 degree centigrade. This avoids the risk of intermetallic precipitation which drastically reduces the properties of these materials. However with increasing depth to which the deposits are present, the thicknesses for such pipes have been increased, resulting in higher levels of demand on all its manufacturing process, including the heat treatment. To avoid the precipitation of intermetallic phases such as sigma phase it is necessary to know the cooling profile in the center of the work piece and for this purpose to know the value of the Surface Heat Transfer Coefficient (h) is essential. This coefficient changes during the hiperquenching and its value is determined experimentally as it depends on several process parameters. Studies reveal that its value is stabilized within a few seconds. We can then assume that to know the cooling profile in the center of large sections it is only necessary to know the stabilized value of h. However, all the studies found in the literature are referred to diameters smaller than 100 mm. This paper has developed a methodology to predict the precipitation of intermetallic phases in duplex stainless steel parts with large thicknesses in industrial facilities from the calculation of h. This methodology allows us to calculate the cooling profiles without wasting any work piece using one or more sensorized patterns with thermocouples and a subsequent simulation with ANSYS. (Author)

  18. Insights into PRA methodologies

    International Nuclear Information System (INIS)

    Gallagher, D.; Lofgren, E.; Atefi, B.; Liner, R.; Blond, R.; Amico, P.

    1984-08-01

    Probabilistic Risk Assessments (PRAs) for six nuclear power plants were examined to gain insight into how the choice of analytical methods can affect the results of PRAs. The PRA sreflectope considered was limited to internally initiated accidents sequences through core melt. For twenty methodological topic areas, a baseline or minimal methodology was specified. The choice of methods for each topic in the six PRAs was characterized in terms of the incremental level of effort above the baseline. A higher level of effort generally reflects a higher level of detail or a higher degree of sophistication in the analytical approach to a particular topic area. The impact on results was measured in terms of how additional effort beyond the baseline level changed the relative importance and ordering of dominant accident sequences compared to what would have been observed had methods corresponding to the baseline level of effort been employed. This measure of impact is a more useful indicator of how methods affect perceptions of plant vulnerabilities than changes in core melt frequency would be. However, the change in core melt frequency was used as a secondary measure of impact for nine topics where availability of information permitted. Results are presented primarily in the form of effort-impact matrices for each of the twenty topic areas. A suggested effort-impact profile for future PRAs is presented

  19. Safety class methodology

    International Nuclear Information System (INIS)

    Donner, E.B.; Low, J.M.; Lux, C.R.

    1992-01-01

    DOE Order 6430.1A, General Design Criteria (GDC), requires that DOE facilities be evaluated with respect to ''safety class items.'' Although the GDC defines safety class items, it does not provide a methodology for selecting safety class items. The methodology described in this paper was developed to assure that Safety Class Items at the Savannah River Site (SRS) are selected in a consistent and technically defensible manner. Safety class items are those in the highest of four categories determined to be of special importance to nuclear safety and, merit appropriately higher-quality design, fabrication, and industrial test standards and codes. The identification of safety class items is approached using a cascading strategy that begins at the 'safety function' level (i.e., a cooling function, ventilation function, etc.) and proceeds down to the system, component, or structure level. Thus, the items that are required to support a safety function are SCls. The basic steps in this procedure apply to the determination of SCls for both new project activities, and for operating facilities. The GDC lists six characteristics of SCls to be considered as a starting point for safety item classification. They are as follows: 1. Those items whose failure would produce exposure consequences that would exceed the guidelines in Section 1300-1.4, ''Guidance on Limiting Exposure of the Public,'' at the site boundary or nearest point of public access 2. Those items required to maintain operating parameters within the safety limits specified in the Operational Safety Requirements during normal operations and anticipated operational occurrences. 3. Those items required for nuclear criticality safety. 4. Those items required to monitor the release of radioactive material to the environment during and after a Design Basis Accident. Those items required to achieve, and maintain the facility in a safe shutdown condition 6. Those items that control Safety Class Item listed above

  20. Constructivism: a naturalistic methodology for nursing inquiry.

    Science.gov (United States)

    Appleton, J V; King, L

    1997-12-01

    This article will explore the philosophical underpinnings of the constructivist research paradigm. Despite its increasing popularity in evaluative health research studies there is limited recognition of constructivism in popular research texts. Lincoln and Guba's original approach to constructivist methodology is outlined and a detailed framework for nursing research is offered. Fundamental issues and concerns surrounding this methodology are debated and differences between method and methodology are highlighted.

  1. PIXE methodology of rare earth element analysis and its applications

    International Nuclear Information System (INIS)

    Ma Xinpei

    1992-01-01

    The Proton Induced X-ray Emission (PIXE) methodology of rare earth element (REEs) analysis is discussed, including the significance of REE analysis, the principle of PIXE applied to REE, selection of characteristic X-ray for Lanthanide series elements, deconvolution of highly over lapped PIXE spectrum and minimum detection limit (MDL) of REEs. Some practical applications are presented. And the specialities of PIXE analysis to the high pure REE chemicals are discussed. (author)

  2. How to emerge from the conservatism in clinical research methodology?

    Science.gov (United States)

    Kotecki, Nuria; Penel, Nicolas; Awada, Ahmad

    2017-09-01

    Despite recent changes in clinical research methodology, many challenges remain in drug development methodology. Advances in molecular biology and cancer treatments have changed the clinical research landscape. Thus, we moved from empirical clinical oncology to molecular and immunological therapeutic approaches. Along with this move, adapted dose-limiting toxicities definitions, endpoints, and dose escalation methods have been proposed. Moreover, the classical frontier between phase I, phase II, and phase III has become unclear in particular for immunological approaches. So, investigators are facing major challenges in drug development methodology. We propose to individualize clinical research using innovative approaches to significantly improve patient outcomes and targeting what is considered unmet need. Integrating high level of translational research and performing well designed biomarker studies with great potential for clinical practice are of utmost importance. This could be performed within new models of clinical research networks and by building a strong collaboration between academic, cooperative groups, on-site investigators, and pharma.

  3. Application and licensing requirements of the Framatome ANP RLBLOCA methodology

    International Nuclear Information System (INIS)

    Martin, R.P.; Dunn, B.M.

    2004-01-01

    The Framatome ANP Realistic Large-Break LOCA methodology (FANP RLBLOCA) is an analysis approach approved by the US NRC for supporting the licensing basis of 3- and 4-loop Westinghouse PWRs and CE 2x4 PWRs. It was developed consistent with the NRC's Code Scaling, Applicability, and Uncertainty (CSAU) methodology for performing best-estimate large-break LOCAs. The CSAU methodology consists of three key elements with the second and third element addressing uncertainty identification and application. Unique to the CSAU methodology is the use of engineering judgment and the Process Identification and Ranking Table (PIRT) defined in the first element to lay the groundwork for achieving the ultimate goal of quantifying the total uncertainty in predicted measures of interest associated with the large-break LOCA. It is the PIRT that not only directs the methodology development, but also directs the methodology review. While the FANP RLBLOCA methodology was generically approved, a plant-specific application is customized in two ways addressing how the unique plant characterization 1) is translated to code input and 2) relates to the unique methodology licensing requirements. Related to the former, plants are required by 10 CFR 50.36 to define a technical specification limiting condition for operation based on the following criteria: 1. Installed instrumentation that is used in the control room to detect, and indicate, a significant abnormal degradation of the reactor coolant pressure boundary. 2. A process variable, design feature, or operating restriction that is an initial condition of a design basis accident or transient analysis that either assumes the failure of or presents a challenge to the integrity of a fission product barrier. 3. A structure, system, or component that is part of the primary success path and which functions or actuates to mitigate a design basis accident or transient that either assumes the failure of or presents a challenge to the integrity of a

  4. A development methodology for scientific software

    International Nuclear Information System (INIS)

    Cort, G.; Barrus, D.M.; Goldstone, J.A.; Miller, L.; Nelson, R.O.; Poore, R.V.

    1985-01-01

    We present the details of a software development methodology that addresses all phases of the software life cycle, yet is well suited for application by small projects with limited resources. The methodology has been developed at the Los Alamos Weapons Neutron Research (WNR) Facility and was utilized during the recent development of the WNR Data Acquisition Command Language. The methodology emphasizes the development and maintenance of comprehensive documentation for all software components. The impact of the methodology upon software quality and programmer productivity is assessed

  5. The Significance of Normativity

    DEFF Research Database (Denmark)

    Presskorn-Thygesen, Thomas

    and Weber. It does so by situating Durkheim and Weber in the context of Neo-Kantian philosophy, which prevailed among their contemporaries, and the chapter thereby reveals a series of under-thematised similarities not only with regard to their methodological positions, but also in their conception of social...... of social theory. In pursuing this overall research agenda, the dissertation contributes to a number of specific research literatures. Following two introductory and methodological chapters, Chapter 3 thus critically examines the analysis of normativity suggested in the recent attempts at transforming...... the methods of neo-classical economics into a broader form of social theory. The chapter thereby contributes to the critical discourses, particularly in philosophy of science, that challenge the validity of neo-classical economics and its underlying conception of practical rationality. In examining...

  6. Methodological themes and variations

    International Nuclear Information System (INIS)

    Tetlock, P.E.

    1989-01-01

    This paper reports on the tangible progress that has been made in clarifying the underlying processes that affect both the likelihood of war in general and of nuclear war in particular. It also illustrates how difficult it is to make progress in this area. Nonetheless, what has been achieved should not be minimized. We have learned a good deal on both the theoretical and the methodological fronts and, perhaps, most important, we have learned a good deal about the limits of our knowledge. Knowledge of our ignorance---especially in a policy domain where confident, even glib, causal assertions are so common---can be a major contribution in itself. The most important service the behavioral and social sciences can currently provide to the policy making community may well be to make thoughtful skepticism respectable: to sensitize those who make key decisions to the uncertainty surrounding our understanding of international conflict and to the numerous qualifications that now need to be attached to simple causal theories concerning the origins of war

  7. Current limiters

    Energy Technology Data Exchange (ETDEWEB)

    Loescher, D.H. [Sandia National Labs., Albuquerque, NM (United States). Systems Surety Assessment Dept.; Noren, K. [Univ. of Idaho, Moscow, ID (United States). Dept. of Electrical Engineering

    1996-09-01

    The current that flows between the electrical test equipment and the nuclear explosive must be limited to safe levels during electrical tests conducted on nuclear explosives at the DOE Pantex facility. The safest way to limit the current is to use batteries that can provide only acceptably low current into a short circuit; unfortunately this is not always possible. When it is not possible, current limiters, along with other design features, are used to limit the current. Three types of current limiters, the fuse blower, the resistor limiter, and the MOSFET-pass-transistor limiters, are used extensively in Pantex test equipment. Detailed failure mode and effects analyses were conducted on these limiters. Two other types of limiters were also analyzed. It was found that there is no best type of limiter that should be used in all applications. The fuse blower has advantages when many circuits must be monitored, a low insertion voltage drop is important, and size and weight must be kept low. However, this limiter has many failure modes that can lead to the loss of over current protection. The resistor limiter is simple and inexpensive, but is normally usable only on circuits for which the nominal current is less than a few tens of milliamperes. The MOSFET limiter can be used on high current circuits, but it has a number of single point failure modes that can lead to a loss of protective action. Because bad component placement or poor wire routing can defeat any limiter, placement and routing must be designed carefully and documented thoroughly.

  8. Methodological remarks on contraction theory

    DEFF Research Database (Denmark)

    Jouffroy, Jerome; Slotine, Jean-Jacques E.

    Because contraction analysis stems from a differential and incremental framework, the nature and methodology of contraction-based proofs are significantly different from those of their Lyapunov-based counterparts. This paper specifically studies this issue, and illustrates it by revisiting some c...... classical examples traditionally addressed using Lyapunov theory. Even in these cases, contraction tools can often yield significantly simplified analysis. The examples include adaptive control, robotics, and a proof of convergence of the deterministic Extended Kalman Filter....

  9. A Performance Tuning Methodology with Compiler Support

    Directory of Open Access Journals (Sweden)

    Oscar Hernandez

    2008-01-01

    Full Text Available We have developed an environment, based upon robust, existing, open source software, for tuning applications written using MPI, OpenMP or both. The goal of this effort, which integrates the OpenUH compiler and several popular performance tools, is to increase user productivity by providing an automated, scalable performance measurement and optimization system. In this paper we describe our environment, show how these complementary tools can work together, and illustrate the synergies possible by exploiting their individual strengths and combined interactions. We also present a methodology for performance tuning that is enabled by this environment. One of the benefits of using compiler technology in this context is that it can direct the performance measurements to capture events at different levels of granularity and help assess their importance, which we have shown to significantly reduce the measurement overheads. The compiler can also help when attempting to understand the performance results: it can supply information on how a code was translated and whether optimizations were applied. Our methodology combines two performance views of the application to find bottlenecks. The first is a high level view that focuses on OpenMP/MPI performance problems such as synchronization cost and load imbalances; the second is a low level view that focuses on hardware counter analysis with derived metrics that assess the efficiency of the code. Our experiments have shown that our approach can significantly reduce overheads for both profiling and tracing to acceptable levels and limit the number of times the application needs to be run with selected hardware counters. In this paper, we demonstrate the workings of this methodology by illustrating its use with selected NAS Parallel Benchmarks and a cloud resolving code.

  10. Tumor significant dose

    International Nuclear Information System (INIS)

    Supe, S.J.; Nagalaxmi, K.V.; Meenakshi, L.

    1983-01-01

    In the practice of radiotherapy, various concepts like NSD, CRE, TDF, and BIR are being used to evaluate the biological effectiveness of the treatment schedules on the normal tissues. This has been accepted as the tolerance of the normal tissue is the limiting factor in the treatment of cancers. At present when various schedules are tried, attention is therefore paid to the biological damage of the normal tissues only and it is expected that the damage to the cancerous tissues would be extensive enough to control the cancer. Attempt is made in the present work to evaluate the concent of tumor significant dose (TSD) which will represent the damage to the cancerous tissue. Strandquist in the analysis of a large number of cases of squamous cell carcinoma found that for the 5 fraction/week treatment, the total dose required to bring about the same damage for the cancerous tissue is proportional to T/sup -0.22/, where T is the overall time over which the dose is delivered. Using this finding the TSD was defined as DxN/sup -p/xT/sup -q/, where D is the total dose, N the number of fractions, T the overall time p and q are the exponents to be suitably chosen. The values of p and q are adjusted such that p+q< or =0.24, and p varies from 0.0 to 0.24 and q varies from 0.0 to 0.22. Cases of cancer of cervix uteri treated between 1978 and 1980 in the V. N. Cancer Centre, Kuppuswamy Naidu Memorial Hospital, Coimbatore, India were analyzed on the basis of these formulations. These data, coupled with the clinical experience, were used for choice of a formula for the TSD. Further, the dose schedules used in the British Institute of Radiology fraction- ation studies were also used to propose that the tumor significant dose is represented by DxN/sup -0.18/xT/sup -0.06/

  11. Scenario development methodologies

    International Nuclear Information System (INIS)

    Eng, T.; Hudson, J.; Stephansson, O.

    1994-11-01

    In the period 1981-1994, SKB has studied several methodologies to systematize and visualize all the features, events and processes (FEPs) that can influence a repository for radioactive waste in the future. All the work performed is based on the terminology and basic findings in the joint SKI/SKB work on scenario development presented in the SKB Technical Report 89-35. The methodologies studied are a) Event tree analysis, b) Influence diagrams and c) Rock Engineering Systems (RES) matrices. Each one of the methodologies is explained in this report as well as examples of applications. One chapter is devoted to a comparison between the two most promising methodologies, namely: Influence diagrams and the RES methodology. In conclusion a combination of parts of the Influence diagram and the RES methodology is likely to be a promising approach. 26 refs

  12. Reliability Centered Maintenance - Methodologies

    Science.gov (United States)

    Kammerer, Catherine C.

    2009-01-01

    Journal article about Reliability Centered Maintenance (RCM) methodologies used by United Space Alliance, LLC (USA) in support of the Space Shuttle Program at Kennedy Space Center. The USA Reliability Centered Maintenance program differs from traditional RCM programs because various methodologies are utilized to take advantage of their respective strengths for each application. Based on operational experience, USA has customized the traditional RCM methodology into a streamlined lean logic path and has implemented the use of statistical tools to drive the process. USA RCM has integrated many of the L6S tools into both RCM methodologies. The tools utilized in the Measure, Analyze, and Improve phases of a Lean Six Sigma project lend themselves to application in the RCM process. All USA RCM methodologies meet the requirements defined in SAE JA 1011, Evaluation Criteria for Reliability-Centered Maintenance (RCM) Processes. The proposed article explores these methodologies.

  13. Surreptitious, Evolving and Participative Ontology Development: An End-User Oriented Ontology Development Methodology

    Science.gov (United States)

    Bachore, Zelalem

    2012-01-01

    Ontology not only is considered to be the backbone of the semantic web but also plays a significant role in distributed and heterogeneous information systems. However, ontology still faces limited application and adoption to date. One of the major problems is that prevailing engineering-oriented methodologies for building ontologies do not…

  14. Quench limits

    International Nuclear Information System (INIS)

    Sapinski, M.

    2012-01-01

    With thirteen beam induced quenches and numerous Machine Development tests, the current knowledge of LHC magnets quench limits still contains a lot of unknowns. Various approaches to determine the quench limits are reviewed and results of the tests are presented. Attempt to reconstruct a coherent picture emerging from these results is taken. The available methods of computation of the quench levels are presented together with dedicated particle shower simulations which are necessary to understand the tests. The future experiments, needed to reach better understanding of quench limits as well as limits for the machine operation are investigated. The possible strategies to set BLM (Beam Loss Monitor) thresholds are discussed. (author)

  15. Dose limits

    International Nuclear Information System (INIS)

    Fitoussi, L.

    1987-12-01

    The dose limit is defined to be the level of harmfulness which must not be exceeded, so that an activity can be exercised in a regular manner without running a risk unacceptable to man and the society. The paper examines the effects of radiation categorised into stochastic and non-stochastic. Dose limits for workers and the public are discussed

  16. Introduction to LCA Methodology

    DEFF Research Database (Denmark)

    Hauschild, Michael Z.

    2018-01-01

    In order to offer the reader an overview of the LCA methodology in the preparation of the more detailed description of its different phases, a brief introduction is given to the methodological framework according to the ISO 14040 standard and the main elements of each of its phases. Emphasis...

  17. Methodologies, languages and tools

    International Nuclear Information System (INIS)

    Amako, Katsuya

    1994-01-01

    This is a summary of the open-quotes Methodologies, Languages and Toolsclose quotes session in the CHEP'94 conference. All the contributions to methodologies and languages are relevant to the object-oriented approach. Other topics presented are related to various software tools in the down-sized computing environment

  18. Archetype modeling methodology.

    Science.gov (United States)

    Moner, David; Maldonado, José Alberto; Robles, Montserrat

    2018-03-01

    Clinical Information Models (CIMs) expressed as archetypes play an essential role in the design and development of current Electronic Health Record (EHR) information structures. Although there exist many experiences about using archetypes in the literature, a comprehensive and formal methodology for archetype modeling does not exist. Having a modeling methodology is essential to develop quality archetypes, in order to guide the development of EHR systems and to allow the semantic interoperability of health data. In this work, an archetype modeling methodology is proposed. This paper describes its phases, the inputs and outputs of each phase, and the involved participants and tools. It also includes the description of the possible strategies to organize the modeling process. The proposed methodology is inspired by existing best practices of CIMs, software and ontology development. The methodology has been applied and evaluated in regional and national EHR projects. The application of the methodology provided useful feedback and improvements, and confirmed its advantages. The conclusion of this work is that having a formal methodology for archetype development facilitates the definition and adoption of interoperable archetypes, improves their quality, and facilitates their reuse among different information systems and EHR projects. Moreover, the proposed methodology can be also a reference for CIMs development using any other formalism. Copyright © 2018 Elsevier Inc. All rights reserved.

  19. Menopause and Methodological Doubt

    Science.gov (United States)

    Spence, Sheila

    2005-01-01

    Menopause and methodological doubt begins by making a tongue-in-cheek comparison between Descartes' methodological doubt and the self-doubt that can arise around menopause. A hermeneutic approach is taken in which Cartesian dualism and its implications for the way women are viewed in society are examined, both through the experiences of women…

  20. VEM: Virtual Enterprise Methodology

    DEFF Research Database (Denmark)

    Tølle, Martin; Vesterager, Johan

    2003-01-01

    This chapter presents a virtual enterprise methodology (VEM) that outlines activities to consider when setting up and managing virtual enterprises (VEs). As a methodology the VEM helps companies to ask the right questions when preparing for and setting up an enterprise network, which works...

  1. Data Centric Development Methodology

    Science.gov (United States)

    Khoury, Fadi E.

    2012-01-01

    Data centric applications, an important effort of software development in large organizations, have been mostly adopting a software methodology, such as a waterfall or Rational Unified Process, as the framework for its development. These methodologies could work on structural, procedural, or object oriented based applications, but fails to capture…

  2. The Methodology of Magpies

    Science.gov (United States)

    Carter, Susan

    2014-01-01

    Arts/Humanities researchers frequently do not explain methodology overtly; instead, they "perform" it through their use of language, textual and historic cross-reference, and theory. Here, methodologies from literary studies are shown to add to Higher Education (HE) an exegetical and critically pluralist approach. This includes…

  3. Application of fault current limiters

    Energy Technology Data Exchange (ETDEWEB)

    Neumann, A.

    2007-11-30

    This report presents the results of a study commissioned by the Department for Business, Enterprise and Industry (BERR; formerly the Department of Trade and Industry) into the application of fault current limiters in the UK. The study reviewed the current state of fault current limiter (FCL) technology and regulatory position in relation to all types of current limiters. It identified significant research and development work with respect to medium voltage FCLs and a move to high voltage. Appropriate FCL technologies being developed include: solid state breakers; superconducting FCLs (including superconducting transformers); magnetic FCLs; and active network controllers. Commercialisation of these products depends on successful field tests and experience, plus material development in the case of high temperature superconducting FCL technologies. The report describes FCL techniques, the current state of FCL technologies, practical applications and future outlook for FCL technologies, distribution fault level analysis and an outline methodology for assessing the materiality of the fault level problem. A roadmap is presented that provides an 'action agenda' to advance the fault level issues associated with low carbon networks.

  4. Implementation impacts of PRL methodology

    International Nuclear Information System (INIS)

    Caudill, J.A.; Krupa, J.F.; Meadors, R.E.; Odum, J.V.; Rodrigues, G.C.

    1993-02-01

    This report responds to a DOE-SR request to evaluate the impacts from implementation of the proposed Plutonium Recovery Limit (PRL) methodology. The PRL Methodology is based on cost minimization for decisions to discard or recover plutonium contained in scrap, residues, and other plutonium bearing materials. Implementation of the PRL methodology may result in decisions to declare as waste certain plutonium bearing materials originally considered to be a recoverable plutonium product. Such decisions may have regulatory impacts, because any material declared to be waste would immediately be subject to provisions of the Resource Conservation and Recovery Act (RCRA). The decision to discard these materials will have impacts on waste storage, treatment, and disposal facilities. Current plans for the de-inventory of plutonium processing facilities have identified certain materials as candidates for discard based upon the economic considerations associated with extending the operating schedules for recovery of the contained plutonium versus potential waste disposal costs. This report evaluates the impacts of discarding those materials as proposed by the F Area De-Inventory Plan and compares the De-Inventory Plan assessments with conclusions from application of the PRL. The impact analysis was performed for those materials proposed as potential candidates for discard by the De-Inventory Plan. The De-Inventory Plan identified 433 items, containing approximately 1% of the current SRS Pu-239 inventory, as not appropriate for recovery as the site moves to complete the mission of F-Canyon and FB-Line. The materials were entered into storage awaiting recovery as product under the Department's previous Economic Discard Limit (EDL) methodology which valued plutonium at its incremental cost of production in reactors. An application of Departmental PRLs to the subject 433 items revealed that approximately 40% of them would continue to be potentially recoverable as product plutonium

  5. Inverse Limits

    CERN Document Server

    Ingram, WT

    2012-01-01

    Inverse limits provide a powerful tool for constructing complicated spaces from simple ones. They also turn the study of a dynamical system consisting of a space and a self-map into a study of a (likely more complicated) space and a self-homeomorphism. In four chapters along with an appendix containing background material the authors develop the theory of inverse limits. The book begins with an introduction through inverse limits on [0,1] before moving to a general treatment of the subject. Special topics in continuum theory complete the book. Although it is not a book on dynamics, the influen

  6. Methodology for projecting the limits of nuclear power growth

    International Nuclear Information System (INIS)

    Francis, J.M.; Omberg, R.P.

    1981-06-01

    A scenario using only the most conservative, and yet reasonable, assumptions on GNP growth is constructed, and from this, electrical growth is inferred. Implicit in this technique is the assumption that most new energy demand will arise from the industrial sector. Thus, in the commercial and residential sectors, increasing demand by consumers is offset by new conservation techniques for little net change in energy demand. Consequently, this approach emphasizes the need for conservation as well as the need for new generating capability. The emphasis on coal and nuclear power is described

  7. The ExternE project: methodology, objectives and limitations

    International Nuclear Information System (INIS)

    Rabl, A.; Spadaro, J.V.

    2002-01-01

    This paper presents a summary of recent studies on external costs of energy systems, in particular the ExternE (External Costs of Energy) Project of the European Commission. To evaluate the impact and damage cost of a pollutant, one needs to carry out an impact pathway analysis; this involves the calculation of increased pollutant concentrations in all affected regions due to an incremental emission (e.g. μg/m 3 of particles, using models of atmospheric dispersion and chemistry), followed by the calculation of physical impacts (e.g. number of cases of asthma due to these particles, using a dose-response function). The entire so-called fuel chain (or fuel cycle) is evaluated and compared on the basis of delivered end use energy. Even though the uncertainties are large, the results provide substantial evidence that the classical air pollutants (particles, NO x and SO x ) from the combustion of fossil fuels impose a heavy toll, in addition to the cost of global warming. The external costs are especially large for coal; even for 'good current technology' they may be comparable to the price of electricity. For natural gas the external costs are about a third to a half of coal. The external costs of nuclear are small compared to the price of electricity (at most a few %), and so are the external costs of most renewable energy systems. (authors)

  8. Soft systems methodology: other voices

    OpenAIRE

    Holwell, Sue

    2000-01-01

    This issue of Systemic Practice and Action Research, celebrating the work of Peter Checkland, in the particular nature and development of soft systems methodology (SSM), would not have happened unless the work was seen by others as being important. No significant contribution to thinking happens without a secondary literature developing. Not surprisingly, many commentaries have accompanied the ongoing development of SSM. Some of these are insightful, some full of errors, and some include both...

  9. Artificial Intelligence Techniques and Methodology

    OpenAIRE

    Carbonell, Jaime G.; Sleeman, Derek

    1982-01-01

    Two closely related aspects of artificial intelligence that have received comparatively little attention in the recent literature are research methodology, and the analysis of computational techniques that span multiple application areas. We believe both issues to be increasingly significant as Artificial Intelligence matures into a science and spins off major application efforts. It is imperative to analyze the repertoire of AI methods with respect to past experience, utility in new domains,...

  10. Disposal Criticality Analysis Methodology Topical Report

    International Nuclear Information System (INIS)

    Horton, D.G.

    1998-01-01

    The fundamental objective of this topical report is to present the planned risk-informed disposal criticality analysis methodology to the NRC to seek acceptance that the principles of the methodology and the planned approach to validating the methodology are sound. The design parameters and environmental assumptions within which the waste forms will reside are currently not fully established and will vary with the detailed waste package design, engineered barrier design, repository design, and repository layout. Therefore, it is not practical to present the full validation of the methodology in this report, though a limited validation over a parameter range potentially applicable to the repository is presented for approval. If the NRC accepts the methodology as described in this section, the methodology will be fully validated for repository design applications to which it will be applied in the License Application and its references. For certain fuel types (e.g., intact naval fuel), a ny processes, criteria, codes or methods different from the ones presented in this report will be described in separate addenda. These addenda will employ the principles of the methodology described in this report as a foundation. Departures from the specifics of the methodology presented in this report will be described in the addenda

  11. Disposal Criticality Analysis Methodology Topical Report

    International Nuclear Information System (INIS)

    D.G. Horton

    1998-01-01

    The fundamental objective of this topical report is to present the planned risk-informed disposal criticality analysis methodology to the NRC to seek acceptance that the principles of the methodology and the planned approach to validating the methodology are sound. The design parameters and environmental assumptions within which the waste forms will reside are currently not fully established and will vary with the detailed waste package design, engineered barrier design, repository design, and repository layout. Therefore, it is not practical to present the full validation of the methodology in this report, though a limited validation over a parameter range potentially applicable to the repository is presented for approval. If the NRC accepts the methodology as described in this section, the methodology will be fully validated for repository design applications to which it will be applied in the License Application and its references. For certain fuel types (e.g., intact naval fuel), any processes, criteria, codes or methods different from the ones presented in this report will be described in separate addenda. These addenda will employ the principles of the methodology described in this report as a foundation. Departures from the specifics of the methodology presented in this report will be described in the addenda

  12. Detecting Novelty and Significance

    Science.gov (United States)

    Ferrari, Vera; Bradley, Margaret M.; Codispoti, Maurizio; Lang, Peter J.

    2013-01-01

    Studies of cognition often use an “oddball” paradigm to study effects of stimulus novelty and significance on information processing. However, an oddball tends to be perceptually more novel than the standard, repeated stimulus as well as more relevant to the ongoing task, making it difficult to disentangle effects due to perceptual novelty and stimulus significance. In the current study, effects of perceptual novelty and significance on ERPs were assessed in a passive viewing context by presenting repeated and novel pictures (natural scenes) that either signaled significant information regarding the current context or not. A fronto-central N2 component was primarily affected by perceptual novelty, whereas a centro-parietal P3 component was modulated by both stimulus significance and novelty. The data support an interpretation that the N2 reflects perceptual fluency and is attenuated when a current stimulus matches an active memory representation and that the amplitude of the P3 reflects stimulus meaning and significance. PMID:19400680

  13. Significant NRC Enforcement Actions

    Data.gov (United States)

    Nuclear Regulatory Commission — This dataset provides a list of Nuclear Regulartory Commission (NRC) issued significant enforcement actions. These actions, referred to as "escalated", are issued by...

  14. Design Methodology - Design Synthesis

    DEFF Research Database (Denmark)

    Andreasen, Mogens Myrup

    2003-01-01

    Design Methodology is part of our practice and our knowledge about designing, and it has been strongly supported by the establishing and work of a design research community. The aim of this article is to broaden the reader¿s view of designing and Design Methodology. This is done by sketching...... the development of Design Methodology through time and sketching some important approaches and methods. The development is mainly forced by changing industrial condition, by the growth of IT support for designing, but also by the growth of insight into designing created by design researchers.......ABSTRACT Design Methodology shall be seen as our understanding of how to design; it is an early (emerging late 60ies) and original articulation of teachable and learnable methodics. The insight is based upon two sources: the nature of the designed artefacts and the nature of human designing. Today...

  15. GPS system simulation methodology

    Science.gov (United States)

    Ewing, Thomas F.

    1993-01-01

    The following topics are presented: background; Global Positioning System (GPS) methodology overview; the graphical user interface (GUI); current models; application to space nuclear power/propulsion; and interfacing requirements. The discussion is presented in vugraph form.

  16. Hazard classification methodology

    International Nuclear Information System (INIS)

    Brereton, S.J.

    1996-01-01

    This document outlines the hazard classification methodology used to determine the hazard classification of the NIF LTAB, OAB, and the support facilities on the basis of radionuclides and chemicals. The hazard classification determines the safety analysis requirements for a facility

  17. Nonlinear Image Denoising Methodologies

    National Research Council Canada - National Science Library

    Yufang, Bao

    2002-01-01

    In this thesis, we propose a theoretical as well as practical framework to combine geometric prior information to a statistical/probabilistic methodology in the investigation of a denoising problem...

  18. Clinical trial methodology

    National Research Council Canada - National Science Library

    Peace, Karl E; Chen, Ding-Geng

    2011-01-01

    "Now viewed as its own scientific discipline, clinical trial methodology encompasses the methods required for the protection of participants in a clinical trial and the methods necessary to provide...

  19. A methodology for acquiring qualitative knowledge for probabilistic graphical models

    DEFF Research Database (Denmark)

    Kjærulff, Uffe Bro; Madsen, Anders L.

    2004-01-01

    We present a practical and general methodology that simplifies the task of acquiring and formulating qualitative knowledge for constructing probabilistic graphical models (PGMs). The methodology efficiently captures and communicates expert knowledge, and has significantly eased the model...

  20. Methodology for technical risk assessment

    International Nuclear Information System (INIS)

    Waganer, L.M.; Zuckerman, D.S.

    1983-01-01

    A methodology has been developed for and applied to the assessment of the technical risks associated with an evolving technology. This methodology, originally developed for fusion by K. W. Billman and F. R. Scott at EPRI, has been applied to assess the technical risk of a fuel system for a fusion reactor. Technical risk is defined as the risk that a particular technology or component which is currently under development will not achieve a set of required technical specifications (i.e. probability of failure). The individual steps in the technical risk assessment are summarized. The first step in this methodology is to clearly and completely quantify the technical requirements for the particular system being examined. The next step is to identify and define subsystems and various options which appear capable of achieving the required technical performance. The subsystem options are then characterized regarding subsystem functions, interface requirements with the subsystems and systems, important components, developmental obstacles and technical limitations. Key technical subsystem performance parameters are identified which directly or indirectly relate to the system technical specifications. Past, existing and future technical performance data from subsystem experts are obtained by using a Bayesian Interrogation technique. The input data is solicited in the form of probability functions. Thus the output performance of the system is expressed as probability functions

  1. Methodology for astronaut reconditioning research.

    Science.gov (United States)

    Beard, David J; Cook, Jonathan A

    2017-01-01

    Space medicine offers some unique challenges, especially in terms of research methodology. A specific challenge for astronaut reconditioning involves identification of what aspects of terrestrial research methodology hold and which require modification. This paper reviews this area and presents appropriate solutions where possible. It is concluded that spaceflight rehabilitation research should remain question/problem driven and is broadly similar to the terrestrial equivalent on small populations, such as rare diseases and various sports. Astronauts and Medical Operations personnel should be involved at all levels to ensure feasibility of research protocols. There is room for creative and hybrid methodology but careful systematic observation is likely to be more achievable and fruitful than complex trial based comparisons. Multi-space agency collaboration will be critical to pool data from small groups of astronauts with the accepted use of standardised outcome measures across all agencies. Systematic reviews will be an essential component. Most limitations relate to the inherent small sample size available for human spaceflight research. Early adoption of a co-operative model for spaceflight rehabilitation research is therefore advised. Copyright © 2016 Elsevier Ltd. All rights reserved.

  2. Methodology of sustainability accounting

    Directory of Open Access Journals (Sweden)

    O.H. Sokil

    2017-03-01

    Full Text Available Modern challenges of the theory and methodology of accounting are realized through the formation and implementation of new concepts, the purpose of which is to meet the needs of users in standard and unique information. The development of a methodology for sustainability accounting is a key aspect of the management of an economic entity. The purpose of the article is to form the methodological bases of accounting for sustainable development and determine its goals, objectives, object, subject, methods, functions and key aspects. The author analyzes the theoretical bases of the definition and considers the components of the traditional accounting methodology. Generalized structural diagram of the methodology for accounting for sustainable development is offered in the article. The complex of methods and principles of sustainable development accounting for systematized and non-standard provisions has been systematized. The new system of theoretical and methodological provisions of accounting for sustainable development is justified in the context of determining its purpose, objective, subject, object, methods, functions and key aspects.

  3. Preliminary safety analysis methodology for the SMART

    Energy Technology Data Exchange (ETDEWEB)

    Bae, Kyoo Hwan; Chung, Y. J.; Kim, H. C.; Sim, S. K.; Lee, W. J.; Chung, B. D.; Song, J. H. [Korea Atomic Energy Research Institute, Taejeon (Korea)

    2000-03-01

    This technical report was prepared for a preliminary safety analysis methodology of the 330MWt SMART (System-integrated Modular Advanced ReacTor) which has been developed by Korea Atomic Energy Research Institute (KAERI) and funded by the Ministry of Science and Technology (MOST) since July 1996. This preliminary safety analysis methodology has been used to identify an envelope for the safety of the SMART conceptual design. As the SMART design evolves, further validated final safety analysis methodology will be developed. Current licensing safety analysis methodology of the Westinghouse and KSNPP PWRs operating and under development in Korea as well as the Russian licensing safety analysis methodology for the integral reactors have been reviewed and compared to develop the preliminary SMART safety analysis methodology. SMART design characteristics and safety systems have been reviewed against licensing practices of the PWRs operating or KNGR (Korean Next Generation Reactor) under construction in Korea. Detailed safety analysis methodology has been developed for the potential SMART limiting events of main steam line break, main feedwater pipe break, loss of reactor coolant flow, CEA withdrawal, primary to secondary pipe break and the small break loss of coolant accident. SMART preliminary safety analysis methodology will be further developed and validated in parallel with the safety analysis codes as the SMART design further evolves. Validated safety analysis methodology will be submitted to MOST as a Topical Report for a review of the SMART licensing safety analysis methodology. Thus, it is recommended for the nuclear regulatory authority to establish regulatory guides and criteria for the integral reactor. 22 refs., 18 figs., 16 tabs. (Author)

  4. IS THERE A NEED FOR THE POST-NON-CLASSICAL METHODOLOGY IN PEDAGOGY?

    Directory of Open Access Journals (Sweden)

    Vladislav L. Benin

    2014-01-01

    Full Text Available  The publication continues the discussion, started by Yu.V. Larina in ≪Education in Search of the Congruity Principle≫ concerning the modern methodology of pedagogical science; and identifies the criteria of the given principle along with the limitations of the post-non-classical approaches to the humanities.Methods: The methodology involves the analysis of existing view points, formalization of characteristics of post-non-classical science, and reflection of pedagogical principle of cultural conformity.Results: The research outcomes demonstrate that the gradual undermining of the fundamental science results in erosion of methodological background. In case of interdisciplinary subjects, a researcher is forced to integrate different methods and techniques, which provokes further disruption of the methodology.Scientific novelty: The author classifies and extrapolates to the humanities sphere the main characteristics of post-non-classical science; and makes a conclusion about the gradual decline of researchers’ training quality due to the lack of methodological clarity, and aggressive forms of science vulgarization leading to spontaneous development of clipping methodology.The practical significance: Implementation of the research findings can activate both theoretical and methodological aspects of teacher’s training and selfeducation.

  5. Regional Shelter Analysis Methodology

    Energy Technology Data Exchange (ETDEWEB)

    Dillon, Michael B. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Dennison, Deborah [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Kane, Jave [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Walker, Hoyt [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Miller, Paul [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2015-08-01

    The fallout from a nuclear explosion has the potential to injure or kill 100,000 or more people through exposure to external gamma (fallout) radiation. Existing buildings can reduce radiation exposure by placing material between fallout particles and exposed people. Lawrence Livermore National Laboratory was tasked with developing an operationally feasible methodology that could improve fallout casualty estimates. The methodology, called a Regional Shelter Analysis, combines the fallout protection that existing buildings provide civilian populations with the distribution of people in various locations. The Regional Shelter Analysis method allows the consideration of (a) multiple building types and locations within buildings, (b) country specific estimates, (c) population posture (e.g., unwarned vs. minimally warned), and (d) the time of day (e.g., night vs. day). The protection estimates can be combined with fallout predictions (or measurements) to (a) provide a more accurate assessment of exposure and injury and (b) evaluate the effectiveness of various casualty mitigation strategies. This report describes the Regional Shelter Analysis methodology, highlights key operational aspects (including demonstrating that the methodology is compatible with current tools), illustrates how to implement the methodology, and provides suggestions for future work.

  6. The policy trail methodology

    DEFF Research Database (Denmark)

    Holford, John; Larson, Anne; Melo, Susana

    of ‘policy trail’, arguing that it can overcome ‘methodological nationalism’ and link structure and agency in research on the ‘European educational space’. The ‘trail’ metaphor, she suggests, captures the intentionality and the erratic character of policy. The trail connects sites and brings about change......, but – although policy may be intended to be linear, with specific outcomes – policy often has to bend, and sometimes meets insurmountable obstacles. This symposium outlines and develops the methodology, but also reports on research undertaken within a major FP7 project (LLLIght’in’Europe, 2012-15) which made use......In recent years, the “policy trail” has been proposed as a methodology appropriate to the shifting and fluid governance of lifelong learning in the late modern world (Holford et al. 2013, Holford et al. 2013, Cort 2014). The contemporary environment is marked by multi-level governance (global...

  7. Changing methodologies in TESOL

    CERN Document Server

    Spiro, Jane

    2013-01-01

    Covering core topics from vocabulary and grammar to teaching, writing speaking and listening, this textbook shows you how to link research to practice in TESOL methodology. It emphasises how current understandings have impacted on the language classroom worldwide and investigates the meaning of 'methods' and 'methodology' and the importance of these for the teacher: as well as the underlying assumptions and beliefs teachers bring to bear in their practice. By introducing you to language teaching approaches, you will explore the way these are influenced by developments in our understanding of l

  8. Creativity in phenomenological methodology

    DEFF Research Database (Denmark)

    Dreyer, Pia; Martinsen, Bente; Norlyk, Annelise

    2014-01-01

    on the methodologies of van Manen, Dahlberg, Lindseth & Norberg, the aim of this paper is to argue that the increased focus on creativity and arts in research methodology is valuable to gain a deeper insight into lived experiences. We illustrate this point through examples from empirical nursing studies, and discuss......Nursing research is often concerned with lived experiences in human life using phenomenological and hermeneutic approaches. These empirical studies may use different creative expressions and art-forms to describe and enhance an embodied and personalised understanding of lived experiences. Drawing...... may support a respectful renewal of phenomenological research traditions in nursing research....

  9. Significant Tsunami Events

    Science.gov (United States)

    Dunbar, P. K.; Furtney, M.; McLean, S. J.; Sweeney, A. D.

    2014-12-01

    Tsunamis have inflicted death and destruction on the coastlines of the world throughout history. The occurrence of tsunamis and the resulting effects have been collected and studied as far back as the second millennium B.C. The knowledge gained from cataloging and examining these events has led to significant changes in our understanding of tsunamis, tsunami sources, and methods to mitigate the effects of tsunamis. The most significant, not surprisingly, are often the most devastating, such as the 2011 Tohoku, Japan earthquake and tsunami. The goal of this poster is to give a brief overview of the occurrence of tsunamis and then focus specifically on several significant tsunamis. There are various criteria to determine the most significant tsunamis: the number of deaths, amount of damage, maximum runup height, had a major impact on tsunami science or policy, etc. As a result, descriptions will include some of the most costly (2011 Tohoku, Japan), the most deadly (2004 Sumatra, 1883 Krakatau), and the highest runup ever observed (1958 Lituya Bay, Alaska). The discovery of the Cascadia subduction zone as the source of the 1700 Japanese "Orphan" tsunami and a future tsunami threat to the U.S. northwest coast, contributed to the decision to form the U.S. National Tsunami Hazard Mitigation Program. The great Lisbon earthquake of 1755 marked the beginning of the modern era of seismology. Knowledge gained from the 1964 Alaska earthquake and tsunami helped confirm the theory of plate tectonics. The 1946 Alaska, 1952 Kuril Islands, 1960 Chile, 1964 Alaska, and the 2004 Banda Aceh, tsunamis all resulted in warning centers or systems being established.The data descriptions on this poster were extracted from NOAA's National Geophysical Data Center (NGDC) global historical tsunami database. Additional information about these tsunamis, as well as water level data can be found by accessing the NGDC website www.ngdc.noaa.gov/hazard/

  10. Software engineering methodologies and tools

    Science.gov (United States)

    Wilcox, Lawrence M.

    1993-01-01

    Over the years many engineering disciplines have developed, including chemical, electronic, etc. Common to all engineering disciplines is the use of rigor, models, metrics, and predefined methodologies. Recently, a new engineering discipline has appeared on the scene, called software engineering. For over thirty years computer software has been developed and the track record has not been good. Software development projects often miss schedules, are over budget, do not give the user what is wanted, and produce defects. One estimate is there are one to three defects per 1000 lines of deployed code. More and more systems are requiring larger and more complex software for support. As this requirement grows, the software development problems grow exponentially. It is believed that software quality can be improved by applying engineering principles. Another compelling reason to bring the engineering disciplines to software development is productivity. It has been estimated that productivity of producing software has only increased one to two percent a year in the last thirty years. Ironically, the computer and its software have contributed significantly to the industry-wide productivity, but computer professionals have done a poor job of using the computer to do their job. Engineering disciplines and methodologies are now emerging supported by software tools that address the problems of software development. This paper addresses some of the current software engineering methodologies as a backdrop for the general evaluation of computer assisted software engineering (CASE) tools from actual installation of and experimentation with some specific tools.

  11. How Six Sigma Methodology Improved Doctors' Performance

    Science.gov (United States)

    Zafiropoulos, George

    2015-01-01

    Six Sigma methodology was used in a District General Hospital to assess the effect of the introduction of an educational programme to limit unnecessary admissions. The performance of the doctors involved in the programme was assessed. Ishikawa Fishbone and 5 S's were initially used and Pareto analysis of their findings was performed. The results…

  12. Computer Network Operations Methodology

    Science.gov (United States)

    2004-03-01

    means of their computer information systems. Disrupt - This type of attack focuses on disrupting as “attackers might surreptitiously reprogram enemy...by reprogramming the computers that control distribution within the power grid. A disruption attack introduces disorder and inhibits the effective...between commanders. The use of methodologies is widespread and done subconsciously to assist individuals in decision making. The processes that

  13. SCI Hazard Report Methodology

    Science.gov (United States)

    Mitchell, Michael S.

    2010-01-01

    This slide presentation reviews the methodology in creating a Source Control Item (SCI) Hazard Report (HR). The SCI HR provides a system safety risk assessment for the following Ares I Upper Stage Production Contract (USPC) components (1) Pyro Separation Systems (2) Main Propulsion System (3) Reaction and Roll Control Systems (4) Thrust Vector Control System and (5) Ullage Settling Motor System components.

  14. A Functional HAZOP Methodology

    DEFF Research Database (Denmark)

    Liin, Netta; Lind, Morten; Jensen, Niels

    2010-01-01

    A HAZOP methodology is presented where a functional plant model assists in a goal oriented decomposition of the plant purpose into the means of achieving the purpose. This approach leads to nodes with simple functions from which the selection of process and deviation variables follow directly...

  15. Complicating Methodological Transparency

    Science.gov (United States)

    Bridges-Rhoads, Sarah; Van Cleave, Jessica; Hughes, Hilary E.

    2016-01-01

    A historical indicator of the quality, validity, and rigor of qualitative research has been the documentation and disclosure of the behind-the-scenes work of the researcher. In this paper, we use what we call "methodological data" as a tool to complicate the possibility and desirability of such transparency. Specifically, we draw on our…

  16. Methodological Advances in Dea

    NARCIS (Netherlands)

    L. Cherchye (Laurens); G.T. Post (Thierry)

    2001-01-01

    textabstractWe survey the methodological advances in DEA over the last 25 years and discuss the necessary conditions for a sound empirical application. We hope this survey will contribute to the further dissemination of DEA, the knowledge of its relative strengths and weaknesses, and the tools

  17. NUSAM Methodology for Assessment.

    Energy Technology Data Exchange (ETDEWEB)

    Leach, Janice [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Snell, Mark K. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2016-07-01

    This document provides a methodology for the performance-based assessment of security systems designed for the protection of nuclear and radiological materials and the processes that produce and/or involve them. It is intended for use with both relatively simple installations and with highly regulated complex sites with demanding security requirements.

  18. MIRD methodology. Part 1

    International Nuclear Information System (INIS)

    Rojo, Ana M.

    2004-01-01

    This lecture develops the MIRD (Medical Internal Radiation Dose) methodology for the evaluation of the internal dose due to the administration of radiopharmaceuticals. In this first part, the basic concepts and the main equations are presented. The ICRP Dosimetric System is also explained. (author)

  19. Response Surface Methodology

    NARCIS (Netherlands)

    Kleijnen, Jack P.C.

    2014-01-01

    Abstract: This chapter first summarizes Response Surface Methodology (RSM), which started with Box and Wilson’s article in 1951 on RSM for real, non-simulated systems. RSM is a stepwise heuristic that uses first-order polynomials to approximate the response surface locally. An estimated polynomial

  20. MIRD methodology. Part 2

    International Nuclear Information System (INIS)

    Gomez Parada, Ines

    2004-01-01

    This paper develops the MIRD (Medical Internal Radiation Dose) methodology for the evaluation of the internal dose due to the administration of radiopharmaceuticals. In this second part, different methods for the calculation of the accumulated activity are presented, together with the effective half life definition. Different forms of Retention Activity curves are also shown. (author)

  1. Testing Significance Testing

    Directory of Open Access Journals (Sweden)

    Joachim I. Krueger

    2018-04-01

    Full Text Available The practice of Significance Testing (ST remains widespread in psychological science despite continual criticism of its flaws and abuses. Using simulation experiments, we address four concerns about ST and for two of these we compare ST’s performance with prominent alternatives. We find the following: First, the 'p' values delivered by ST predict the posterior probability of the tested hypothesis well under many research conditions. Second, low 'p' values support inductive inferences because they are most likely to occur when the tested hypothesis is false. Third, 'p' values track likelihood ratios without raising the uncertainties of relative inference. Fourth, 'p' values predict the replicability of research findings better than confidence intervals do. Given these results, we conclude that 'p' values may be used judiciously as a heuristic tool for inductive inference. Yet, 'p' values cannot bear the full burden of inference. We encourage researchers to be flexible in their selection and use of statistical methods.

  2. Predicting significant torso trauma.

    Science.gov (United States)

    Nirula, Ram; Talmor, Daniel; Brasel, Karen

    2005-07-01

    Identification of motor vehicle crash (MVC) characteristics associated with thoracoabdominal injury would advance the development of automatic crash notification systems (ACNS) by improving triage and response times. Our objective was to determine the relationships between MVC characteristics and thoracoabdominal trauma to develop a torso injury probability model. Drivers involved in crashes from 1993 to 2001 within the National Automotive Sampling System were reviewed. Relationships between torso injury and MVC characteristics were assessed using multivariate logistic regression. Receiver operating characteristic curves were used to compare the model to current ACNS models. There were a total of 56,466 drivers. Age, ejection, braking, avoidance, velocity, restraints, passenger-side impact, rollover, and vehicle weight and type were associated with injury (p < 0.05). The area under the receiver operating characteristic curve (83.9) was significantly greater than current ACNS models. We have developed a thoracoabdominal injury probability model that may improve patient triage when used with ACNS.

  3. Gas revenue increasingly significant

    International Nuclear Information System (INIS)

    Megill, R.E.

    1991-01-01

    This paper briefly describes the wellhead prices of natural gas compared to crude oil over the past 70 years. Although natural gas prices have never reached price parity with crude oil, the relative value of a gas BTU has been increasing. It is one of the reasons that the total amount of money coming from natural gas wells is becoming more significant. From 1920 to 1955 the revenue at the wellhead for natural gas was only about 10% of the money received by producers. Most of the money needed for exploration, development, and production came from crude oil. At present, however, over 40% of the money from the upstream portion of the petroleum industry is from natural gas. As a result, in a few short years natural gas may become 50% of the money revenues generated from wellhead production facilities

  4. Uranium chemistry: significant advances

    International Nuclear Information System (INIS)

    Mazzanti, M.

    2011-01-01

    The author reviews recent progress in uranium chemistry achieved in CEA laboratories. Like its neighbors in the Mendeleev chart uranium undergoes hydrolysis, oxidation and disproportionation reactions which make the chemistry of these species in water highly complex. The study of the chemistry of uranium in an anhydrous medium has led to correlate the structural and electronic differences observed in the interaction of uranium(III) and the lanthanides(III) with nitrogen or sulfur molecules and the effectiveness of these molecules in An(III)/Ln(III) separation via liquid-liquid extraction. Recent work on the redox reactivity of trivalent uranium U(III) in an organic medium with molecules such as water or an azide ion (N 3 - ) in stoichiometric quantities, led to extremely interesting uranium aggregates particular those involved in actinide migration in the environment or in aggregation problems in the fuel processing cycle. Another significant advance was the discovery of a compound containing the uranyl ion with a degree of oxidation (V) UO 2 + , obtained by oxidation of uranium(III). Recently chemists have succeeded in blocking the disproportionation reaction of uranyl(V) and in stabilizing polymetallic complexes of uranyl(V), opening the way to to a systematic study of the reactivity and the electronic and magnetic properties of uranyl(V) compounds. (A.C.)

  5. Meaning and significance of

    Directory of Open Access Journals (Sweden)

    Ph D Student Roman Mihaela

    2011-05-01

    Full Text Available The concept of "public accountability" is a challenge for political science as a new concept in this area in full debate and developement ,both in theory and practice. This paper is a theoretical approach of displaying some definitions, relevant meanings and significance odf the concept in political science. The importance of this concept is that although originally it was used as a tool to improve effectiveness and eficiency of public governance, it has gradually become a purpose it itself. "Accountability" has become an image of good governance first in the United States of America then in the European Union.Nevertheless,the concept is vaguely defined and provides ambiguous images of good governance.This paper begins with the presentation of some general meanings of the concept as they emerge from specialized dictionaries and ancyclopaedies and continues with the meanings developed in political science. The concept of "public accontability" is rooted in economics and management literature,becoming increasingly relevant in today's political science both in theory and discourse as well as in practice in formulating and evaluating public policies. A first conclusin that emerges from, the analysis of the evolution of this term is that it requires a conceptual clarification in political science. A clear definition will then enable an appropriate model of proving the system of public accountability in formulating and assessing public policies, in order to implement a system of assessment and monitoring thereof.

  6. Review and evaluation of paleohydrologic methodologies

    Energy Technology Data Exchange (ETDEWEB)

    Foley, M.G.; Zimmerman, D.A.; Doesburg, J.M.; Thorne, P.D.

    1982-12-01

    A literature review was conducted to identify methodologies that could be used to interpret paleohydrologic environments. Paleohydrology is the study of past hydrologic systems or of the past behavior of an existing hydrologic system. The purpose of the review was to evaluate how well these methodologies could be applied to the siting of low-level radioactive waste facilities. The computer literature search queried five bibliographical data bases containing over five million citations of technical journals, books, conference papers, and reports. Two data-base searches (United States Geological Survey - USGS) and a manual search were also conducted. The methodologies were examined for data requirements and sensitivity limits. Paleohydrologic interpretations are uncertain because of the effects of time on hydrologic and geologic systems and because of the complexity of fluvial systems. Paleoflow determinations appear in many cases to be order-of-magnitude estimates. However, the methodologies identified in this report mitigate this uncertainty when used collectively as well as independently. That is, the data from individual methodologies can be compared or combined to corroborate hydrologic predictions. In this manner, paleohydrologic methodologies are viable tools to assist in evaluating the likely future hydrology of low-level radioactive waste sites.

  7. Review and evaluation of paleohydrologic methodologies

    International Nuclear Information System (INIS)

    Foley, M.G.; Zimmerman, D.A.; Doesburg, J.M.; Thorne, P.D.

    1982-12-01

    A literature review was conducted to identify methodologies that could be used to interpret paleohydrologic environments. Paleohydrology is the study of past hydrologic systems or of the past behavior of an existing hydrologic system. The purpose of the review was to evaluate how well these methodologies could be applied to the siting of low-level radioactive waste facilities. The computer literature search queried five bibliographical data bases containing over five million citations of technical journals, books, conference papers, and reports. Two data-base searches (United States Geological Survey - USGS) and a manual search were also conducted. The methodologies were examined for data requirements and sensitivity limits. Paleohydrologic interpretations are uncertain because of the effects of time on hydrologic and geologic systems and because of the complexity of fluvial systems. Paleoflow determinations appear in many cases to be order-of-magnitude estimates. However, the methodologies identified in this report mitigate this uncertainty when used collectively as well as independently. That is, the data from individual methodologies can be compared or combined to corroborate hydrologic predictions. In this manner, paleohydrologic methodologies are viable tools to assist in evaluating the likely future hydrology of low-level radioactive waste sites

  8. Significant Radionuclides Determination

    Energy Technology Data Exchange (ETDEWEB)

    Jo A. Ziegler

    2001-07-31

    The purpose of this calculation is to identify radionuclides that are significant to offsite doses from potential preclosure events for spent nuclear fuel (SNF) and high-level radioactive waste expected to be received at the potential Monitored Geologic Repository (MGR). In this calculation, high-level radioactive waste is included in references to DOE SNF. A previous document, ''DOE SNF DBE Offsite Dose Calculations'' (CRWMS M&O 1999b), calculated the source terms and offsite doses for Department of Energy (DOE) and Naval SNF for use in design basis event analyses. This calculation reproduces only DOE SNF work (i.e., no naval SNF work is included in this calculation) created in ''DOE SNF DBE Offsite Dose Calculations'' and expands the calculation to include DOE SNF expected to produce a high dose consequence (even though the quantity of the SNF is expected to be small) and SNF owned by commercial nuclear power producers. The calculation does not address any specific off-normal/DBE event scenarios for receiving, handling, or packaging of SNF. The results of this calculation are developed for comparative analysis to establish the important radionuclides and do not represent the final source terms to be used for license application. This calculation will be used as input to preclosure safety analyses and is performed in accordance with procedure AP-3.12Q, ''Calculations'', and is subject to the requirements of DOE/RW-0333P, ''Quality Assurance Requirements and Description'' (DOE 2000) as determined by the activity evaluation contained in ''Technical Work Plan for: Preclosure Safety Analysis, TWP-MGR-SE-000010'' (CRWMS M&O 2000b) in accordance with procedure AP-2.21Q, ''Quality Determinations and Planning for Scientific, Engineering, and Regulatory Compliance Activities''.

  9. Soft Systems Methodology

    Science.gov (United States)

    Checkland, Peter; Poulter, John

    Soft systems methodology (SSM) is an approach for tackling problematical, messy situations of all kinds. It is an action-oriented process of inquiry into problematic situations in which users learn their way from finding out about the situation, to taking action to improve it. The learning emerges via an organised process in which the situation is explored using a set of models of purposeful action (each built to encapsulate a single worldview) as intellectual devices, or tools, to inform and structure discussion about a situation and how it might be improved. This paper, written by the original developer Peter Checkland and practitioner John Poulter, gives a clear and concise account of the approach that covers SSM's specific techniques, the learning cycle process of the methodology and the craft skills which practitioners develop. This concise but theoretically robust account nevertheless includes the fundamental concepts, techniques, core tenets described through a wide range of settings.

  10. Transparent Guideline Methodology Needed

    DEFF Research Database (Denmark)

    Lidal, Ingeborg; Norén, Camilla; Mäkelä, Marjukka

    2013-01-01

    As part of learning at the Nordic Workshop of Evidence-based Medicine, we have read with interest the practice guidelines for central venous access, published in your Journal in 2012.1 We appraised the quality of this guideline using the checklist developed by The Evidence-Based Medicine Working ...... are based on best currently available evidence. Our concerns are in two main categories: the rigor of development, including methodology of searching, evaluating, and combining the evidence; and editorial independence, including funding and possible conflicts of interest....... Group.2 Similar criteria for guideline quality have been suggested elsewhere.3 Our conclusion was that this much needed guideline is currently unclear about several aspects of the methodology used in developing the recommendations. This means potential users cannot be certain that the recommendations...

  11. Web survey methodology

    CERN Document Server

    Callegaro, Mario; Vehovar, Asja

    2015-01-01

    Web Survey Methodology guides the reader through the past fifteen years of research in web survey methodology. It both provides practical guidance on the latest techniques for collecting valid and reliable data and offers a comprehensive overview of research issues. Core topics from preparation to questionnaire design, recruitment testing to analysis and survey software are all covered in a systematic and insightful way. The reader will be exposed to key concepts and key findings in the literature, covering measurement, non-response, adjustments, paradata, and cost issues. The book also discusses the hottest research topics in survey research today, such as internet panels, virtual interviewing, mobile surveys and the integration with passive measurements, e-social sciences, mixed modes and business intelligence. The book is intended for students, practitioners, and researchers in fields such as survey and market research, psychological research, official statistics and customer satisfaction research.

  12. Steganography: LSB Methodology

    Science.gov (United States)

    2012-08-02

    of LSB steganography in grayscale and color images . In J. Dittmann, K. Nahrstedt, and P. Wohlmacher, editors, Proceedings of the ACM, Special...Fridrich, M. Gojan and R. Du paper titled “Reliable detection of LSB steganography in grayscale and color images ”. From a general perspective Figure 2...REPORT Steganography : LSB Methodology (Progress Report) 14. ABSTRACT 16. SECURITY CLASSIFICATION OF: In computer science, steganography is the science

  13. Soil Radiological Characterisation Methodology

    International Nuclear Information System (INIS)

    Attiogbe, Julien; Aubonnet, Emilie; De Maquille, Laurence; De Moura, Patrick; Desnoyers, Yvon; Dubot, Didier; Feret, Bruno; Fichet, Pascal; Granier, Guy; Iooss, Bertrand; Nokhamzon, Jean-Guy; Ollivier Dehaye, Catherine; Pillette-Cousin, Lucien; Savary, Alain

    2014-12-01

    This report presents the general methodology and best practice approaches which combine proven existing techniques for sampling and characterisation to assess the contamination of soils prior to remediation. It is based on feedback of projects conducted by main French nuclear stakeholders involved in the field of remediation and dismantling (EDF, CEA, AREVA and IRSN). The application of this methodology will enable the project managers to obtain the elements necessary for the drawing up of files associated with remediation operations, as required by the regulatory authorities. It is applicable to each of the steps necessary for the piloting of remediation work-sites, depending on the objectives targeted (release into the public domain, re-use, etc.). The main part describes the applied statistical methodology with the exploratory analysis and variogram data, identification of singular points and their location. The results obtained permit assessment of a mapping to identify the contaminated surface and subsurface areas. It stakes the way for radiological site characterisation since the initial investigations from historical and functional analysis to check that the remediation objectives have been met. It follows an example application from the feedback of the remediation of a contaminated site on the Fontenay aux Roses facility. It is supplemented by a glossary of main terms used in the field from different publications or international standards. This technical report is a support of the ISO Standard ISO ISO/TC 85/SC 5 N 18557 'Sampling and characterisation principles for soils, buildings and infrastructures contaminated by radionuclides for remediation purposes'. (authors) [fr

  14. Advantages of Westinghouse BWR control rod drop accidents methodology utilizing integrated POLCA-T code

    International Nuclear Information System (INIS)

    Panayotov, Dobromir

    2008-01-01

    The paper focuses on the activities pursued by Westinghouse in the development and licensing of POLCA-T code Control Rod Drop Accident (CRDA) Methodology. The comprehensive CRDA methodology that utilizes PHOENIX4/POLCA7/POLCA-T calculation chain foresees complete cycle-specific analysis. The methodology consists of determination of candidates of control rods (CR) that could cause a significant reactivity excursion if dropped throughout the entire fuel cycle, selection of limiting initial conditions for CRDA transient simulation and transient simulation itself. The Westinghouse methodology utilizes state-of-the-art methods. Unnecessary conservatisms in the methodology have been avoided to allow the accurate prediction of margin to design bases. This is mainly achieved by using the POLCA-T code for dynamic CRDA evaluations. The code belongs to the same calculation chain that is used for core design. Thus the very same reactor, core, cycle and fuel data base is used. This allows also reducing the uncertainties of input data and parameters that determine the energy deposition in the fuel. Uncertainty treatment, very selective use of conservatisms, selection of the initial conditions for limiting case analyses, incorporation into POLCA-T code models of the licensed fuel performance code are also among the means of performing realistic CRDA transient analyses. (author)

  15. Methodology for evaluation of diagnostic performance

    International Nuclear Information System (INIS)

    Metz, C.E.

    1992-01-01

    Effort in this project during the past year has focused on the development, refinement, and distribution of computer software that will allow current Receiver Operating Characteristic (ROC) methodology to be used conveniently and reliably by investigators in a variety of evaluation tasks in diagnostic medicine; and on the development of new ROC methodology that will broaden the spectrum of evaluation tasks and/or experimental settings to which the fundamental approach can be applied. Progress has been limited by the amount of financial support made available to the project

  16. Online Public Access Catalog User Studies: A Review of Research Methodologies, March 1986-November 1989.

    Science.gov (United States)

    Seymour, Sharon

    1991-01-01

    Review of research methodologies used in studies of online public access catalog (OPAC) users finds that a variety of research methodologies--e.g., surveys, transaction log analysis, interviews--have been used with varying degrees of expertise. It is concluded that poor research methodology resulting from limited training and resources limits the…

  17. Two methodologies for optical analysis of contaminated engine lubricants

    International Nuclear Information System (INIS)

    Aghayan, Hamid; Yang, Jun; Bordatchev, Evgueni

    2012-01-01

    The performance, efficiency and lifetime of modern combustion engines significantly depend on the quality of the engine lubricants. However, contaminants, such as gasoline, moisture, coolant and wear particles, reduce the life of engine mechanical components and lubricant quality. Therefore, direct and indirect measurements of engine lubricant properties, such as physical-mechanical, electro-magnetic, chemical and optical properties, are intensively utilized in engine condition monitoring systems and sensors developed within the last decade. Such sensors for the measurement of engine lubricant properties can be used to detect a functional limit of the in-use lubricant, increase drain interval and reduce the environmental impact. This paper proposes two new methodologies for the quantitative and qualitative analysis of the presence of contaminants in the engine lubricants. The methodologies are based on optical analysis of the distortion effect when an object image is obtained through a thin random optical medium (e.g. engine lubricant). The novelty of the proposed methodologies is in the introduction of an object with a known periodic shape behind a thin film of the contaminated lubricant. In this case, an acquired image represents a combined lubricant–object optical appearance, where an a priori known periodic structure of the object is distorted by a contaminated lubricant. In the object shape-based optical analysis, several parameters of an acquired optical image, such as the gray scale intensity of lubricant and object, shape width at object and lubricant levels, object relative intensity and width non-uniformity coefficient are newly proposed. Variations in the contaminant concentration and use of different contaminants lead to the changes of these parameters measured on-line. In the statistical optical analysis methodology, statistical auto- and cross-characteristics (e.g. auto- and cross-correlation functions, auto- and cross-spectrums, transfer function

  18. CHARACTERIZATION OF SMALL AND MEDIUM ENTERPRISES (SMES OF POMERANIAN REGION IN SIX SIGMA METHODOLOGY APPLICATION

    Directory of Open Access Journals (Sweden)

    2011-12-01

    Full Text Available Background: Six Sigma is related to product’s characteristics and parameters of actions, needed to obtain these products. On the other hand, it is a multi-step, cyclic process aimed at the improvements leading to global standard, closed to the perfection. There is a growing interest in Six Sigma methodology among smaller organizations but there are still too little publications presented such events in the sector of small and medium enterprises, especially based on good empirical results. It was already noticed at the phase of the preliminary researches, that only small part of companies from this sector in Pomerian region use elements of this methodology. Methods: The companies were divided into groups by the type of their activities as well as the employment size. The questionnaires were sent to 150 randomly selected organizations in two steps and were addressed to senior managers. The questionnaire contained the questions about basic information about a company, the level of the knowledge and the practical application of Six Sigma methodology, opinions about improvements of processes occurring in the company, opinions about trainings in Six Sigma methodology. Results: The following hypotheses were proposed, statistically verified and received the answer: The lack of the adequate knowledge of Six Sigma methodology in SMEs limits the possibility to effectively monitor and improve processes - accepted. The use of statistical tools of Six Sigma methodology requires the broad action to popularize this knowledge among national SMEs - accepted. The level of the awareness of the importance as well as practical use of Six Sigma methodology in manufacturing SMEs is higher than in SMEs providing services - rejected, the level is equal. The level of the knowledge and the use of Six Sigma methodology in medium manufacturing companies is significantly higher than in small manufacturing companies - accepted. The level of the knowledge and the application

  19. New scoring methodology improves the sensitivity of the Alzheimer's Disease Assessment Scale-Cognitive subscale (ADAS-Cog) in clinical trials.

    Science.gov (United States)

    Verma, Nishant; Beretvas, S Natasha; Pascual, Belen; Masdeu, Joseph C; Markey, Mia K

    2015-11-12

    As currently used, the Alzheimer's Disease Assessment Scale-Cognitive subscale (ADAS-Cog) has low sensitivity for measuring Alzheimer's disease progression in clinical trials. A major reason behind the low sensitivity is its sub-optimal scoring methodology, which can be improved to obtain better sensitivity. Using item response theory, we developed a new scoring methodology (ADAS-CogIRT) for the ADAS-Cog, which addresses several major limitations of the current scoring methodology. The sensitivity of the ADAS-CogIRT methodology was evaluated using clinical trial simulations as well as a negative clinical trial, which had shown an evidence of a treatment effect. The ADAS-Cog was found to measure impairment in three cognitive domains of memory, language, and praxis. The ADAS-CogIRT methodology required significantly fewer patients and shorter trial durations as compared to the current scoring methodology when both were evaluated in simulated clinical trials. When validated on data from a real clinical trial, the ADAS-CogIRT methodology had higher sensitivity than the current scoring methodology in detecting the treatment effect. The proposed scoring methodology significantly improves the sensitivity of the ADAS-Cog in measuring progression of cognitive impairment in clinical trials focused in the mild-to-moderate Alzheimer's disease stage. This provides a boost to the efficiency of clinical trials requiring fewer patients and shorter durations for investigating disease-modifying treatments.

  20. Agile methodology selection criteria: IT start-up case study

    Science.gov (United States)

    Micic, Lj

    2017-05-01

    Project management in modern IT companies is often based on agile methodologies which have several advantages compared to traditional methodologies such is waterfall. Having in mind that clients sometimes change project during development it is crucial for an IT company to choose carefully which methodology is going to implement and is it going to be mostly based on one or is it going got be combination of several. There are several modern and often used methodologies but among those Scrum, Kanban and XP programming are usually the most common. Sometimes companies use mostly tools and procedures from one but quite often they use some of the combination of those methodologies. Having in mind that those methodologies are just a framework they allow companies to adapt it for their specific projects as well as for other limitations. These methodologies are in limited usage Bosnia but more and more IT companies are starting to use agile methodologies because it is practice and common not just for their clients abroad but also starting to be the only option in order to deliver quality product on time. However it is always challenging which methodology or combination of several companies should implement and how to connect it to its own project, organizational framework and HR management. This paper presents one case study based on local IT start up and delivers solution based on theoretical framework and practical limitations that case company has.

  1. Novel limiter pump topologies

    International Nuclear Information System (INIS)

    Schultz, J.H.

    1981-01-01

    The use of limiter pumps as the principle plasma exhaust system of a magnetic confinement fusion device promises significant simplification, when compared to previously investigating divertor based systems. Further simplifications, such as the integration of the exhaust system with a radio frequency heating system and with the main reactor shield and structure are investigated below. The integrity of limiters in a reactor environment is threatened by many mechanisms, the most severe of which may be erosion by sputtering. Two novel topolgies are suggested which allow high erosion without limiter failure

  2. Novel limiter pump topologies

    International Nuclear Information System (INIS)

    Schultz, J.H.

    1981-01-01

    The use of limiter pumps as the principle plasma exhaust system of a magnetic confinement fusion device promises significant simplification, when compared to previously investigating divertor based systems. Further simplifications, such as the integration of the exhaust system with a radio frequency heating system and with the main reactor shield and structure are investigated below. The integrity of limiters in a reactor environment is threatened by many mechanisms, the most severe of which may be erosion by sputtering. Two novel topologies are suggested which allow high erosion without limiter failure

  3. EVOLUTION OF THEORETICAL AND METHODOLOGICAL FOUNDATIONS OF COMPARATIVE ECONOMICS

    OpenAIRE

    N. Grazhevska

    2014-01-01

    The article reveals the evolution stages of theoretical and methodological foundations of comparative economics. The author highlights algorithms of comparative analysis as well as theoretical and methodological limitations of four research programs of the new comparative economics. The article justifies the necessity of a comprehensive comparative study of the major trends and contradictions in the development of national economies in the era of globalization.

  4. Methodology for reliability, economic and environmental assessment of wave energy

    International Nuclear Information System (INIS)

    Thorpe, T.W.; Muirhead, S.

    1994-01-01

    As part of the Preliminary Actions in Wave Energy R and D for DG XII's Joule programme, methodologies were developed to facilitate assessment of the reliability, economics and environmental impact of wave energy. This paper outlines these methodologies, their limitations and areas requiring further R and D. (author)

  5. A theoretical and experimental investigation of propeller performance methodologies

    Science.gov (United States)

    Korkan, K. D.; Gregorek, G. M.; Mikkelson, D. C.

    1980-01-01

    This paper briefly covers aspects related to propeller performance by means of a review of propeller methodologies; presentation of wind tunnel propeller performance data taken in the NASA Lewis Research Center 10 x 10 wind tunnel; discussion of the predominent limitations of existing propeller performance methodologies; and a brief review of airfoil developments appropriate for propeller applications.

  6. Methodology applied to develop the DHIE: applied methodology

    CSIR Research Space (South Africa)

    Herselman, Marlien

    2016-12-01

    Full Text Available This section will address the methodology that was applied to develop the South African Digital Health Innovation Ecosystem (DHIE). Each chapter under Section B represents a specific phase in the methodology....

  7. Detection of significant protein coevolution.

    Science.gov (United States)

    Ochoa, David; Juan, David; Valencia, Alfonso; Pazos, Florencio

    2015-07-01

    The evolution of proteins cannot be fully understood without taking into account the coevolutionary linkages entangling them. From a practical point of view, coevolution between protein families has been used as a way of detecting protein interactions and functional relationships from genomic information. The most common approach to inferring protein coevolution involves the quantification of phylogenetic tree similarity using a family of methodologies termed mirrortree. In spite of their success, a fundamental problem of these approaches is the lack of an adequate statistical framework to assess the significance of a given coevolutionary score (tree similarity). As a consequence, a number of ad hoc filters and arbitrary thresholds are required in an attempt to obtain a final set of confident coevolutionary signals. In this work, we developed a method for associating confidence estimators (P values) to the tree-similarity scores, using a null model specifically designed for the tree comparison problem. We show how this approach largely improves the quality and coverage (number of pairs that can be evaluated) of the detected coevolution in all the stages of the mirrortree workflow, independently of the starting genomic information. This not only leads to a better understanding of protein coevolution and its biological implications, but also to obtain a highly reliable and comprehensive network of predicted interactions, as well as information on the substructure of macromolecular complexes using only genomic information. The software and datasets used in this work are freely available at: http://csbg.cnb.csic.es/pMT/. pazos@cnb.csic.es Supplementary data are available at Bioinformatics online. © The Author 2015. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  8. Health economic evaluation: important principles and methodology.

    Science.gov (United States)

    Rudmik, Luke; Drummond, Michael

    2013-06-01

    To discuss health economic evaluation and improve the understanding of common methodology. This article discusses the methodology for the following types of economic evaluations: cost-minimization, cost-effectiveness, cost-utility, cost-benefit, and economic modeling. Topics include health-state utility measures, the quality-adjusted life year (QALY), uncertainty analysis, discounting, decision tree analysis, and Markov modeling. Economic evaluation is the comparative analysis of alternative courses of action in terms of both their costs and consequences. With increasing health care expenditure and limited resources, it is important for physicians to consider the economic impact of their interventions. Understanding common methodology involved in health economic evaluation will improve critical appraisal of the literature and optimize future economic evaluations. Copyright © 2012 The American Laryngological, Rhinological and Otological Society, Inc.

  9. A methodology to describe process control requirements

    International Nuclear Information System (INIS)

    Carcagno, R.; Ganni, V.

    1994-01-01

    This paper presents a methodology to describe process control requirements for helium refrigeration plants. The SSC requires a greater level of automation for its refrigeration plants than is common in the cryogenics industry, and traditional methods (e.g., written descriptions) used to describe process control requirements are not sufficient. The methodology presented in this paper employs tabular and graphic representations in addition to written descriptions. The resulting document constitutes a tool for efficient communication among the different people involved in the design, development, operation, and maintenance of the control system. The methodology is not limited to helium refrigeration plants, and can be applied to any process with similar requirements. The paper includes examples

  10. Methodology for evaluation of industrial CHP production

    International Nuclear Information System (INIS)

    Pavlovic, Nenad V.; Studovic, Milovan

    2000-01-01

    At the end of the century industry switched from exclusive power consumer into power consumer-producer which is one of the players on the deregulated power market. Consequently, goals of industrial plant optimization have to be changed, making new challenges that industrial management has to be faced with. In the paper is reviewed own methodology for evaluation of industrial power production on deregulated power market. The methodology recognizes economic efficiency of industrial CHP facilities as a main criterion for evaluation. Energy and ecological efficiency are used as additional criteria, in which implicit could be found social goals. Also, methodology recognizes key and limit factors for CHP production in industry. It could be successful applied, by use of available commercial software for energy simulation in CHP plants and economic evaluation. (Authors)

  11. Case Study Research Methodology

    Directory of Open Access Journals (Sweden)

    Mark Widdowson

    2011-01-01

    Full Text Available Commenting on the lack of case studies published in modern psychotherapy publications, the author reviews the strengths of case study methodology and responds to common criticisms, before providing a summary of types of case studies including clinical, experimental and naturalistic. Suggestions are included for developing systematic case studies and brief descriptions are given of a range of research resources relating to outcome and process measures. Examples of a pragmatic case study design and a hermeneutic single-case efficacy design are given and the paper concludes with some ethical considerations and an exhortation to the TA community to engage more widely in case study research.

  12. Sample size methodology

    CERN Document Server

    Desu, M M

    2012-01-01

    One of the most important problems in designing an experiment or a survey is sample size determination and this book presents the currently available methodology. It includes both random sampling from standard probability distributions and from finite populations. Also discussed is sample size determination for estimating parameters in a Bayesian setting by considering the posterior distribution of the parameter and specifying the necessary requirements. The determination of the sample size is considered for ranking and selection problems as well as for the design of clinical trials. Appropria

  13. Geotechnical site assessment methodology

    International Nuclear Information System (INIS)

    Tunbridge, L.W.; Richards, L.R.

    1985-09-01

    The reports comprising this volume concern the research conducted on geotechnical site assessment methodology at the Carwynnen test mine in granites in Cornwall, with particular reference to the effect of structures imposed by discontinuities on the engineering behaviour of rock masses. The topics covered are: in-situ stress measurements using (a) the hydraulic fracturing method, or (b) the US Bureau of Mines deformation probe; scanline discontinuity survey - coding form and instructions, and data; applicability of geostatistical estimation methods to scalar rock properties; comments on in-situ stress at the Carwynnen test mine and the state of stress in the British Isles. (U.K.)

  14. Microphysics evolution and methodology

    International Nuclear Information System (INIS)

    Dionisio, J.S.

    1990-01-01

    A few general features of microscopics evolution and their relationship with microscopics methodology are briefly surveyed. Several pluri-disciplinary and interdisciplinary aspects of microscopics research are also discussed in the present scientific context. The need for an equilibrium between individual tendencies and collective constraints required by team work, already formulated thirty years ago by Frederic Joliot, is particularly stressed in the present conjuncture of Nuclear Research favouring very large team projects and discouraging individual initiatives. The increasing importance of the science of science (due to their multiple social, economical, ecological aspects) and the stronger competition between national and international tendencies of scientific (and technical) cooperation are also discussed. (author)

  15. MIRD methodology; Metodologia MIRD

    Energy Technology Data Exchange (ETDEWEB)

    Rojo, Ana M [Autoridad Regulatoria Nuclear, Buenos Aires (Argentina); Gomez Parada, Ines [Sociedad Argentina de Radioproteccion, Buenos Aires (Argentina)

    2004-07-01

    The MIRD (Medical Internal Radiation Dose) system was established by the Society of Nuclear Medicine of USA in 1960 to assist the medical community in the estimation of the dose in organs and tissues due to the incorporation of radioactive materials. Since then, 'MIRD Dose Estimate Report' (from the 1 to 12) and 'Pamphlets', of great utility for the dose calculations, were published. The MIRD system was planned essentially for the calculation of doses received by the patients during nuclear medicine diagnostic procedures. The MIRD methodology for the absorbed doses calculations in different tissues is explained.

  16. Beam optimization: improving methodology

    International Nuclear Information System (INIS)

    Quinteiro, Guillermo F.

    2004-01-01

    Different optimization techniques commonly used in biology and food technology allow a systematic and complete analysis of response functions. In spite of the great interest in medical and nuclear physics in the problem of optimizing mixed beams, little attention has been given to sophisticate mathematical tools. Indeed, many techniques are perfectly suited to the typical problem of beam optimization. This article is intended as a guide to the use of two methods, namely Response Surface Methodology and Simplex, that are expected to fasten the optimization process and, meanwhile give more insight into the relationships among the dependent variables controlling the response

  17. Literacy research methodologies

    CERN Document Server

    Duke, Nell K

    2012-01-01

    The definitive reference on literacy research methods, this book serves as a key resource for researchers and as a text in graduate-level courses. Distinguished scholars clearly describe established and emerging methodologies, discuss the types of questions and claims for which each is best suited, identify standards of quality, and present exemplary studies that illustrate the approaches at their best. The book demonstrates how each mode of inquiry can yield unique insights into literacy learning and teaching and how the methods can work together to move the field forward.   New to This Editi

  18. Methodology of site studies

    International Nuclear Information System (INIS)

    Caries, J.C.; Hugon, J.; Grauby, A.

    1980-01-01

    This methodology consists in an essentially dynamic, estimated and follow-up analysis of the impact of discharges on all the environment compartments, whether natural or not, that play a part in the protection of man and his environment. It applies at two levels, to wit: the choice of site, or the detailed study of the site selected. Two examples of its application will be developed, namely: at the choice of site level in the case of marine sites, and of the detailed study level of the chosen site in that of a riverside site [fr

  19. Alternative pricing methodologies

    International Nuclear Information System (INIS)

    Anon.

    1991-01-01

    With the increased interest in competitive market forces and growing recognition of the deficiencies in current practices, FERC and others are exploring alternatives to embedded cost pricing. A number of these alternatives are discussed in this chapter. Marketplace pricing, discussed briefly here, is the subject of the next chapter. Obviously, the pricing formula may combine several of these methodologies. One utility of which the authors are aware is seeking a price equal to the sum of embedded costs, opportunity costs, line losses, value of service, FERC's percentage adder formula and a contract service charge

  20. No significant fuel failures (NSFF)

    International Nuclear Information System (INIS)

    Domaratzki, Z.

    1979-01-01

    It has long been recognized that no emergency core cooling system (ECCS) could be absolutely guaranteed to prevent fuel failures. In 1976 the Atomic Energy Control Board decided that the objective for an ECCS should be to prevent fuel failures, but if the objective could not be met it should be shown that the consequences are acceptable for dual failures comprising any LOCA combined with an assumed impairment of containment. Out of the review of the Bruce A plant came the definition of 'no significant fuel failures': for any postulated LOCA combined with any one mode of containment impairment the resultant dose to a person at the edge of the exclusion zone is less than the reference dose limits for dual failures

  1. Methodology of site protection studies

    International Nuclear Information System (INIS)

    Farges, L.

    1980-01-01

    Preliminary studies preceding building of a nuclear facility aim at assessing the choice of a site and establishing operating and control procedures. These studies are of two types. Studies on the impact of environment on the nuclear facility to be constructed form one type and studies on the impact of nuclear facilities on the environment form the second type. A methodology giving a framework to studies of second type is presented. These studies are undertaken to choose suitable sites for nuclear facilities. After a preliminary selection of a site based on the first estimate, a detailed site study is undertaken. The procedure for this consists of five successive phases, namely, (1) an inquiry assessing the initial state of the site, (2) an initial synthesis of accumulated information for assessing the health and safety consequences of releases, (3) laboratory and field studies simulating the movement of waste products for a quantitative assessment of effects, (4) final synthesis for laying down the release limits and radiological control methods, and (5) conclusions based on comparing the data of final synthesis to the limits prescribed by regulations. These five phases are outlined. Role of periodic reassessments after the facility is in operation for same time is explained. (M.G.B.)

  2. Radiological Characterization Methodology for INEEL-Stored Remote-Handled Transuranic (RH TRU) Waste from Argonne National Laboratory-East

    International Nuclear Information System (INIS)

    Kuan, P.; Bhatt, R.N.

    2003-01-01

    An Acceptable Knowledge (AK)-based radiological characterization methodology is being developed for RH TRU waste generated from ANL-E hot cell operations performed on fuel elements irradiated in the EBR-II reactor. The methodology relies on AK for composition of the fresh fuel elements, their irradiation history, and the waste generation and collection processes. Radiological characterization of the waste involves the estimates of the quantities of significant fission products and transuranic isotopes in the waste. Methods based on reactor and physics principles are used to achieve these estimates. Because of the availability of AK and the robustness of the calculation methods, the AK-based characterization methodology offers a superior alternative to traditional waste assay techniques. Using the methodology, it is shown that the radiological parameters of a test batch of ANL-E waste is well within the proposed WIPP Waste Acceptance Criteria limits

  3. LOFT uncertainty-analysis methodology

    International Nuclear Information System (INIS)

    Lassahn, G.D.

    1983-01-01

    The methodology used for uncertainty analyses of measurements in the Loss-of-Fluid Test (LOFT) nuclear-reactor-safety research program is described and compared with other methodologies established for performing uncertainty analyses

  4. LOFT uncertainty-analysis methodology

    International Nuclear Information System (INIS)

    Lassahn, G.D.

    1983-01-01

    The methodology used for uncertainty analyses of measurements in the Loss-of-Fluid Test (LOFT) nuclear reactor safety research program is described and compared with other methodologies established for performing uncertainty analyses

  5. Safeguarding the fuel cycle: Methodologies

    International Nuclear Information System (INIS)

    Gruemm, H.

    1984-01-01

    The effectiveness of IAEA safeguards is characterized by the extent to which they achieve their basic purpose - credible verification that no nuclear material is diverted from peaceful uses. This effectiveness depends inter alia but significantly on manpower in terms of the number and qualifications of inspectors. Staff increases will be required to improve effectiveness further, if this is requested by Member States, as well as to take into account new facilities expected to come under safeguards in the future. However, they are difficult to achieve due to financial constraints set by the IAEA budget. As a consequence, much has been done and is being undertaken to improve utilization of available manpower, including standardization of inspection procedures; improvement of management practices and training; rationalization of planning, reporting, and evaluation of inspection activities; and development of new equipment. This article focuses on certain aspects of the verification methodology presently used and asks: are any modifications of this methodology conceivable that would lead to economies of manpower, without loss of effectiveness. It has been stated in this context that present safeguards approaches are ''facility-oriented'' and that the adoption of a ''fuel cycle-oriented approach'' might bring about the desired savings. Many studies have been devoted to this very interesting suggestion. Up to this moment, no definite answer is available and further studies will be necessary to come to a conclusion. In what follows, the essentials of the problem are explained and some possible paths to a solution are discussed

  6. RHIC Data Correlation Methodology

    International Nuclear Information System (INIS)

    Michnoff, R.; D'Ottavio, T.; Hoff, L.; MacKay, W.; Satogata, T.

    1999-01-01

    A requirement for RHIC data plotting software and physics analysis is the correlation of data from all accelerator data gathering systems. Data correlation provides the capability for a user to request a plot of multiple data channels vs. time, and to make meaningful time-correlated data comparisons. The task of data correlation for RHIC requires careful consideration because data acquisition triggers are generated from various asynchronous sources including events from the RHIC Event Link, events from the two Beam Sync Links, and other unrelated clocks. In order to correlate data from asynchronous acquisition systems a common time reference is required. The RHIC data correlation methodology will allow all RHIC data to be converted to a common wall clock time, while still preserving native acquisition trigger information. A data correlation task force team, composed of the authors of this paper, has been formed to develop data correlation design details and provide guidelines for software developers. The overall data correlation methodology will be presented in this paper

  7. Intelligent systems engineering methodology

    Science.gov (United States)

    Fouse, Scott

    1990-01-01

    An added challenge for the designers of large scale systems such as Space Station Freedom is the appropriate incorporation of intelligent system technology (artificial intelligence, expert systems, knowledge-based systems, etc.) into their requirements and design. This presentation will describe a view of systems engineering which successfully addresses several aspects of this complex problem: design of large scale systems, design with requirements that are so complex they only completely unfold during the development of a baseline system and even then continue to evolve throughout the system's life cycle, design that involves the incorporation of new technologies, and design and development that takes place with many players in a distributed manner yet can be easily integrated to meet a single view of the requirements. The first generation of this methodology was developed and evolved jointly by ISX and the Lockheed Aeronautical Systems Company over the past five years on the Defense Advanced Research Projects Agency/Air Force Pilot's Associate Program, one of the largest, most complex, and most successful intelligent systems constructed to date. As the methodology has evolved it has also been applied successfully to a number of other projects. Some of the lessons learned from this experience may be applicable to Freedom.

  8. SMART performance analysis methodology

    International Nuclear Information System (INIS)

    Lim, H. S.; Kim, H. C.; Lee, D. J.

    2001-04-01

    To ensure the required and desired operation over the plant lifetime, the performance analysis for the SMART NSSS design is done by means of the specified analysis methodologies for the performance related design basis events(PRDBE). The PRDBE is an occurrence(event) that shall be accommodated in the design of the plant and whose consequence would be no more severe than normal service effects of the plant equipment. The performance analysis methodology which systematizes the methods and procedures to analyze the PRDBEs is as follows. Based on the operation mode suitable to the characteristics of the SMART NSSS, the corresponding PRDBEs and allowable range of process parameters for these events are deduced. With the developed control logic for each operation mode, the system thermalhydraulics are analyzed for the chosen PRDBEs using the system analysis code. Particularly, because of different system characteristics of SMART from the existing commercial nuclear power plants, the operation mode, PRDBEs, control logic, and analysis code should be consistent with the SMART design. This report presents the categories of the PRDBEs chosen based on each operation mode and the transition among these and the acceptance criteria for each PRDBE. It also includes the analysis methods and procedures for each PRDBE and the concept of the control logic for each operation mode. Therefore this report in which the overall details for SMART performance analysis are specified based on the current SMART design, would be utilized as a guide for the detailed performance analysis

  9. Relative Hazard Calculation Methodology

    International Nuclear Information System (INIS)

    DL Strenge; MK White; RD Stenner; WB Andrews

    1999-01-01

    The methodology presented in this document was developed to provide a means of calculating the RH ratios to use in developing useful graphic illustrations. The RH equation, as presented in this methodology, is primarily a collection of key factors relevant to understanding the hazards and risks associated with projected risk management activities. The RH equation has the potential for much broader application than generating risk profiles. For example, it can be used to compare one risk management activity with another, instead of just comparing it to a fixed baseline as was done for the risk profiles. If the appropriate source term data are available, it could be used in its non-ratio form to estimate absolute values of the associated hazards. These estimated values of hazard could then be examined to help understand which risk management activities are addressing the higher hazard conditions at a site. Graphics could be generated from these absolute hazard values to compare high-hazard conditions. If the RH equation is used in this manner, care must be taken to specifically define and qualify the estimated absolute hazard values (e.g., identify which factors were considered and which ones tended to drive the hazard estimation)

  10. Scrum methodology in banking environment

    OpenAIRE

    Strihová, Barbora

    2015-01-01

    Bachelor thesis "Scrum methodology in banking environment" is focused on one of agile methodologies called Scrum and description of the methodology used in banking environment. Its main goal is to introduce the Scrum methodology and outline a real project placed in a bank focused on software development through a case study, address problems of the project, propose solutions of the addressed problems and identify anomalies of Scrum in software development constrained by the banking environmen...

  11. Experimental Economics: Some Methodological Notes

    OpenAIRE

    Fiore, Annamaria

    2009-01-01

    The aim of this work is presenting in a self-contained paper some methodological aspects as they are received in the current experimental literature. The purpose has been to make a critical review of some very influential papers dealing with methodological issues. In other words, the idea is to have a single paper where people first approaching experimental economics can find summarised (some) of the most important methodological issues. In particular, the focus is on some methodological prac...

  12. A design methodology for unattended monitoring systems

    International Nuclear Information System (INIS)

    SMITH, JAMES D.; DELAND, SHARON M.

    2000-01-01

    The authors presented a high-level methodology for the design of unattended monitoring systems, focusing on a system to detect diversion of nuclear materials from a storage facility. The methodology is composed of seven, interrelated analyses: Facility Analysis, Vulnerability Analysis, Threat Assessment, Scenario Assessment, Design Analysis, Conceptual Design, and Performance Assessment. The design of the monitoring system is iteratively improved until it meets a set of pre-established performance criteria. The methodology presented here is based on other, well-established system analysis methodologies and hence they believe it can be adapted to other verification or compliance applications. In order to make this approach more generic, however, there needs to be more work on techniques for establishing evaluation criteria and associated performance metrics. They found that defining general-purpose evaluation criteria for verifying compliance with international agreements was a significant undertaking in itself. They finally focused on diversion of nuclear material in order to simplify the problem so that they could work out an overall approach for the design methodology. However, general guidelines for the development of evaluation criteria are critical for a general-purpose methodology. A poor choice in evaluation criteria could result in a monitoring system design that solves the wrong problem

  13. A methodology for spectral wave model evaluation

    Science.gov (United States)

    Siqueira, S. A.; Edwards, K. L.; Rogers, W. E.

    2017-12-01

    Model evaluation is accomplished by comparing bulk parameters (e.g., significant wave height, energy period, and mean square slope (MSS)) calculated from the model energy spectra with those calculated from buoy energy spectra. Quality control of the observed data and choice of the frequency range from which the bulk parameters are calculated are critical steps in ensuring the validity of the model-data comparison. The compared frequency range of each observation and the analogous model output must be identical, and the optimal frequency range depends in part on the reliability of the observed spectra. National Data Buoy Center 3-m discus buoy spectra are unreliable above 0.3 Hz due to a non-optimal buoy response function correction. As such, the upper end of the spectrum should not be included when comparing a model to these data. Bioufouling of Waverider buoys must be detected, as it can harm the hydrodynamic response of the buoy at high frequencies, thereby rendering the upper part of the spectrum unsuitable for comparison. An important consideration is that the intentional exclusion of high frequency energy from a validation due to data quality concerns (above) can have major implications for validation exercises, especially for parameters such as the third and fourth moments of the spectrum (related to Stokes drift and MSS, respectively); final conclusions can be strongly altered. We demonstrate this by comparing outcomes with and without the exclusion, in a case where a Waverider buoy is believed to be free of biofouling. Determination of the appropriate frequency range is not limited to the observed spectra. Model evaluation involves considering whether all relevant frequencies are included. Guidance to make this decision is based on analysis of observed spectra. Two model frequency lower limits were considered. Energy in the observed spectrum below the model lower limit was calculated for each. For locations where long swell is a component of the wave

  14. CIAU methodology and BEPU applications

    International Nuclear Information System (INIS)

    Petruzzi, A.; D'Auria, F.

    2009-01-01

    Best-Estimate calculation results from complex thermal-hydraulic system codes (like Relap5, Cathare, Athlet, Trace, etc..) are affected by unavoidable approximations that are unpredictable without the use of computational tools that account for the various sources of uncertainty. Therefore the use of best-estimate codes within the reactor technology, either for design or safety purposes, implies understanding and accepting the limitations and the deficiencies of those codes. Uncertainties may have different origins ranging from the approximation of the models, to the approximation of the numerical solution, and to the lack of precision of the values adopted for boundary and initial conditions. The amount of uncertainty that affects a calculation may strongly depend upon the codes and the modeling techniques (i.e. the code's users). A consistent and robust uncertainty methodology must be developed taking into consideration all the above aspects. The CIAU (Code with the capability of Internal Assessment of Uncertainty) and the UMAE (Uncertainty Methodology based on Accuracy Evaluation) methods have been developed by University of Pisa (UNIPI) in the framework of a long lasting research activities started since 80's and involving several researchers. CIAU is extensively discussed in the available technical literature, Refs. [1, 2, 3, 4, 5, 6, 7], and tens of additional relevant papers, that provide comprehensive details about the method, can be found in the bibliography lists of the above references. Therefore, the present paper supplies only 'spot-information' about CIAU and focuses mostly on the applications to some cases of industrial interest. In particular the application of CIAU to the OECD BEMUSE (Best Estimate Methods Uncertainty and Sensitivity Evaluation, [8, 9]) project is discussed and a critical comparison respect with other uncertainty methods (in relation to items like: sources of uncertainties, selection of the input parameters and quantification of

  15. Simulation enabled safeguards assessment methodology

    International Nuclear Information System (INIS)

    Bean, Robert; Bjornard, Trond; Larson, Tom

    2007-01-01

    It is expected that nuclear energy will be a significant component of future supplies. New facilities, operating under a strengthened international nonproliferation regime will be needed. There is good reason to believe virtual engineering applied to the facility design, as well as to the safeguards system design will reduce total project cost and improve efficiency in the design cycle. Simulation Enabled Safeguards Assessment MEthodology has been developed as a software package to provide this capability for nuclear reprocessing facilities. The software architecture is specifically designed for distributed computing, collaborative design efforts, and modular construction to allow step improvements in functionality. Drag and drop wire-frame construction allows the user to select the desired components from a component warehouse, render the system for 3D visualization, and, linked to a set of physics libraries and/or computational codes, conduct process evaluations of the system they have designed. (authors)

  16. Simulation Enabled Safeguards Assessment Methodology

    International Nuclear Information System (INIS)

    Robert Bean; Trond Bjornard; Thomas Larson

    2007-01-01

    It is expected that nuclear energy will be a significant component of future supplies. New facilities, operating under a strengthened international nonproliferation regime will be needed. There is good reason to believe virtual engineering applied to the facility design, as well as to the safeguards system design will reduce total project cost and improve efficiency in the design cycle. Simulation Enabled Safeguards Assessment Methodology (SESAME) has been developed as a software package to provide this capability for nuclear reprocessing facilities. The software architecture is specifically designed for distributed computing, collaborative design efforts, and modular construction to allow step improvements in functionality. Drag and drop wireframe construction allows the user to select the desired components from a component warehouse, render the system for 3D visualization, and, linked to a set of physics libraries and/or computational codes, conduct process evaluations of the system they have designed

  17. Geotechnical site assessment methodology

    International Nuclear Information System (INIS)

    Tunbridge, L.W.; Richards, L.R.

    1985-09-01

    A final report summarizing the research conducted on geotechnical site assessment methodology at the Carwynnen test mine in Cornwall. The geological setting of the test site in the Cornubian granite batholith is described. The effect of structure imposed by discontinuities on the engineering behaviour of rock masses is discussed and the scanline survey method of obtaining data on discontinuities in the rock mass is described. The applicability of some methods of statistical analysis for discontinuity data is reviewed. The requirement for remote geophysical methods of characterizing the mass is discussed and experiments using seismic and ultrasonic velocity measurements are reported. Methods of determining the in-situ stresses are described and the final results of a programme of in-situ stress measurements using the overcoring and hydrofracture methods are reported. (author)

  18. UNCOMMON SENSORY METHODOLOGIES

    Directory of Open Access Journals (Sweden)

    Vladimír Vietoris

    2015-02-01

    Full Text Available Sensory science is the young but the rapidly developing field of the food industry. Actually, the great emphasis is given to the production of rapid techniques of data collection, the difference between consumers and trained panel is obscured and the role of sensory methodologists is to prepare the ways for evaluation, by which a lay panel (consumers can achieve identical results as a trained panel. Currently, there are several conventional methods of sensory evaluation of food (ISO standards, but more sensory laboratories are developing methodologies that are not strict enough in the selection of evaluators, their mechanism is easily understandable and the results are easily interpretable. This paper deals with mapping of marginal methods used in sensory evaluation of food (new types of profiles, CATA, TDS, napping.

  19. Situating methodology within qualitative research.

    Science.gov (United States)

    Kramer-Kile, Marnie L

    2012-01-01

    Qualitative nurse researchers are required to make deliberate and sometimes complex methodological decisions about their work. Methodology in qualitative research is a comprehensive approach in which theory (ideas) and method (doing) are brought into close alignment. It can be difficult, at times, to understand the concept of methodology. The purpose of this research column is to: (1) define qualitative methodology; (2) illuminate the relationship between epistemology, ontology and methodology; (3) explicate the connection between theory and method in qualitative research design; and 4) highlight relevant examples of methodological decisions made within cardiovascular nursing research. Although there is no "one set way" to do qualitative research, all qualitative researchers should account for the choices they make throughout the research process and articulate their methodological decision-making along the way.

  20. Methodology for quantitative evaluation of diagnostic performance

    International Nuclear Information System (INIS)

    Metz, C.

    1981-01-01

    Of various approaches that might be taken to the diagnostic performance evaluation problem, Receiver Operating Characteristic (ROC) analysis holds great promise. Further development of the methodology for a unified, objective, and meaningful approach to evaluating the usefulness of medical imaging procedures is done by consideration of statistical significance testing, optimal sequencing of correlated studies, and analysis of observer performance

  1. A performance assessment methodology for low-level waste facilities

    International Nuclear Information System (INIS)

    Kozak, M.W.; Chu, M.S.Y.; Mattingly, P.A.

    1990-07-01

    A performance assessment methodology has been developed for use by the US Nuclear Regulatory Commission in evaluating license applications for low-level waste disposal facilities. This report provides a summary of background reports on the development of the methodology and an overview of the models and codes selected for the methodology. The overview includes discussions of the philosophy and structure of the methodology and a sequential procedure for applying the methodology. Discussions are provided of models and associated assumptions that are appropriate for each phase of the methodology, the goals of each phase, data required to implement the models, significant sources of uncertainty associated with each phase, and the computer codes used to implement the appropriate models. In addition, a sample demonstration of the methodology is presented for a simple conceptual model. 64 refs., 12 figs., 15 tabs

  2. Methodology for Modeling and Analysis of Business Processes (MMABP

    Directory of Open Access Journals (Sweden)

    Vaclav Repa

    2015-10-01

    Full Text Available This paper introduces the methodology for modeling business processes. Creation of the methodology is described in terms of the Design Science Method. Firstly, the gap in contemporary Business Process Modeling approaches is identified and general modeling principles which can fill the gap are discussed. The way which these principles have been implemented in the main features of created methodology is described. Most critical identified points of the business process modeling are process states, process hierarchy and the granularity of process description. The methodology has been evaluated by use in the real project. Using the examples from this project the main methodology features are explained together with the significant problems which have been met during the project. Concluding from these problems together with the results of the methodology evaluation the needed future development of the methodology is outlined.

  3. Going beyond methodological presentism

    DEFF Research Database (Denmark)

    Schmidt, Garbi

    2017-01-01

    Denmark is an example of a country where the idea of historical ethnic homogeneity stands strong. This article challenges this historical presentism: the scholarly and societal tendency to understand social phenomena within a limited contemporary framework, neglecting possible effects and similar...

  4. Methodological Problems of Nanotechnoscience

    Science.gov (United States)

    Gorokhov, V. G.

    Recently, we have reported on the definitions of nanotechnology as a new type of NanoTechnoScience and on the nanotheory as a cluster of the different natural and engineering theories. Nanotechnology is not only a new type of scientific-engineering discipline, but it evolves also in a “nonclassical” way. Nanoontology or nano scientific world view has a function of the methodological orientation for the choice the theoretical means and methods toward a solution to the scientific and engineering problems. This allows to change from one explanation and scientific world view to another without any problems. Thus, nanotechnology is both a field of scientific knowledge and a sphere of engineering activity, in other words, NanoTechnoScience is similar to Systems Engineering as the analysis and design of large-scale, complex, man/machine systems but micro- and nanosystems. Nano systems engineering as well as Macro systems engineering includes not only systems design but also complex research. Design orientation has influence on the change of the priorities in the complex research and of the relation to the knowledge, not only to “the knowledge about something”, but also to the knowledge as the means of activity: from the beginning control and restructuring of matter at the nano-scale is a necessary element of nanoscience.

  5. Engineering radioecology: Methodological considerations

    International Nuclear Information System (INIS)

    Nechaev, A.F.; Projaev, V.V.; Sobolev, I.A.; Dmitriev, S.A.

    1995-01-01

    The term ''radioecology'' has been widely recognized in scientific and technical societies. At the same time, this scientific school (radioecology) does not have a precise/generally acknowledged structure, unified methodical basis, fixed subjects of investigation, etc. In other words, radioecology is a vast, important but rather amorphous conglomerate of various ideas, amalgamated mostly by their involvement in biospheric effects of ionizing radiation and some conceptual stereotypes. This paradox was acceptable up to a certain time. However, with the termination of the Cold War and because of remarkable political changes in the world, it has become possible to convert the problem of environmental restoration from the scientific sphere in particularly practical terms. Already the first steps clearly showed an imperfection of existing technologies, managerial and regulatory schemes; lack of qualified specialists, relevant methods and techniques; uncertainties in methodology of decision-making, etc. Thus, building up (or maybe, structuring) of special scientific and technological basis, which the authors call ''engineering radioecology'', seems to be an important task. In this paper they endeavored to substantiate the last thesis and to suggest some preliminary ideas concerning the subject matter of engineering radioecology

  6. Civil migration and risk assessment methodology

    International Nuclear Information System (INIS)

    Onishi, Y.; Brown, S.M.; Olsen, A.R.; Parkhurst, M.A.

    1981-01-01

    To provide a scientific basis for risk assessment and decision making, the Chemical Migration and Risk Assessment (CMRA) Methodology was developed to simulate overland and instream toxic containment migration and fate, and to predict the probability of acute and chronic impacts on aquatic biota. The simulation results indicated that the time between the pesticide application and the subsequent runoff producing event was the most important factor determining the amount of the alachlor. The study also revealed that sediment transport has important effects on contaminant migration when sediment concentrations in receiving streams are high or contaminants are highly susceptible to adsorption by sediment. Although the capabilities of the CMRA methodology were only partially tested in this study, the results demonstrate that methodology can be used as a scientific decision-making tool for toxic chemical regulations, a research tool to evaluate the relative significance of various transport and degradation phenomena, as well as a tool to examine the effectiveness of toxic chemical control practice

  7. Demonstration of an infiltration evaluation methodology

    International Nuclear Information System (INIS)

    Smyth, J.D.; Gee, G.W.; Kincaid, C.T.; Nichols, W.M.; Bresler, E.

    1990-07-01

    An Infiltration Evaluation Methodology (IEM) was developed for the US Nuclear Regulatory Commission (NRC) by Pacific Northwest Laboratory (PNL) to provide a consistent, well formulated approach for evaluating drainage through engineered covers at low-level radioactive waste (LLW) sites. The methodology is designed to help evaluate the ability of proposed waste site covers to minimize drainage for LLW site license applications and for sites associated with the Uranium Mill Tailings Remedial Action (UMTRA) program. The objective of this methodology is to estimate the drainage through an engineered burial site cover system. The drainage estimate can be used as an input to a broader performance assessment methodology currently under development by the NRC. The methodology is designed to simulate, at the field scale, significant factors and hydrologic conditions which determine or influence estimates of infiltration, long-term moisture content profiles, and drainage from engineered covers and barriers. The IEM developed under this study acknowledges the uncertainty inherent in soil properties and quantifies the influence of such uncertainty on the estimates of drainage in engineered cover systems at waste disposal sites. 6 refs., 1 fig

  8. Methodology for developing new test methods

    Directory of Open Access Journals (Sweden)

    A. I. Korobko

    2017-06-01

    Full Text Available The paper describes the methodology for developing new test methods and forming solutions for the development of new test methods. The basis of the methodology for developing new test methods is the individual elements of the system and process approaches. They contribute to the development of an effective research strategy for the object, the study of interrelations, the synthesis of an adequate model of the test method. The effectiveness of the developed test method is determined by the correct choice of the set of concepts, their interrelations and mutual influence. This allows you to solve the tasks assigned to achieve the goal. The methodology is based on the use of fuzzy cognitive maps. The question of the choice of the method on the basis of which the model for the formation of solutions is based is considered. The methodology provides for recording a model for a new test method in the form of a finite set of objects. These objects are significant for the test method characteristics. Then a causal relationship is established between the objects. Further, the values of fitness indicators and the observability of the method and metrological tolerance for the indicator are established. The work is aimed at the overall goal of ensuring the quality of tests by improving the methodology for developing the test method.

  9. Dosimetric methodology of the ICRP

    International Nuclear Information System (INIS)

    Eckerman, K.F.

    1994-01-01

    Establishment of guidance for the protection of workers and members of the public from radiation exposures necessitates estimation of the radiation dose to tissues of the body at risk. The dosimetric methodology formulated by the International Commission on Radiological Protection (ICRP) is intended to be responsive to this need. While developed for radiation protection, elements of the methodology are often applied in addressing other radiation issues; e.g., risk assessment. This chapter provides an overview of the methodology, discusses its recent extension to age-dependent considerations, and illustrates specific aspects of the methodology through a number of numerical examples

  10. Transmission pricing: paradigms and methodologies

    Energy Technology Data Exchange (ETDEWEB)

    Shirmohammadi, Dariush [Pacific Gas and Electric Co., San Francisco, CA (United States); Vieira Filho, Xisto; Gorenstin, Boris [Centro de Pesquisas de Energia Eletrica (CEPEL), Rio de Janeiro, RJ (Brazil); Pereira, Mario V.P. [Power System Research, Rio de Janeiro, RJ (Brazil)

    1994-12-31

    In this paper we describe the principles of several paradigms and methodologies for pricing transmission services. The paper outlines some of the main characteristics of these paradigms and methodologies such as where they may be used for best results. Due to their popularity, power flow based MW-mile and short run marginal cost pricing methodologies will be covered in some detail. We conclude the paper with examples of the application of these two pricing methodologies for pricing transmission services in Brazil. (author) 25 refs., 2 tabs.

  11. Analytical methodology for nuclear safeguards

    International Nuclear Information System (INIS)

    Ramakumar, K.L.

    2011-01-01

    This paper attempts to briefly describe the analytical methodologies available and also highlight some of the challenges, expectations from nuclear material accounting and control (NUMAC) point of view

  12. The Debate over Inclusive Fitness as a Debate over Methodologies

    NARCIS (Netherlands)

    Rubin, Hannah

    This article analyzes the recent debate surrounding inclusive fitness and argues that certain limitations ascribed to it by critics—such as requiring weak selection or providing dynamically insufficient models—are better thought of as limitations of the methodological framework most often used with

  13. Country report: a methodology

    International Nuclear Information System (INIS)

    Colin, A.

    2013-01-01

    This paper describes a methodology which could be applicable to establish a country report. In the framework of nuclear non proliferation appraisal and IAEA safeguards implementation, it is important to be able to assess the potential existence of undeclared nuclear materials and activities as undeclared facilities in the country under review. In our views a country report should aim at providing detailed information on nuclear related activities for each country examined taken 'as a whole' such as nuclear development, scientific and technical capabilities, etc. In order to study a specific country, we need to know if there is already an operating nuclear civil programme or not. In the first case, we have to check carefully if it could divert nuclear material, if there are misused declared facilities or if they operate undeclared facilities and conduct undeclared activities aiming at manufacturing nuclear weapon. In the second case, we should pay attention to the development of a nuclear civil project. A country report is based on a wide span of information (most of the time coming from open sources but sometimes coming also from confidential or private ones). Therefore, it is important to carefully check the nature and the credibility (reliability?) of these sources through cross-check examination. Eventually, it is necessary to merge information from different sources and apply an expertise filter. We have at our disposal a lot of performing tools to help us to assess, understand and evaluate the situation (cartography, imagery, bibliometry, etc.). These tools allow us to offer the best conclusions as far as possible. The paper is followed by the slides of the presentation. (author)

  14. Microbiological Methodology in Astrobiology

    Science.gov (United States)

    Abyzov, S. S.; Gerasimenko, L. M.; Hoover, R. B.; Mitskevich, I. N.; Mulyukin, A. L.; Poglazova, M. N.; Rozanov, A. Y.

    2005-01-01

    Searching for life in astromaterials to be delivered from the future missions to extraterrestrial bodies is undoubtedly related to studies of the properties and signatures of living microbial cells and microfossils on Earth. As model terrestrial analogs of Martian polar subsurface layers are often regarded the Antarctic glacier and Earth permafrost habitats where alive microbial cells preserved viability for millennia years due to entering the anabiotic state. For the future findings of viable microorganisms in samples from extraterrestrial objects, it is important to use a combined methodology that includes classical microbiological methods, plating onto nutrient media, direct epifluorescence and electron microscopy examinations, detection of the elemental composition of cells, radiolabeling techniques, PCR and FISH methods. Of great importance is to ensure authenticity of microorganisms (if any in studied samples) and to standardize the protocols used to minimize a risk of external contamination. Although the convincing evidence of extraterrestrial microbial life will may come from the discovery of living cells in astromaterials, biomorphs and microfossils must also be regarded as a target in search of life evidence bearing in mind a scenario that alive microorganisms had not be preserved and underwent mineralization. Under the laboratory conditions, processes that accompanied fossilization of cyanobacteria were reconstructed, and artificially produced cyanobacterial stromatolites resembles by their morphological properties those found in natural Earth habitats. Regarding the vital importance of distinguishing between biogenic and abiogenic signatures and between living and fossil microorganisms in analyzed samples, it is worthwhile to use some previously developed approaches based on electron microscopy examinations and analysis of elemental composition of biomorphs in situ and comparison with the analogous data obtained for laboratory microbial cultures and

  15. Kaupapa Maori Methodology: Trusting the Methodology through Thick and Thin

    Science.gov (United States)

    Hiha, Anne Aroha

    2016-01-01

    Kaupapa Maori is thoroughly theorised in academia in Aotearoa and those wishing to use it as their research methodology can find support through the writing of a number of Maori academics. What is not so well articulated, is the experiential voice of those who have used Kaupapa Maori as research methodology. My identity as a Maori woman…

  16. INAA - application and limitation

    International Nuclear Information System (INIS)

    Heydorn, K.

    1990-01-01

    The uncertainties associated with performing Instrumental Neutron Activation Analysis (INAA) are discussed in relation to their category. The Comite International de Poids et Mesure (CIPM) distinguishes between uncertainties according to how their contribution to the overall uncertainty is evaluated. INAA is a potentially definitive method, if all sources of uncertainty can be accounted for. The limitation of the method is reached when uncertainties, which cannot be accounted for, assume significance, and the method cannot be brought in statistical control. (orig.)

  17. Ranking Accounting Authors and Departments in Accounting Education: Different Methodologies--Significantly Different Results

    Science.gov (United States)

    Bernardi, Richard A.; Zamojcin, Kimberly A.; Delande, Taylor L.

    2016-01-01

    This research tests whether Holderness Jr., D. K., Myers, N., Summers, S. L., & Wood, D. A. [(2014). "Accounting education research: Ranking institutions and individual scholars." "Issues in Accounting Education," 29(1), 87-115] accounting-education rankings are sensitive to a change in the set of journals used. It provides…

  18. Theoretical and methodological significance of Information and Communication Technology in educational practice.

    NARCIS (Netherlands)

    Mooij, Ton

    2016-01-01

    In September 1998 the Research Network ‘ICT in Education and Training’ was initiated at the conference of the European Educational Research Association (EERA). The new network reflected the recognition and growing importance of information and communication technology (ICT) with respect to education

  19. Verbal protocols as methodological resources: research evidence

    Directory of Open Access Journals (Sweden)

    Alessandra Baldo

    2012-01-01

    Full Text Available This article aims at reflecting on the use of verbal protocols as a methodological resource in qualitative research, more specifically on the aspect regarded as the main limitation of a study about lexical inferencing in L2 (BALDO; VELASQUES, 2010: its subjective trait. The article begins with a brief literature review on protocols, followed by a description of the study in which they were employed as methodological resources. Based on that, protocol subjectivity is illustrated through samples of unparalleled data classification, carried out independently by two researchers. In the final section, the path followed to minimize the problem is presented, intending to contribute to improve efficiency in the use of verbal protocols in future research.

  20. Methodologies for problem identification and solution

    International Nuclear Information System (INIS)

    Drury, C.G.

    This paper describes a methodology for bringing together knowledge on how humans work (ergonomics) and knowledge on a particular system (operating experience), using the concept of the purposive human/machine system. A Task/Operator/Machine/Environment (TOME) system is one in which information from the machine is passed to the operator through displays and the machine receives information from the operator via controls. The human in the system can be described as a set of potentially limiting subsystems. When the muscular-skeletal, physiological, sensory, or attentional limitations of the human are exceeded one has a 'human failure'. The overloading of limiting subsystems may also be described as mismatches between task demands and human capabilities. Detection and analysis of human/system mismatches are central to solving man-machine problems

  1. Improved USGS methodology for assessing continuous petroleum resources

    Science.gov (United States)

    Charpentier, Ronald R.; Cook, Troy A.

    2010-01-01

    This report presents an improved methodology for estimating volumes of continuous (unconventional) oil and gas resources within the United States and around the world. The methodology is based on previously developed U.S. Geological Survey methodologies that rely on well-scale production data. Improvements were made primarily to how the uncertainty about estimated ultimate recoveries is incorporated in the estimates. This is particularly important when assessing areas with sparse or no production data, because the new methodology allows better use of analog data from areas with significant discovery histories.

  2. A methodology for social experimentation

    DEFF Research Database (Denmark)

    Ravn, Ib

    A methodology is outlined whereby one may improve the performance of a social system to the satisfaction of its stakeholders, that is, facilitate desirable social and organizational transformations......A methodology is outlined whereby one may improve the performance of a social system to the satisfaction of its stakeholders, that is, facilitate desirable social and organizational transformations...

  3. Workshops as a Research Methodology

    Science.gov (United States)

    Ørngreen, Rikke; Levinsen, Karin

    2017-01-01

    This paper contributes to knowledge on workshops as a research methodology, and specifically on how such workshops pertain to e-learning. A literature review illustrated that workshops are discussed according to three different perspectives: workshops as a means, workshops as practice, and workshops as a research methodology. Focusing primarily on…

  4. Methodological Pluralism and Narrative Inquiry

    Science.gov (United States)

    Michie, Michael

    2013-01-01

    This paper considers how the integral theory model of Nancy Davis and Laurie Callihan might be enacted using a different qualitative methodology, in this case the narrative methodology. The focus of narrative research is shown to be on "what meaning is being made" rather than "what is happening here" (quadrant 2 rather than…

  5. Building ASIPS the Mescal methodology

    CERN Document Server

    Gries, Matthias

    2006-01-01

    A number of system designers use ASIP's rather than ASIC's to implement their system solutions. This book gives a comprehensive methodology for the design of these application-specific instruction processors (ASIPs). It includes demonstrations of applications of the methodologies using the Tipi research framework.

  6. A methodology for software documentation

    OpenAIRE

    Torres Júnior, Roberto Dias; Ahlert, Hubert

    2000-01-01

    With the growing complexity of window based software and the use of object-oriented, the development of software is getting more complex than ever. Based on that, this article intends to present a methodology for software documentation and to analyze our experience and how this methodology can aid the software maintenance

  7. Reverse engineering of inductive fault current limiters

    Energy Technology Data Exchange (ETDEWEB)

    Pina, J M; Neves, M Ventim; Rodrigues, A L [Centre of Technology and Systems Faculdade de Ciencias e Tecnologia, Nova University of Lisbon Monte de Caparica, 2829-516 Caparica (Portugal); Suarez, P; Alvarez, A, E-mail: jmmp@fct.unl.p [' Benito Mahedero' Group of Electrical Applications of Superconductors Escuela de IngenierIas Industrials, University of Extremadura Avenida de Elvas s/n, 06006 Badajoz (Spain)

    2010-06-01

    The inductive fault current limiter is less compact and harder to scale to high voltage networks than the resistive one. Nevertheless, its simple construction and mechanical robustness make it attractive in low voltage grids. Thus, it might be an enabling technology for the advent of microgrids, low voltage networks with dispersed generation, controllable loads and energy storage. A new methodology for reverse engineering of inductive fault current limiters based on the independent analysis of iron cores and HTS cylinders is presented in this paper. Their electromagnetic characteristics are used to predict the devices' hysteresis loops and consequently their dynamic behavior. Previous models based on the separate analysis of the limiters' components were already derived, e.g. in transformer like equivalent models. Nevertheless, the assumptions usually made may limit these models' application, as shown in the paper. The proposed methodology obviates these limitations. Results are validated through simulations.

  8. Reverse engineering of inductive fault current limiters

    International Nuclear Information System (INIS)

    Pina, J M; Neves, M Ventim; Rodrigues, A L; Suarez, P; Alvarez, A

    2010-01-01

    The inductive fault current limiter is less compact and harder to scale to high voltage networks than the resistive one. Nevertheless, its simple construction and mechanical robustness make it attractive in low voltage grids. Thus, it might be an enabling technology for the advent of microgrids, low voltage networks with dispersed generation, controllable loads and energy storage. A new methodology for reverse engineering of inductive fault current limiters based on the independent analysis of iron cores and HTS cylinders is presented in this paper. Their electromagnetic characteristics are used to predict the devices' hysteresis loops and consequently their dynamic behavior. Previous models based on the separate analysis of the limiters' components were already derived, e.g. in transformer like equivalent models. Nevertheless, the assumptions usually made may limit these models' application, as shown in the paper. The proposed methodology obviates these limitations. Results are validated through simulations.

  9. Fire safety analysis: methodology

    International Nuclear Information System (INIS)

    Kazarians, M.

    1998-01-01

    From a review of the fires that have occurred in nuclear power plants and the results of fire risk studies that have been completed over the last 17 years, we can conclude that internal fires in nuclear power plants can be an important contributor to plant risk. Methods and data are available to quantify the fire risk. These methods and data have been subjected to a series of reviews and detailed scrutiny and have been applied to a large number of plants. There is no doubt that we do not know everything about fire and its impact on a nuclear power plants. However, this lack of knowledge or uncertainty can be quantified and can be used in the decision making process. In other words, the methods entail uncertainties and limitations that are not insurmountable and there is little or no basis for the results of a fire risk analysis fail to support a decision process

  10. Percutaneous Penetration - Methodological Considerations

    DEFF Research Database (Denmark)

    Holmgaard, Rikke; Benfeldt, Eva; Nielsen, Jesper B

    2014-01-01

    developed to replace methods involving experimental animals. The results obtained from these methods are decided not only by the chemical or product tested, but to a significant degree also by the experimental set-up and decisions made by the investigator during the planning phase. The present Mini...

  11. Moynihan: A Methodological Note

    Science.gov (United States)

    Swan, L. Alex

    1974-01-01

    Presents an analysis of the significant parts of the census data Moynihan used to support his argument to determine whether the conclusions he reached are supported by such data, finding that he presents no data to substantiate his argument that black social problems are a function of family instability. (Author/JM)

  12. METHODOLOGY OF TECHNIQUE PREPARATION FOR LOW VISION JAVELIN THROWERS

    Directory of Open Access Journals (Sweden)

    Milan Matić

    2013-07-01

    Full Text Available Javelin throwing discipline for disabled people has been expanding couple of years back. In addition, world’s records have been improving year after year. The esential part in preparation of low vision javelin throwers is mastering the technique elements, crucial for acquiring better results. Method of theoretical analysis, decriptive and comparative methods of survey were applied. Relevant knowledge in the area of low vision javelin throwers was analyzed and systematized, and then interpretated theoretically and applied on the top javelin thrower, which served as a base for the inovative apporoach in methodology and praxis with disabled people. Due to visual impairment, the coordination and balance are challenged. This limitation practically makes the difference in methodology, explained in this article. Apart from the goals focused on improving the condition and results on competitions, more specialized goals should be considered, e.g. improving of orientation, balance and socialization process for the people who have low vision. Special approach used in the technique preparation brought the significant improvement in techique of our famous Paralympian Grlica Miloš. In addition to the technique improvement he acquired better results on the big competitions and a few worldwide valuable prizes were won. The area of ’sport for disabled people’ is not enough present in the praxis of sport’s workers. More articles and scientific surveys on this topic are needed for further work and results improvement with these kind of sportsmen.

  13. Prototype application of best estimate and uncertainty safety analysis methodology to large LOCA analysis

    International Nuclear Information System (INIS)

    Luxat, J.C.; Huget, R.G.

    2001-01-01

    Development of a methodology to perform best estimate and uncertainty nuclear safety analysis has been underway at Ontario Power Generation for the past two and one half years. A key driver for the methodology development, and one of the major challenges faced, is the need to re-establish demonstrated safety margins that have progressively been undermined through excessive and compounding conservatism in deterministic analyses. The major focus of the prototyping applications was to quantify the safety margins that exist at the probable range of high power operating conditions, rather than the highly improbable operating states associated with Limit of the Envelope (LOE) assumptions. In LOE, all parameters of significance to the consequences of a postulated accident are assumed to simultaneously deviate to their limiting values. Another equally important objective of the prototyping was to demonstrate the feasibility of conducting safety analysis as an incremental analysis activity, as opposed to a major re-analysis activity. The prototype analysis solely employed prior analyses of Bruce B large break LOCA events - no new computer simulations were undertaken. This is a significant and novel feature of the prototyping work. This methodology framework has been applied to a postulated large break LOCA in a Bruce generating unit on a prototype basis. This paper presents results of the application. (author)

  14. AEGIS methodology and a perspective from AEGIS methodology demonstrations

    International Nuclear Information System (INIS)

    Dove, F.H.

    1981-03-01

    Objectives of AEGIS (Assessment of Effectiveness of Geologic Isolation Systems) are to develop the capabilities needed to assess the post-closure safety of waste isolation in geologic formation; demonstrate these capabilities on reference sites; apply the assessment methodology to assist the NWTS program in site selection, waste package and repository design; and perform repository site analyses for the licensing needs of NWTS. This paper summarizes the AEGIS methodology, the experience gained from methodology demonstrations, and provides an overview in the following areas: estimation of the response of a repository to perturbing geologic and hydrologic events; estimation of the transport of radionuclides from a repository to man; and assessment of uncertainties

  15. Management methodology for pressure equipment

    Science.gov (United States)

    Bletchly, P. J.

    Pressure equipment constitutes a significant investment in capital and a major proportion of potential high-risk plant in many operations and this is particularly so in an alumina refinery. In many jurisdictions pressure equipment is also subject to statutory regulation that imposes obligations on Owners of the equipment with respect to workplace safety. Most modern technical standards and industry codes of practice employ a risk-based approach to support better decision making with respect to pressure equipment. For a management system to be effective it must demonstrate that risk is being managed within acceptable limits.

  16. EVOLUTION OF THEORETICAL AND METHODOLOGICAL FOUNDATIONS OF COMPARATIVE ECONOMICS

    Directory of Open Access Journals (Sweden)

    N. Grazhevska

    2014-12-01

    Full Text Available The article reveals the evolution stages of theoretical and methodological foundations of comparative economics. The author highlights algorithms of comparative analysis as well as theoretical and methodological limitations of four research programs of the new comparative economics. The article justifies the necessity of a comprehensive comparative study of the major trends and contradictions in the development of national economies in the era of globalization.

  17. The JET belt limiter tiles

    International Nuclear Information System (INIS)

    Deksnis, E.

    1988-09-01

    The belt limiter system, comprising two full toroidal rings of limiter tiles, was installed in JET in 1987. In consists of water-cooled fins with the limiter material in form of tile inbetween. The tiles are designed to absorb heat fluxes during irradiation without the surface temperature exceeding 2000 0 C and to radiate this heat between pulses to the water cooled sink whose temperature is lower than that of the vacuum vessel. An important feature of the design is to maximise the area of the radiating surface facing the water cooled fin. This leads to a tile depth much greater than the width of the tile facing the heat flux. Limiter tiles intercept particles flowing out of the plasma through the area between the two belt limiter rings and through remaining surface area of the plasma column. Power deposition to a limiter tile depends strongly on the shape of the plasma, the edge plasma properties as well as on the surface profile of the tiles. This paper will discuss the methodology that was followed in producing an optimized surface profile of the tiles. This shaped profile has the feature that the resulting power deposition profile is roughly similar for a wide range of plasma parameters. (author)

  18. Open Government Data Publication Methodology

    Directory of Open Access Journals (Sweden)

    Jan Kucera

    2015-04-01

    Full Text Available Public sector bodies hold a significant amount of data that might be of potential interest to citizens and businesses. However the re-use potential of this data is still largely untapped because the data is not always published in a way that would allow its easy discovery, understanding and re-use. Open Government Data (OGD initiatives aim at increasing availability of machine-readable data provided under an open license and therefore these initiatives might facilitate re-use of the government data which in turn might lead to increased transparency and economic growth. However the recent studies show that still only a portion of data provided by the public sector bodies is truly open. Public sector bodies face a number of challenges when publishing OGD and they need to address the relevant risks. Therefore there is a need for best practices and methodologies for publication of OGD that would provide the responsible persons with a clear guidance on how the OGD initiatives should be implemented and how the known challenges and risks should be addressed.

  19. An Improved Cambridge Filter Pad Extraction Methodology to Obtain More Accurate Water and “Tar” Values: In Situ Cambridge Filter Pad Extraction Methodology

    Directory of Open Access Journals (Sweden)

    Ghosh David

    2014-07-01

    Full Text Available Previous investigations by others and internal investigations at Philip Morris International (PMI have shown that the standard trapping and extraction procedure used for conventional cigarettes, defined in the International Standard ISO 4387 (Cigarettes -- Determination of total and nicotine-free dry particulate matter using a routine analytical smoking machine, is not suitable for high-water content aerosols. Errors occur because of water losses during the opening of the Cambridge filter pad holder to remove the filter pad as well as during the manual handling of the filter pad, and because the commercially available filter pad holder, which is constructed out of plastic, may adsorb water. This results in inaccurate values for the water content, and erroneous and overestimated values for Nicotine Free Dry Particulate Matter (NFDPM. A modified 44 mm Cambridge filter pad holder and extraction equipment which supports in situ extraction methodology has been developed and tested. The principle of the in situ extraction methodology is to avoid any of the above mentioned water losses by extracting the loaded filter pad while kept in the Cambridge filter pad holder which is hermetically sealed by two caps. This is achieved by flushing the extraction solvent numerous times through the hermetically sealed Cambridge filter pad holder by means of an in situ extractor. The in situ methodology showed a significantly more complete water recovery, resulting in more accurate NFDPM values for high-water content aerosols compared to the standard ISO methodology. The work presented in this publication demonstrates that the in situ extraction methodology applies to a wider range of smoking products and smoking regimens, whereas the standard ISO methodology only applies to a limited range of smoking products and smoking regimens, e.g., conventional cigarettes smoked under ISO smoking regimen. In cases where a comparison of yields between the PMI HTP and

  20. A Generalizable Methodology for Quantifying User Satisfaction

    Science.gov (United States)

    Huang, Te-Yuan; Chen, Kuan-Ta; Huang, Polly; Lei, Chin-Laung

    Quantifying user satisfaction is essential, because the results can help service providers deliver better services. In this work, we propose a generalizable methodology, based on survival analysis, to quantify user satisfaction in terms of session times, i. e., the length of time users stay with an application. Unlike subjective human surveys, our methodology is based solely on passive measurement, which is more cost-efficient and better able to capture subconscious reactions. Furthermore, by using session times, rather than a specific performance indicator, such as the level of distortion of voice signals, the effects of other factors like loudness and sidetone, can also be captured by the developed models. Like survival analysis, our methodology is characterized by low complexity and a simple model-developing process. The feasibility of our methodology is demonstrated through case studies of ShenZhou Online, a commercial MMORPG in Taiwan, and the most prevalent VoIP application in the world, namely Skype. Through the model development process, we can also identify the most significant performance factors and their impacts on user satisfaction and discuss how they can be exploited to improve user experience and optimize resource allocation.

  1. An Innovative Synthesis Methodology for Process Intensification

    DEFF Research Database (Denmark)

    Lutze, Philip

    to improve a process. However, to date only a limited number have achieved implementation in industry, such as reactive distillation, dividing wall columns and reverse flow reactors. A reason for this is that the identification of the best PI option is neither simple nor systematic. That is to decide where......‐based solution approach. Starting from an analysis of existing processes, the methodology generates a set of PI process options. Subsequently, the initial search space is reduced through an ordered sequence of steps. As the search space decreases, more process details are added, increasing the complexity...

  2. A Micropillar Compression Methodology for Ductile Damage Quantification

    NARCIS (Netherlands)

    Tasan, C.C.; Hoefnagels, J.P.M.; Geers, M.G.D.

    2012-01-01

    Microstructural damage evolution is reported to influence significantly the failures of new high-strength alloys. Its accurate quantification is, therefore, critical for (1) microstructure optimization and (2) continuum damage models to predict failures of these materials. As existing methodologies

  3. A micropillar compression methodology for ductile damage quantification

    NARCIS (Netherlands)

    Tasan, C.C.; Hoefnagels, J.P.M.; Geers, M.G.D.

    2012-01-01

    Microstructural damage evolution is reported to influence significantly the failures of new high-strength alloys. Its accurate quantification is, therefore, critical for (1) microstructure optimization and (2) continuum damage models to predict failures of these materials. As existing methodologies

  4. Methodological practicalities in analytical generalization

    DEFF Research Database (Denmark)

    Halkier, Bente

    2011-01-01

    generalization. Theoretically, the argumentation in the article is based on practice theory. The main part of the article describes three different examples of ways of generalizing on the basis of the same qualitative data material. There is a particular focus on describing the methodological strategies......In this article, I argue that the existing literature on qualitative methodologies tend to discuss analytical generalization at a relatively abstract and general theoretical level. It is, however, not particularly straightforward to “translate” such abstract epistemological principles into more...... operative methodological strategies for producing analytical generalizations in research practices. Thus, the aim of the article is to contribute to the discussions among qualitatively working researchers about generalizing by way of exemplifying some of the methodological practicalities in analytical...

  5. Nanotoxicology materials, methodologies, and assessments

    CERN Document Server

    Durán, Nelson; Alves, Oswaldo L; Zucolotto, Valtencir

    2014-01-01

    This book begins with a detailed introduction to engineered nanostructures, followed by a section on methodologies used in research on cytotoxicity and genotoxicity, and concluding with evidence for the cyto- and genotoxicity of specific nanoparticles.

  6. Reflective Methodology: The Beginning Teacher

    Science.gov (United States)

    Templeton, Ronald K.; Siefert, Thomas E.

    1970-01-01

    Offers a variety of specific techniques which will help the beginning teacher to implement reflective methodology and create an inquiry-centered classroom atmosphere, at the same time meeting the many more pressing demands of first-year teaching. (JES)

  7. Does Metformin Reduce Cancer Risks? Methodologic Considerations.

    Science.gov (United States)

    Golozar, Asieh; Liu, Shuiqing; Lin, Joeseph A; Peairs, Kimberly; Yeh, Hsin-Chieh

    2016-01-01

    The substantial burden of cancer and diabetes and the association between the two conditions has been a motivation for researchers to look for targeted strategies that can simultaneously affect both diseases and reduce their overlapping burden. In the absence of randomized clinical trials, researchers have taken advantage of the availability and richness of administrative databases and electronic medical records to investigate the effects of drugs on cancer risk among diabetic individuals. The majority of these studies suggest that metformin could potentially reduce cancer risk. However, the validity of this purported reduction in cancer risk is limited by several methodological flaws either in the study design or in the analysis. Whether metformin use decreases cancer risk relies heavily on the availability of valid data sources with complete information on confounders, accurate assessment of drug use, appropriate study design, and robust analytical techniques. The majority of the observational studies assessing the association between metformin and cancer risk suffer from methodological shortcomings and efforts to address these issues have been incomplete. Future investigations on the association between metformin and cancer risk should clearly address the methodological issues due to confounding by indication, prevalent user bias, and time-related biases. Although the proposed strategies do not guarantee a bias-free estimate for the association between metformin and cancer, they will reduce synthesis of and reporting of erroneous results.

  8. The significance of music in teaching Turkish

    Directory of Open Access Journals (Sweden)

    Feyzan Göher Vural

    2018-03-01

    Full Text Available The whole history of teaching foreign languages is, in a way, a search for the best methodologies of doing so. Methods of teaching can be seen as systems providing the optimal ways of getting basic knowledge of a foreign language for students. The existing wide variety of methods suggests the choice of those most efficient for particular individuals. Each of the methodologies used relies on a number of theoretical underpinnings. This article advances proposals on using music in teaching Turkish as a foreign language, based on the principles of suggestopedia — the psychology of positive teaching developed by G. Lozanov. It is claimed that suggestopedia helps students learn a foreign language 3 to 5 times faster than those who rely on traditional methods of study. Suggestopedia makes colloquial dialogues in a foreign language be pronounced and/or listened to in the same way as music, and music is used as a facilitating and motivating factor, without lyrics. However, teaching based on suggestopedia alone will not suffice, since in the classical form it is limited by adult audiences and small ones to boot. Other peculiarities of students and groups also have to be accounted for, such as their linguistic background: if their first language is a Turkic one, they can be expected to learn Turkish faster, since they are already familiar with the melodic mode of the language. It can nevertheless be postulated that the use of songs in language learning helps develop such skills as vocabulary, rules of grammar and correct pronunciation. Songs facilitate teaching new words, grammatical rules, learning direction, layout, numbers and demonstrative adjectives. This is both useful and interesting for all categories of learners, children as well as adults.

  9. Methodologies used in Project Management

    OpenAIRE

    UNGUREANU, Adrian; UNGUREANU, Anca

    2014-01-01

    Undoubtedly, a methodology properly defined and strictly followed for project management provides a firm guarantee that the work will be done on time, in budget and according to specifications. A project management methodology in simple terms is a “must-have” to avoid failure and reduce risks, because is one of the critical success factors, such basic skills of the management team. This is the simple way to guide the team through the design and execution phases, processes and tasks throughout...

  10. Methodology for ranking restoration options

    DEFF Research Database (Denmark)

    Jensen, Per Hedemann

    1999-01-01

    techniques as a function of contamination and site characteristics. The project includes analyses of existing remediation methodologies and contaminated sites, and is structured in the following steps:-characterisation of relevant contaminated sites -identication and characterisation of relevant restoration...... techniques -assessment of the radiological impact -development and application of a selection methodology for restoration options -formulation ofgeneric conclusions and development of a manual The project is intended to apply to situations in which sites with nuclear installations have been contaminated...

  11. Did the corporatization of Portuguese hospitals significantly change their productivity?

    Science.gov (United States)

    Ferreira, Diogo; Marques, Rui Cunha

    2015-04-01

    This paper aims to investigate if the market structure reforms in the Portuguese health system have improved hospital performance and productivity. A robust non-parametric Malmquist index is applied to measure group performance. The significance of the results achieved is examined using a conditional and non-conditional subsampling bootstrapped-based methodology, enhanced by the likelihood cross validation criterion based on the k-nearest neighbors method. The sample contains information about 216 non-corporatized and 176 corporatized Portuguese hospitals for the period 2002–2009. Five models were applied, based on three study dimensions (internment, emergencies and doctor visits). The results show that although corporatized hospitals presented the highest efficiency consistency, they had also the lowest levels of productivity, while the hospitals under the traditional administrative public management system were the ones with the best average performance. However, several best practices were also found in all groups, being the limited companies were often dominated by both noncorporatized and public enterprise entities. Consistent output ranges where all groups present dominance over the others were also identified. It was possible to conclude that the more autonomy the hospital had from the Ministry of Health, the lower was its productivity.

  12. Magnetic resonance imaging methodology

    International Nuclear Information System (INIS)

    Moser, Ewald; Stadlbauer, Andreas; Windischberger, Christian; Quick, Harald H.; Ladd, Mark E.

    2009-01-01

    Magnetic resonance (MR) methods are non-invasive techniques to provide detailed, multi-parametric information on human anatomy, function and metabolism. Sensitivity, specificity, spatial and temporal resolution may, however, vary depending on hardware (e.g., field strength, gradient strength and speed) and software (optimised measurement protocols and parameters for the various techniques). Furthermore, multi-modality imaging may enhance specificity to better characterise complex disease patterns. Positron emission tomography (PET) is an interesting, largely complementary modality, which might be combined with MR. Despite obvious advantages, combining these rather different physical methods may also pose challenging problems. At this early stage, it seems that PET quality may be preserved in the magnetic field and, if an adequate detector material is used for the PET, MR sensitivity should not be significantly degraded. Again, this may vary for the different MR techniques, whereby functional and metabolic MR is more susceptible than standard anatomical imaging. Here we provide a short introduction to MR basics and MR techniques, also discussing advantages, artefacts and problems when MR hardware and PET detectors are combined. In addition to references for more detailed descriptions of MR fundamentals and applications, we provide an early outlook on this novel and exciting multi-modality approach to PET/MR. (orig.)

  13. A study on the advanced statistical core thermal design methodology

    International Nuclear Information System (INIS)

    Lee, Seung Hyuk

    1992-02-01

    A statistical core thermal design methodology for generating the limit DNBR and the nominal DNBR is proposed and used in assessing the best-estimate thermal margin in a reactor core. Firstly, the Latin Hypercube Sampling Method instead of the conventional Experimental Design Technique is utilized as an input sampling method for a regression analysis to evaluate its sampling efficiency. Secondly and as a main topic, the Modified Latin Hypercube Sampling and the Hypothesis Test Statistics method is proposed as a substitute for the current statistical core thermal design method. This new methodology adopts 'a Modified Latin Hypercube Sampling Method' which uses the mean values of each interval of input variables instead of random values to avoid the extreme cases that arise in the tail areas of some parameters. Next, the independence between the input variables is verified through 'Correlation Coefficient Test' for statistical treatment of their uncertainties. And the distribution type of DNBR response is determined though 'Goodness of Fit Test'. Finally, the limit DNBR with one-sided 95% probability and 95% confidence level, DNBR 95/95 ' is estimated. The advantage of this methodology over the conventional statistical method using Response Surface and Monte Carlo simulation technique lies in its simplicity of the analysis procedure, while maintaining the same level of confidence in the limit DNBR result. This methodology is applied to the two cases of DNBR margin calculation. The first case is the application to the determination of the limit DNBR where the DNBR margin is determined by the difference between the nominal DNBR and the limit DNBR. The second case is the application to the determination of the nominal DNBR where the DNBR margin is determined by the difference between the lower limit value of the nominal DNBR and the CHF correlation limit being used. From this study, it is deduced that the proposed methodology gives a good agreement in the DNBR results

  14. Acceleration-based methodology to assess the blast mitigation performance of explosive ordnance disposal helmets

    Science.gov (United States)

    Dionne, J. P.; Levine, J.; Makris, A.

    2018-01-01

    To design the next generation of blast mitigation helmets that offer increasing levels of protection against explosive devices, manufacturers must be able to rely on appropriate test methodologies and human surrogates that will differentiate the performance level of various helmet solutions and ensure user safety. Ideally, such test methodologies and associated injury thresholds should be based on widely accepted injury criteria relevant within the context of blast. Unfortunately, even though significant research has taken place over the last decade in the area of blast neurotrauma, there currently exists no agreement in terms of injury mechanisms for blast-induced traumatic brain injury. In absence of such widely accepted test methods and injury criteria, the current study presents a specific blast test methodology focusing on explosive ordnance disposal protective equipment, involving the readily available Hybrid III mannequin, initially developed for the automotive industry. The unlikely applicability of the associated brain injury criteria (based on both linear and rotational head acceleration) is discussed in the context of blast. Test results encompassing a large number of blast configurations and personal protective equipment are presented, emphasizing the possibility to develop useful correlations between blast parameters, such as the scaled distance, and mannequin engineering measurements (head acceleration). Suggestions are put forward for a practical standardized blast testing methodology taking into account limitations in the applicability of acceleration-based injury criteria as well as the inherent variability in blast testing results.

  15. Housing Accessibility Methodology Targeting Older People

    DEFF Research Database (Denmark)

    Helle, Tina

    accessibility problems before the planning of housing intervention strategies. It is also critical that housing standards addressing accessibility intended to accommodate people with functional limitations are valid in the sense that their definitions truly support accessibility. However, there is a paucity...... of valid and reliable assessment instruments targeting housing accessibility, and in-depth analysis of factors potentially impacting on reliability in complex assessment situations is remarkably absent. Moreover, the knowledge base informing the housing standards appears to be vague. We may therefore...... reasonably question the validity of the housing standards addressing accessibility. This thesis addresses housing accessibility methodology in general and the reliability of assessment and the validity of standards targeting older people with functional limitations and a dependence on mobility devices...

  16. Risk based limits for Operational Safety Requirements

    International Nuclear Information System (INIS)

    Cappucci, A.J. Jr.

    1993-01-01

    OSR limits are designed to protect the assumptions made in the facility safety analysis in order to preserve the safety envelope during facility operation. Normally, limits are set based on ''worst case conditions'' without regard to the likelihood (frequency) of a credible event occurring. In special cases where the accident analyses are based on ''time at risk'' arguments, it may be desirable to control the time at which the facility is at risk. A methodology has been developed to use OSR limits to control the source terms and the times these source terms would be available, thus controlling the acceptable risk to a nuclear process facility. The methodology defines a new term ''gram-days''. This term represents the area under a source term (inventory) vs time curve which represents the risk to the facility. Using the concept of gram-days (normalized to one year) allows the use of an accounting scheme to control the risk under the inventory vs time curve. The methodology results in at least three OSR limits: (1) control of the maximum inventory or source term, (2) control of the maximum gram-days for the period based on a source term weighted average, and (3) control of the maximum gram-days at the individual source term levels. Basing OSR limits on risk based safety analysis is feasible, and a basis for development of risk based limits is defensible. However, monitoring inventories and the frequencies required to maintain facility operation within the safety envelope may be complex and time consuming

  17. Groundwater vulnerability to climate change: A review of the assessment methodology.

    Science.gov (United States)

    Aslam, Rana Ammar; Shrestha, Sangam; Pandey, Vishnu Prasad

    2018-01-15

    Impacts of climate change on water resources, especially groundwater, can no longer be hidden. These impacts are further exacerbated under the integrated influence of climate variability, climate change and anthropogenic activities. The degree of impact varies according to geographical location and other factors leading systems and regions towards different levels of vulnerability. In the recent past, several attempts have been made in various regions across the globe to quantify the impacts and consequences of climate and non-climate factors in terms of vulnerability to groundwater resources. Firstly, this paper provides a structured review of the available literature, aiming to critically analyse and highlight the limitations and knowledge gaps involved in vulnerability (of groundwater to climate change) assessment methodologies. The effects of indicator choice and the importance of including composite indicators are then emphasised. A new integrated approach for the assessment of groundwater vulnerability to climate change is proposed to successfully address those limitations. This review concludes that the choice of indicator has a significant role in defining the reliability of computed results. The effect of an individual indicator is also apparent but the consideration of a combination (variety) of indicators may give more realistic results. Therefore, in future, depending upon the local conditions and scale of the study, indicators from various groups should be chosen. Furthermore, there are various assumptions involved in previous methodologies, which limit their scope by introducing uncertainty in the calculated results. These limitations can be overcome by implementing the proposed approach. Copyright © 2017 Elsevier B.V. All rights reserved.

  18. Partnerships – Limited partnerships and limited liability limited partnerships

    OpenAIRE

    Henning, Johan J.

    2000-01-01

    Consideration of the Limited Liability Partnership Act 2000 which introduced a new corporate entity, carrying the designations “partnership” and “limited” which allow members to limit their liability whilst organising themselves internally as a partnership. Article by Professor Johan Henning (Director of the Centre for Corporate Law and Practice, IALS and Dean of the Faculty of Law, University of the Free State, South Africa). Published in Amicus Curiae - Journal of the Institute of Advanced ...

  19. Hybrid Risk Management Methodology: A Case Study

    Directory of Open Access Journals (Sweden)

    Jacky Siu-Lun Ting

    2009-10-01

    Full Text Available Risk management is a decision-making process involving considerations of political, social, economic and engineering factors with relevant risk assessments relating to a potential hazard. In the last decade, a number of risk management tools are introduced and employed to manage and minimize the uncertainty and threats realization to the organizations. However, the focus of these methodologies are different; in which companies need to adopt various risk management principles to visualize a full picture of the organizational risk level. Regarding to this, this paper presents a new approach of risk management that integrates Hierarchical Holographic Modeling (HHM, Enterprise Risk Management (ERM and Business Recovery Planning (BCP for identifying and assessing risks as well as managing the consequences of realized residual risks. To illustrate the procedures of the proposed methodology, a logistic company ABC Limited is chosen to serve as a case study Through applying HHM and ERM to investigate and assess the risk, ABC Limited can be better evaluated the potential risks and then took the responsive actions (e.g. BCP to handle the risks and crisis in near future.

  20. Assessing species saturation: conceptual and methodological challenges.

    Science.gov (United States)

    Olivares, Ingrid; Karger, Dirk N; Kessler, Michael

    2018-05-07

    Is there a maximum number of species that can coexist? Intuitively, we assume an upper limit to the number of species in a given assemblage, or that a lineage can produce, but defining and testing this limit has proven problematic. Herein, we first outline seven general challenges of studies on species saturation, most of which are independent of the actual method used to assess saturation. Among these are the challenge of defining saturation conceptually and operationally, the importance of setting an appropriate referential system, and the need to discriminate among patterns, processes and mechanisms. Second, we list and discuss the methodological approaches that have been used to study species saturation. These approaches vary in time and spatial scales, and in the variables and assumptions needed to assess saturation. We argue that assessing species saturation is possible, but that many studies conducted to date have conceptual and methodological flaws that prevent us from currently attaining a good idea of the occurrence of species saturation. © 2018 Cambridge Philosophical Society.

  1. Ecological Significance of Marine Microzooplankton

    Digital Repository Service at National Institute of Oceanography (India)

    Godhantaraman, N.

    ). Studies concerning microzooplankton in marine coastal, estuarine and brackish-water systems along tropical Indian waters are limited. Hence, in the following sections, I provide the results obtained by the research conducted in the tropical Vellar... estuarine systems, southeast coast of India. This was one of the first comprehensive studies on microzooplankton in India. There are also comparative accounts of microzooplankton researches from my studies in Japanese coastal marine waters. Microzooplankton...

  2. The intersections between TRIZ and forecasting methodology

    Directory of Open Access Journals (Sweden)

    Georgeta BARBULESCU

    2010-12-01

    Full Text Available The authors’ intention is to correlate the basic knowledge in using the TRIZ methodology (Theory of Inventive Problem Solving or in Russian: Teoriya Resheniya Izobretatelskikh Zadatch as a problem solving tools meant to help the decision makers to perform more significant forecasting exercises. The idea is to identify the TRIZ features and instruments (40 inventive principles, i.e. for putting in evidence the noise and signal problem, for trend identification (qualitative and quantitative tendencies and support tools in technological forecasting, to make the decision-makers able to refine and to increase the level of confidence in the forecasting results. The interest in connecting TRIZ to forecasting methodology, nowadays, relates to the massive application of TRIZ methods and techniques for engineering system development world-wide and in growing application of TRIZ’s concepts and paradigms for improvements of non-engineering systems (including the business and economic applications.

  3. Fracture mechanics approach to estimate rail wear limits

    Science.gov (United States)

    2009-10-01

    This paper describes a systematic methodology to estimate allowable limits for rail head wear in terms of vertical head-height loss, gage-face side wear, and/or the combination of the two. This methodology is based on the principles of engineering fr...

  4. Evaluation of proliferation resistance using the INPRO methodology

    International Nuclear Information System (INIS)

    Yang, Myung Seung; Park, Joo Hwan; Ko, Won Il; Song, Kee Chan; Choi, Kun Mo; Kim, Jin Kyoung

    2007-01-01

    The IAEA launched the International Project on Innovative Nuclear Reactors and Fuel Cycles (INPRO) and developed the INPRO Methodology to provide guidelines and to assess the characteristics of a future innovative nuclear energy system in areas such as safety, economics, waste management, and proliferation resistance. The proliferation resistance area of the INPRO Methodology is reviewed here, and modifications for further improvements are proposed. The evaluation metrics including the evaluation parameters, evaluation scales and acceptance limits are developed for a practical application of the methodology to assess the proliferation resistance. The proliferation resistant characteristics of the DUPIC fuel cycle are assessed by applying the modified INPRO Methodology based on the developed evaluation metrics and acceptance criteria. The evaluation procedure and the metrics can be utilized as a reference for an evaluation of the proliferation resistance of a future innovative nuclear energy system

  5. Historical Significant Volcanic Eruption Locations

    Data.gov (United States)

    Department of Homeland Security — A significant eruption is classified as one that meets at least one of the following criteriacaused fatalities, caused moderate damage (approximately $1 million or...

  6. Managerial Methodology in Public Institutions

    Directory of Open Access Journals (Sweden)

    Ion VERBONCU

    2010-10-01

    Full Text Available One of the most important ways of making public institutions more efficient is by applying managerial methodology, embodied in the promotion of management tools, modern and sophisticated methodologies, as well as operation of designing/redesigning and maintenance of the management process and its components. Their implementation abides the imprint of constructive and functional particularities of public institutions, decentralized and devolved, and, of course, the managers’ expertise of these organizations. Managerial methodology is addressed through three important instruments diagnosis, management by objectives and scoreboard. Its presence in the performance management process should be mandatory, given the favorable influence on the management and economic performance and the degree of scholastic approach of the managers’ performance.

  7. Blanket safety by GEMSAFE methodology

    International Nuclear Information System (INIS)

    Sawada, Tetsuo; Saito, Masaki

    2001-01-01

    General Methodology of Safety Analysis and Evaluation for Fusion Energy Systems (GEMSAFE) has been applied to a number of fusion system designs, such as R-tokamak, Fusion Experimental Reactor (FER), and the International Thermonuclear Experimental Reactor (ITER) designs in the both stages of Conceptual Design Activities (CDA) and Engineering Design Activities (EDA). Though the major objective of GEMSAFE is to reasonably select design basis events (DBEs) it is also useful to elucidate related safety functions as well as requirements to ensure its safety. In this paper, we apply the methodology to fusion systems with future tritium breeding blankets and make clear which points of the system should be of concern from safety ensuring point of view. In this context, we have obtained five DBEs that are related to the blanket system. We have also clarified the safety functions required to prevent accident propagations initiated by those blanket-specific DBEs. The outline of the methodology is also reviewed. (author)

  8. The NLC Software Requirements Methodology

    Energy Technology Data Exchange (ETDEWEB)

    Shoaee, Hamid

    2002-08-20

    We describe the software requirements and development methodology developed for the NLC control system. Given the longevity of that project, and the likely geographical distribution of the collaborating engineers, the planned requirements management process is somewhat more formal than the norm in high energy physics projects. The short term goals of the requirements process are to accurately estimate costs, to decompose the problem, and to determine likely technologies. The long term goal is to enable a smooth transition from high level functional requirements to specific subsystem and component requirements for individual programmers, and to support distributed development. The methodology covers both ends of that life cycle. It covers both the analytical and documentary tools for software engineering, and project management support. This paper introduces the methodology, which is fully described in [1].

  9. Methodological Guidelines for Advertising Research

    DEFF Research Database (Denmark)

    Rossiter, John R.; Percy, Larry

    2017-01-01

    In this article, highly experienced advertising academics and advertising research consultants John R. Rossiter and Larry Percy present and discuss what they believe to be the seven most important methodological guidelines that need to be implemented to improve the practice of advertising research....... Their focus is on methodology, defined as first choosing a suitable theoretical framework to guide the research study and then identifying the advertising responses that need to be studied. Measurement of those responses is covered elsewhere in this special issue in the article by Bergkvist and Langner. Most...

  10. Acoustic emission methodology and application

    CERN Document Server

    Nazarchuk, Zinoviy; Serhiyenko, Oleh

    2017-01-01

    This monograph analyses in detail the physical aspects of the elastic waves radiation during deformation or fracture of materials. I presents the  methodological bases for the practical use of acoustic emission device, and describes the results of theoretical and experimental researches of evaluation of the crack growth resistance of materials, selection of the useful AE signals. The efficiency of this methodology is shown through the diagnostics of various-purpose industrial objects. The authors obtain results of experimental researches with the help of the new methods and facilities.

  11. An LWR design decision Methodology

    International Nuclear Information System (INIS)

    Leahy, T.J.; Rees, D.C.; Young, J.

    1982-01-01

    While all parties involved in nuclear plant regulation endeavor to make decisions which optimize the considerations of plant safety and financial impacts, these decisions are generally made without the benefit of a systematic and rigorous approach to the questions confronting the decision makers. A Design Decision Methodology has been developed which provides such a systematic approach. By employing this methodology, which makes use of currently accepted probabilistic risk assessment techniques and cost estimation, informed decisions may be made against a background of comparisons between the relative levels of safety and costs associated with various design alternatives

  12. Qualitative methodology in developmental psychology

    DEFF Research Database (Denmark)

    Demuth, Carolin; Mey, Günter

    2015-01-01

    Qualitative methodology presently is gaining increasing recognition in developmental psychology. Although the founders of developmental psychology to a large extent already used qualitative procedures, the field was long dominated by a (post) positivistic quantitative paradigm. The increasing rec...... in qualitative research offers a promising avenue to advance the field in this direction.......Qualitative methodology presently is gaining increasing recognition in developmental psychology. Although the founders of developmental psychology to a large extent already used qualitative procedures, the field was long dominated by a (post) positivistic quantitative paradigm. The increasing...

  13. Qualitative Methodology in Unfamiliar Cultures

    DEFF Research Database (Denmark)

    Svensson, Christian Franklin

    2014-01-01

    on a qualitative methodology, conscious reflection on research design and objectivity is important when doing fieldwork. This case study discusses such reflections. Emphasis throughout is given to applied qualitative methodology and its contributions to the social sciences, in particular having to do......This case study discusses qualitative fieldwork in Malaysia. The trends in higher education led to investigating how and why young Indians and Chinese in Malaysia are using the university to pursue a life strategy. Given the importance of field context in designing and analysing research based...

  14. Observational methodology in sport sciences

    Directory of Open Access Journals (Sweden)

    M. Teresa Anguera

    2013-11-01

    Full Text Available This paper reviews the conceptual framework, the key literature and the methods (observation tools, such as category systems and field formats, and coding software, etc. that should be followed when conducting research from the perspective of observational methodology. The observational designs used by the authors’ research group over the last twenty years are discussed, and the procedures for analysing data and assessing their quality are described. Mention is also made of the latest methodological trends in this field, such as the use of mixed methods.

  15. Reflections on Design Methodology Research

    DEFF Research Database (Denmark)

    2011-01-01

    We shall reflect on the results of Design Methodology research and their impact on design practice. In the past 50 years the number of researchers in the field has expanded enormously – as has the number of publications. During the same period design practice and its products have changed...... and produced are also now far more complex and distributed, putting designers under ever increasing pressure. We shall address the question: Are the results of Design Methodology research appropriate and are they delivering the expected results in design practice? In our attempt to answer this question we...

  16. Particle Count Limits Recommendation for Aviation Fuel

    Science.gov (United States)

    2015-10-05

    Particle Counter Methodology • Particle counts are taken utilizing calibration methodologies and standardized cleanliness code ratings – ISO 11171 – ISO...Limits Receipt Vehicle Fuel Tank Fuel Injector Aviation Fuel DEF (AUST) 5695B 18/16/13 Parker 18/16/13 14/10/7 Pamas / Parker / Particle Solutions 19/17...12 U.S. DOD 19/17/14/13* Diesel Fuel World Wide Fuel Charter 5th 18/16/13 DEF (AUST) 5695B 18/16/13 Caterpillar 18/16/13 Detroit Diesel 18/16/13 MTU

  17. Significance evaluation in factor graphs

    DEFF Research Database (Denmark)

    Madsen, Tobias; Hobolth, Asger; Jensen, Jens Ledet

    2017-01-01

    in genomics and the multiple-testing issues accompanying them, accurate significance evaluation is of great importance. We here address the problem of evaluating statistical significance of observations from factor graph models. Results Two novel numerical approximations for evaluation of statistical...... significance are presented. First a method using importance sampling. Second a saddlepoint approximation based method. We develop algorithms to efficiently compute the approximations and compare them to naive sampling and the normal approximation. The individual merits of the methods are analysed both from....... Conclusions The applicability of saddlepoint approximation and importance sampling is demonstrated on known models in the factor graph framework. Using the two methods we can substantially improve computational cost without compromising accuracy. This contribution allows analyses of large datasets...

  18. Significant Lactic Acidosis from Albuterol

    Directory of Open Access Journals (Sweden)

    Deborah Diercks

    2018-03-01

    Full Text Available Lactic acidosis is a clinical entity that demands rapid assessment and treatment to prevent significant morbidity and mortality. With increased lactate use across many clinical scenarios, lactate values themselves cannot be interpreted apart from their appropriate clinical picture. The significance of Type B lactic acidosis is likely understated in the emergency department (ED. Given the mortality that sepsis confers, a serum lactate is an important screening study. That said, it is with extreme caution that we should interpret and react to the resultant elevated value. We report a patient with a significant lactic acidosis. Though he had a high lactate value, he did not require aggressive resuscitation. A different classification scheme for lactic acidosis that focuses on the bifurcation of the “dangerous” and “not dangerous” causes of lactic acidosis may be of benefit. In addition, this case is demonstrative of the potential overuse of lactates in the ED.

  19. The Methodological Imperatives of Feminist Ethnography

    Directory of Open Access Journals (Sweden)

    Richelle D. Schrock

    2013-12-01

    Full Text Available Feminist ethnography does not have a single, coherent definition and is caught between struggles over the definition and goals of feminism and the multiple practices known collectively as ethnography. Towards the end of the 1980s, debates emerged that problematized feminist ethnography as a productive methodology and these debates still haunt feminist ethnographers today. In this article, I provide a concise historiography of feminist ethnography that summarizes both its promises and its vulnerabilities. I address the three major challenges I argue feminist ethnographers currently face, which include responding productively to feminist critiques of representing "others," accounting for feminisms' commitment to social change while grappling with poststructuralist critiques of knowledge production, and confronting the historical and ongoing lack of recognition for significant contributions by feminist ethnographers. Despite these challenges, I argue that feminist ethnography is a productive methodology and I conclude by delineating its methodological imperatives. These imperatives include producing knowledge about women's lives in specific cultural contexts, recognizing the potential detriments and benefits of representation, exploring women's experiences of oppression along with the agency they exercise in their own lives, and feeling an ethical responsibility towards the communities in which the researchers work. I argue that this set of imperatives enables feminist ethnographers to successfully navigate the challenges they face.

  20. (Limiting the greenhouse effect)

    Energy Technology Data Exchange (ETDEWEB)

    Rayner, S.

    1991-01-07

    Traveler attended the Dahlem Research Conference organized by the Freien Universitat, Berlin. The subject of the conference was Limiting the Greenhouse Effect: Options for Controlling Atmospheric CO{sub 2} Accumulation. Like all Dahlem workshops, this was a meeting of scientific experts, although the disciplines represented were broader than usual, ranging across anthropology, economics, international relations, forestry, engineering, and atmospheric chemistry. Participation by scientists from developing countries was limited. The conference was divided into four multidisciplinary working groups. Traveler acted as moderator for Group 3 which examined the question What knowledge is required to tackle the principal social and institutional barriers to reducing CO{sub 2} emissions'' The working rapporteur was Jesse Ausubel of Rockefeller University. Other working groups examined the economic costs, benefits, and technical feasibility of options to reduce emissions per unit of energy service; the options for reducing energy use per unit of GNP; and the significant of linkage between strategies to reduce CO{sub 2} emissions and other goals. Draft reports of the working groups are appended. Overall, the conference identified a number of important research needs in all four areas. It may prove particularly important in bringing the social and institutional research needs relevant to climate change closer to the forefront of the scientific and policy communities than hitherto.

  1. Limits on fundamental limits to computation.

    Science.gov (United States)

    Markov, Igor L

    2014-08-14

    An indispensable part of our personal and working lives, computing has also become essential to industries and governments. Steady improvements in computer hardware have been supported by periodic doubling of transistor densities in integrated circuits over the past fifty years. Such Moore scaling now requires ever-increasing efforts, stimulating research in alternative hardware and stirring controversy. To help evaluate emerging technologies and increase our understanding of integrated-circuit scaling, here I review fundamental limits to computation in the areas of manufacturing, energy, physical space, design and verification effort, and algorithms. To outline what is achievable in principle and in practice, I recapitulate how some limits were circumvented, and compare loose and tight limits. Engineering difficulties encountered by emerging technologies may indicate yet unknown limits.

  2. CUEX methodology for assessing radiological impacts in the context of ICRP Recommendations

    International Nuclear Information System (INIS)

    Rohwer, P.S.; Kaye, S.V.; Struxness, E.G.

    1975-01-01

    The Cumulative Exposure Index (CUEX) methodology was developed to estimate and assess, in the context of International Commission on Radiological Protection (ICRP) Recommendations, the total radiation dose to man due to environmental releases of radioactivity from nuclear applications. Each CUEX, a time-integrated radionuclide concentration (e.g.μCi.h.cm -3 ), reflects the selected annual dose limit for the reference organ and the estimated total dose to that organ via all exposure modes for a specific exposure situation. To assess the radiological significance of an environmental release of radioactivity, calculated or measured radionuclide concentrations in a suitable environmental sampling medium are compared with CUEXs determined for that medium under comparable conditions. The models and computer codes used in the CUEX methodology to predict environmental transport and to estimate radiation dose have been thoroughly tested. These models and codes are identified and described briefly. Calculation of a CUEX is shown step by step. An application of the methodology to a hypothetical atmospheric release involving four radionuclides illustrates use of the CUEX computer code to assess the radiological significance of a release, and to determine the relative importance (i.e. percentage of the estimated total dose contributed) of each radionuclide and each mode of exposure. The data requirements of the system are shown to be extensive, but not excessive in view of the assessments and analyses provided by the CUEX code. (author)

  3. College Research Methodology Courses: Revisiting General Instructional Goals and Objectives

    Science.gov (United States)

    Lei, Simon A.

    2010-01-01

    A number of graduate (masters-level) students from a wide variety of academic disciplines have viewed a required introductory research methodology course negatively. These students often do not retain much of the previously learned material, thus limiting their success of subsequent research and statistics courses. The purpose of this article is…

  4. Scripting possible futures of nanotechnologies: A methodology that enhances reflexivity

    NARCIS (Netherlands)

    den Boer, Duncan; Rip, Arie; Speller, Sylvia

    2009-01-01

    Nanoscience is full of promises. However, these promises often do not take into account the realities of product development and the limited coupling with scientific research. On the basis of literature and earlier projects, we have developed a mapping methodology (“bridging gaps in the innovation

  5. Assessment of Wind Turbine Structural Integrity using Response Surface Methodology

    DEFF Research Database (Denmark)

    Toft, Henrik Stensgaard; Svenningsen, Lasse; Moser, Wolfgang

    2016-01-01

    Highlights •A new approach to assessment of site specific wind turbine loads is proposed. •The approach can be applied in both fatigue and ultimate limit state. •Two different response surface methodologies have been investigated. •The model uncertainty introduced by the response surfaces...

  6. A Big Data Analytics Methodology Program in the Health Sector

    Science.gov (United States)

    Lawler, James; Joseph, Anthony; Howell-Barber, H.

    2016-01-01

    The benefits of Big Data Analytics are cited frequently in the literature. However, the difficulties of implementing Big Data Analytics can limit the number of organizational projects. In this study, the authors evaluate business, procedural and technical factors in the implementation of Big Data Analytics, applying a methodology program. Focusing…

  7. Methodological decolonization and interculturality. Reflexions from the ethnographic research

    Directory of Open Access Journals (Sweden)

    Juan Pablo Puentes

    2015-12-01

    Full Text Available In this paper I explain some methodological crosslinks between the decolonial option, subaltern studies and postcolonial studies. In addition, I highlight certains limitations they present in their ethnographic research. Finally, I suggest some guidelines that could help us carry out a research comprising an extended intercultural horizon

  8. The historical significance of oak

    Science.gov (United States)

    J. V. Thirgood

    1971-01-01

    A brief history of the importance of oak in Europe, contrasting the methods used in France and Britain to propagate the species and manage the forests for continued productivity. The significance of oak as a strategic resource during the sailing-ship era is stressed, and mention is made of the early development of oak management in North America. The international...

  9. Significance and popularity in music production

    Science.gov (United States)

    Monechi, Bernardo; Gravino, Pietro; Servedio, Vito D. P.; Tria, Francesca; Loreto, Vittorio

    2017-07-01

    Creative industries constantly strive for fame and popularity. Though highly desirable, popularity is not the only achievement artistic creations might ever acquire. Leaving a longstanding mark in the global production and influencing future works is an even more important achievement, usually acknowledged by experts and scholars. `Significant' or `influential' works are not always well known to the public or have sometimes been long forgotten by the vast majority. In this paper, we focus on the duality between what is successful and what is significant in the musical context. To this end, we consider a user-generated set of tags collected through an online music platform, whose evolving co-occurrence network mirrors the growing conceptual space underlying music production. We define a set of general metrics aiming at characterizing music albums throughout history, and their relationships with the overall musical production. We show how these metrics allow to classify albums according to their current popularity or their belonging to expert-made lists of important albums. In this way, we provide the scientific community and the public at large with quantitative tools to tell apart popular albums from culturally or aesthetically relevant artworks. The generality of the methodology presented here lends itself to be used in all those fields where innovation and creativity are in play.

  10. Significance and popularity in music production.

    Science.gov (United States)

    Monechi, Bernardo; Gravino, Pietro; Servedio, Vito D P; Tria, Francesca; Loreto, Vittorio

    2017-07-01

    Creative industries constantly strive for fame and popularity. Though highly desirable, popularity is not the only achievement artistic creations might ever acquire. Leaving a longstanding mark in the global production and influencing future works is an even more important achievement, usually acknowledged by experts and scholars. 'Significant' or 'influential' works are not always well known to the public or have sometimes been long forgotten by the vast majority. In this paper, we focus on the duality between what is successful and what is significant in the musical context. To this end, we consider a user-generated set of tags collected through an online music platform, whose evolving co-occurrence network mirrors the growing conceptual space underlying music production. We define a set of general metrics aiming at characterizing music albums throughout history, and their relationships with the overall musical production. We show how these metrics allow to classify albums according to their current popularity or their belonging to expert-made lists of important albums. In this way, we provide the scientific community and the public at large with quantitative tools to tell apart popular albums from culturally or aesthetically relevant artworks. The generality of the methodology presented here lends itself to be used in all those fields where innovation and creativity are in play.

  11. Methodologic frontiers in environmental epidemiology.

    OpenAIRE

    Rothman, K J

    1993-01-01

    Environmental epidemiology comprises the epidemiologic study of those environmental factors that are outside the immediate control of the individual. Exposures of interest to environmental epidemiologists include air pollution, water pollution, occupational exposure to physical and chemical agents, as well as psychosocial elements of environmental concern. The main methodologic problem in environmental epidemiology is exposure assessment, a problem that extends through all of epidemiologic re...

  12. IMSF: Infinite Methodology Set Framework

    Science.gov (United States)

    Ota, Martin; Jelínek, Ivan

    Software development is usually an integration task in enterprise environment - few software applications work autonomously now. It is usually a collaboration of heterogeneous and unstable teams. One serious problem is lack of resources, a popular result being outsourcing, ‘body shopping’, and indirectly team and team member fluctuation. Outsourced sub-deliveries easily become black boxes with no clear development method used, which has a negative impact on supportability. Such environments then often face the problems of quality assurance and enterprise know-how management. The used methodology is one of the key factors. Each methodology was created as a generalization of a number of solved projects, and each methodology is thus more or less connected with a set of task types. When the task type is not suitable, it causes problems that usually result in an undocumented ad-hoc solution. This was the motivation behind formalizing a simple process for collaborative software engineering. Infinite Methodology Set Framework (IMSF) defines the ICT business process of adaptive use of methods for classified types of tasks. The article introduces IMSF and briefly comments its meta-model.

  13. Unattended Monitoring System Design Methodology

    International Nuclear Information System (INIS)

    Drayer, D.D.; DeLand, S.M.; Harmon, C.D.; Matter, J.C.; Martinez, R.L.; Smith, J.D.

    1999-01-01

    A methodology for designing Unattended Monitoring Systems starting at a systems level has been developed at Sandia National Laboratories. This proven methodology provides a template that describes the process for selecting and applying appropriate technologies to meet unattended system requirements, as well as providing a framework for development of both training courses and workshops associated with unattended monitoring. The design and implementation of unattended monitoring systems is generally intended to respond to some form of policy based requirements resulting from international agreements or domestic regulations. Once the monitoring requirements are established, a review of the associated process and its related facilities enables identification of strategic monitoring locations and development of a conceptual system design. The detailed design effort results in the definition of detection components as well as the supporting communications network and data management scheme. The data analyses then enables a coherent display of the knowledge generated during the monitoring effort. The resultant knowledge is then compared to the original system objectives to ensure that the design adequately addresses the fundamental principles stated in the policy agreements. Implementation of this design methodology will ensure that comprehensive unattended monitoring system designs provide appropriate answers to those critical questions imposed by specific agreements or regulations. This paper describes the main features of the methodology and discusses how it can be applied in real world situations

  14. Environmental Zoning: Some methodological implications

    NARCIS (Netherlands)

    Ike, Paul; Voogd, Henk

    1991-01-01

    The purpose of this article is to discuss some methodological problems of environmental zoning. The principle of environmental zoning will be elaborated. In addition an overview is given of a number of approaches that have been followed in practice to arrive at an integral judgement. Finally some

  15. A methodology for string resolution

    International Nuclear Information System (INIS)

    Karonis, N.T.

    1992-11-01

    In this paper we present a methodology, not a tool. We present this methodology with the intent that it be adopted, on a case by case basis, by each of the existing tools in EPICS. In presenting this methodology, we describe each of its two components in detail and conclude with an example depicting how the methodology can be used across a pair of tools. The task of any control system is to provide access to the various components of the machine being controlled, for example, the Advanced Photon Source (APS). By access, we mean the ability to monitor the machine's status (reading) as well as the ability to explicitly change its status (writing). The Experimental Physics and Industrial Control System (EPICS) is a set of tools, designed to act in concert, that allows one to construct a control system. EPICS provides the ability to construct a control system that allows reading and writing access to the machine. It does this through the notion of databases. Each of the components of the APS that is accessed by the control system is represented in EPICS by a set of named database records. Once this abstraction is made, from physical device to named database records, the process of monitoring and changing the state of that device becomes the simple process of reading and writing information from and to its associated named records

  16. Counting stem cells : methodological constraints

    NARCIS (Netherlands)

    Bystrykh, Leonid V.; Verovskaya, Evgenia; Zwart, Erik; Broekhuis, Mathilde; de Haan, Gerald

    The number of stem cells contributing to hematopoiesis has been a matter of debate. Many studies use retroviral tagging of stem cells to measure clonal contribution. Here we argue that methodological factors can impact such clonal analyses. Whereas early studies had low resolution, leading to

  17. Test reactor risk assessment methodology

    International Nuclear Information System (INIS)

    Jennings, R.H.; Rawlins, J.K.; Stewart, M.E.

    1976-04-01

    A methodology has been developed for the identification of accident initiating events and the fault modeling of systems, including common mode identification, as these methods are applied in overall test reactor risk assessment. The methods are exemplified by a determination of risks to a loss of primary coolant flow in the Engineering Test Reactor

  18. METHODOLOGICAL ELEMENTS OF SITUATIONAL ANALYSIS

    Directory of Open Access Journals (Sweden)

    Tetyana KOVALCHUK

    2016-07-01

    Full Text Available The article deals with the investigation of theoretical and methodological principles of situational analysis. The necessity of situational analysis is proved in modern conditions. The notion “situational analysis” is determined. We have concluded that situational analysis is a continuous system study which purpose is to identify dangerous situation signs, to evaluate comprehensively such signs influenced by a system of objective and subjective factors, to search for motivated targeted actions used to eliminate adverse effects of the exposure of the system to the situation now and in the future and to develop the managerial actions needed to bring the system back to norm. It is developed a methodological approach to the situational analysis, its goal is substantiated, proved the expediency of diagnostic, evaluative and searching functions in the process of situational analysis. The basic methodological elements of the situational analysis are grounded. The substantiation of the principal methodological elements of system analysis will enable the analyst to develop adaptive methods able to take into account the peculiar features of a unique object which is a situation that has emerged in a complex system, to diagnose such situation and subject it to system and in-depth analysis, to identify risks opportunities, to make timely management decisions as required by a particular period.

  19. 16 Offsetting deficit conceptualisations: methodological ...

    African Journals Online (AJOL)

    uses the concepts of literacy practices and knowledge recontextualisation to ... 1996, 2000) theory of 'knowledge recontextualisation' in the development of curricula .... cognitive, social and cultural abilities needed to fit in and thrive in the HE learning .... this argument, that a methodology and analytic framework that seeks to ...

  20. Methodology for the case studies

    NARCIS (Netherlands)

    Smits, M.J.W.; Woltjer, G.B.

    2017-01-01

    This document is about the methodology and selection of the case studies. It is meant as a guideline for the case studies, and together with the other reports in this work package can be a source of inform ation for policy officers, interest groups and researchers evaluating or performing impact

  1. Safety at Work : Research Methodology

    NARCIS (Netherlands)

    Beurden, van K. (Karin); Boer, de J. (Johannes); Brinks, G. (Ger); Goering-Zaburnenko, T. (Tatiana); Houten, van Y. (Ynze); Teeuw, W. (Wouter)

    2012-01-01

    In this document, we provide the methodological background for the Safety atWork project. This document combines several project deliverables as defined inthe overall project plan: validation techniques and methods (D5.1.1), performanceindicators for safety at work (D5.1.2), personal protection

  2. Utility of radiotracer methodology in scientific research of industrial relevancy

    International Nuclear Information System (INIS)

    Kolar, Z.I.

    1990-01-01

    Utilization of radiotracer methodology in industrial research provides substantial scientific rather than directly demonstrable economic benefits. These benefits include better understanding of industrial processes and subsequently the development of new ones. Examples are given of the use of radiotracers in technological studies and the significance of the obtained results is put down. Creative application of radiotracer methodology may contribute to the economic development and technological advancement of all countries including the developing ones. (orig.) [de

  3. Time Limits : Effects on Recall

    OpenAIRE

    Hirano, Kinue

    2000-01-01

    This study investigates the effects of differing time limits and the level of language proficiency on the written recalls of 66 Japanese EFL undergraduates. Results showed that different time limits affected total recall, but not main ideas recalled. Regardless of proficiency level, the 20-minute group (Group 2) recalled a greater number of idea units than the 8-minute group (Group 1). However, no significant difference was found between Groups 1 and 2 regarding the recall of main ideas, alth...

  4. The Global Landscape of Occupational Exposure Limits--Implementation of Harmonization Principles to Guide Limit Selection.

    Science.gov (United States)

    Deveau, M; Chen, C-P; Johanson, G; Krewski, D; Maier, A; Niven, K J; Ripple, S; Schulte, P A; Silk, J; Urbanus, J H; Zalk, D M; Niemeier, R W

    2015-01-01

    Occupational exposure limits (OELs) serve as health-based benchmarks against which measured or estimated workplace exposures can be compared. In the years since the introduction of OELs to public health practice, both developed and developing countries have established processes for deriving, setting, and using OELs to protect workers exposed to hazardous chemicals. These processes vary widely, however, and have thus resulted in a confusing international landscape for identifying and applying such limits in workplaces. The occupational hygienist will encounter significant overlap in coverage among organizations for many chemicals, while other important chemicals have OELs developed by few, if any, organizations. Where multiple organizations have published an OEL, the derived value often varies considerably-reflecting differences in both risk policy and risk assessment methodology as well as access to available pertinent data. This article explores the underlying reasons for variability in OELs, and recommends the harmonization of risk-based methods used by OEL-deriving organizations. A framework is also proposed for the identification and systematic evaluation of OEL resources, which occupational hygienists can use to support risk characterization and risk management decisions in situations where multiple potentially relevant OELs exist.

  5. Clinical significance of anismus in encopresis.

    Science.gov (United States)

    Catto-Smith, A G; Nolan, T M; Coffey, C M

    1998-09-01

    Treatments designed to relieve paradoxical contraction of the anal sphincters during defecation (anismus) have had limited success in children with encopresis. This has raised doubts as to the clinical relevance of this diagnosis in childhood as anorectal dysfunction. Our aim was to determine whether, in patients who had treatment-resistant encopresis, the presence of electromyographic anismus was associated with increased faecal retention. Sixty-eight children with soiling (mean age 8.7+/-2.06 years) were assessed by clinical examination, abdominal radiography and then with anorectal manometry. Patients with electromyographic anismus (n=32; 47%) had significantly increased radiographic rectal faecal retention and were significantly less likely to be able to defecate water-filled balloons. There were no significant differences in response to prior therapy, history of primary encopresis, behavioural adjustment or in sociodemographic data. Our results suggest that electromyographic anismus is associated with obstructed defecation and faecal retention.

  6. Risk-based Regulatory Evaluation Program methodology

    International Nuclear Information System (INIS)

    DuCharme, A.R.; Sanders, G.A.; Carlson, D.D.; Asselin, S.V.

    1987-01-01

    The objectives of this DOE-supported Regulatory Evaluation Progrwam are to analyze and evaluate the safety importance and economic significance of existing regulatory guidance in order to assist in the improvement of the regulatory process for current generation and future design reactors. A risk-based cost-benefit methodology was developed to evaluate the safety benefit and cost of specific regulations or Standard Review Plan sections. Risk-based methods can be used in lieu of or in combination with deterministic methods in developing regulatory requirements and reaching regulatory decisions

  7. METHODOLOGY OF PROFESSIONAL PEDAGOGICAL EDUCATION: THEORY AND PRACTICE (theoretical and methodological foundations of vocational teacher education

    Directory of Open Access Journals (Sweden)

    Evgeny M. Dorozhkin

    2014-01-01

    methodology taking into consideration the target orientation, principles and approaches to the organization and its’ methods of scientific and educational activities implementation. The qualification structure formation of the teachers’ vocational training and providing advance principles of education are considered to be the most important conditions for the development of vocational teacher education. Scientific novelty. The research demonstrates creating the project of further vocational teacher education development in the post-industrial society. The pedagogical innovations transforming research findings into educational practice are considered to be the main tool of integration methodology means. Practical significance. The research findings highlight the proposed reforms for further teachers training system development of vocational institutes, which are in need of drastic restructuring. In the final part of the article the authors recommend some specific issues that can be discussed at the methodological workshop. 

  8. Enforcement actions: Significant actions resolved

    International Nuclear Information System (INIS)

    1994-03-01

    This compilation summarizes significant enforcement actions that have been resolved during one quarterly period (October - December 1993) and includes copies of letters, Notices, and Orders sent by the Nuclear Regulatory Commission to licensees with respect to these enforcement actions. It is anticipated that the information in this publication will be widely disseminated to managers and employees engaged in activities licensed by the NRC, so that actions can be taken to improve safety by avoiding future violations similar to those described in this publication

  9. Synthetic definition of biological significance

    International Nuclear Information System (INIS)

    Buffington, J.D.

    1975-01-01

    The central theme of the workshop is recounted and the views of the authors are summarized. Areas of broad agreement or disagreement, unifying principles, and research needs are identified. Authors' views are consolidated into concepts that have practical utility for the scientist making impact assessments. The need for decision-makers and managers to be cognizant of the recommendations made herein is discussed. Finally, bringing together the diverse views of the workshop participants, a conceptual definition of biological significance is synthesized

  10. Enforcement actions: Significant actions resolved

    International Nuclear Information System (INIS)

    1992-11-01

    This compilation summarizes significant enforcement actions that have been resolved during one quarterly period (July - September 1992) and includes copies of letters, Notices, and Orders sent by the Nuclear Regulatory Commission to licensees with respect to these enforcement actions. It is anticipated that the information in this publication will be widely disseminated to managers and employees engaged in activities licensed by the NRC, so that actions can be taken to improve safety by avoiding future violations similar to those described in this publication

  11. Environmental impact statement analysis: dose methodology

    International Nuclear Information System (INIS)

    Mueller, M.A.; Strenge, D.L.; Napier, B.A.

    1981-01-01

    Standardized sections and methodologies are being developed for use in environmental impact statements (EIS) for activities to be conducted on the Hanford Reservation. Five areas for standardization have been identified: routine operations dose methodologies, accident dose methodology, Hanford Site description, health effects methodology, and socioeconomic environment for Hanford waste management activities

  12. A Critique of Methodological Dualism in Education

    Science.gov (United States)

    Yang, Jeong A.; Yoo, Jae-Bong

    2018-01-01

    This paper aims to critically examine the paradigm of methodological dualism and explore whether methodologies in social science currently are appropriate for educational research. There are two primary methodologies for studying education: quantitative and qualitative methods. This is what we mean by "methodological dualism". Is…

  13. Feminist Methodologies and Engineering Education Research

    Science.gov (United States)

    Beddoes, Kacey

    2013-01-01

    This paper introduces feminist methodologies in the context of engineering education research. It builds upon other recent methodology articles in engineering education journals and presents feminist research methodologies as a concrete engineering education setting in which to explore the connections between epistemology, methodology and theory.…

  14. A Methodological Inter-Comparison of Gridded Meteorological Products

    Science.gov (United States)

    Newman, A. J.; Clark, M. P.; Longman, R. J.; Giambelluca, T. W.; Arnold, J.

    2017-12-01

    Here we present a gridded meteorology inter-comparison using the state of Hawaíi as a testbed. This inter-comparison is motivated by two general goals: 1) the broad user community of gridded observation based meteorological fields should be aware of inter-product differences and the reasons they exist, which allows users to make informed choices on product selection to best meet their specific application(s); 2) we want to demonstrate the utility of inter-comparisons to meet the first goal, yet highlight that they are limited to mostly generic statements regarding attribution of differences that limits our understanding of these complex algorithms and obscures future research directions. Hawaíi is a useful testbed because it is a meteorologically complex region with well-known spatial features that are tied to specific physical processes (e.g. the trade wind inversion). From a practical standpoint, there are now several monthly climatological and daily precipitation and temperature datasets available that are being used for impact modeling. General conclusions that have emerged are: 1) differences in input station data significantly influence product differences; 2) prediction of precipitation occurrence is crucial across multiple metrics; 3) derived temperature statistics (e.g. diurnal temperature range) may have large spatial differences across products; and 4) attribution of differences to methodological choices is difficult and may limit the outcomes of these inter-comparisons, particularly from a development viewpoint. Thus, we want to continue to move the community towards frameworks that allow for multiple options throughout the product generation chain and allow for more systematic testing.

  15. Social cognition interventions for people with schizophrenia: a systematic review focussing on methodological quality and intervention modality.

    Science.gov (United States)

    Grant, Nina; Lawrence, Megan; Preti, Antonio; Wykes, Til; Cella, Matteo

    2017-08-01

    People with a diagnosis of schizophrenia have significant social and functional difficulties. Social cognition was found to influences these outcomes and in recent years interventions targeting this domain were developed. This paper reviews the existing literature on social cognition interventions for people with a diagnosis of schizophrenia focussing on: i) comparing focussed (i.e. targeting only one social cognitive domain) and global interventions and ii) studies methodological quality. Systematic search was conducted on PubMed and PsycInfo. Studies were included if they were randomised control trials, participants had a diagnosis of schizophrenia or schizoaffective disorder, and the intervention targeted at least one out of four social cognition domains (i.e. theory of mind, affect recognition, social perception and attribution bias). All papers were assessed for methodological quality. Information on the intervention, control condition, study methodology and the main findings from each study were extracted and critically summarised. Data from 32 studies fulfilled the inclusion criteria, considering a total of 1440 participants. Taking part in social cognition interventions produced significant improvements in theory of mind and affect recognition compared to both passive and active control conditions. Results were less clear for social perception and attributional bias. Focussed and global interventions had similar results on outcomes. Overall study methodological quality was modest. There was very limited evidence showing that social cognitive intervention result in functional outcome improvement. The evidence considered suggests that social cognition interventions may be a valuable approach for people with a diagnosis of schizophrenia. However, evidence quality is limited by measure heterogeneity, modest study methodology and short follow-up periods. The findings point to a number of recommendations for future research, including measurement standardisation

  16. Methodologies for Social Life Cycle Assessment

    DEFF Research Database (Denmark)

    Jørgensen, Andreas; Le Bocq, Agathe; Nazakina, Liudmila

    2008-01-01

    Goal, Scope and Background. In recent years several different approaches towards Social Life Cycle Assessment (SLCA) have been developed. The purpose of this review is to compare these approaches in order to highlight methodological differences and general shortcomings. SLCA has several similarit......Goal, Scope and Background. In recent years several different approaches towards Social Life Cycle Assessment (SLCA) have been developed. The purpose of this review is to compare these approaches in order to highlight methodological differences and general shortcomings. SLCA has several...... similarities with other social assessment tools, but in order to limit the review, only claims to address social impacts from an LCA-like framework is considered. Main Features. The review is to a large extent based on conference proceedings and reports of which some are not easily accessible, since very...... stage in the product life cycle. Another very important difference among the proposals is their position towards the use of generic data. Several of the proposals argue that social impacts are connected to the conduct of the company leading to the conclusion that each individual company in the product...

  17. Enforcement actions: Significant actions resolved

    International Nuclear Information System (INIS)

    1993-09-01

    This compilation summarizes significant enforcement actions that have been resolved during one quarterly period (April--June 1993) and includes copies of letters, Notices, and Orders sent by the Nuclear Regulatory Commission to licensees with respect to these enforcement actions. It is anticipated that the information in this publication will be widely disseminated to managers and employees engaged in activities licensed by the NRC, so that actions can be taken to improve safety by avoiding future violations similar to those described in this publication

  18. Enforcement actions: Significant actions resolved

    International Nuclear Information System (INIS)

    1990-05-01

    This compilation summarizes significant enforcement actions that have been resolved during one quarterly period (January--March 1990) and includes copies of letters, Notices, and Orders sent by the Nuclear Regulatory Commission to licensees with respect to these enforcement actions. Also included are a number of enforcement actions that had been previously resolved but not published in this NUREG. It is anticipated that the information in this publication will be widely disseminated to managers and employees engaged in activities licensed by the NRC, so that actions can be taken to improve safety by avoiding future violations similar to those described in this publication

  19. Enforcement actions: Significant actions resolved

    International Nuclear Information System (INIS)

    1991-05-01

    This compilation summarizes significant enforcement actions that have been resolved during one quarterly period (January--March 1991) and includes copies of letters, Notices, and Orders sent by the Nuclear Regulatory Commission to licensees with respect to these enforcement actions. It is anticipated that the information in this publication will be widely disseminated to managers and employees engaged in activities licensed by the NRC, so that actions can be taken to improve safety by avoiding future violations similar to those described in this publication

  20. Enforcement actions: Significant actions resolved

    International Nuclear Information System (INIS)

    1991-02-01

    This compilation summarizes significant enforcement actions that have been resolved during one quarterly period (October--December 1990) and includes copies of letters, Notices, and Orders sent by the Nuclear Regulatory Commission to licensees with respect to these enforcement actions. It is anticipated that the information in this publication will be widely disseminated to managers and employees engaged in activities licensed by the NRC, so that actions can be taken to improve safety by avoiding future violations similar to those described in this publication

  1. Enforcement actions: Significant actions resolved

    International Nuclear Information System (INIS)

    1990-03-01

    This compilation summarizes significant enforcement actions that have been resolved during one quarterly period (October--December 1989) and includes copies of letters, Notices, and Orders sent by the Nuclear Regulatory Commission to licensees with respect to these enforcement actions. It is anticipated that the information in this publication will be widely disseminated to managers and employees engaged in activities licensed by the NRC, so that actions can be taken to improve safety by avoiding future violations similar to those described in this publication

  2. Enforcement actions: Significant actions resolved

    International Nuclear Information System (INIS)

    1990-11-01

    This compilation summarizes significant enforcement actions that have been resolved during one quarterly period (July--September 1990) and includes copies of letters, notices, and orders sent by the Nuclear Regulatory Commission to licensees with respect to these enforcement actions. It is anticipated that the information in this publication will be widely disseminated to managers and employees engaged in activities licensed by the NRC, so that actions can be taken to improve safety by avoiding future violations similar to those described in this publication

  3. Enforcement actions: Significant actions resolved

    International Nuclear Information System (INIS)

    1992-08-01

    This compilation summarizes significant enforcement actions that have been resolved during one quarterly period (April--June 1992) and includes copies of letters, Notices, and Orders sent by the Nuclear Regulatory Commission to licensees with respect to these enforcement actions. It is anticipated that the information in this publication will be widely disseminated to managers and employees engaged in activities licensed by the NRC, so that actions can be taken to improve safety by avoiding future violations similar to those described in this publication

  4. Enforcement actions: Significant actions resolved

    International Nuclear Information System (INIS)

    1990-09-01

    This compilation summarizes significant enforcement actions that have been resolved during one quarterly period (April--June 1990) and includes copies of letters, notices, and orders sent by the Nuclear Regulatory Commission to licensees with respect to these enforcement actions. It is anticipated that the information in this publication will be widely disseminated to managers and employees engaged in activities licensed by the NRC, so that actions can be taken to improve safety by avoiding future violations similar to those described in this publication

  5. Clinical significance of neonatal menstruation.

    Science.gov (United States)

    Brosens, Ivo; Benagiano, Giuseppe

    2016-01-01

    Past studies have clearly shown the existence of a spectrum of endometrial progesterone responses in neonatal endometrium, varying from proliferation to full decidualization with menstrual-like shedding. The bleedings represent, similar to what occurs in adult menstruation, a progesterone withdrawal bleeding. Today, the bleeding is completely neglected and considered an uneventful episode of no clinical significance. Yet clinical studies have linked the risk of bleeding to a series of events indicating fetal distress. The potential link between the progesterone response and major adolescent disorders requires to be investigated by prospective studies. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.

  6. Enforcement actions: Significant actions resolved

    International Nuclear Information System (INIS)

    1993-06-01

    This compilation summarizes significant enforcement actions that have been resolved during one quarterly period (January--March 1993) and includes copies of letters, Notices, and Orders sent by the Nuclear Regulatory Commission to licensees with respect to these enforcement actions. It is anticipated that the information in this publication will be widely disseminated to managers and employees engaged in activities licensed by the NRC, so that actions can be taken to improve safety by avoiding future violations similar to those described in this publication

  7. Enforcement actions: Significant actions resolved

    International Nuclear Information System (INIS)

    1992-05-01

    This compilation summarizes significant enforcement actions that have been resolved during one quarterly period (January--March 1992) and includes copies of letters, Notices, and Orders sent by the Nuclear Regulatory Commission to licensees with respect to these enforcement actions. It is anticipated that the information in this publication will be widely disseminated to managers and employees engaged in activities licensed by the NRC, so that actions can be taken to improve safety by avoiding future violations similar to those described in this publication

  8. Enforcement actions: Significant actions resolved

    International Nuclear Information System (INIS)

    1993-12-01

    This compilation summarizes significant enforcement actions that have been resolved during one quarterly period (July--September 1993) and includes copies of letters, Notices, and Orders sent by the Nuclear Regulatory Commission to licensees with respect to these enforcement actions. It is anticipated that the information in this publication will be widely disseminated to managers and employees engaged in activities licensed by the NRC, so that actions can be taken to improve safety by avoiding future violations similar to those described in this publication

  9. Enforcement actions: Significant actions resolved

    International Nuclear Information System (INIS)

    1993-03-01

    This compilation summarizes significant enforcement actions that have been resolved during one quarterly period (October--December 1992) and includes copies of letters, Notices, and Orders sent by the Nuclear Regulatory Commission to licensees with respect to these enforcement actions. It is anticipated that the information in this publication will be widely disseminated to managers and employees engaged in activities licensed by the NRC, so that actions can be taken to improve safety by avoiding future violations similar to those described in this publication

  10. Enforcement actions: Significant actions resolved

    International Nuclear Information System (INIS)

    1991-07-01

    This compilation summarizes significant enforcement actions that have been resolved during one quarterly period (April-June 1991) and includes copies of letters, Notices, and Orders sent by the Nuclear Regulatory Commission to licensees with respect to these enforcement actions. It is anticipated that the information in this publication will be widely disseminated to managers and employees engaged in activities licensed by the NRC, so that actions can be taken to improve safety by avoiding future violations similar to those described in this publication

  11. Enforcement actions: Significant actions resolved

    International Nuclear Information System (INIS)

    1991-11-01

    This compilation summarizes significant enforcement actions that have been resolved during one quarterly period (July--September 1991) and includes copies of letters, Notices, and Orders sent by the Nuclear Regulatory Commission to licensees with respect to these enforcement actions. It is anticipated that the information in this publication will be widely disseminated to managers and employees engaged in activities licensed by the NRC, so that actions can be taken to improve safety by avoiding future violations similar to those described in this publication

  12. Enforcement actions: Significant actions resolved

    International Nuclear Information System (INIS)

    1992-03-01

    This compilation summarizes significant enforcement actions that have been resolved during one quarterly period (October--December 1991) and includes copies of letters, Notices, and Orders sent by the Nuclear Regulatory Commission to licensees with respect to these enforcement actions. It is anticipated that the information in this publication will be widely disseminated to managers and employees engaged in activities licensed by the NRC, so that actions can be taken to improve safety by avoiding future violations similar to those described in this publication

  13. Enforcement actions: Significant actions resolved

    International Nuclear Information System (INIS)

    1989-06-01

    This compilation summarizes significant enforcement actions that have been resolved during one quarterly period (January--March 1989) and includes copies of letters, Notices, and Orders sent by the Nuclear Regulatory Commission to licensees with respect to these enforcement actions. Also included are a number of enforcement actions that had been previously resolved but not published in this NUREG. It is anticipated that the information in this publication will be widely disseminated to managers and employees engaged in activities licensed by the NRC, so that actions can be taken to improve safety by avoiding future violations similar to those described in this publication

  14. Enforcement actions: Significant actions resolved

    International Nuclear Information System (INIS)

    1989-12-01

    This compilation summarizes significant enforcement actions that have been resolved during one quarterly period (July--September 1989) and includes copies of letters, Notices, and Orders sent by the Nuclear Regulatory Commission to licensees with respect to these enforcement actions. It is anticipated that the information in this publication will be widely disseminated to managers and employees engaged in activities licensed by the NRC, so that actions can be taken to improve safety by avoiding future violations similar to those described in this publication

  15. Economic evaluation in stratified medicine: methodological issues and challenges

    Directory of Open Access Journals (Sweden)

    Hans-Joerg eFugel

    2016-05-01

    Full Text Available Background: Stratified Medicine (SM is becoming a practical reality with the targeting of medicines by using a biomarker or genetic-based diagnostic to identify the eligible patient sub-population. Like any healthcare intervention, SM interventions have costs and consequences that must be considered by reimbursement authorities with limited resources. Methodological standards and guidelines exist for economic evaluations in clinical pharmacology and are an important component for health technology assessments (HTAs in many countries. However, these guidelines have initially been developed for traditional pharmaceuticals and not for complex interventions with multiple components. This raises the issue as to whether these guidelines are adequate to SM interventions or whether new specific guidance and methodology is needed to avoid inconsistencies and contradictory findings when assessing economic value in SM.Objective: This article describes specific methodological challenges when conducting health economic (HE evaluations for SM interventions and outlines potential modifications necessary to existing evaluation guidelines /principles that would promote consistent economic evaluations for SM.Results/Conclusions: Specific methodological aspects for SM comprise considerations on the choice of comparator, measuring effectiveness and outcomes, appropriate modelling structure and the scope of sensitivity analyses. Although current HE methodology can be applied for SM, greater complexity requires further methodology development and modifications in the guidelines.

  16. A methodology to measure the degre of managerial innovation

    Directory of Open Access Journals (Sweden)

    Mustafa Batuhan Ayhan

    2014-01-01

    Full Text Available Purpose: The main objective of this study is to introduce the concept of managerial innovation and to propose a quantitative methodology to measure the degree of managerial innovation capability by analyzing the evolution of the techniques used for management functions.Design/methodology/approach: The methodology mainly focuses on the different techniques used for each management functions namely; Planning, Organizing, Leading, Controlling and Coordinating. These functions are studied and the different techniques used for them are listed. Since the techniques used for these management functions evolve in time due to technological and social changes, a methodology is required to measure the degree of managerial innovation capability. This competency is measured through an analysis performed to point out which techniques used for each of these functions.Findings: To check the validity and applicability of this methodology, it is implemented to a manufacturing company. Depending on the results of the implementation, enhancements are suggested to the company for each function to survive in the changing managerial conditionsResearch limitations/implications: The primary limitation of this study is the implementation area. Although the study is implemented in just a single manufacturing company, it is welcomed to apply the same methodology to measure the managerial innovation capabilities of other manufacturing companies. Moreover, the model is ready to be adapted to different sectors although it is mainly prepared for manufacturing sector.Originality/value: Although innovation management is widely studied, managerial innovation is a new concept and introduced to measure the capability to challenge the changes occur in managerial functions. As a brief this methodology aims to be a pioneer in the field of managerial innovation regarding the evolution of management functions. Therefore it is expected to lead more studies to inspect the progress of

  17. Statistical significance versus clinical relevance.

    Science.gov (United States)

    van Rijn, Marieke H C; Bech, Anneke; Bouyer, Jean; van den Brand, Jan A J G

    2017-04-01

    In March this year, the American Statistical Association (ASA) posted a statement on the correct use of P-values, in response to a growing concern that the P-value is commonly misused and misinterpreted. We aim to translate these warnings given by the ASA into a language more easily understood by clinicians and researchers without a deep background in statistics. Moreover, we intend to illustrate the limitations of P-values, even when used and interpreted correctly, and bring more attention to the clinical relevance of study findings using two recently reported studies as examples. We argue that P-values are often misinterpreted. A common mistake is saying that P < 0.05 means that the null hypothesis is false, and P ≥0.05 means that the null hypothesis is true. The correct interpretation of a P-value of 0.05 is that if the null hypothesis were indeed true, a similar or more extreme result would occur 5% of the times upon repeating the study in a similar sample. In other words, the P-value informs about the likelihood of the data given the null hypothesis and not the other way around. A possible alternative related to the P-value is the confidence interval (CI). It provides more information on the magnitude of an effect and the imprecision with which that effect was estimated. However, there is no magic bullet to replace P-values and stop erroneous interpretation of scientific results. Scientists and readers alike should make themselves familiar with the correct, nuanced interpretation of statistical tests, P-values and CIs. © The Author 2017. Published by Oxford University Press on behalf of ERA-EDTA. All rights reserved.

  18. Moral significance of phenomenal consciousness.

    Science.gov (United States)

    Levy, Neil; Savulescu, Julian

    2009-01-01

    Recent work in neuroimaging suggests that some patients diagnosed as being in the persistent vegetative state are actually conscious. In this paper, we critically examine this new evidence. We argue that though it remains open to alternative interpretations, it strongly suggests the presence of consciousness in some patients. However, we argue that its ethical significance is less than many people seem to think. There are several different kinds of consciousness, and though all kinds of consciousness have some ethical significance, different kinds underwrite different kinds of moral value. Demonstrating that patients have phenomenal consciousness--conscious states with some kind of qualitative feel to them--shows that they are moral patients, whose welfare must be taken into consideration. But only if they are subjects of a sophisticated kind of access consciousness--where access consciousness entails global availability of information to cognitive systems--are they persons, in the technical sense of the word employed by philosophers. In this sense, being a person is having the full moral status of ordinary human beings. We call for further research which might settle whether patients who manifest signs of consciousness possess the sophisticated kind of access consciousness required for personhood.

  19. Clinical significance of the fabella

    International Nuclear Information System (INIS)

    Dodevski, A.; Lazarova-Tosovska, D.; Zhivadinovik, J.; Lazareska, M.

    2012-01-01

    Full text: Introduction: There is variable number of sesamoid bones in the human body; one of them is fabella, located in the tendon of the gastrocnemius muscle. Aim of this study was to investigate the frequency of occurrence of fabella in the Macedonian population and to discuss about clinical importance of this bone. Materials and methods: We retrospectively examined radiographs of 53 patients who had knee exams undertaken for a variety of clinical reasons, performed as a part of their medical treatment. Over a time span of six months, 53 patients (38 males and 15 females, age range 19-60 years, mean age of 36.7±12.3 years) were examined. Results: In seven (13.2%) patients of 53 analyzed reports, fabella was found in the lateral tendon of gastrocnemius muscle. We did not find a significant gender or side difference in the appearance of fabella. Conclusion: Although anatomic studies emphasized a lack of significance of the fabella, this bone has been associated with a spectrum of pathology affecting the knee as fabellar syndrome, perineal nerve injury and fracture. We should think of this sesamoid bone while performing diagnostic and surgical procedures

  20. Shuttle TPS thermal performance and analysis methodology

    Science.gov (United States)

    Neuenschwander, W. E.; Mcbride, D. U.; Armour, G. A.

    1983-01-01

    Thermal performance of the thermal protection system was approximately as predicted. The only extensive anomalies were filler bar scorching and over-predictions in the high Delta p gap heating regions of the orbiter. A technique to predict filler bar scorching has been developed that can aid in defining a solution. Improvement in high Delta p gap heating methodology is still under study. Minor anomalies were also examined for improvements in modeling techniques and prediction capabilities. These include improved definition of low Delta p gap heating, an analytical model for inner mode line convection heat transfer, better modeling of structure, and inclusion of sneak heating. The limited number of problems related to penetration items that presented themselves during orbital flight tests were resolved expeditiously, and designs were changed and proved successful within the time frame of that program.

  1. Loss Factor Characterization Methodology for Piezoelectric Ceramics

    International Nuclear Information System (INIS)

    Zhuang Yuan; Ural, Seyit O; Uchino, Kenji

    2011-01-01

    The key factor for the miniaturization of piezoelectric devices is power density, which is limited by the heat generation or loss mechanisms. There are three loss components for piezoelectric vibrators, i.e., dielectric, elastic and piezoelectric losses. The mechanical quality factor, determined by these three factors, is the figure of merit in the sense of loss or heat generation. In this paper, quality factors of resonance and antiresonance for k 31 , k 33 , and k 15 vibration modes are derived, and the methodology to determine loss factors in various directions is provided. For simplicity, we focus on materials with ∞mm (equivalent to 6mm) crystal symmetry for deriving the loss factors of polycrystalline ceramics, and 16 different loss factors among total 20 can be obtained from the admittance/ impedance measurements.

  2. Methodological guidelines for developing accident modification functions

    DEFF Research Database (Denmark)

    Elvik, Rune

    2015-01-01

    This paper proposes methodological guidelines for developing accident modification functions. An accident modification function is a mathematical function describing systematic variation in the effects of road safety measures. The paper describes ten guidelines. An example is given of how to use...... limitations in developing accident modification functions are the small number of good evaluation studies and the often huge variation in estimates of effect. It is therefore still not possible to develop accident modification functions for very many road safety measures. © 2015 Elsevier Ltd. All rights...... the guidelines. The importance of exploratory analysis and an iterative approach in developing accident modification functions is stressed. The example shows that strict compliance with all the guidelines may be difficult, but represents a level of stringency that should be strived for. Currently the main...

  3. Development of a Long Term Cooling Analysis Methodology Using Rappel

    International Nuclear Information System (INIS)

    Lee, S. I.; Jeong, J. H.; Ban, C. H.; Oh, S. J.

    2012-01-01

    Since the revision of the 10CFR50.46 in 1988, which allowed BE (Best-Estimate) method in analyzing the safety performance of a nuclear power plant, safety analysis methodologies have been changed continuously from conservative EM (Evaluation Model) approaches to BE ones. In this context, LSC (Long-Term core Cooling) methodologies have been reviewed by the regulatory bodies of USA and Korea. Some non-conservatism and improperness of the old methodology have been identified, and as a result, USNRC suspended the approval of CENPD-254-P-A which is the old LSC methodology for CE-designed NPPs. Regulatory bodies requested to remove the non-conservatisms and to reflect system transient behaviors in all the LSC methodologies used. In the present study, a new LSC methodology using RELAP5 is developed. RELAP5 and a newly developed code, BACON (Boric Acid Concentration Of Nuclear power plant) are used to calculate the transient behavior of the system and the boric acid concentration, respectively. Full range of break spectrum is considered and the applicability is confirmed through plant demonstration calculations. The result shows a good comparison with the old-fashioned ones, therefore, the methodology could be applied with no significant changes of current LSC plans

  4. Methodologies of Uncertainty Propagation Calculation

    International Nuclear Information System (INIS)

    Chojnacki, Eric

    2002-01-01

    After recalling the theoretical principle and the practical difficulties of the methodologies of uncertainty propagation calculation, the author discussed how to propagate input uncertainties. He said there were two kinds of input uncertainty: - variability: uncertainty due to heterogeneity, - lack of knowledge: uncertainty due to ignorance. It was therefore necessary to use two different propagation methods. He demonstrated this in a simple example which he generalised, treating the variability uncertainty by the probability theory and the lack of knowledge uncertainty by the fuzzy theory. He cautioned, however, against the systematic use of probability theory which may lead to unjustifiable and illegitimate precise answers. Mr Chojnacki's conclusions were that the importance of distinguishing variability and lack of knowledge increased as the problem was getting more and more complex in terms of number of parameters or time steps, and that it was necessary to develop uncertainty propagation methodologies combining probability theory and fuzzy theory

  5. Multicriteria methodology for decision aiding

    CERN Document Server

    Roy, Bernard

    1996-01-01

    This is the first comprehensive book to present, in English, the multicriteria methodology for decision aiding In the foreword the distinctive features and main ideas of the European School of MCDA are outlined The twelve chapters are essentially expository in nature, but scholarly in treatment Some questions, which are too often neglected in the literature on decision theory, such as how is a decision made, who are the actors, what is a decision aiding model, how to define the set of alternatives, are discussed Examples are used throughout the book to illustrate the various concepts Ways to model the consequences of each alternative and building criteria taking into account the inevitable imprecisions, uncertainties and indeterminations are described and illustrated The three classical operational approaches of MCDA synthesis in one criterion (including MAUT), synthesis by outranking relations, interactive local judgements, are studied This methodology tries to be a theoretical or intellectual framework dire...

  6. AGR core safety assessment methodologies

    International Nuclear Information System (INIS)

    McLachlan, N.; Reed, J.; Metcalfe, M.P.

    1996-01-01

    To demonstrate the safety of its gas-cooled graphite-moderated AGR reactors, nuclear safety assessments of the cores are based upon a methodology which demonstrates no component failures, geometrical stability of the structure and material properties bounded by a database. All AGRs continue to meet these three criteria. However, predictions of future core behaviour indicate that the safety case methodology will eventually need to be modified to deal with new phenomena. A new approach to the safety assessment of the cores is currently under development, which can take account of these factors while at the same time providing the same level of protection for the cores. This approach will be based on the functionality of the core: unhindered movement of control rods, continued adequate cooling of the fuel and the core, continued ability to charge and discharge fuel. (author). 5 figs

  7. Methodological update in Medicina Intensiva.

    Science.gov (United States)

    García Garmendia, J L

    2018-04-01

    Research in the critically ill is complex by the heterogeneity of patients, the difficulties to achieve representative sample sizes and the number of variables simultaneously involved. However, the quantity and quality of records is high as well as the relevance of the variables used, such as survival. The methodological tools have evolved to offering new perspectives and analysis models that allow extracting relevant information from the data that accompanies the critically ill patient. The need for training in methodology and interpretation of results is an important challenge for the intensivists who wish to be updated on the research developments and clinical advances in Intensive Medicine. Copyright © 2017 Elsevier España, S.L.U. y SEMNIM. All rights reserved.

  8. Methodology for building confidence measures

    Science.gov (United States)

    Bramson, Aaron L.

    2004-04-01

    This paper presents a generalized methodology for propagating known or estimated levels of individual source document truth reliability to determine the confidence level of a combined output. Initial document certainty levels are augmented by (i) combining the reliability measures of multiply sources, (ii) incorporating the truth reinforcement of related elements, and (iii) incorporating the importance of the individual elements for determining the probability of truth for the whole. The result is a measure of confidence in system output based on the establishing of links among the truth values of inputs. This methodology was developed for application to a multi-component situation awareness tool under development at the Air Force Research Laboratory in Rome, New York. Determining how improvements in data quality and the variety of documents collected affect the probability of a correct situational detection helps optimize the performance of the tool overall.

  9. Mo(ve)ment-methodology

    DEFF Research Database (Denmark)

    Mørck, Line Lerche; Christian Celosse-Andersen, Martin

    2018-01-01

    This paper describes the theoretical basis for and development of a moment-movement research methodology, based on an integration of critical psychological practice research and critical ethnographic social practice theory. Central theoretical conceptualizations, such as human agency, life...... conditions and identity formation, are discussed in relation to criminological theories of gang desistance. The paper illustrates how the mo(ve)ment methodology was applied in a study of comprehensive processes of identity (re)formation and gang exit processes. This study was conducted with Martin, a former....... This is a moment which captures Martin’s complex and ambiguous feelings of conflictual concerns, frustration, anger, and a new feeling of insecurity in his masculinity, as well as engagement and a sense of deep meaningfulness as he becomes a more reflective academic. All these conflicting feelings also give...

  10. Design methodology of Dutch banknotes

    Science.gov (United States)

    de Heij, Hans A. M.

    2000-04-01

    Since the introduction of a design methodology for Dutch banknotes, the quality of Dutch paper currency has improved in more than one way. The methodology is question provides for (i) a design policy, which helps fix clear objectives; (ii) design management, to ensure a smooth cooperation between the graphic designer, printer, papermaker an central bank, (iii) a program of requirements, a banknote development guideline for all parties involved. This systematic approach enables an objective selection of design proposals, including security features. Furthermore, the project manager obtains regular feedback from the public by conducting market surveys. Each new design of a Netherlands Guilder banknote issued by the Nederlandsche Bank of the past 50 years has been an improvement on its predecessor in terms of value recognition, security and durability.

  11. Workshops as a Research Methodology

    DEFF Research Database (Denmark)

    Ørngreen, Rikke; Levinsen, Karin Tweddell

    2017-01-01

    , and workshops as a research methodology. Focusing primarily on the latter, this paper presents five studies on upper secondary and higher education teachers’ professional development and on teaching and learning through video conferencing. Through analysis and discussion of these studies’ findings, we argue......This paper contributes to knowledge on workshops as a research methodology, and specifically on how such workshops pertain to e-learning. A literature review illustrated that workshops are discussed according to three different perspectives: workshops as a means, workshops as practice...... that workshops provide a platform that can aid researchers in identifying and exploring relevant factors in a given domain by providing means for understanding complex work and knowledge processes that are supported by technology (for example, e-learning). The approach supports identifying factors...

  12. Methodological challenges and lessons learned

    DEFF Research Database (Denmark)

    Nielsen, Poul Erik; Gustafsson, Jessica

    2017-01-01

    Taking as point of departure three recently conducted empirical studies, the aim of this article is to theoretically and empirically discuss methodological challenges studying the interrelations between media and social reality and to critically reflect on the methodologies used in the studies....... By deconstructing the studies, the article draws attention to the fact that different methods are able to grasp different elements of social reality. Moreover, by analysing the power relations at play, the article demonstrated that the interplay between interviewer and interviewee, and how both parties fit...... into present power structures, greatly influence the narratives that are co-produced during interviews. The article thus concludes that in order to fully understand complex phenomena it is not just enough to use a mixture of methods, the makeup of the research team is also imperative, as a diverse team...

  13. A methodological approach to designing sewer system control

    DEFF Research Database (Denmark)

    Mollerup, Ane Loft

    for this thesis was therefore the wish for a methodological approach to sewer system control design. Using a case study the following research hypothesis was tested in this thesis: Using classical and modern control theory, a methodological approach can be derived for designing sewer system control. This can aid....... This was not unexpected, since the true potential of having optimisation arises, when a system has many control loops with limit-ing constraints and/or changing prioritisation between them. The results showed that for small sewer systems, where the complexity is limited, it is not necessarily the best option to implement...... generate control systems of the future that are more robust, more structured, have a better performance and are easi-er to maintain....

  14. The significance of small streams

    Science.gov (United States)

    Wohl, Ellen

    2017-09-01

    Headwaters, defined here as first- and secondorder streams, make up 70%‒80% of the total channel length of river networks. These small streams exert a critical influence on downstream portions of the river network by: retaining or transmitting sediment and nutrients; providing habitat and refuge for diverse aquatic and riparian organisms; creating migration corridors; and governing connectivity at the watershed-scale. The upstream-most extent of the channel network and the longitudinal continuity and lateral extent of headwaters can be difficult to delineate, however, and people are less likely to recognize the importance of headwaters relative to other portions of a river network. Consequently, headwaters commonly lack the legal protections accorded to other portions of a river network and are more likely to be significantly altered or completely obliterated by land use.

  15. Ritual Significance in Mycenaean Hairstyles

    Directory of Open Access Journals (Sweden)

    Hsu, Florence Sheng-chieh

    2012-04-01

    Full Text Available Although the frescoes excavated from Bronze Age sites on the Greek mainland provide evidence for female figures in the Mycenaean society, the hairstyles of these figures have not been studied in detail. As in many other ancient cultures, hairstyles were not only an exhibition of beauty and fashion, but they also represented certain age groups or a person’s social status. The Mycenaeans inherited many of their hairstyles from their Minoan predecessors, although differences existed as well. It is also possible there may have been a shift in meaning for seemingly similar looking hairstyles from the Minoan to the Mycenaean periods. Female figures, which compose most of the Mycenaean figures in frescoes known to date, are fine examples for discussing the artistic representation and potential significance of Mycenaean hairstyles. By comparing with Minoan hairstyles, discussions of Mycenaean examples lead to conclusions in the relationship between hairstyles and ritual activities in the Mycenaean society.

  16. ISE System Development Methodology Manual

    Energy Technology Data Exchange (ETDEWEB)

    Hayhoe, G.F.

    1992-02-17

    The Information Systems Engineering (ISE) System Development Methodology Manual (SDM) is a framework of life cycle management guidelines that provide ISE personnel with direction, organization, consistency, and improved communication when developing and maintaining systems. These guide-lines were designed to allow ISE to build and deliver Total Quality products, and to meet the goals and requirements of the US Department of Energy (DOE), Westinghouse Savannah River Company, and Westinghouse Electric Corporation.

  17. Environmental Testing Methodology in Biometrics

    OpenAIRE

    Fernández Saavedra, Belén; Sánchez Reíllo, Raúl; Alonso Moreno, Raúl; Miguel Hurtado, Óscar

    2010-01-01

    8 pages document + 5-slide presentation.-- Contributed to: 1st International Biometric Performance Conference (IBPC 2010, NIST, Gaithersburg, MD, US, Mar 1-5, 2010). Recently, biometrics is used in many security systems and these systems can be located in different environments. As many experts claim and previous works have demonstrated, environmental conditions influence biometric performance. Nevertheless, there is not a specific methodology for testing this influence at the moment...

  18. Systems engineering agile design methodologies

    CERN Document Server

    Crowder, James A

    2013-01-01

    This book examines the paradigm of the engineering design process. The authors discuss agile systems and engineering design. The book captures the entire design process (functionbases), context, and requirements to affect real reuse. It provides a methodology for an engineering design process foundation for modern and future systems design. This book captures design patterns with context for actual Systems Engineering Design Reuse and contains a new paradigm in Design Knowledge Management.

  19. Developing Foucault's Discourse Analytic Methodology

    Directory of Open Access Journals (Sweden)

    Rainer Diaz-Bone

    2006-01-01

    Full Text Available A methodological position for a FOUCAULTian discourse analysis is presented. A sequence of analytical steps is introduced and an illustrating example is offered. It is emphasized that discourse analysis has to discover the system-level of discursive rules and the deeper structure of the discursive formation. Otherwise the analysis will be unfinished. Michel FOUCAULTs work is theoretically grounded in French structuralism and (the so called post-structuralism. In this paper, post-structuralism is not conceived as a means for overcoming of structuralism, but as a way of critically continuing the structural perspective. In this way, discursive structures can be related to discursive practices and the concept of structure can be disclosed (e. g. to inter-discourse or DERRIDAs concept of structurality. In this way, the structural methodology is continued and radicalized, but not given up. In this paper, FOUCAULTs theory is combined with the works of Michel PÊCHEUX and (especially for the sociology of knowledge and the sociology of culture Pierre BOURDIEU. The practice of discourse analysis is theoretically grounded. This practice can be conceived as a reflexive coupling of deconstruction and reconstruction in the material to be analyzed. This methodology therefore can be characterized as a reconstructive qualitative methodology. At the end of the article, forms of discourse analysis are criticized that do not intend to recover the system level of discursive rules and that do not intend to discover the deeper structure of the discursive formation (i. e. episteme, socio-episteme. These forms merely are commentaries of discourses (not their analyses, they remain phenomenological and are therefore: pre-structuralist. URN: urn:nbn:de:0114-fqs060168

  20. Preliminary disposal limits, plume interaction factors, and final disposal limits

    Energy Technology Data Exchange (ETDEWEB)

    Flach, G. [Savannah River Site (SRS), Aiken, SC (United States). Savannah River National Lab. (SRNL)

    2018-01-11

    In the 2008 E-Area Performance Assessment (PA), each final disposal limit was constructed as the product of a preliminary disposal limit and a plume interaction factor. The following mathematical development demonstrates that performance objectives are generally expected to be satisfied with high confidence under practical PA scenarios using this method. However, radionuclides that experience significant decay between a disposal unit and the 100-meter boundary, such as H-3 and Sr-90, can challenge performance objectives, depending on the disposed-of waste composition, facility geometry, and the significance of the plume interaction factor. Pros and cons of analyzing single disposal units or multiple disposal units as a group in the preliminary disposal limits analysis are also identified.

  1. The energetic significance of cooking.

    Science.gov (United States)

    Carmody, Rachel N; Wrangham, Richard W

    2009-10-01

    While cooking has long been argued to improve the diet, the nature of the improvement has not been well defined. As a result, the evolutionary significance of cooking has variously been proposed as being substantial or relatively trivial. In this paper, we evaluate the hypothesis that an important and consistent effect of cooking food is a rise in its net energy value. The pathways by which cooking influences net energy value differ for starch, protein, and lipid, and we therefore consider plant and animal foods separately. Evidence of compromised physiological performance among individuals on raw diets supports the hypothesis that cooked diets tend to provide energy. Mechanisms contributing to energy being gained from cooking include increased digestibility of starch and protein, reduced costs of digestion for cooked versus raw meat, and reduced energetic costs of detoxification and defence against pathogens. If cooking consistently improves the energetic value of foods through such mechanisms, its evolutionary impact depends partly on the relative energetic benefits of non-thermal processing methods used prior to cooking. We suggest that if non-thermal processing methods such as pounding were used by Lower Palaeolithic Homo, they likely provided an important increase in energy gain over unprocessed raw diets. However, cooking has critical effects not easily achievable by non-thermal processing, including the relatively complete gelatinisation of starch, efficient denaturing of proteins, and killing of food borne pathogens. This means that however sophisticated the non-thermal processing methods were, cooking would have conferred incremental energetic benefits. While much remains to be discovered, we conclude that the adoption of cooking would have led to an important rise in energy availability. For this reason, we predict that cooking had substantial evolutionary significance.

  2. Neurophysiological advantages of biorelevant methodology of teaching academic disciplines

    Directory of Open Access Journals (Sweden)

    Natalia A. Davidovskaya

    2017-01-01

    Full Text Available The analysis of technology of teaching a biorelevant lesson showed when using a traditional methodology of teaching (the sinistrocerebral one, there is a contradiction between social and biological aspects of the cerebrum functional system. It impels teachers to search for new methods of teaching on the basis of human reserves that have not been involved before. Personal development and education, as processes of forming higher mental functions, are considered in the article as complex forms of conscious activity, which are regulated by the corresponding aims and programs. It is shown that for comprehensive perception and fixing of the received information in memory, it is important that neural connections of cerebrum were maximally activated vertically (subcortex-cortex and horizontally (left and right brain. In this connection, a cerebrum is considered as a complex metasystem consisting of macro- and microsystems incorporated into a multilevel organization with multiple horizontal and vertical relations. In such a system, the code of perceiving, processing and maintaining information is highly sought in the conditions of research activity and corresponds to a person’s instinct of self-preservation at most.  On the principle that the sinistrocerebral methodology of teaching, dividing reason and feelings, leads to “robotization” of an individual, disconnection with long-term memory, teleologism and natural instincts are disjoined. Further use of the sinistrocerebral methodology of teaching in the conditions of computerization of society threatens with degradation of the succeeding generations. The traditional method of teaching violates the genetic sequence of perceiving information and results in the functional disconnection in the integrative brain activity, forming the “tunnel of reality”, limited by short-term memory, on the one hand, and by the blinkered vision, on the other hand, that worsens the quality of life, psychological

  3. Energy Efficiency Indicators Methodology Booklet

    Energy Technology Data Exchange (ETDEWEB)

    Sathaye, Jayant; Price, Lynn; McNeil, Michael; de la rue du Can, Stephane

    2010-05-01

    This Methodology Booklet provides a comprehensive review and methodology guiding principles for constructing energy efficiency indicators, with illustrative examples of application to individual countries. It reviews work done by international agencies and national government in constructing meaningful energy efficiency indicators that help policy makers to assess changes in energy efficiency over time. Building on past OECD experience and best practices, and the knowledge of these countries' institutions, relevant sources of information to construct an energy indicator database are identified. A framework based on levels of hierarchy of indicators -- spanning from aggregate, macro level to disaggregated end-use level metrics -- is presented to help shape the understanding of assessing energy efficiency. In each sector of activity: industry, commercial, residential, agriculture and transport, indicators are presented and recommendations to distinguish the different factors affecting energy use are highlighted. The methodology booklet addresses specifically issues that are relevant to developing indicators where activity is a major factor driving energy demand. A companion spreadsheet tool is available upon request.

  4. Methodology for ranking restoration options

    International Nuclear Information System (INIS)

    Hedemann Jensen, Per

    1999-04-01

    The work described in this report has been performed as a part of the RESTRAT Project FI4P-CT95-0021a (PL 950128) co-funded by the Nuclear Fission Safety Programme of the European Commission. The RESTRAT project has the overall objective of developing generic methodologies for ranking restoration techniques as a function of contamination and site characteristics. The project includes analyses of existing remediation methodologies and contaminated sites, and is structured in the following steps: characterisation of relevant contaminated sites; identification and characterisation of relevant restoration techniques; assessment of the radiological impact; development and application of a selection methodology for restoration options; formulation of generic conclusions and development of a manual. The project is intended to apply to situations in which sites with nuclear installations have been contaminated with radioactive materials as a result of the operation of these installations. The areas considered for remedial measures include contaminated land areas, rivers and sediments in rivers, lakes, and sea areas. Five contaminated European sites have been studied. Various remedial measures have been envisaged with respect to the optimisation of the protection of the populations being exposed to the radionuclides at the sites. Cost-benefit analysis and multi-attribute utility analysis have been applied for optimisation. Health, economic and social attributes have been included and weighting factors for the different attributes have been determined by the use of scaling constants. (au)

  5. [Progress in methodological characteristics of clinical practice guideline for osteoarthritis].

    Science.gov (United States)

    Xing, D; Wang, B; Lin, J H

    2017-06-01

    At present, several clinical practice guidelines for the treatment of osteoarthritis have been developed by institutes or societies. The ultimate purpose of developing clinical practice guidelines is to formulate the process in the treatment of osteoarthritis effectively. However, the methodologies used in developing clinical practice guidelines may place an influence on the transformation and application of that in treating osteoarthritis. The present study summarized the methodological features of individual clinical practice guideline and presented the tools for quality evaluation of clinical practice guideline. The limitations of current osteoarthritis guidelines of China are also indicated. The review article might help relevant institutions improve the quality in developing guide and clinical transformation.

  6. Topics in expert system design methodologies and tools

    CERN Document Server

    Tasso, C

    1989-01-01

    Expert Systems are so far the most promising achievement of artificial intelligence research. Decision making, planning, design, control, supervision and diagnosis are areas where they are showing great potential. However, the establishment of expert system technology and its actual industrial impact are still limited by the lack of a sound, general and reliable design and construction methodology.This book has a dual purpose: to offer concrete guidelines and tools to the designers of expert systems, and to promote basic and applied research on methodologies and tools. It is a coordinated coll

  7. Significance and popularity in music production

    Science.gov (United States)

    Gravino, Pietro; Servedio, Vito D. P.; Tria, Francesca; Loreto, Vittorio

    2017-01-01

    Creative industries constantly strive for fame and popularity. Though highly desirable, popularity is not the only achievement artistic creations might ever acquire. Leaving a longstanding mark in the global production and influencing future works is an even more important achievement, usually acknowledged by experts and scholars. ‘Significant’ or ‘influential’ works are not always well known to the public or have sometimes been long forgotten by the vast majority. In this paper, we focus on the duality between what is successful and what is significant in the musical context. To this end, we consider a user-generated set of tags collected through an online music platform, whose evolving co-occurrence network mirrors the growing conceptual space underlying music production. We define a set of general metrics aiming at characterizing music albums throughout history, and their relationships with the overall musical production. We show how these metrics allow to classify albums according to their current popularity or their belonging to expert-made lists of important albums. In this way, we provide the scientific community and the public at large with quantitative tools to tell apart popular albums from culturally or aesthetically relevant artworks. The generality of the methodology presented here lends itself to be used in all those fields where innovation and creativity are in play. PMID:28791169

  8. Robust test limits

    NARCIS (Netherlands)

    Albers, Willem/Wim; Kallenberg, W.C.M.; Otten, G.D.

    1997-01-01

    Because of inaccuracies of the measurement process inspection of manufactured parts requires test limits which are more strict than the given specification limits. Test limits derived under the assumption of normality for product characteristics turn out to violate the prescribed bound on the

  9. Methodology of external exposure calculation for reuse of conditional released materials from decommissioning - 59138

    International Nuclear Information System (INIS)

    Ondra, Frantisek; Vasko, Marek; Necas, Vladimir

    2012-01-01

    The article presents methodology of external exposure calculation for reuse of conditional released materials from decommissioning using VISIPLAN 3D ALARA planning tool. Production of rails has been used as an example application of proposed methodology within the CONRELMAT project. The article presents a methodology for determination of radiological, material, organizational and other conditions for conditionally released materials reuse to ensure that workers and public exposure does not breach the exposure limits during scenario's life cycle (preparation, construction and operation of scenario). The methodology comprises a proposal of following conditions in the view of workers and public exposure: - radionuclide limit concentration of conditionally released materials for specific scenarios and nuclide vectors, - specific deployment of conditionally released materials eventually shielding materials, workers and public during the scenario's life cycle, - organizational measures concerning time of workers or public stay in the vicinity on conditionally released materials for individual performed scenarios and nuclide vectors. The above mentioned steps of proposed methodology have been applied within the CONRELMAT project. Exposure evaluation of workers for rail production is introduced in the article as an example of this application. Exposure calculation using VISIPLAN 3D ALARA planning tool was done within several models. The most exposed profession for scenario was identified. On the basis of this result, an increase of radionuclide concentration in conditional released material was proposed more than two times to 681 Bq/kg without no additional safety or organizational measures being applied. After application of proposed safety and organizational measures (additional shielding, geometry changes and limitation of work duration) it is possible to increase concentration of radionuclide in conditional released material more than ten times to 3092 Bq/kg. Storage

  10. Affordance of Braille Music as a Mediational Means: Significance and Limitations

    Science.gov (United States)

    Park, Hyu-Yong; Kim, Mi-Jung

    2014-01-01

    Affordance refers to the properties or designs of a thing that offer the function of the thing. This paper discusses the affordance of Braille music in terms of three notions: mediational means, mastery and appropriation, and focuses on answering the following three questions: (i) How do musicians with visual impairments (MVI) perceive Braille…

  11. Significance of the new ICRP dose limits in the Indian context

    International Nuclear Information System (INIS)

    Mehta, S.K.

    1993-01-01

    The ICRP estimates the risk quantities using the primary risk coefficients from the results of Japanese survivor studies (with DDREF of 2) along with the all-causes mortality and survival probabilities of the Swedish population. In the present work, risk quantities have been computed using the ICRP estimates of the attributable conditional cancer death probability rates for different exposure levels along with the survival probabilities of the Indian population from the official Indian life tables. For this purpose the parameters of the latest Indian life tables are extrapolated beyond the highest tabulated age of 70 years by 'logit transformation' using the parameters of the complete Indian life table to age 100 years for the period 1951-60 as standard. The results of the present work show that the Indian and the Swedish-ICRP risk quantity estimates are consistent as a function of the life expectancies of the populations and that the Swedish-ICRP risk quantity estimates contain safety factors of about 2 in the Indian context. (author)

  12. A Limited Structural Modification Results in a Significantly More Efficacious Diazachrysene-Based Filovirus Inhibitor

    Directory of Open Access Journals (Sweden)

    Rekha G. Panchal

    2012-08-01

    Full Text Available Ebola (EBOV and Marburg (MARV filoviruses are highly infectious pathogens causing deadly hemorrhagic fever in humans and non-human primates. Promising vaccine candidates providing immunity against filoviruses have been reported. However, the sporadic nature and swift progression of filovirus disease underlines the need for the development of small molecule therapeutics providing immediate antiviral effects. Herein we describe a brief structural exploration of two previously reported diazachrysene (DAAC-based EBOV inhibitors. Specifically, three analogs were prepared to examine how slight substituent modifications would affect inhibitory efficacy and inhibitor-mediated toxicity during not only EBOV, but also MARV cellular infection. Of the three analogs, one was highly efficacious, providing IC50 values of 0.696 µM ± 0.13 µM and 2.76 µM ± 0.21 µM against EBOV and MARV infection, respectively, with little or no associated cellular toxicity. Overall, the structure-activity and structure-toxicity results from this study provide a framework for the future development of DAAC-based filovirus inhibitors that will be both active and non-toxic in vivo.

  13. The Atterberg limits and their significance in the ceramic and brick industries

    Directory of Open Access Journals (Sweden)

    Sembenelli, P.

    1966-12-01

    Full Text Available Not availableLos límites de consistencia de Atterberg suministran los elementos para una rigurosa clasificación de las arcillas y para valorar muchas de sus propiedades. Pueden emplearse con utilidad para emprender un estudio riguroso, bien de los materiales destinados a la industria cerámica y de los ladrillos, o bien para proyectar las plantas de producción, integrando o sustituyendo algunos criterios todavía en uso.

  14. Prioritization methodology for chemical replacement

    Science.gov (United States)

    Cruit, Wendy; Goldberg, Ben; Schutzenhofer, Scott

    1995-01-01

    Since United States of America federal legislation has required ozone depleting chemicals (class 1 & 2) to be banned from production, The National Aeronautics and Space Administration (NASA) and industry have been required to find other chemicals and methods to replace these target chemicals. This project was initiated as a development of a prioritization methodology suitable for assessing and ranking existing processes for replacement 'urgency.' The methodology was produced in the form of a workbook (NASA Technical Paper 3421). The final workbook contains two tools, one for evaluation and one for prioritization. The two tools are interconnected in that they were developed from one central theme - chemical replacement due to imposed laws and regulations. This workbook provides matrices, detailed explanations of how to use them, and a detailed methodology for prioritization of replacement technology. The main objective is to provide a GUIDELINE to help direct the research for replacement technology. The approach for prioritization called for a system which would result in a numerical rating for the chemicals and processes being assessed. A Quality Function Deployment (QFD) technique was used in order to determine numerical values which would correspond to the concerns raised and their respective importance to the process. This workbook defines the approach and the application of the QFD matrix. This technique: (1) provides a standard database for technology that can be easily reviewed, and (2) provides a standard format for information when requesting resources for further research for chemical replacement technology. Originally, this workbook was to be used for Class 1 and Class 2 chemicals, but it was specifically designed to be flexible enough to be used for any chemical used in a process (if the chemical and/or process needs to be replaced). The methodology consists of comparison matrices (and the smaller comparison components) which allow replacement technology

  15. Significant biases affecting abundance determinations

    Science.gov (United States)

    Wesson, Roger

    2015-08-01

    I have developed two highly efficient codes to automate analyses of emission line nebulae. The tools place particular emphasis on the propagation of uncertainties. The first tool, ALFA, uses a genetic algorithm to rapidly optimise the parameters of gaussian fits to line profiles. It can fit emission line spectra of arbitrary resolution, wavelength range and depth, with no user input at all. It is well suited to highly multiplexed spectroscopy such as that now being carried out with instruments such as MUSE at the VLT. The second tool, NEAT, carries out a full analysis of emission line fluxes, robustly propagating uncertainties using a Monte Carlo technique.Using these tools, I have found that considerable biases can be introduced into abundance determinations if the uncertainty distribution of emission lines is not well characterised. For weak lines, normally distributed uncertainties are generally assumed, though it is incorrect to do so, and significant biases can result. I discuss observational evidence of these biases. The two new codes contain routines to correctly characterise the probability distributions, giving more reliable results in analyses of emission line nebulae.

  16. Astrobiological significance of chemolithoautotrophic acidophiles

    Science.gov (United States)

    Pikuta, Elena V.; Hoover, Richard B.

    2004-02-01

    For more than a century (since Winogradsky discovered lithautotrophic bacteria) there has been a dilemma in microbiology about life that first inhabited the Earth. Which types of life forms first appeared in the primordial oceans during the earliest geological period on Earth as the primary ancestors of modern biological diversity? How did a metabolism of ancestors evolve: from lithoautotrophic to lithoheterotrophic and organoheterotrophic or from organoheterotrophic to organautotrophic and lithomixotrophic types? At the present time, it is known that chemolithoheterotrophic and chemolithoautotrophic metabolizing bacteria are wide spread in different ecosystems. On Earth the acidic ecosystems are associated with geysers, volcanic fumaroles, hot springs, deep sea hydrothermal vents, caves, acid mine drainage and other technogenic ecosystems. Bioleaching played a significant roel on a global geological scale during the Earth's formation. This important feature of bacteria has been successfully applied in industry. The lithoautotrophs include Bacteria and Archaea belonging to diverse genera containing thermophilic and mesophilic species. In this paper we discuss the lithotrophic microbial acidophiles and present some data with a description of new acidophilic iron- and sulfur-oxidizing bacterium isolated from the Chena Hot Springs in Alaska. We also consider the possible relevance of microbial acidophiles to Venus, Io, and acidic inclusions in glaciers and icy moons.

  17. Determining Semantically Related Significant Genes.

    Science.gov (United States)

    Taha, Kamal

    2014-01-01

    GO relation embodies some aspects of existence dependency. If GO term xis existence-dependent on GO term y, the presence of y implies the presence of x. Therefore, the genes annotated with the function of the GO term y are usually functionally and semantically related to the genes annotated with the function of the GO term x. A large number of gene set enrichment analysis methods have been developed in recent years for analyzing gene sets enrichment. However, most of these methods overlook the structural dependencies between GO terms in GO graph by not considering the concept of existence dependency. We propose in this paper a biological search engine called RSGSearch that identifies enriched sets of genes annotated with different functions using the concept of existence dependency. We observe that GO term xcannot be existence-dependent on GO term y, if x- and y- have the same specificity (biological characteristics). After encoding into a numeric format the contributions of GO terms annotating target genes to the semantics of their lowest common ancestors (LCAs), RSGSearch uses microarray experiment to identify the most significant LCA that annotates the result genes. We evaluated RSGSearch experimentally and compared it with five gene set enrichment systems. Results showed marked improvement.

  18. Statistically significant relational data mining :

    Energy Technology Data Exchange (ETDEWEB)

    Berry, Jonathan W.; Leung, Vitus Joseph; Phillips, Cynthia Ann; Pinar, Ali; Robinson, David Gerald; Berger-Wolf, Tanya; Bhowmick, Sanjukta; Casleton, Emily; Kaiser, Mark; Nordman, Daniel J.; Wilson, Alyson G.

    2014-02-01

    This report summarizes the work performed under the project (3z(BStatitically significant relational data mining.(3y (BThe goal of the project was to add more statistical rigor to the fairly ad hoc area of data mining on graphs. Our goal was to develop better algorithms and better ways to evaluate algorithm quality. We concetrated on algorithms for community detection, approximate pattern matching, and graph similarity measures. Approximate pattern matching involves finding an instance of a relatively small pattern, expressed with tolerance, in a large graph of data observed with uncertainty. This report gathers the abstracts and references for the eight refereed publications that have appeared as part of this work. We then archive three pieces of research that have not yet been published. The first is theoretical and experimental evidence that a popular statistical measure for comparison of community assignments favors over-resolved communities over approximations to a ground truth. The second are statistically motivated methods for measuring the quality of an approximate match of a small pattern in a large graph. The third is a new probabilistic random graph model. Statisticians favor these models for graph analysis. The new local structure graph model overcomes some of the issues with popular models such as exponential random graph models and latent variable models.

  19. Statistical Significance for Hierarchical Clustering

    Science.gov (United States)

    Kimes, Patrick K.; Liu, Yufeng; Hayes, D. Neil; Marron, J. S.

    2017-01-01

    Summary Cluster analysis has proved to be an invaluable tool for the exploratory and unsupervised analysis of high dimensional datasets. Among methods for clustering, hierarchical approaches have enjoyed substantial popularity in genomics and other fields for their ability to simultaneously uncover multiple layers of clustering structure. A critical and challenging question in cluster analysis is whether the identified clusters represent important underlying structure or are artifacts of natural sampling variation. Few approaches have been proposed for addressing this problem in the context of hierarchical clustering, for which the problem is further complicated by the natural tree structure of the partition, and the multiplicity of tests required to parse the layers of nested clusters. In this paper, we propose a Monte Carlo based approach for testing statistical significance in hierarchical clustering which addresses these issues. The approach is implemented as a sequential testing procedure guaranteeing control of the family-wise error rate. Theoretical justification is provided for our approach, and its power to detect true clustering structure is illustrated through several simulation studies and applications to two cancer gene expression datasets. PMID:28099990

  20. Adaptability of laser diffraction measurement technique in soil physics methodology

    Science.gov (United States)

    Barna, Gyöngyi; Szabó, József; Rajkai, Kálmán; Bakacsi, Zsófia; Koós, Sándor; László, Péter; Hauk, Gabriella; Makó, András

    2016-04-01

    There are intentions all around the world to harmonize soils' particle size distribution (PSD) data by the laser diffractometer measurements (LDM) to that of the sedimentation techniques (pipette or hydrometer methods). Unfortunately, up to the applied methodology (e. g. type of pre-treatments, kind of dispersant etc.), PSDs of the sedimentation methods (due to different standards) are dissimilar and could be hardly harmonized with each other, as well. A need was arisen therefore to build up a database, containing PSD values measured by the pipette method according to the Hungarian standard (MSZ-08. 0205: 1978) and the LDM according to a widespread and widely used procedure. In our current publication the first results of statistical analysis of the new and growing PSD database are presented: 204 soil samples measured with pipette method and LDM (Malvern Mastersizer 2000, HydroG dispersion unit) were compared. Applying usual size limits at the LDM, clay fraction was highly under- and silt fraction was overestimated compared to the pipette method. Subsequently soil texture classes determined from the LDM measurements significantly differ from results of the pipette method. According to previous surveys and relating to each other the two dataset to optimizing, the clay/silt boundary at LDM was changed. Comparing the results of PSDs by pipette method to that of the LDM, in case of clay and silt fractions the modified size limits gave higher similarities. Extension of upper size limit of clay fraction from 0.002 to 0.0066 mm, and so change the lower size limit of silt fractions causes more easy comparability of pipette method and LDM. Higher correlations were found between clay content and water vapor adsorption, specific surface area in case of modified limit, as well. Texture classes were also found less dissimilar. The difference between the results of the two kind of PSD measurement methods could be further reduced knowing other routinely analyzed soil parameters

  1. LANSCE beam current limiter

    International Nuclear Information System (INIS)

    Gallegos, F.R.

    1996-01-01

    The Radiation Security System (RSS) at the Los Alamos Neutron Science Center (LANSCE) provides personnel protection from prompt radiation due to accelerated beam. Active instrumentation, such as the Beam Current Limiter, is a component of the RSS. The current limiter is designed to limit the average current in a beam line below a specific level, thus minimizing the maximum current available for a beam spill accident. The beam current limiter is a self-contained, electrically isolated toroidal beam transformer which continuously monitors beam current. It is designed as fail-safe instrumentation. The design philosophy, hardware design, operation, and limitations of the device are described

  2. The fractional scaling methodology (FSM) Part 1. methodology development

    International Nuclear Information System (INIS)

    Novak Zuber; Ivan Catton; Upendra S Rohatgi; Wolfgang Wulff

    2005-01-01

    Full text of publication follows: a quantitative methodology is developed, based on the concepts of hierarchy and synthesis, to integrate and organize information and data. The methodology uses scaling to synthesize experimental data and analytical results, and to provide quantitative criteria for evaluating the effects of various design and operating parameters that influence processes in a complex system such as a nuclear power plant or a related test facility. Synthesis and scaling are performed on three hierarchical levels: the process, component and system levels. Scaling on the process level determines the effect of a selected process on a particular state variable during a selected scenario. At the component level this scaling determines the effects various processes have on a state variable, and it ranks the processes according to their importance by the magnitude of the fractional change they cause on that state variable. At the system level the scaling determines the governing processes and corresponding components, ranking these in the order of importance according to their effect on the fractional change of system-wide state variables. The scaling methodology reveals on all levels the fractional change of state variables and is called therefore the Fractional Scaling Methodology (FSM). FSM synthesizes process parameters and assigns to each thermohydraulic process a dimensionless effect metric Ω = ωt, that is the product of the specific rate of fractional change ω and the characteristic time t. The rate of fractional change ω is the ratio of process transport rate over content of a preserved quantity in a component. The effect metric Ω quantifies the contribution of the process to the fractional change of a state variable in a given component. Ordering of a component effect metrics provides the hierarchy of processes in a component, then in all components and the system. FSM separates quantitatively dominant from minor processes and components and

  3. Application of extended statistical combination of uncertainties methodology for digital nuclear power plants

    Energy Technology Data Exchange (ETDEWEB)

    In, Wang Ki; Uh, Keun Sun; Chul, Kim Heui [Korea Atomic Energy Research Institute, Taejon (Korea, Republic of)

    1995-02-01

    A technically more direct statistical combinations of uncertainties methodology, extended SCU (XSCU), was applied to statistically combine the uncertainties associated with the DNBR alarm setpoint and the DNBR trip setpoint of digital nuclear power plants. The modified SCU (MSCU) methodology is currently used as the USNRC approved design methodology to perform the same function. In this report, the MSCU and XSCU methodologies were compared in terms of the total uncertainties and the net margins to the DNBR alarm and trip setpoints. The MSCU methodology resulted in the small total penalties due to a significantly negative bias which are quite large. However the XSCU methodology gave the virtually unbiased total uncertainties. The net margins to the DNBR alarm and trip setpoints by the MSCU methodology agree with those by the XSCU methodology within statistical variations. (Author) 12 refs., 17 figs., 5 tabs.

  4. Global conservation significance of Ecuador's Yasuní National Park.

    Directory of Open Access Journals (Sweden)

    Margot S Bass

    Full Text Available BACKGROUND: The threats facing Ecuador's Yasuní National Park are emblematic of those confronting the greater western Amazon, one of the world's last high-biodiversity wilderness areas. Notably, the country's second largest untapped oil reserves--called "ITT"--lie beneath an intact, remote section of the park. The conservation significance of Yasuní may weigh heavily in upcoming state-level and international decisions, including whether to develop the oil or invest in alternatives. METHODOLOGY/PRINCIPAL FINDINGS: We conducted the first comprehensive synthesis of biodiversity data for Yasuní. Mapping amphibian, bird, mammal, and plant distributions, we found eastern Ecuador and northern Peru to be the only regions in South America where species richness centers for all four taxonomic groups overlap. This quadruple richness center has only one viable strict protected area (IUCN levels I-IV: Yasuní. The park covers just 14% of the quadruple richness center's area, whereas active or proposed oil concessions cover 79%. Using field inventory data, we compared Yasuní's local (alpha and landscape (gamma diversity to other sites, in the western Amazon and globally. These analyses further suggest that Yasuní is among the most biodiverse places on Earth, with apparent world richness records for amphibians, reptiles, bats, and trees. Yasuní also protects a considerable number of threatened species and regional endemics. CONCLUSIONS/SIGNIFICANCE: Yasuní has outstanding global conservation significance due to its extraordinary biodiversity and potential to sustain this biodiversity in the long term because of its 1 large size and wilderness character, 2 intact large-vertebrate assemblage, 3 IUCN level-II protection status in a region lacking other strict protected areas, and 4 likelihood of maintaining wet, rainforest conditions while anticipated climate change-induced drought intensifies in the eastern Amazon. However, further oil development in

  5. Conducting Research with LGB People of Color: Methodological Challenges and Strategies

    Science.gov (United States)

    DeBlaere, Cirleen; Brewster, Melanie E.; Sarkees, Anthony; Moradi, Bonnie

    2010-01-01

    Methodological barriers have been highlighted as a primary reason for the limited research with lesbian, gay, and bisexual (LGB) people of color. Thus, strategies for anticipating and addressing potential methodological barriers are needed. To address this need, this article discusses potential challenges associated with conducting research with…

  6. Risk control and the minimum significant risk

    International Nuclear Information System (INIS)

    Seiler, F.A.; Alvarez, J.L.

    1996-01-01

    Risk management implies that the risk manager can, by his actions, exercise at least a modicum of control over the risk in question. In the terminology of control theory, a management action is a control signal imposed as feedback on the system to bring about a desired change in the state of the system. In the terminology of risk management, an action is taken to bring a predicted risk to lower values. Even if it is assumed that the management action taken is 100% effective and that the projected risk reduction is infinitely well known, there is a lower limit to the desired effects that can be achieved. It is based on the fact that all risks, such as the incidence of cancer, exhibit a degree of variability due to a number of extraneous factors such as age at exposure, sex, location, and some lifestyle parameters such as smoking or the consumption of alcohol. If the control signal is much smaller than the variability of the risk, the signal is lost in the noise and control is lost. This defines a minimum controllable risk based on the variability of the risk over the population considered. This quantity is the counterpart of the minimum significant risk which is defined by the uncertainties of the risk model. Both the minimum controllable risk and the minimum significant risk are evaluated for radiation carcinogenesis and are shown to be of the same order of magnitude. For a realistic management action, the assumptions of perfectly effective action and perfect model prediction made above have to be dropped, resulting in an effective minimum controllable risk which is determined by both risk limits. Any action below that effective limit is futile, but it is also unethical due to the ethical requirement of doing more good than harm. Finally, some implications of the effective minimum controllable risk on the use of the ALARA principle and on the evaluation of remedial action goals are presented

  7. What value, detection limits

    International Nuclear Information System (INIS)

    Currie, L.A.

    1986-01-01

    Specific approaches and applications of LLD's to nuclear and ''nuclear-related'' measurements are presented in connection with work undertaken for the U.S. Nuclear Regulatory Commission and the International Atomic Energy Agency. In this work, special attention was given to assumptions and potential error sources, as well as to different types of analysis. For the former, the authors considered random and systematic error associated with the blank and the calibration and sample preparation processes, as well as issues relating to the nature of the random error distributions. Analysis types considered included continuous monitoring, ''simple counting'' involving scalar quantities, and spectrum fitting involving data vectors. The investigation of data matrices and multivariate analysis is also described. The most important conclusions derived from this study are: that there is a significant lack of communication and compatibility resulting from diverse terminology and conceptual bases - including no-basis ''ad hoc'' definitions; that the distinction between detection decisions and detection limits is frequently lost sight of; and that quite erroneous LOD estimates follow from inadequate consideration of the actual variability of the blank, and systematic error associated with the blank, the calibration-recovery factor, matrix effects, and ''black box'' data reduction models

  8. A dynamic systems engineering methodology research study. Phase 2: Evaluating methodologies, tools, and techniques for applicability to NASA's systems projects

    Science.gov (United States)

    Paul, Arthur S.; Gill, Tepper L.; Maclin, Arlene P.

    1989-01-01

    A study of NASA's Systems Management Policy (SMP) concluded that the primary methodology being used by the Mission Operations and Data Systems Directorate and its subordinate, the Networks Division, is very effective. Still some unmet needs were identified. This study involved evaluating methodologies, tools, and techniques with the potential for resolving the previously identified deficiencies. Six preselected methodologies being used by other organizations with similar development problems were studied. The study revealed a wide range of significant differences in structure. Each system had some strengths but none will satisfy all of the needs of the Networks Division. Areas for improvement of the methodology being used by the Networks Division are listed with recommendations for specific action.

  9. 7 CFR 1400.204 - Limited partnerships, limited liability partnerships, limited liability companies, corporations...

    Science.gov (United States)

    2010-01-01

    ... 7 Agriculture 10 2010-01-01 2010-01-01 false Limited partnerships, limited liability partnerships..., limited liability partnerships, limited liability companies, corporations, and other similar legal entities. (a) A limited partnership, limited liability partnership, limited liability company, corporation...

  10. Environmental impact assessment for energy pathways: an integrated methodology

    International Nuclear Information System (INIS)

    Sommereux-Blanc, Isabelle

    2010-01-01

    This document presents the synthesis of my research work contributing to the development of an integrated methodology of environmental impact assessment for energy pathways. In the context of world globalization, environmental impact assessments issues are highly linked with the following questioning: Which environmental impacts? for which demand? at which location? at which temporal scale? My work is built upon the definition of a conceptual framework able to handle these issues and upon its progressive implementation. The integration of the spatial and temporal issues within the methodology are key elements. Fundamental cornerstones of this framework are presented along the DPSIR concept (Driving forces, Pressures, State, Impacts, Responses). They cover a comprehensive analysis of the limits and the relevance of life cycle analysis and the development of a geo-spatialized environmental performance approach for an electrical production pathway. Perspectives linked with the development of this integrated methodology are detailed for energy pathways. (author)

  11. Core design methodology and software for Temelin NPP

    International Nuclear Information System (INIS)

    Havluj, F; Hejzlar, J.; Klouzal, J.; Stary, V.; Vocka, R.

    2011-01-01

    In the frame of the process of fuel vendor change at Temelin NPP in the Czech Republic, where, starting since 2010, TVEL TVSA-T fuel is loaded instead of Westinghouse VVANTAGE-6 fuel, new methodologies for core design and core reload safety evaluation have been developed. These documents are based on the methodologies delivered by TVEL within the fuel contract, and they were further adapted according to Temelin NPP operational needs and according to the current practice at NPP. Along with the methodology development the 3D core analysis code ANDREA, licensed for core reload safety evaluation in 2010, have been upgraded in order to optimize the safety evaluation process. New sequences of calculations were implemented in order to simplify the evaluation of different limiting parameters and output visualization tools were developed to make the verification process user friendly. Interfaces to the fuel performance code TRANSURANUS and sub-channel analysis code SUBCAL were developed as well. (authors)

  12. New quickest transient detection methodology. Nuclear engineering applications

    International Nuclear Information System (INIS)

    Wang, Xin; Jevremovic, Tatjana; Tsoukalas, Lefteri H.

    2003-01-01

    A new intelligent systems methodology for quickest online transient detection is presented. Based on information that includes, but is not limited to, statistical features, energy of frequency components and wavelet coefficients, the new methodology decides whether a transient has emerged. A fuzzy system makes the final decision, the membership functions of which are obtained by artificial neural networks and adjusted in an online manner. Comparisons are performed with conventional methods for transient detection using simulated and plant data. The proposed methodology could be useful in power plant operations, diagnostic and maintenance activities. It is also considered as a design tool for quick design modifications in a virtual design environment aimed at next generation University Research and Training Reactors (URTRs). (The virtual design environment is pursued as part of the Big-10 Consortium sponsored by the new Innovations in Nuclear Infrastructure and Education (INIE) program sponsored by the US Department of Energy.) (author)

  13. Revised INPRO Methodology in the Area of Proliferation Resistance

    International Nuclear Information System (INIS)

    Park, J.H.; Lee, Y.D.; Yang, M.S.; Kim, J.K.; Haas, E.; Depisch, F.

    2008-01-01

    The official INPRO User Manual in the area of proliferation resistance is being processed for the evaluation of innovative nuclear energy systems. Proliferation resistance is one of the goals to be satisfied for future nuclear energy systems in INPRO. The features of currently updated and released INPRO methodology were introduced on basic principles, user requirements and indicators. The criteria for an acceptance limit were specified. The DUPIC fuel cycle was evaluated based on the updated INPRO methodology for the applicability of the INPRO User Manual. However, the INPRO methodology has some difficulty in quantifying the multiplicity and robustness as well as the total cost to improve proliferation resistance. Moreover, the integration method for the evaluation results still needs to be improved.

  14. Experience feedback from incidents: methodological and cultural aspects

    International Nuclear Information System (INIS)

    Perinet, R.

    2007-01-01

    EdF has designed some provisions to improve the reliability of human interventions: an increased number of training simulators, management of the quality of interventions, implementation of human factor consultants on each site, improvement in user documentation, development of communication practices, etc. However, despite efforts made in the right direction, the complexity of human behaviour and organisations make it obligatory to follow up the efficacy of these provisions over time in order to ensure that they produce the expected results on work practices. The in-depth analysis by IRSN of events that are significant for safety shows that experience feedback from incidents constitutes a real opportunity to ensure this follow-up. It also highlights the difficulty for licensees to define the temporal context of investigations to carry out, analysing errors committed more in depth and identifying ensuing problems. This article shows that these difficulties are the result of inappropriate methodologies and a lack of skills and availability to carry out the analysis. Finally, it shows that the incident leads to defensive behaviour among those participating in the system that blocks the compilation of information and limits the relevance of analyses. (author)

  15. School resources and student achievment: worldwide findings and methodological issues

    Directory of Open Access Journals (Sweden)

    Paulo A. Meyer. M. Nascimento

    2008-03-01

    Full Text Available The issues raised in the Education Production Function literature since the US 1966 Coleman Report have fuelled high controversy on the role of school resources in relation to student performance. In several literature reviews and some self estimates, Erik Hanushek (1986, 1997, 2006 systematically affirms that these two factors are not associated one to another – neither in the US nor abroad. In recent cross-country analyses, Ludger Woessmann (2003; 2005a; 2005b links international differences in attainment to institutional differences across educational systems – not to resourcing levels. In the opposite direction, Stephen Heyneman and William Loxley (1982, 1983 tried to demonstrate in the 1980’s that, at least for low income countries, school factors seemed to outweigh family characteristics on the determination of students’ outcomes – although other authors show evidence that such a phenomenon may have existed only during a limited period of the 20th Century. In the 1990s, meta-analyses raised the argument that school resources were sufficiently significant to be regarded as pedagogically important. The turn of the Century witnessed a new movement: the recognition that endogenous determination of resource allocation is a substantial methodological issue. Therefore, efforts have been made to incorporate the decision-making processes that involve families, schools and policy-makers in economic models. This implies changes in research designs that may affect the direction of future policy advices patronised by international development and educational organisations.

  16. Radioisotope methodology course radioprotection aspects

    International Nuclear Information System (INIS)

    Bergoc, R.M.; Caro, R.A.; Menossi, C.A.

    1996-01-01

    The advancement knowledge in molecular and cell biology, biochemistry, medicine and pharmacology, which has taken place during the last 50 years, after World War II finalization, is really outstanding. It can be safely said that this fact is principally due to the application of radioisotope techniques. The research on metabolisms, biodistribution of pharmaceuticals, pharmacodynamics, etc., is mostly carried out by means of techniques employing radioactive materials. Radioisotopes and radiation are frequently used in medicine both as diagnostic and therapeutic tools. The radioimmunoanalysis is today a routine method in endocrinology and in general clinical medicine. The receptor determination and characterization is a steadily growing methodology used in clinical biochemistry, pharmacology and medicine. The use of radiopharmaceuticals and radiation of different origins, for therapeutic purposes, should not be overlooked. For these reasons, the importance to teach radioisotope methodology is steadily growing. This is principally the case for specialization at the post-graduate level but at the pre graduate curriculum it is worthwhile to give some elementary theoretical and practical notions on this subject. These observations are justified by a more than 30 years teaching experience at both levels at the School of Pharmacy and Biochemistry of the University of Buenos Aires, Argentina. In 1960 we began to teach Physics III, an obligatory pregraduate course for biochemistry students, in which some elementary notions of radioactivity and measurement techniques were given. Successive modifications of the biochemistry pregraduate curriculum incorporated radiochemistry as an elective subject and since 1978, radioisotope methodology, as obligatory subject for biochemistry students. This subject is given at the radioisotope laboratory during the first semester of each year and its objective is to provide theoretical and practical knowledge to the biochemistry students, even

  17. Flammability Assessment Methodology Program Phase I: Final Report

    Energy Technology Data Exchange (ETDEWEB)

    C. A. Loehr; S. M. Djordjevic; K. J. Liekhus; M. J. Connolly

    1997-09-01

    The Flammability Assessment Methodology Program (FAMP) was established to investigate the flammability of gas mixtures found in transuranic (TRU) waste containers. The FAMP results provide a basis for increasing the permissible concentrations of flammable volatile organic compounds (VOCs) in TRU waste containers. The FAMP results will be used to modify the ''Safety Analysis Report for the TRUPACT-II Shipping Package'' (TRUPACT-II SARP) upon acceptance of the methodology by the Nuclear Regulatory Commission. Implementation of the methodology would substantially increase the number of drums that can be shipped to the Waste Isolation Pilot Plant (WIPP) without repackaging or treatment. Central to the program was experimental testing and modeling to predict the gas mixture lower explosive limit (MLEL) of gases observed in TRU waste containers. The experimental data supported selection of an MLEL model that was used in constructing screening limits for flammable VOC and flammable gas concentrations. The MLEL values predicted by the model for individual drums will be utilized to assess flammability for drums that do not meet the screening criteria. Finally, the predicted MLEL values will be used to derive acceptable gas generation rates, decay heat limits, and aspiration time requirements for drums that do not pass the screening limits. The results of the program demonstrate that an increased number of waste containers can be shipped to WIPP within the flammability safety envelope established in the TRUPACT-II SARP.

  18. Methodologies for tracking learning paths

    DEFF Research Database (Denmark)

    Frølunde, Lisbeth; Gilje, Øystein; Lindstrand, Fredrik

    2009-01-01

    filmmakers: what furthers their interest and/or hinders it, and what learning patterns emerge. The aim of this article is to present and discuss issues regarding the methodology and meth- ods of the study, such as developing a relationship with interviewees when conducting inter- views online (using MSN). We...... suggest two considerations about using online interviews: how the interviewees value the given subject of conversation and their familiarity with being online. The benefit of getting online communication with the young filmmakers offers ease, because it is both practical and appropriates a meeting...

  19. Continuous culture apparatus and methodology

    International Nuclear Information System (INIS)

    Conway, H.L.

    1975-01-01

    At present, we are investigating the sorption of potentially toxic trace elements by phytoplankton under controlled laboratory conditions. Continuous culture techniques were used to study the mechanism of the sorption of the trace elements by unialgal diatom populations and the factors influencing this sorption. Continuous culture methodology has been used extensively to study bacterial kinetics. It is an excellent technique for obtaining a known physiological state of phytoplankton populations. An automated method for the synthesis of continuous culture medium for use in these experiments is described

  20. JET pump limiter

    International Nuclear Information System (INIS)

    Sonnenberg, K.; Deksnis, E.; Shaw, R.; Reiter, D.

    1988-01-01

    JET plans to install two pump limiter modules which can be used for belt-limiter, inner-wall and X-point discharges and, also, for 1-2s as the main limiter. A design is presented which is compatible with two diagnostic systems, and which allows partial removal of the pump limiter to provide access for remote-handling operations. The high heat-flux components are initially cooled during a pulse. Heat is removed between discharges by radiation and pressure contacts to a water-cooled support structure. The pumping edge will be made of annealed pyrolytic graphite. Exhaust efficiency has been estimated, for a 1-d edge model, using a Monte-Carlo calculation of neutral gas transport. When the pump limiter is operated together with other wall components we expect an efficiency of ≅ 5% (2.5 x 10 21 part/s). As a main limiter the efficiency increases to about 10%. (author)

  1. Reactor limit control system

    International Nuclear Information System (INIS)

    Rubbel, F.E.

    1982-01-01

    The very extensive use of limitations in the operational field between protection system and closed-loop controls is an important feature of German understanding of operational safety. The design of limitations is based on very large activities in the computational field but mostly on the high level of the plant-wide own commissioning experience of a turnkey contractor. Limitations combine intelligence features of closed-loop controls with the high availability of protection systems. (orig.)

  2. Detector limitations, STAR

    Energy Technology Data Exchange (ETDEWEB)

    Underwood, D. G.

    1998-07-13

    Every detector has limitations in terms of solid angle, particular technologies chosen, cracks due to mechanical structure, etc. If all of the presently planned parts of STAR [Solenoidal Tracker At RHIC] were in place, these factors would not seriously limit our ability to exploit the spin physics possible in RHIC. What is of greater concern at the moment is the construction schedule for components such as the Electromagnetic Calorimeters, and the limited funding for various levels of triggers.

  3. Performance limitations at ISABELLE

    International Nuclear Information System (INIS)

    Keil, E.

    1975-01-01

    The transverse stability of coasting beams in the planned ISABELLE storage rings was studied. The beam--beam tune shift limitation at 0.005 can be avoided, and a computer simulation seems to show 0.005 is a pessimistic limit. For beams of reasonable smoothness on the edge, the actual limit should be somewhat higher. Some coupling effects due to the beam--beam interaction are also examined

  4. Limit loads in nozzles

    International Nuclear Information System (INIS)

    Zouain, N.

    1983-01-01

    The static method for the evaluation of the limit loads of a perfectly elasto-plastic structure is presented. Using the static theorem of Limit Analysis and the Finite Element Method, a lower bound for the colapso load can be obtained through a linear programming problem. This formulation if then applied to symmetrically loaded shells of revolution and some numerical results of limit loads in nozzles are also presented. (Author) [pt

  5. Limit analysis via creep

    International Nuclear Information System (INIS)

    Taroco, E.; Feijoo, R.A.

    1981-07-01

    In this paper it is presented a variational method for the limit analysis of an ideal plastic solid. This method has been denominated as Modified Secundary Creep and enables to find the collapse loads through a minimization of a functional and a limit process. Given an ideal plastic material it is shown how to determinate the associated secundary creep constitutive equation. Finally, as an application, it is found the limit load in an pressurized von Mises rigid plastic sphere. (Author) [pt

  6. Adaptation of EVIAVE methodology for monitoring and follow-up when evaluating the environmental impact of landfills

    International Nuclear Information System (INIS)

    Arrieta, Gabriela; Requena, Ignacio; Toro, Javier; Zamorano, Montserrat

    2016-01-01

    Treatment and final disposal of Municipal Solid Waste can have a significant role in the generation of negative environmental impacts. As a prevention strategy, such activities are subjected to the process of Environmental Impact Assessment (EIA). Still, the follow-up of Environmental Management Plans or mitigation measures is limited, for one due to a lack of methodological approaches. In searching for possibilities, the University of Granada (Spain) developed a diagnostic methodology named EVIAVE, which allows one to quantify, by means of indexes, the environmental impact of landfills in view of their location and the conditions of exploitation. EVIAVE is applicable within the legal framework of the European Union and can be adapted to the environmental and legal conditions of other countries. This study entails its adaptation in Colombia, for the follow-up and control of the EIA process for landfills. Modifications involved inclusion of the environmental elements flora and fauna, and the evaluation of the environmental descriptors in agreement with the concept of vulnerability. The application of the modified EVIAVE in Colombian landfills allowed us to identify the elements affected by the operating conditions and maintenance. It may be concluded that this methodology is viable and effective for the follow-up and environmental control of EIA processes for landfills, and to analyze the associated risks, as it takes into account related environmental threats and vulnerabilities. - Highlights: • A modified methodology is used to monitor and follow-up environmental impacts in landfills. • The improved methodology includes the Vulnerability of Flora and Fauna to evaluate environmental impact of landfills. • The methodology serves to identify and evaluate the sources of risk generated in the construction and siting of landfills. • Environmental vulnerability indicators improve effectiveness of the control and follow-up phases of landfill management. • The

  7. Adaptation of EVIAVE methodology for monitoring and follow-up when evaluating the environmental impact of landfills

    Energy Technology Data Exchange (ETDEWEB)

    Arrieta, Gabriela, E-mail: tonina1903@hotmail.com [Department of Civil Engineering, University of Granada (Spain); Requena, Ignacio, E-mail: requena@decsai.ugr.es [Department of Computer Science and Artificial Intelligence, University of Granada (Spain); Toro, Javier, E-mail: jjtoroca@unal.edu.co [Universidad Nacional de Colombia — Sede Bogotá, Instituto de Estudios Ambientales (Colombia); Zamorano, Montserrat, E-mail: zamorano@ugr.es [Department of Civil Engineering, University of Granada (Spain)

    2016-01-15

    Treatment and final disposal of Municipal Solid Waste can have a significant role in the generation of negative environmental impacts. As a prevention strategy, such activities are subjected to the process of Environmental Impact Assessment (EIA). Still, the follow-up of Environmental Management Plans or mitigation measures is limited, for one due to a lack of methodological approaches. In searching for possibilities, the University of Granada (Spain) developed a diagnostic methodology named EVIAVE, which allows one to quantify, by means of indexes, the environmental impact of landfills in view of their location and the conditions of exploitation. EVIAVE is applicable within the legal framework of the European Union and can be adapted to the environmental and legal conditions of other countries. This study entails its adaptation in Colombia, for the follow-up and control of the EIA process for landfills. Modifications involved inclusion of the environmental elements flora and fauna, and the evaluation of the environmental descriptors in agreement with the concept of vulnerability. The application of the modified EVIAVE in Colombian landfills allowed us to identify the elements affected by the operating conditions and maintenance. It may be concluded that this methodology is viable and effective for the follow-up and environmental control of EIA processes for landfills, and to analyze the associated risks, as it takes into account related environmental threats and vulnerabilities. - Highlights: • A modified methodology is used to monitor and follow-up environmental impacts in landfills. • The improved methodology includes the Vulnerability of Flora and Fauna to evaluate environmental impact of landfills. • The methodology serves to identify and evaluate the sources of risk generated in the construction and siting of landfills. • Environmental vulnerability indicators improve effectiveness of the control and follow-up phases of landfill management. • The

  8. Chapter three: methodology of exposure modeling

    CSIR Research Space (South Africa)

    Moschandreas, DJ

    2002-12-01

    Full Text Available methodologies and models are reviewed. Three exposure/measurement methodologies are assessed. Estimation methods focus on source evaluation and attribution, sources include those outdoors and indoors as well as in occupational and in-transit environments. Fate...

  9. Methodological Issues and Practices in Qualitative Research.

    Science.gov (United States)

    Bradley, Jana

    1993-01-01

    Discusses methodological issues concerning qualitative research and describes research practices that qualitative researchers use to address these methodological issues. Topics discussed include the researcher as interpreter, the emergent nature of qualitative research, understanding the experience of others, trustworthiness in qualitative…

  10. Investigating surety methodologies for cognitive systems.

    Energy Technology Data Exchange (ETDEWEB)

    Caudell, Thomas P. (University of New Mexico, Albuquerque, NM); Peercy, David Eugene; Mills, Kristy (University of New Mexico, Albuquerque, NM); Caldera, Eva (University of New Mexico, Albuquerque, NM)

    2006-11-01

    Advances in cognitive science provide a foundation for new tools that promise to advance human capabilities with significant positive impacts. As with any new technology breakthrough, associated technical and non-technical risks are involved. Sandia has mitigated both technical and non-technical risks by applying advanced surety methodologies in such areas as nuclear weapons, nuclear reactor safety, nuclear materials transport, and energy systems. In order to apply surety to the development of cognitive systems, we must understand the concepts and principles that characterize the certainty of a system's operation as well as the risk areas of cognitive sciences. This SAND report documents a preliminary spectrum of risks involved with cognitive sciences, and identifies some surety methodologies that can be applied to potentially mitigate such risks. Some potential areas for further study are recommended. In particular, a recommendation is made to develop a cognitive systems epistemology framework for more detailed study of these risk areas and applications of surety methods and techniques.

  11. Selection of skin dose calculation methodologies

    International Nuclear Information System (INIS)

    Farrell, W.E.

    1987-01-01

    This paper reports that good health physics practice dictates that a dose assessment be performed for any significant skin contamination incident. There are, however, several methodologies that could be used, and while there is probably o single methodology that is proper for all cases of skin contamination, some are clearly more appropriate than others. This can be demonstrated by examining two of the more distinctly different options available for estimating skin dose the calculational methods. The methods compiled by Healy require separate beta and gamma calculations. The beta calculational method is the derived by Loevinger, while the gamma dose is calculated from the equation for dose rate from an infinite plane source with an absorber between the source and the detector. Healy has provided these formulas in graphical form to facilitate rapid dose rate determinations at density thicknesses of 7 and 20 mg/cm 2 . These density thicknesses equate to the regulatory definition of the sensitive layer of the skin and a more arbitrary value to account of beta absorption in contaminated clothing

  12. Audit Methodology for IT Governance

    Directory of Open Access Journals (Sweden)

    Mirela GHEORGHE

    2010-01-01

    Full Text Available The continuous development of the new IT technologies was followed up by a rapid integration of them at the organization level. The management of the organizations face a new challenge: structural redefinition of the IT component in order to create plus value and to minimize IT risks through an efficient management of all IT resources of the organization. These changes have had a great impact on the governance of the IT component. The paper proposes an audit methodology of the IT Governance at the organization level. From this point of view the developed audit strategy is a strategy based on risks to enable IT auditor to study from the best angle efficiency and effectiveness of the IT Governance structure. The evaluation of the risks associated with IT Governance is a key process in planning the audit mission which will allow the identification of the segments with increased risks. With now ambition for completeness, the proposed methodology provides the auditor a useful tool in the accomplishment of his mission.

  13. Methodology of Credit Analysis Development

    Directory of Open Access Journals (Sweden)

    Slađana Neogradi

    2017-12-01

    Full Text Available The subject of research presented in this paper refers to the definition of methodology for the development of credit analysis in companies and its application in lending operations in the Republic of Serbia. With the developing credit market, there is a growing need for a well-developed risk and loss prevention system. In the introduction the process of bank analysis of the loan applicant is presented in order to minimize and manage the credit risk. By examining the subject matter, the process of processing the credit application is described, the procedure of analyzing the financial statements in order to get an insight into the borrower's creditworthiness. In the second part of the paper, the theoretical and methodological framework is presented applied in the concrete company. In the third part, models are presented which banks should use to protect against exposure to risks, i.e. their goal is to reduce losses on loan operations in our country, as well as to adjust to market conditions in an optimal way.

  14. Methodology for combining dynamic responses

    International Nuclear Information System (INIS)

    Cudlin, R.; Hosford, S.; Mattu, R.; Wichman, K.

    1978-09-01

    The NRC has historically required that the structural/mechanical responses due to various accident loads and loads caused by natural phenomena, (such as earthquakes) be combined when analyzing structures, systems, and components important to safety. Several approaches to account for the potential interaction of loads resulting from accidents and natural phenomena have been used. One approach, the so-called absolute or linear summation (ABS) method, linearly adds the peak structural responses due to the individual dynamic loads. In general, the ABS method has also reflected the staff's conservative preference for the combination of dynamic load responses. A second approach, referred to as SRSS, yields a combined response equal to the square root of the sum of the squares of the peak responses due to the individual dynamic loads. The lack of a physical relationship between some of the loads has raised questions as to the proper methodology to be used in the design of nuclear power plants. An NRR Working Group was constituted to examine load combination methodologies and to develop a recommendation concerning criteria or conditions for their application. Evaluations of and recommendations on the use of the ABS and SRSS methods are provided in the report

  15. Applications of mixed-methods methodology in clinical pharmacy research.

    Science.gov (United States)

    Hadi, Muhammad Abdul; Closs, S José

    2016-06-01

    Introduction Mixed-methods methodology, as the name suggests refers to mixing of elements of both qualitative and quantitative methodologies in a single study. In the past decade, mixed-methods methodology has gained popularity among healthcare researchers as it promises to bring together the strengths of both qualitative and quantitative approaches. Methodology A number of mixed-methods designs are available in the literature and the four most commonly used designs in healthcare research are: the convergent parallel design, the embedded design, the exploratory design, and the explanatory design. Each has its own unique advantages, challenges and procedures and selection of a particular design should be guided by the research question. Guidance on designing, conducting and reporting mixed-methods research is available in the literature, so it is advisable to adhere to this to ensure methodological rigour. When to use it is best suited when the research questions require: triangulating findings from different methodologies to explain a single phenomenon; clarifying the results of one method using another method; informing the design of one method based on the findings of another method, development of a scale/questionnaire and answering different research questions within a single study. Two case studies have been presented to illustrate possible applications of mixed-methods methodology. Limitations Possessing the necessary knowledge and skills to undertake qualitative and quantitative data collection, analysis, interpretation and integration remains the biggest challenge for researchers conducting mixed-methods studies. Sequential study designs are often time consuming, being in two (or more) phases whereas concurrent study designs may require more than one data collector to collect both qualitative and quantitative data at the same time.

  16. Information technology security system engineering methodology

    Science.gov (United States)

    Childs, D.

    2003-01-01

    A methodology is described for system engineering security into large information technology systems under development. The methodology is an integration of a risk management process and a generic system development life cycle process. The methodology is to be used by Security System Engineers to effectively engineer and integrate information technology security into a target system as it progresses through the development life cycle. The methodology can also be used to re-engineer security into a legacy system.

  17. Methodology identification in mass disasters

    OpenAIRE

    Ampudia García, Omar

    2014-01-01

    Major disasters in Perul ack from a treatment plan and adapt to the current reality. Were rare and limited to natural disasters such as major earthquakes, floods, torrential rains, erupting volcanoes, and so on.At first these disasters were limited to certain geographic areas ingeneral,but with the advancement of science and technology these events have soared alarming lyas rail crashes, plane crashes, car crashes going at high speed,and if we add the attacks by fundamentalist groups with car...

  18. Methodology for Measuring the Complexity of Enterprise Information Systems

    Directory of Open Access Journals (Sweden)

    Ilja Holub

    2016-07-01

    Full Text Available The complexity of enterprise information systems is currently a challenge faced not only by IT professionals and project managers, but also by the users of such systems. Current methodologies and frameworks used to design and implement information systems do not specifically deal with the issue of their complexity and, apart from few exceptions, do not at all attempt to simplify the complexity. This article presents the author's own methodology for managing complexity, which can be used to complement any other methodology and which helps limit the growth of complexity. It introduces its own definition and metric of complexity, which it defines as the sum of entities of the individual UML models of the given system, which are selected according to the MMDIS methodology so as to consistently describe all relevant content dimensions of the system. The main objective is to propose a methodology to manage information system complexity and to verify it in practice on a real-life SAP implementation project.

  19. Development of Proliferation Resistance Assessment Methodology Based on International Standards

    International Nuclear Information System (INIS)

    Lee, Yong Deok; Lee, Jung Won; Lee, Kwang Seok

    2009-03-01

    Proliferation resistance is one of the requirement to be met in GEN IV and INPRO for next generation nuclear energy system. Internationally, the evaluation methodology on PR had been already initiated from 1980, but the systematic development was started at 2000s. In Korea, for the export of nuclear energy system and the increase of international credibility and transparence of domestic nuclear system and fuel cycle development, the independent development of PR evaluation methodology was started in 2007 as a nuclear long term R and D project and the development is being performed for the model of PR evaluation methodology. In 1st year, comparative study of GEN-IV/INPRO, PR indicator development, quantification of indicator and evaluation model development, analysis of technology system and international technology development trend had been performed. In 2nd year, feasibility study of indicator, allowable limit of indicator, review of technical requirement of indicator were done. The results of PR evaluation must be applied in the beginning of conceptual design of nuclear system. Through the technology development of PR evaluation methodology, the methodology will be applied in the regulatory requirement for authorization and permission to be developed

  20. Methodological Considerations in Designing and Evaluating Animal-Assisted Interventions.

    Science.gov (United States)

    Stern, Cindy; Chur-Hansen, Anna

    2013-02-27

    This paper presents a discussion of the literature on animal-assisted interventions and describes limitations surrounding current methodological quality. Benefits to human physical, psychological and social health cannot be empirically confirmed due to the methodological limitations of the existing body of research, and comparisons cannot validly be made across different studies. Without a solid research base animal-assisted interventions will not receive recognition and acceptance as a credible alternative health care treatment. The paper draws on the work of four systematic reviews conducted over April-May 2009, with no date restrictions, focusing exclusively on the use of canine-assisted interventions for older people residing in long-term care. The reviews revealed a lack of good quality studies. Although the literature base has grown in volume since its inception, it predominantly consists of anecdotal accounts and reports. Experimental studies undertaken are often flawed in aspects of design, conduct and reporting. There are few qualitative studies available leading to the inability to draw definitive conclusions. It is clear that due to the complexities associated with these interventions not all weaknesses can be eliminated. However, there are basic methodological weaknesses that can be addressed in future studies in the area. Checklists for quantitative and qualitative research designs to guide future research are offered to help address methodological rigour.

  1. Can citizen science produce good science? Testing the OPAL Air Survey methodology, using lichens as indicators of nitrogenous pollution

    International Nuclear Information System (INIS)

    Tregidgo, Daniel J.; West, Sarah E.; Ashmore, Mike R.

    2013-01-01

    Citizen science is having increasing influence on environmental monitoring as its advantages are becoming recognised. However methodologies are often simplified to make them accessible to citizen scientists. We tested whether a recent citizen science survey (the OPAL Air Survey) could detect trends in lichen community composition over transects away from roads. We hypothesised that the abundance of nitrophilic lichens would decrease with distance from the road, while that of nitrophobic lichens would increase. The hypothesised changes were detected along strong pollution gradients, but not where the road source was relatively weak, or background pollution relatively high. We conclude that the simplified OPAL methodology can detect large contrasts in nitrogenous pollution, but it may not be able to detect more subtle changes in pollution exposure. Similar studies are needed in conjunction with the ever-growing body of citizen science work to ensure that the limitations of these methods are fully understood. -- Highlights: •We investigated the validity of a simplified citizen science methodology. •Lichen abundance data were used to indicate nitrogenous air pollution. •Significant changes were detected beside busy roads with low background pollution. •The methodology detected major, but not subtle, contrasts in pollution. •Sensitivity of citizen science methods to environmental change must be evaluated. -- A simplified lichen biomonitoring method used for citizen science can detect the impact of nitrogenous air pollution from local roads

  2. Reconciling Anti-essentialism and Quantitative Methodology

    DEFF Research Database (Denmark)

    Jensen, Mathias Fjællegaard

    2017-01-01

    Quantitative methodology has a contested role in feminist scholarship which remains almost exclusively qualitative. Considering Irigaray’s notion of mimicry, Spivak’s strategic essentialism, and Butler’s contingent foundations, the essentialising implications of quantitative methodology may prove...... the potential to reconcile anti-essentialism and quantitative methodology, and thus, to make peace in the quantitative/qualitative Paradigm Wars....

  3. 42 CFR 441.472 - Budget methodology.

    Science.gov (United States)

    2010-10-01

    ... 42 Public Health 4 2010-10-01 2010-10-01 false Budget methodology. 441.472 Section 441.472 Public... Self-Directed Personal Assistance Services Program § 441.472 Budget methodology. (a) The State shall set forth a budget methodology that ensures service authorization resides with the State and meets the...

  4. Application of agile methodologies in software development

    Directory of Open Access Journals (Sweden)

    Jovanović Aca D.

    2016-01-01

    Full Text Available The paper presents the potentials for the development of software using agile methodologies. Special consideration is devoted to the potentials and advantages of use of the Scrum methodology in the development of software and the relationship between the implementation of agile methodologies and the software development projects.

  5. The micro-habitat methodology. Application protocols

    Energy Technology Data Exchange (ETDEWEB)

    Sabaton, C; Valentin, S; Souchon, Y

    1995-06-01

    A strong need has been felt for guidelines to help various entities in applying the micro-habitat methodology, particularly in impact studies on hydroelectric installations. CEMAGREF and Electricite de France have developed separately two protocols with five major steps: reconnaissance of the river, selection of representative units to be studied in greater depth, morpho-dynamic measurements at one or more rates of discharge and hydraulic modeling, coupling of hydraulic and biological models, calculation of habitat-quality scores for fish, analysis of results. The two approaches give very comparable results and are essentially differentiated by the hydraulic model used. CEMAGREF uses a one-dimensional model requiring measurements at only one discharge rate. Electricite de France uses a simplified model based on measurements at several rates of discharge. This approach is possible when discharge can be controlled in the study area during data acquisition, as is generally the case downstream of hydroelectric installations. The micro-habitat methodology is now a fully operational tool with which to study changes in fish habitat quality in relation to varying discharge. It provides an element of assessment pertinent to the choice of instreaming flow to be maintained downstream of a hydroelectric installation; this information is essential when the flow characteristics (velocity, depth) and the nature of the river bed are the preponderant factors governing habitat suitability for trout or salmon. The ultimate decision must nonetheless take into account any other potentially limiting factors for the biocenoses on the one hand, and the target water use objectives on the other. In many cases, compromises must be found among different uses, different species and different stages in the fish development cycle. (Abstract Truncated)

  6. Impact significance determination-Designing an approach

    International Nuclear Information System (INIS)

    Lawrence, David P.

    2007-01-01

    The question of how best to go about determining the significance of impacts has, to date, only been addressed in a partial and preliminary way. The assumption tends to be made that it is either only necessary to provide explicit, justified reasons for a judgment about significance and/or to explicitly apply a prescribed procedure-a procedure usually involving the staged application of thresholds and/or criteria. The detailed attributes, strengths and limitations of such approaches and possible alternative approaches have yet to be explored systematically. This article addresses these deficiencies by analyzing the characteristics, specific methods and positive and negative tendencies of three general impact significance determination approaches-the technical approach, the collaborative approach and the reasoned argumentation approach. A range of potential composite approaches are also described. With an enhanced understanding of these approaches, together with potential combinations, EIA practitioners and other EIA participants can be in a better position to select an approach appropriate to their needs, to reinforce the positive tendencies and offset the negative tendencies of the selected approach and to combine the best qualities of more than one approach

  7. Numerical Limit Analysis:

    DEFF Research Database (Denmark)

    Damkilde, Lars

    2007-01-01

    Limit State analysis has a long history and many prominent researchers have contributed. The theoretical foundation is based on the upper- and lower-bound theorems which give a very comprehensive and elegant formulation on complicated physical problems. In the pre-computer age Limit State analysis...... also enabled engineers to solve practical problems within reinforced concrete, steel structures and geotechnics....

  8. Advanced limiters for ISX

    International Nuclear Information System (INIS)

    Mioduszewski, P.K.; Edmonds, P.H.; Sheffield, J.

    1982-01-01

    Continuous removal of heat and particles becomes a vital necessity in future steady-state fusion devices. The pump limiter seems to be an attractive concept to combine these two tasks. On ISX, various schemes of pump limiters are being explored with the final goal to furnish the ISX--C device with a pump limiter to handle heat removal and particle control in steady state. The emphasis of the present paper is on pump limiters based on ballistic particle collection. If this concept turns out to be successful in supplying sufficient pumping efficiency, it may be possible to design pump limiters without a leading edge. Analytical calculations of the particle collection efficiency are given for various limiter configurations. Pumping efficiencies of approximately 4--10%, depending on the specific configuration, seem to be feasible and should be sufficient for steady-state operation. Initial experimental results on pump limiter studies in ISX--B confirm the calculated collection efficiencies. By measuring the ion saturation current to the limiter blade and the pressure buildup simultaneously, we found a correlation between the incident particle flux and the pressure rise that agrees well with a simple model

  9. Limits to Inclusion

    Science.gov (United States)

    Hansen, Janne Hedegaard

    2012-01-01

    In this article, I will argue that a theoretical identification of the limit to inclusion is needed in the conceptual identification of inclusion. On the one hand, inclusion is formulated as a vision that is, in principle, limitless. On the other hand, there seems to be an agreement that inclusion has a limit in the pedagogical practice. However,…

  10. A methodology for producing small scale rural land use maps in semi-arid developing countries using orbital imagery

    Science.gov (United States)

    Vangenderen, J. L. (Principal Investigator); Lock, B. F.

    1976-01-01

    The author has identified the following significant results. Results have shown that it is feasible to design a methodology that can provide suitable guidelines for operational production of small scale rural land use maps of semiarid developing regions from LANDSAT MSS imagery, using inexpensive and unsophisticated visual techniques. The suggested methodology provides immediate practical benefits to map makers attempting to produce land use maps in countries with limited budgets and equipment. Many preprocessing and interpretation techniques were considered, but rejected on the grounds that they were inappropriate mainly due to the high cost of imagery and/or equipment, or due to their inadequacy for use in operational projects in the developing countries. Suggested imagery and interpretation techniques, consisting of color composites and monocular magnification proved to be the simplest, fastest, and most versatile methods.

  11. Methodology for flood risk analysis for nuclear power plants

    International Nuclear Information System (INIS)

    Wagner, D.P.; Casada, M.L.; Fussell, J.B.

    1984-01-01

    The methodology for flood risk analysis described here addresses the effects of a flood on nuclear power plant safety systems. Combining the results of this method with the probability of a flood allows the effects of flooding to be included in a probabilistic risk assessment. The five-step methodology includes accident sequence screening to focus the detailed analysis efforts on the accident sequences that are significantly affected by a flood event. The quantitative results include the flood's contribution to system failure probability, accident sequence occurrence frequency and consequence category occurrence frequency. The analysis can be added to existing risk assessments without a significant loss in efficiency. The results of two example applications show the usefulness of the methodology. Both examples rely on the Reactor Safety Study for the required risk assessment inputs and present changes in the Reactor Safety Study results as a function of flood probability

  12. Mitigating greenhouse: Limited time, limited options

    International Nuclear Information System (INIS)

    Moriarty, Patrick; Honnery, Damon

    2008-01-01

    Most human-caused climate change comes from fossil fuel combustion emissions. To avoid the risk of serious climate change, very recent research suggests that emission reductions will need to be both large and rapidly implemented. We argue that technical solutions-improving energy efficiency, use of renewable and nuclear energy, and carbon capture and sequestration-can only be of minor importance, mainly given the limited time available to take effective climate action. Only curbing energy use, perhaps through 'social efficiency' gains, particularly in the high-energy consumption countries, can provide the rapid emissions reductions needed. The social efficiency approach requires a basic rethinking in how we can satisfy our human needs with low environmental impacts. Large cuts in emissions could then occur rapidly, but only if resistance to such changes can be overcome. Particularly in transport, there are also serious potential conflicts between the technical and the social efficiency approaches, requiring a choice to be made

  13. Ecodesign of cosmetic formulae: methodology and application.

    Science.gov (United States)

    L'Haridon, J; Martz, P; Chenéble, J-C; Campion, J-F; Colombe, L

    2018-04-01

    This article describes an easy-to-use ecodesign methodology developed and applied since 2014 by the L'Oréal Group to improve the sustainable performance of its new products without any compromise on their cosmetic efficacy. Cosmetic products, after being used, are often discharged into the sewers and the aquatic compartment. This discharge is considered as dispersive and continuous. A consistent progress in reducing the environmental impact of cosmetic products can be achieved through focusing upon three strategic indicators: biodegradability, grey water footprint adapted for ecodesign (GWFE) and a global indicator, complementary to these two endpoints. Biodegradability represents the key process in the removal of organic ingredients from the environment. GWFE is defined herein as the theoretical volume of natural freshwater required to dilute a cosmetic formula after being used by the consumer, down to a concentration without any foreseeable toxic effects upon aquatic species. Finally, the complementary indicator highlights a possible alert on formula ingredients due to an unfavourable environmental profile based on hazard properties: for example Global Harmonization System/Classification, Labelling and Packaging (GHS/CLP) H410 classification or potential very persistent and very bioaccumulative (vPvB) classification. The ecodesign of a new cosmetic product can be a challenge as the cosmetic properties and quality of this new product should at least match the benchmark reference. As shown in the case studies described herein, new methodologies have been developed to maximize the biodegradability of cosmetic formulae, to minimize their GWFE and to limit the use of ingredients that present an unfavourable environmental profile, while reaching the highest standards in terms of cosmetic efficacy. By applying these methodologies, highly biodegradable products (≥ 95% based on ingredient composition) have been developed and marketed, with a low GWFE. This new

  14. Modeling Complex Time Limits

    Directory of Open Access Journals (Sweden)

    Oleg Svatos

    2013-01-01

    Full Text Available In this paper we analyze complexity of time limits we can find especially in regulated processes of public administration. First we review the most popular process modeling languages. There is defined an example scenario based on the current Czech legislature which is then captured in discussed process modeling languages. Analysis shows that the contemporary process modeling languages support capturing of the time limit only partially. This causes troubles to analysts and unnecessary complexity of the models. Upon unsatisfying results of the contemporary process modeling languages we analyze the complexity of the time limits in greater detail and outline lifecycles of a time limit using the multiple dynamic generalizations pattern. As an alternative to the popular process modeling languages there is presented PSD process modeling language, which supports the defined lifecycles of a time limit natively and therefore allows keeping the models simple and easy to understand.

  15. Moving toroidal limiter

    International Nuclear Information System (INIS)

    Ikuta, Kazunari; Miyahara, Akira.

    1983-06-01

    The concept of the limiter-divertor proposed by Mirnov is extended to a toroidal limiter-divertor (which we call moving toroidal limiter) using the stream of ferromagnetic balls coated with a low Z materials such as plastics, graphite and ceramics. An important advantage of the use of the ferromagnetic materials would be possible soft landing of the balls on a catcher, provided that the temperature of the balls is below Curie point. Moreover, moving toroidal limiter would work as a protector of the first wall not only against the vertical movement of plasma ring but also against the violent inward motion driven by major disruption because the orbit of the ball in the case of moving toroidal limiter distributes over the small major radius side of the toroidal plasma. (author)

  16. An integrated methodology to evaluate a spent nuclear fuel storage system

    International Nuclear Information System (INIS)

    Yoon, Jeong Hyoun

    2008-02-01

    This study introduced a methodology that can be applied for development of a dry storage system for spent nuclear fuels. It consisted of several design activities that includes development of a simplified program to analyze the amount of spent nuclear fuels from reflecting the practical situation in spent nuclear fuel management and a simplified program to evaluate the cost of 4 types of representing storage system to choose the most competitive option considering economic factor. As verification of the implementation of the reference module to practical purpose, a simplified thermal analysis code was suggested that can see fulfillment of limitation of temperature in long term storage and oxidation analysis. From the thermal related results, the reference module can accommodate full range of PHWR spent nuclear fuels and significant portion of PWR ones too. From the results, the reference storage system can be concluded that has fulfilled the important requirements in terms of long term integrity and radiological safety. Also for the purpose of solving scattered radiation along with deep penetration problems in cooling storage system, small but efficient design alternation was suggested together with its efficiency that can reduce scattered radiation by 1/3 from the original design. Along with the countermeasure for the shielding problem, in consideration of PWR spent nuclear fuels, simplified criticality analysis methodology retaining conservativeness was proposed. The results show the reference module is efficient low enrichment PWR spent nuclear fuel and even relatively high enrichment fuels too if burnup credit is taken. As conclusive remark, the methodology is simple but efficient to plan a concept design of convective cooling type of spent nuclear fuels storage. It can be also concluded that the methodology derived in this study and the reference module has feasibility in practical implementation to mitigate the current complex situation in spent fuel

  17. Limits of detection and decision. Part 4

    International Nuclear Information System (INIS)

    Voigtman, E.

    2008-01-01

    Probability density functions (PDFs) have been derived for a number of commonly used limit of detection definitions, including several variants of the Relative Standard Deviation of the Background-Background Equivalent Concentration (RSDB-BEC) method, for a simple linear chemical measurement system (CMS) having homoscedastic, Gaussian measurement noise and using ordinary least squares (OLS) processing. All of these detection limit definitions serve as both decision and detection limits, thereby implicitly resulting in 50% rates of Type 2 errors. It has been demonstrated that these are closely related to Currie decision limits, if the coverage factor, k, is properly defined, and that all of the PDFs are scaled reciprocals of noncentral t variates. All of the detection limits have well-defined upper and lower limits, thereby resulting in finite moments and confidence limits, and the problem of estimating the noncentrality parameter has been addressed. As in Parts 1-3, extensive Monte Carlo simulations were performed and all the simulation results were found to be in excellent agreement with the derived theoretical expressions. Specific recommendations for harmonization of detection limit methodology have also been made

  18. THRESHOLD OF SIGNIFICANCE IN STRESS MANAGEMENT

    Directory of Open Access Journals (Sweden)

    Elena RUSE

    2015-12-01

    Full Text Available Stress management is the individual's ability to handle any situation, external conditions, to match the demands of the external environment. The researchers revealed several stages in the stress response. A first phase was called ‘‘alert reaction'' or ‘‘immediate reaction to stress‘‘, phase in which there are physiological modifications and manifestations that occur under psychological aspect. Adaptation phase is the phase in which the reactions from the first phase diminishes or disappears. Exhaustion phase is related to the diversity of stress factors and time and may exceed the resources of the human body to adapt. Influencing factors may be: limited, cognitive, perceptual, and a priori. But there is a threshold of significance in stress management. Once the reaction to external stimuli occurs, awareness is needed. The capability effect occurs, any side effect goes away and comes out the ''I AM'' effect.

  19. Progress in basic principles of limitation in radiation protection

    International Nuclear Information System (INIS)

    Ramzaev, P.V.; Tarasov, S.I.; Troitskaya, M.N.; Ermolaeva, A.P.

    1977-01-01

    For purposes of limitation of harmful factors, e.g. radiation, it is proposed to divide all countless numbers of biological effects into three groups: 1) social important effects (ultimate and effects); 2) intermediate effects (different diseases etc.), which are connected with and controlled by the first group; 3) pure biological effects, importance of which is not known. To determine the first group effects there are identified four indices describing all significant sides of human life: time of life, life-time integral of mental and physical capacity for work, aesthetical satisfaction from organism itself, reproduction of descendants. They reflect the main social and individual interests related to functioning of organism. On the base of weighing these indices it is suggested the united general index of health in form of time of a full life. The united index can be used for different principles of limitation (based on threshold, acceptable risk, maximum benefit). To realize the principle of maximum public benefit as ideal principle in the future limitation all benefit and detriment from utilization of harmful sources must be expressed in the united index of health (instead of money), which is the greatest value of individual and society. Authors suggest to standartize ionizing radiation on the general methodological approaches that were acceptable to non-ionizing factors too

  20. Methodological limitations of counting total leukocytes and thrombocytes in reptiles (Amazon turtle, Podocnemis expansa: an analysis and discussion Limitações metodológicas de contagens de leucócitos e trombócitos totais em répteis (tartaruga da Amazônia, Podocnemis expansa: uma análise e discussão

    Directory of Open Access Journals (Sweden)

    Marcos Tavares-Dias

    2008-01-01

    Full Text Available The aim of this paper is to compare three different methods for counting white blood cells [WBC] (Natt and Herrick method, estimation with 1,000 and 2,000 erythrocytes and three methods for counting total thrombocytes [TT] (Wojtaszek method, estimation with 1,000 and 2,000 erythrocytes in a South American freshwater turtle species, Podocnemis expansa, Schweigger 1812 (Reptilia, Pelomedusidae. Direct WBC counts using the Natt and Herrick method showed limitations, which are discussed here. The WBC and TT counts using 1,000 erythrocytes from blood smears are not recommended for Amazon turtles nor other reptilian species, since wide variation in counts can be observed. Estimation methods for determining WBC and TT based on 2,000 erythrocytes of blood smears were most acceptable because they allow a differentiation between leukocytes and thrombocytes and also had a smaller variation. The methods investigated here for the Amazon turtle, which have been widely used in other reptile species, provided evidence that the most acceptable method is not that of using diluted stains and a hemocytometer.O objetivo deste estudo foi comparar três diferentes métodos para contar leucócitos totais [LT] (método de Natt & Herrick, de estimação em 1000 e 2000 eritrócitos e três métodos para contar trombócitos totais [TT] ( método de Wojtaszek, de estimação em 1000 e 2000 eritrócitos em uma espécie de tartaruga de água doce da América do Sul, Podocnemis expansa, Schweigger 1812 (Reptilia, Pelomedusidae. As contagens diretas de LT usando o método de Natt & Herrick mostraram limitações que são aqui discutidas. As contagens de LT e TT usando estimativa em 1000 eritrócitos na extensão sanguínea não são recomendadas para tartaruga-da-Amazônia nem para outras espécies de répteis, pois houve ampla variação nestas contagens. Os métodos para determinar LT e TT baseados em 2000 eritrócitos nas extensões sanguíneas foram mais aceitáveis porque

  1. Cultivating creativity in methodology and research

    DEFF Research Database (Denmark)

    Wegener, Charlotte; Meier, Ninna; Maslo, Elina

    in which ‘accountable’ research methodologies involve adventurousness and an element of uncertainty. Written by scholars from a range of different fields, academic levels and geographic locations, this unique book will offer significant insight to those from a range of academic fields.......This book presents a variety of narratives on key elements of academic work, from data analysis, writing practices and engagement with the field. The authors discuss how elements of academic work and life – usually edited out of traditional research papers – can elicit important analytical insight....... The book reveals how the unplanned, accidental and even obstructive events that often occur in research life, the ‘detours’, can potentially glean important results. The authors introduce the process of ‘writing-sharing-reading-writing’ as a way to expand the playground of research and inspire a culture...

  2. Clinical governance and operations management methodologies.

    Science.gov (United States)

    Davies, C; Walley, P

    2000-01-01

    The clinical governance mechanism, introduced since 1998 in the UK National Health Service (NHS), aims to deliver high quality care with efficient, effective and cost-effective patient services. Scally and Donaldson recognised that new approaches are needed, and operations management techniques comprise potentially powerful methodologies in understanding the process of care, which can be applied both within and across professional boundaries. This paper summarises four studies in hospital Trusts which took approaches to improving process that were different from and less structured than business process re-engineering (BPR). The problems were then amenable to change at a relatively low cost and short timescale, producing significant improvement to patient care. This less structured approach to operations management avoided incurring overhead costs of large scale and costly change such as new information technology (IT) systems. The most successful changes were brought about by formal tools to control quantity, content and timing of changes.

  3. Methodology for seismic PSA of NPPs

    International Nuclear Information System (INIS)

    Jirsa, P.

    1999-09-01

    A general methodology is outlined for seismic PSA (probabilistic safety assessment). The main objectives of seismic PSA include: description of the course of an event; understanding the most probable failure sequences; gaining insight into the overall probability of reactor core damage; identification of the main seismic risk contributors; identification of the range of peak ground accelerations contributing significantly to the plant risk; and comparison of the seismic risk with risks from other events. The results of seismic PSA are typically compared with those of internal PSA and of PSA of other external events. If the results of internal and external PSA are available, sensitivity studies and cost benefit analyses are performed prior to any decision regarding corrective actions. If the seismic PSA involves analysis of the containment, useful information can be gained regarding potential seismic damage of the containment. (P.A.)

  4. Software life cycle methodologies and environments

    Science.gov (United States)

    Fridge, Ernest

    1991-01-01

    Products of this project will significantly improve the quality and productivity of Space Station Freedom Program software processes by: improving software reliability and safety; and broadening the range of problems that can be solved with computational solutions. Projects brings in Computer Aided Software Engineering (CASE) technology for: Environments such as Engineering Script Language/Parts Composition System (ESL/PCS) application generator, Intelligent User Interface for cost avoidance in setting up operational computer runs, Framework programmable platform for defining process and software development work flow control, Process for bringing CASE technology into an organization's culture, and CLIPS/CLIPS Ada language for developing expert systems; and methodologies such as Method for developing fault tolerant, distributed systems and a method for developing systems for common sense reasoning and for solving expert systems problems when only approximate truths are known.

  5. Prognostic significance of erythropoietin in pancreatic adenocarcinoma.

    Directory of Open Access Journals (Sweden)

    Thilo Welsch

    Full Text Available BACKGROUND: Erythropoietin (Epo administration has been reported to have tumor-promoting effects in anemic cancer patients. We investigated the prognostic impact of endogenous Epo in patients with pancreatic ductal adenocarcinoma (PDAC. METHODOLOGY: The clinico-pathological relevance of hemoglobin (Hb, n = 150, serum Epo (sEpo, n = 87 and tissue expression of Epo/Epo receptor (EpoR, n = 104 was analyzed in patients with PDAC. Epo/EpoR expression, signaling, growth, invasion and chemoresistance were studied in Epo-exposed PDAC cell lines. RESULTS: Compared to donors, median preoperative Hb levels were reduced by 15% in both chronic pancreatitis (CP, p<0.05 and PDAC (p<0.001, reaching anemic grade in one third of patients. While inversely correlating to Hb (r = -0.46, 95% of sEPO values lay within the normal range. The individual levels of compensation were adequate in CP (observed to predicted ratio, O/P = 0.99 but not in PDAC (O/P = 0.85. Strikingly, lower sEPO values yielding inadequate Epo responses were prominent in non-metastatic M0-patients, whereas these parameters were restored in metastatic M1-group (8 vs. 13 mU/mL; O/P = 0.82 vs. 0.96; p<0.01--although Hb levels and the prevalence of anemia were comparable. Higher sEpo values (upper quartile ≥ 16 mU/ml were not significantly different in M0 (20% and M1 (30% groups, but were an independent prognostic factor for shorter survival (HR 2.20, 10 vs. 17 months, p<0.05. The pattern of Epo expression in pancreas and liver suggested ectopic release of Epo by capillaries/vasa vasorum and hepatocytes, regulated by but not emanating from tumor cells. Epo could initiate PI3K/Akt signaling via EpoR in PDAC cells but failed to alter their functions, probably due to co-expression of the soluble EpoR isoform, known to antagonize Epo. CONCLUSION/SIGNIFICANCE: Higher sEPO levels counteract anemia but worsen outcome in PDAC patients. Further trials are required to clarify how overcoming a sEPO threshold

  6. Methodology for flammable gas evaluations

    Energy Technology Data Exchange (ETDEWEB)

    Hopkins, J.D., Westinghouse Hanford

    1996-06-12

    There are 177 radioactive waste storage tanks at the Hanford Site. The waste generates flammable gases. The waste releases gas continuously, but in some tanks the waste has shown a tendency to trap these flammable gases. When enough gas is trapped in a tank`s waste matrix, it may be released in a way that renders part or all of the tank atmosphere flammable for a period of time. Tanks must be evaluated against previously defined criteria to determine whether they can present a flammable gas hazard. This document presents the methodology for evaluating tanks in two areas of concern in the tank headspace:steady-state flammable-gas concentration resulting from continuous release, and concentration resulting from an episodic gas release.

  7. Methodological Reflections: Inter- ethnic Research

    DEFF Research Database (Denmark)

    Singla, Rashmi

    2010-01-01

    with both youth and the parental generation with ethnic minority background in Denmark. These reflections include implications and challenges related to researcher’s national, ethnic background and educational, professional position in encounter with   diverse ‘researched persons’ such as youth......This article reflects on the methodological and epistemological aspects of the ethical issues involved in encounters between researcher and research participants with ethnic minority background in contexts with diversity. Specific challenges involved in longitudinal research (10 - 15 years......) are also considered. The issues related to the social relevance of the research deriving from psycho political validity implying consideration of power dynamics in the personal, relational and collective domains are included. The primary basis for these reflections is a follow-up study concerning young...

  8. Sustainable Innovation and Entrepreneurship Methodology

    DEFF Research Database (Denmark)

    Celik, Sine; Joore, Peter; Christodoulou, Panayiotis

    or regional “co-creation platform for sustainable solutions” to promote structural innovation. In this manual, the Sustainable Innovation and Entrepreneurship Methodology will be described. The organisational guidelines mainly take point of departure in how Aalborg University (AAU) in Denmark has organised......The objective of the InnoLabs project is to facilitate cross-sectoral, multidisciplinary solutions to complex social problems in various European settings. InnoLabs are university-driven physical and/or organizational spaces that function as student innovation laboratories and operate as a local...... this in daily practice. In line with the objectives of the Innolabs project (output 05), partners in the Innolabs project have reflected, evaluated and concluded the project experiences, which are described in this report. The InnoLabs project was developed for the 2014 call of Erasmus+ funds KA2- Cooperation...

  9. Methodologies for 2011 economic reports

    DEFF Research Database (Denmark)

    Nielsen, Rasmus

    STECF’s Expert Working Group 11-03 convened in Athens (28th March – 1st April, 2011) to discuss and seek agreement on the content, indicators, methodologies and format of the 2011 Annual Economic Reports (AER) on the EU fishing fleet, the fish processing and the aquaculture sectors. Proposals...... for improved contents and the overall structure were discussed. Templates for the national and EU overview chapters for the EU the fish processing and the aquaculture sectors were produced. Indicators for the EU fishing fleet and fish processing reports were reviewed; new indicators for the fish processing...... and the aquaculture sector reports were proposed. And topics of special interest were proposed for all three reports....

  10. Neural Networks Methodology and Applications

    CERN Document Server

    Dreyfus, Gérard

    2005-01-01

    Neural networks represent a powerful data processing technique that has reached maturity and broad application. When clearly understood and appropriately used, they are a mandatory component in the toolbox of any engineer who wants make the best use of the available data, in order to build models, make predictions, mine data, recognize shapes or signals, etc. Ranging from theoretical foundations to real-life applications, this book is intended to provide engineers and researchers with clear methodologies for taking advantage of neural networks in industrial, financial or banking applications, many instances of which are presented in the book. For the benefit of readers wishing to gain deeper knowledge of the topics, the book features appendices that provide theoretical details for greater insight, and algorithmic details for efficient programming and implementation. The chapters have been written by experts ands seemlessly edited to present a coherent and comprehensive, yet not redundant, practically-oriented...

  11. Stakeholder analysis methodologies resource book

    Energy Technology Data Exchange (ETDEWEB)

    Babiuch, W.M.; Farhar, B.C.

    1994-03-01

    Stakeholder analysis allows analysts to identify how parties might be affected by government projects. This process involves identifying the likely impacts of a proposed action and stakeholder groups affected by that action. Additionally, the process involves assessing how these groups might be affected and suggesting measures to mitigate any adverse effects. Evidence suggests that the efficiency and effectiveness of government actions can be increased and adverse social impacts mitigated when officials understand how a proposed action might affect stakeholders. This report discusses how to conduct useful stakeholder analyses for government officials making decisions on energy-efficiency and renewable-energy technologies and their commercialization. It discusses methodological issues that may affect the validity and reliability of findings, including sampling, generalizability, validity, ``uncooperative`` stakeholder groups, using social indicators, and the effect of government regulations. The Appendix contains resource directories and a list of specialists in stakeholder analysis and involvement.

  12. Inventory differences: An evaluation methodology

    International Nuclear Information System (INIS)

    Heinberg, C.L.; Roberts, N.J.

    1987-01-01

    This paper discusses an evaluation methodology which is used for inventory differences at the Los Alamos National Laboratory. It is recognized that there are various methods which can be, and are being, used to evaluate process inventory differences at DOE facilities. The purpose of this paper is to share our thoughts on the subject and our techniques with those who are responsible for the evaluation of inventory differences at their facility. One of the most dangerous aspects of any evaluation technique, especially one as complex as most inventory difference evaluations tend to be, is to fail to look at the tools being used as indicators. There is a tendency to look at the results of an evaluation by one technique as an absolute. At the Los Alamos National Laboratory, several tools are used and the final evaluation is based on a combination of the observed results of a many-faceted evaluation. The tools used and some examples are presented

  13. Methodology of formal software evaluation

    International Nuclear Information System (INIS)

    Tuszynski, J.

    1998-01-01

    Sydkraft AB, the major Swedish utility, owner of ca 6000 MW el installed in nuclear (NPP Barsebaeck and NPP Oskarshamn), fossil fuel and hydro Power Plants is facing modernization of the control systems of the plants. Standards applicable require structured, formal methods for implementation of the control functions in the modem, real time software systems. This presentation introduces implementation methodology as discussed presently at the Sydkraft organisation. The approach suggested is based upon the process of co-operation of three parties taking part in the implementation; owner of the plant, vendor and Quality Assurance (QA) organisation. QA will be based on tools for formal software validation and on systematic gathering by the owner of validated and proved-by-operation control modules for the concern-wide utilisation. (author)

  14. Tokamak pump limiters

    International Nuclear Information System (INIS)

    Conn, R.W.

    1984-05-01

    Recent experiments with a scoop limiter without active internal pumping have been carried out in the PDX tokamak with up to 6MW of auxiliary neutral beam heating. Experiments have also been done with a rotating head pump limiter in the PLT tokamak in conjunction with RF plasma heating. Extensive experiments have been done in the ISX-B tokamak and first experiments have been completed with the ALT-I limiter in TEXTOR. The pump limiter modules in these latter two machines have internal getter pumping. Experiments in ISX-B are with ohmic and auxiliary neutral beam heating. The results in ISX-B and TEXTOR show that active density control and particle removal is achieved with pump limiters. In ISX-B, the boundary layer (or scape-off layer) plasma partially screens the core plasma from gas injection. In both ISX-B and TEXTOR, the pressure internal to the module scales linearly with plasma density but in ISX-B, with neutral beam injection, a nonlinear increase is observed at the highest densities studied. Plasma plugging is the suspected cause. Results from PDX suggest that a region may exist in which core plasma energy confinement improves using a pump limiter during neutral beam injection. Asymmetric radial profiles and an increased edge electron temperature are observed in discharges with improved confinement. The injection of small amounts of neon into ISX-B has more clearly shown an improved electron core energy confinement during neutral beam injection. While carried out with a regular limiter, this Z-mode of operation is ideal for use with pump limiters and should be a way to achieve energy confinement times similar to values for H-mode tokamak plasmas. The implication of all these results for the design of a reactor pump limiter is described

  15. Tokamak pump limiters

    International Nuclear Information System (INIS)

    Conn, R.W.; California Univ., Los Angeles

    1984-01-01

    Recent experiments with a scoop limiter without active internal pumping have been carried out in the PDX tokamak with up to 6 MW of auxiliary neutral beam heating. Experiments have also been performed with a rotating head pump limiter in the PLT tokamak in conjunction with RF plasma heating. Extensive experiments have been done in the ISX-B tokamak and first experiments have been completed with the ALT-I limiter in TEXTOR. The pump limiter modules in these latter two machines have internal getter pumping. Experiments in ISX-B are with ohmic and auxiliary neutral beam heating. The results in ISX-B and TEXTOR show that active density control and particle removal is achieved with pump limiters. In ISX-B, the boundary layer (or scrape-off layer) plasma partially screens the core plasma from gas injection. In both ISX-B and TEXTOR, the pressure internal to the module scales linearly with plasma density but in ISX-B, with neutral beam injection, a nonlinear increase is observed at the highest densities studied. Plasma plugging is the suspected cause. Results from PDX suggest that a regime may exist in which core plasma energy confinement improves using a pump limiter during neutral beam injection. Asymmetric radial profiles and an increased edge electron temperature are observed in discharges with improved confinement. The injection of small amounts of neon into ISX-B has more clearly shown an improved electron core energy confinement during neutral beam injection. While carried out with a regular limiter, this 'Z-mode' of operation is ideal for use with pump limiters and should be a way to achieve energy confinement times similar to values for H-mode tokamak plasmas. The implication of all these results for the design of a reactor pump limiter is described. (orig.)

  16. Application of the BEPU methodology to assess fuel performance in dry storage

    International Nuclear Information System (INIS)

    Feria, F.; Herranz, L.E.

    2017-01-01

    Highlights: • Application of the BEPU methodology to estimate the cladding stress in dry storage. • The stress predicted is notably affected by the irradiation history. • Improvements of FGR modelling would significantly enhance the stress estimates. • The prediction uncertainty should not be disregarded when assessing clad integrity. - Abstract: The stress at which fuel cladding is submitted in dry storage is the driving force of the main degrading mechanisms postulated (i.e., embrittlement due to hydrides radial reorientation and creep). Therefore, a sound assessment is mandatory to reliably predict fuel performance under the dry storage prevailing conditions. Through fuel rod thermo-mechanical codes, best estimate calculations can be conducted. Precision of predictions depends on uncertainties affecting the way of calculating the stress, so by using uncertainty analysis an upper bound of stress can be determined and compared to safety limits set. The present work shows the application of the BEPU (Best Estimate Plus Uncertainty) methodology in this field. Concretely, hydrides radial reorientation has been assessed based on stress predictions under challenging thermal conditions (400 °C) and a stress limit of 90 MPa. The computational tools used to do that are FRAPCON-3xt (best estimate) and Dakota (uncertainty analysis). The methodology has been applied to a typical PWR fuel rod highly irradiated (65 GWd/tU) at different power histories. The study performed allows concluding that both the power history and the prediction uncertainty should not be disregarded when fuel rod integrity is evaluated in dry storage. On probabilistic bases, a burnup of 60 GWd/tU is found out as an acceptable threshold even in the most challenging irradiation conditions considered.

  17. Butterfly valve torque prediction methodology

    International Nuclear Information System (INIS)

    Eldiwany, B.H.; Sharma, V.; Kalsi, M.S.; Wolfe, K.

    1994-01-01

    As part of the Motor-Operated Valve (MOV) Performance Prediction Program, the Electric Power Research Institute has sponsored the development of methodologies for predicting thrust and torque requirements of gate, globe, and butterfly MOVs. This paper presents the methodology that will be used by utilities to calculate the dynamic torque requirements for butterfly valves. The total dynamic torque at any disc position is the sum of the hydrodynamic torque, bearing torque (which is induced by the hydrodynamic force), as well as other small torque components (such as packing torque). The hydrodynamic torque on the valve disc, caused by the fluid flow through the valve, depends on the disc angle, flow velocity, upstream flow disturbances, disc shape, and the disc aspect ratio. The butterfly valve model provides sets of nondimensional flow and torque coefficients that can be used to predict flow rate and hydrodynamic torque throughout the disc stroke and to calculate the required actuation torque and the maximum transmitted torque throughout the opening and closing stroke. The scope of the model includes symmetric and nonsymmetric discs of different shapes and aspects ratios in compressible and incompressible fluid applications under both choked and nonchoked flow conditions. The model features were validated against test data from a comprehensive flowloop and in situ test program. These tests were designed to systematically address the effect of the following parameters on the required torque: valve size, disc shapes and disc aspect ratios, upstream elbow orientation and its proximity, and flow conditions. The applicability of the nondimensional coefficients to valves of different sizes was validated by performing tests on 42-in. valve and a precisely scaled 6-in. model. The butterfly valve model torque predictions were found to bound test data from the flow-loop and in situ testing, as shown in the examples provided in this paper

  18. System Anthropological Psychology: Methodological Foundations

    Directory of Open Access Journals (Sweden)

    Vitaliy Y. Klochko

    2012-01-01

    Full Text Available The article considers methodological foundations of the system anthropologicalpsychology (SAP as a scientific branch developed by a well-represented groupof Siberian scientists. SAP is a theory based on axiomatics of cultural-historicalpsychology of L.S. Vygotsky and transspective analysis as a specially developedmeans to define the tendencies of science developing as a self-organizing system.Transspective analysis has revealed regularities in a constantly growing complexityof professional-psychological thinking along the course of emergence ofscientific cognition. It has proved that the field of modern psychology is shapedby theories constructed with ideation of different grades of complexity. The concept“dynamics of the paradigm of science” is introduced; it allows transitions tobe acknowledged from ordinary-binary logic characteristics of the classical scienceto a binary-ternary logic, adequate to non-classical science and then to aternary-multidimensional logic, which is now at the stage of emergence. The latteris employed in SAP construction. It involves the following basic methodologicalprinciples: the principle of directed (selective interaction and the principle ofgenerative effect of selective interaction. The concept of “complimentary interaction”applied in natural as well as humanitarian sciences is reconsidered in thecontext of psychology. The conclusion is made that the principle of selectivity anddirectedness of interaction is relevant to the whole Universe embracing all kindsof systems including the living ones. Different levels of matter organization representingsemantic structures of various complexity use one and the same principleof meaning making through which the Universe ensures its sustainability asa self-developing phenomenon. This methodology provides an explanation fornature and stages of emergence of multidimensional life space of an individual,which comes as a foundation for generation of such features of

  19. The Methodological Dynamism of Grounded Theory

    Directory of Open Access Journals (Sweden)

    Nicholas Ralph

    2015-11-01

    Full Text Available Variations in grounded theory (GT interpretation are the subject of ongoing debate. Divergences of opinion, genres, approaches, methodologies, and methods exist, resulting in disagreement on what GT methodology is and how it comes to be. From the postpositivism of Glaser and Strauss, to the symbolic interactionist roots of Strauss and Corbin, through to the constructivism of Charmaz, the field of GT methodology is distinctive in the sense that those using it offer new ontological, epistemological, and methodological perspectives at specific moments in time. We explore the unusual dynamism attached to GT’s underpinnings. Our view is that through a process of symbolic interactionism, in which generations of researchers interact with their context, moments are formed and philosophical perspectives are interpreted in a manner congruent with GT’s essential methods. We call this methodological dynamism, a process characterized by contextual awareness and moment formation, contemporaneous translation, generational methodology, and methodological consumerism.

  20. Modeling methodology for a CMOS-MEMS electrostatic comb

    Science.gov (United States)

    Iyer, Sitaraman V.; Lakdawala, Hasnain; Mukherjee, Tamal; Fedder, Gary K.

    2002-04-01

    A methodology for combined modeling of capacitance and force 9in a multi-layer electrostatic comb is demonstrated in this paper. Conformal mapping-based analytical methods are limited to 2D symmetric cross-sections and cannot account for charge concentration effects at corners. Vertex capacitance can be more than 30% of the total capacitance in a single-layer 2 micrometers thick comb with 10 micrometers overlap. Furthermore, analytical equations are strictly valid only for perfectly symmetrical finger positions. Fringing and corner effects are likely to be more significant in a multi- layered CMOS-MEMS comb because of the presence of more edges and vertices. Vertical curling of CMOS-MEMS comb fingers may also lead to reduced capacitance and vertical forces. Gyroscopes are particularly sensitive to such undesirable forces, which therefore, need to be well-quantified. In order to address the above issues, a hybrid approach of superposing linear regression models over a set of core analytical models is implemented. Design of experiments is used to obtain data for capacitance and force using a commercial 3D boundary-element solver. Since accurate force values require significantly higher mesh refinement than accurate capacitance, we use numerical derivatives of capacitance values to compute the forces. The model is formulated such that the capacitance and force models use the same regression coefficients. The comb model thus obtained, fits the numerical capacitance data to within +/- 3% and force to within +/- 10%. The model is experimentally verified by measuring capacitance change in a specially designed test structure. The capacitance model matches measurements to within 10%. The comb model is implemented in an Analog Hardware Description Language (ADHL) for use in behavioral simulation of manufacturing variations in a CMOS-MEMS gyroscope.

  1. 50 Years of coastal erosion analysis: A new methodological approach.

    Science.gov (United States)

    Prieto Campos, Antonio; Diaz Cuevas, Pilar; Ojeda zujar, Jose; Guisado-Pintado, Emilia

    2017-04-01

    Coasts over the world have been subjected to increased anthropogenic pressures which combined with natural hazards impacts (storm events, rising sea-levels) have led to strong erosion problems with negative impacts on the economy and the safety of coastal communities. The Andalusian coast (South Spain) is a renowned global tourist destination. In the past decades a deep transformation in the economic model led to significant land use changes: strong regulation of rivers, urbanisation and occupation of dunes, among others. As a result irreversible transformations on the coastline, from the aggressive urbanisation undertaken, are now to be faced by local authorities and suffered by locals and visitors. Moreover, the expected impacts derived from the climate change aggravated by anthropic activities emphasises the need for tools that facilitates decision making for a sustainable coastal management. In this contribution a homogeneous (only a proxy and one photointerpreter) methodology is proposed for the calculation of coastal erosion rates of exposed beaches in Andalusia (640 km) through the use of detailed series (1:2500) of open source orthophotographies for the period (1956-1977-2001-2011). The outstanding combination of the traditional software DSAS (Digital Shoreline Analysis System) with a spatial database (PostgreSQL) which integrates the resulting erosion rates with related coastal thematic information (geomorphology, presence of engineering infrastructures, dunes and ecosystems) enhances the capacity of analysis and exploitation. Further, the homogeneity of the method used allows the comparison of the results among years in a highly diverse coast, with both Mediterranean and Atlantic façades. The novelty development and integration of a PostgreSQL/Postgis database facilitates the exploitation of the results by the user (for instance by relating calculated rates with other thematic information as geomorphology of the coast or the presence of a dune field on

  2. Using Six Sigma and Lean methodologies to improve OR throughput.

    Science.gov (United States)

    Fairbanks, Catharine B

    2007-07-01

    Improving patient flow in the perioperative environment is challenging, but it has positive implications for both staff members and for the facility. One facility in vermont improved patient throughput by incorporating Six Sigma and Lean methodologies for patients undergoing elective procedures. The results of the project were significantly improved patient flow and increased teamwork and pride among perioperative staff members. (c) AORN, Inc, 2007.

  3. Can Oxygen Set Thermal Limits in an Insect and Drive Gigantism?

    Science.gov (United States)

    Verberk, Wilco C. E. P.; Bilton, David T.

    2011-01-01

    Background Thermal limits may arise through a mismatch between oxygen supply and demand in a range of animal taxa. Whilst this oxygen limitation hypothesis is supported by data from a range of marine fish and invertebrates, its generality remains contentious. In particular, it is unclear whether oxygen limitation determines thermal extremes in tracheated arthropods, where oxygen limitation may be unlikely due to the efficiency and plasticity of tracheal systems in supplying oxygen directly to metabolically active tissues. Although terrestrial taxa with open tracheal systems may not be prone to oxygen limitation, species may be affected during other life-history stages, particularly if these rely on diffusion into closed tracheal systems. Furthermore, a central role for oxygen limitation in insects is envisaged within a parallel line of research focussing on insect gigantism in the late Palaeozoic. Methodology/Principal Findings Here we examine thermal maxima in the aquatic life stages of an insect at normoxia, hypoxia (14 kPa) and hyperoxia (36 kPa). We demonstrate that upper thermal limits do indeed respond to external oxygen supply in the aquatic life stages of the stonefly Dinocras cephalotes, suggesting that the critical thermal limits of such aquatic larvae are set by oxygen limitation. This could result from impeded oxygen delivery, or limited oxygen regulatory capacity, both of which have implications for our understanding of the limits to insect body size and how these are influenced by atmospheric oxygen levels. Conclusions/Significance These findings extend the generality of the hypothesis of oxygen limitation of thermal tolerance, suggest that oxygen constraints on body size may be stronger in aquatic environments, and that oxygen toxicity may have actively selected for gigantism in the aquatic stages of Carboniferous arthropods. PMID:21818347

  4. Can oxygen set thermal limits in an insect and drive gigantism?

    Directory of Open Access Journals (Sweden)

    Wilco C E P Verberk

    Full Text Available BACKGROUND: Thermal limits may arise through a mismatch between oxygen supply and demand in a range of animal taxa. Whilst this oxygen limitation hypothesis is supported by data from a range of marine fish and invertebrates, its generality remains contentious. In particular, it is unclear whether oxygen limitation determines thermal extremes in tracheated arthropods, where oxygen limitation may be unlikely due to the efficiency and plasticity of tracheal systems in supplying oxygen directly to metabolically active tissues. Although terrestrial taxa with open tracheal systems may not be prone to oxygen limitation, species may be affected during other life-history stages, particularly if these rely on diffusion into closed tracheal systems. Furthermore, a central role for oxygen limitation in insects is envisaged within a parallel line of research focussing on insect gigantism in the late Palaeozoic. METHODOLOGY/PRINCIPAL FINDINGS: Here we examine thermal maxima in the aquatic life stages of an insect at normoxia, hypoxia (14 kPa and hyperoxia (36 kPa. We demonstrate that upper thermal limits do indeed respond to external oxygen supply in the aquatic life stages of the stonefly Dinocras cephalotes, suggesting that the critical thermal limits of such aquatic larvae are set by oxygen limitation. This could result from impeded oxygen delivery, or limited oxygen regulatory capacity, both of which have implications for our understanding of the limits to insect body size and how these are influenced by atmospheric oxygen levels. CONCLUSIONS/SIGNIFICANCE: These findings extend the generality of the hypothesis of oxygen limitation of thermal tolerance, suggest that oxygen constraints on body size may be stronger in aquatic environments, and that oxygen toxicity may have actively selected for gigantism in the aquatic stages of Carboniferous arthropods.

  5. HUD Program Income Limits

    Data.gov (United States)

    Department of Housing and Urban Development — Income limits used to determine the income eligibility of applicants for assistance under three programs authorized by the National Housing Act. These programs are...

  6. Limited Income and Resources

    Data.gov (United States)

    U.S. Department of Health & Human Services — Information for those with limited income and resources (those who may qualify for or already have the Low Income Subsidy to lower their prescription drug coverage...

  7. SIS - Annual Catch Limit

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The Annual Catch Limit (ACL) dataset within the Species Information System (SIS) contains information and data related to management reference points and catch data.

  8. Limited Denial of Participation

    Data.gov (United States)

    Department of Housing and Urban Development — A Limited Denial of Participation (LDP) is an action taken by a HUD Field Office or the Deputy Assistant Secretary for Single Family (DASSF) or Multifamily (DASMF)...

  9. Towards Improved Optical Limiters

    National Research Council Canada - National Science Library

    Huffman, Peter

    2002-01-01

    .... The first approach was to synthesize and study soluble thallium phthalocyanines. Thallium, due to its proximity to lead and indium on the periodic table, should exhibit favorable optical limiting properties...

  10. ACA Federal Upper Limits

    Data.gov (United States)

    U.S. Department of Health & Human Services — Affordable Care Act Federal Upper Limits (FUL) based on the weighted average of the most recently reported monthly average manufacturer price (AMP) for...

  11. HOME Rent Limits

    Data.gov (United States)

    Department of Housing and Urban Development — In accordance with 24 CFR Part 92.252, HUD provides maximum HOME rent limits. The maximum HOME rents are the lesser of: The fair market rent for existing housing for...

  12. Limit lines for risk

    International Nuclear Information System (INIS)

    Cox, D.C.; Baybutt, P.

    1982-01-01

    Approaches to the regulation of risk from technological systems, such as nuclear power plants or chemical process plants, in which potential accidents may result in a broad range of adverse consequences must take into account several different aspects of risk. These include overall or average risk, accidents posing high relative risks, the rate at which accident probability decreases with increasing accident consequences, and the impact of high frequency, low consequence accidents. A hypothetical complementary cumulative distribution function (CCDF), with appropriately chosen parametric form, meets all these requirements. The Farmer limit line, by contrast, places limits on the risks due to individual accident sequences, and cannot adequately account for overall risk. This reduces its usefulness as a regulatory tool. In practice, the CCDF is used in the Canadian nuclear licensing process, while the Farmer limit line approach, supplemented by separate qualitative limits on overall risk, is employed in the United Kingdom

  13. Generalized Geometric Quantum Speed Limits

    Directory of Open Access Journals (Sweden)

    Diego Paiva Pires

    2016-06-01

    Full Text Available The attempt to gain a theoretical understanding of the concept of time in quantum mechanics has triggered significant progress towards the search for faster and more efficient quantum technologies. One of such advances consists in the interpretation of the time-energy uncertainty relations as lower bounds for the minimal evolution time between two distinguishable states of a quantum system, also known as quantum speed limits. We investigate how the nonuniqueness of a bona fide measure of distinguishability defined on the quantum-state space affects the quantum speed limits and can be exploited in order to derive improved bounds. Specifically, we establish an infinite family of quantum speed limits valid for unitary and nonunitary evolutions, based on an elegant information geometric formalism. Our work unifies and generalizes existing results on quantum speed limits and provides instances of novel bounds that are tighter than any established one based on the conventional quantum Fisher information. We illustrate our findings with relevant examples, demonstrating the importance of choosing different information metrics for open system dynamics, as well as clarifying the roles of classical populations versus quantum coherences, in the determination and saturation of the speed limits. Our results can find applications in the optimization and control of quantum technologies such as quantum computation and metrology, and might provide new insights in fundamental investigations of quantum thermodynamics.

  14. A systematic review on diagnostic accuracy of CT-based detection of significant coronary artery disease

    International Nuclear Information System (INIS)

    Janne d'Othee, Bertrand; Siebert, Uwe; Cury, Ricardo; Jadvar, Hossein; Dunn, Edward J.; Hoffmann, Udo

    2008-01-01

    Objectives: Systematic review of diagnostic accuracy of contrast enhanced coronary computed tomography (CE-CCT). Background: Noninvasive detection of coronary artery stenosis (CAS) by CE-CCT as an alternative to catheter-based coronary angiography (CCA) may improve patient management. Methods: Forty-one articles published between 1997 and 2006 were included that evaluated native coronary arteries for significant stenosis and used CE-CCT as diagnostic test and CCA as reference standard. Study group characteristics, study methodology and diagnostic outcomes were extracted. Pooled summary sensitivity and specificity of CE-CCT were calculated using a random effects model (1) for all coronary segments, (2) assessable segments, and (3) per patient. Results: The 41 studies totaled 2515 patients (75% males; mean age: 59 years, CAS prevalence: 59%). Analysis of all coronary segments yielded a sensitivity of 95% (80%, 89%, 86%, 98% for electron beam CT, 4/8-slice, 16-slice and 64-slice MDCT, respectively) for a specificity of 85% (77%, 84%, 95%, 91%). Analysis limited to segments deemed assessable by CT showed sensitivity of 96% (86%, 85%, 98%, 97%) for a specificity of 95% (90%, 96%, 96%, 96%). Per patient, sensitivity was 99% (90%, 97%, 99%, 98%) and specificity was 76% (59%, 81%, 83%, 92%). Heterogeneity was quantitatively important but not explainable by patient group characteristics or study methodology. Conclusions: Current diagnostic accuracy of CE-CCT is high. Advances in CT technology have resulted in increases in diagnostic accuracy and proportion of assessable coronary segments. However, per patient, accuracy may be lower and CT may have more limited clinical utility in populations at high risk for CAD

  15. Adjusting estimative prediction limits

    OpenAIRE

    Masao Ueki; Kaoru Fueda

    2007-01-01

    This note presents a direct adjustment of the estimative prediction limit to reduce the coverage error from a target value to third-order accuracy. The adjustment is asymptotically equivalent to those of Barndorff-Nielsen & Cox (1994, 1996) and Vidoni (1998). It has a simpler form with a plug-in estimator of the coverage probability of the estimative limit at the target value. Copyright 2007, Oxford University Press.

  16. Smoothness of limit functors

    Indian Academy of Sciences (India)

    Abstract. Let S be a scheme. Assume that we are given an action of the one dimen- sional split torus Gm,S on a smooth affine S-scheme X. We consider the limit (also called attractor) subfunctor Xλ consisting of points whose orbit under the given action. 'admits a limit at 0'. We show that Xλ is representable by a smooth ...

  17. Safety and design limits

    International Nuclear Information System (INIS)

    Shishkov, L. K.; Gorbaev, V. A.; Tsyganov, S. V.

    2007-01-01

    The paper touches upon the issues of NPP safety ensuring at the stage of fuel load design and operation by applying special limitations for a series of parameters, that is, design limits. Two following approaches are compared: the one used by west specialists for the PWR reactor and the Russian approach employed for the WWER reactor. The closeness of approaches is established, differences that are mainly peculiarities of terms are noted (Authors)

  18. METHODOLOGY OF PROFESSIONAL PEDAGOGICAL EDUCATION: THEORY AND PRACTICE (THEORETICAL AND METHODOLOGICAL FOUNDATIONS OF VOCATIONAL TEACHER EDUCATION

    Directory of Open Access Journals (Sweden)

    E. M. Dorozhkin

    2014-01-01

    analysis of methodology taking into consideration the target orientation, principles and approaches to the organization and its’ methods of scientific and educational activitiesimplementation. The qualification structure formation of the teachers’ vocational training and providing advance principles of education are considered to be the most important conditions for the development of vocational teacher education.Scientific novelty. The research demonstrates creating the project of further vocational teacher education development in the post-industrial society. The pedagogical innovations transforming research findings into educational practice are considered to be the main tool of integration methodology means.Practical significance. The research findings highlight the proposed reforms for further teachers training system development of vocational institutes, which are in need of drastic restructuring. In the final part of the article the authors recommend some specific issues that can be discussed at the methodological workshop.

  19. Altruism and Reproductive Limitations

    Directory of Open Access Journals (Sweden)

    Carey J. Fitzgerald

    2009-04-01

    Full Text Available We examined how different types of reproductive limitations — functional (schizoid personality disorder and schizophrenia, physical (malnutrition, and sexual (bisexuality and homosexuality — influenced altruistic intentions toward hypothetical target individuals of differing degrees of relatedness (r = 0, .25, and .50. Participants were 312 undergraduate students who completed a questionnaire on altruism toward hypothetical friends, half-siblings, and siblings with these different types of reproductive limitations. Genetic relatedness and reproductive limitations did not influence altruistic decision-making when the cost of altruism was low but did as the cost of altruism increased, with participants being more likely to help a sibling over a half-sibling and a half-sibling over a friend. Participants also indicated they were more likely to help a healthy (control person over people with a reproductive limitation. Of the three types of reproductive limitations, functional limitations had the strongest effect on altruistic decision-making, indicating that people were less likely to help those who exhibit abnormal social behavior.

  20. Upper limit of peak area

    International Nuclear Information System (INIS)

    Helene, O.A.M.

    1982-08-01

    The determination of the upper limit of peak area in a multi-channel spectra, with a known significance level is discussed. This problem is specially important when the peak area is masked by the background statistical fluctuations. The problem is exactly solved and, thus, the results are valid in experiments with small number of events. The results are submitted to a Monte Carlo test and applied to the 92 Nb beta decay. (Author) [pt