WorldWideScience

Sample records for methodology include performance

  1. Physical activity and performance at school A systematic review of the literature including a methodological quality assessment

    NARCIS (Netherlands)

    Singh, A.S.; Uijtdewilligen, L.; Twisk, J.W.; van Mechelen, W.; Chin A Paw, M.J.M.

    2012-01-01

    Objective: To describe the prospective relationship between physical activity and academic performance. Data Sources: Prospective studies were identified from searches in PubMed, PsycINFO, Cochrane Central, and Sportdiscus from 1990 through 2010. Study Selection: We screened the titles and abstracts

  2. SMART performance analysis methodology

    International Nuclear Information System (INIS)

    Lim, H. S.; Kim, H. C.; Lee, D. J.

    2001-04-01

    To ensure the required and desired operation over the plant lifetime, the performance analysis for the SMART NSSS design is done by means of the specified analysis methodologies for the performance related design basis events(PRDBE). The PRDBE is an occurrence(event) that shall be accommodated in the design of the plant and whose consequence would be no more severe than normal service effects of the plant equipment. The performance analysis methodology which systematizes the methods and procedures to analyze the PRDBEs is as follows. Based on the operation mode suitable to the characteristics of the SMART NSSS, the corresponding PRDBEs and allowable range of process parameters for these events are deduced. With the developed control logic for each operation mode, the system thermalhydraulics are analyzed for the chosen PRDBEs using the system analysis code. Particularly, because of different system characteristics of SMART from the existing commercial nuclear power plants, the operation mode, PRDBEs, control logic, and analysis code should be consistent with the SMART design. This report presents the categories of the PRDBEs chosen based on each operation mode and the transition among these and the acceptance criteria for each PRDBE. It also includes the analysis methods and procedures for each PRDBE and the concept of the control logic for each operation mode. Therefore this report in which the overall details for SMART performance analysis are specified based on the current SMART design, would be utilized as a guide for the detailed performance analysis

  3. Extending flood damage assessment methodology to include ...

    African Journals Online (AJOL)

    Optimal and sustainable flood plain management, including flood control, can only be achieved when the impacts of flood control measures are considered for both the man-made and natural environments, and the sociological aspects are fully considered. Until now, methods/models developed to determine the influences ...

  4. Energy performance assessment methodology

    Energy Technology Data Exchange (ETDEWEB)

    Platzer, W.J. [Fraunhofer Inst. for Solar Energy Systems, Freiburg (Germany)

    2006-01-15

    The energy performance of buildings are intimately connected to the energy performance of building envelopes. The better we understand the relation between the quality of the envelope and the energy consumption of the building, the better we can improve both. We have to consider not only heating but all service energies related to the human comfort in the building, such as cooling, ventilation, lighting as well. The complexity coming from this embracing approach is not to be underestimated. It is less and less possible to realted simple characteristic performance indicators of building envelopes (such as the U-value) to the overall energy performance. On the one hand much more paramters (e.g. light transmittance) come into the picture we have to assess the product quality in a multidimensional world. Secondly buildings more and more have to work on a narrow optimum: For an old, badly insulated building all solar gains are useful for a high-performance building with very good insulation and heat recovery systems in the ventilation overheating becomes more likely. Thus we have to control the solar gains, and sometimes we need high gains, sometimes low ones. And thirdly we see that the technology within the building and the user patterns and interactions as well influence the performance of a building envelope. The aim of this project within IEA Task27 was to improve our knowledge on the complex situation and also to give a principal approach how to assess the performance of the building envelope. The participants have contributed to this aim not pretending that we have reached the end. (au)

  5. Methodological challenges when doing research that includes ethnic minorities

    DEFF Research Database (Denmark)

    Morville, Anne-Le; Erlandsson, Lena-Karin

    2016-01-01

    minorities are included. Method: A thorough literature search yielded 21 articles obtained from the scientific databases PubMed, Cinahl, Web of Science and PsychInfo. Analysis followed Arksey and O’Malley’s framework for scoping reviews, applying content analysis. Results: The results showed methodological...

  6. Methodology for performing surveys for fixed contamination

    International Nuclear Information System (INIS)

    Durham, J.S.; Gardner, D.L.

    1994-10-01

    This report describes a methodology for performing instrument surveys for fixed contamination that can be used to support the release of material from radiological areas, including release to controlled areas and release from radiological control. The methodology, which is based on a fast scan survey and a series of statistical, fixed measurements, meets the requirements of the U.S. Department of Energy Radiological Control Manual (RadCon Manual) (DOE 1994) and DOE Order 5400.5 (DOE 1990) for surveys for fixed contamination and requires less time than a conventional scan survey. The confidence interval associated with the new methodology conforms to the draft national standard for surveys. The methodology that is presented applies only to surveys for fixed contamination. Surveys for removable contamination are not discussed, and the new methodology does not affect surveys for removable contamination

  7. Nuclear data evaluation methodology including estimates of covariances

    Directory of Open Access Journals (Sweden)

    Smith D.L.

    2010-10-01

    Full Text Available Evaluated nuclear data rather than raw experimental and theoretical information are employed in nuclear applications such as the design of nuclear energy systems. Therefore, the process by which such information is produced and ultimately used is of critical interest to the nuclear science community. This paper provides an overview of various contemporary methods employed to generate evaluated cross sections and related physical quantities such as particle emission angular distributions and energy spectra. The emphasis here is on data associated with neutron induced reaction processes, with consideration of the uncertainties in these data, and on the more recent evaluation methods, e.g., those that are based on stochastic (Monte Carlo techniques. There is no unique way to perform such evaluations, nor are nuclear data evaluators united in their opinions as to which methods are superior to the others in various circumstances. In some cases it is not critical which approaches are used as long as there is consistency and proper use is made of the available physical information. However, in other instances there are definite advantages to using particular methods as opposed to other options. Some of these distinctions are discussed in this paper and suggestions are offered regarding fruitful areas for future research in the development of evaluation methodology.

  8. Advanced methodology for generation expansion planning including interconnected systems

    Energy Technology Data Exchange (ETDEWEB)

    Zhao, M; Yokoyama, R; Yasuda, K [Tokyo Metropolitan Univ. (Japan); Sasaki, H [Hiroshima Univ. (Japan); Ogimoto, K [Electric Power Development Co. Ltd., Tokyo (Japan)

    1994-12-31

    This paper reviews advanced methodology for generation expansion planning including interconnected systems developed in Japan, putting focus on flexibility and efficiency in a practical application. First, criteria for evaluating flexibility of generation planning considering uncertainties are introduced. Secondly, the flexible generation mix problem is formulated as a multi-objective optimization with more than two objective functions. The multi-objective optimization problem is then transformed into a single objective problem by using the weighting method, to obtain the Pareto optimal solution, and solved by a dynamics programming technique. Thirdly, a new approach for electric generation expansion planning of interconnected systems is presented, based on the Benders Decomposition technique. That is, large scale generation problem constituted by the general economic load dispatch problem, and several sub problems which are composed of smaller scale isolated system generation expansion plans. Finally, the generation expansion plan solved by an artificial neural network is presented. In conclusion, the advantages and disadvantages of this method from the viewpoint of flexibility and applicability to practical generation expansion planning are presented. (author) 29 refs., 10 figs., 4 tabs.

  9. An overview of performance assessment methodology

    International Nuclear Information System (INIS)

    Hongnian Jow

    2010-01-01

    The definition of performance assessment (PA) within the context of a geologic repository program is a post-closure safety assessment; a system analysis of hazards associated with the facility and the ability of the site and the design of the facility to provide for the safety functions. For the last few decades, PA methodology bas been developed and applied to different waste disposal programs around the world. PA has been used in the safety analyses for waste disposal repositories for low-level waste, intermediate level waste, and high-level waste including spent nuclear fuels. This paper provides an overview of the performance assessment methodology and gives examples of its applications for the Yucca Mountain Project. (authors)

  10. Technology Performance Level Assessment Methodology.

    Energy Technology Data Exchange (ETDEWEB)

    Roberts, Jesse D.; Bull, Diana L; Malins, Robert Joseph; Costello, Ronan Patrick; Aurelien Babarit; Kim Nielsen; Claudio Bittencourt Ferreira; Ben Kennedy; Kathryn Dykes; Jochem Weber

    2017-04-01

    The technology performance level (TPL) assessments can be applied at all technology development stages and associated technology readiness levels (TRLs). Even, and particularly, at low TRLs the TPL assessment is very effective as it, holistically, considers a wide range of WEC attributes that determine the techno-economic performance potential of the WEC farm when fully developed for commercial operation. The TPL assessment also highlights potential showstoppers at the earliest possible stage of the WEC technology development. Hence, the TPL assessment identifies the technology independent “performance requirements.” In order to achieve a successful solution, the entirety of the performance requirements within the TPL must be considered because, in the end, all the stakeholder needs must be achieved. The basis for performing a TPL assessment comes from the information provided in a dedicated format, the Technical Submission Form (TSF). The TSF requests information from the WEC developer that is required to answer the questions posed in the TPL assessment document.

  11. Team Dynamics. Essays in the Sociology and Social Psychology of Sport Including Methodological and Epistemological Issues.

    Science.gov (United States)

    Lenk, Hans

    This document contains nine essays on the sociology and social psychology of team dynamics, including methodological and epistemological issues involved in such study. Essay titles are: (1) Conflict and Achievement in Top Athletic Teams--Sociometric Structures of Racing Eight Oar Crews; (2) Top Performance Despite Internal Conflict--An Antithesis…

  12. Methodology for NDA performance assessment

    International Nuclear Information System (INIS)

    Cuypers, M.; Franklin, M.; Guardini, S.

    1986-01-01

    In the framework of the RandD programme of the Joint Research Centre of the Commission of the European Communities, a considerable effort is being dedicated to performance assessment of NDA techniques taking account of field conditions. By taking account of field conditions is meant measurement samples of the size encountered in practice and training which allows inspectors to design cost efficient verification plans for the real situations encountered in the field. Special laboratory facilities referred to as PERLA are being constructed for this purpose. These facilities will be used for measurement experiments and for training. In this paper, performance assessment is discussed under the headings of measurement capability and in-field effectiveness. Considerable emphasis is given to the role of method specific measurement error models. The authors outline the advantages of giving statistical error models a sounder basis in the physical phenomenology of the measurement method

  13. Shuttle TPS thermal performance and analysis methodology

    Science.gov (United States)

    Neuenschwander, W. E.; Mcbride, D. U.; Armour, G. A.

    1983-01-01

    Thermal performance of the thermal protection system was approximately as predicted. The only extensive anomalies were filler bar scorching and over-predictions in the high Delta p gap heating regions of the orbiter. A technique to predict filler bar scorching has been developed that can aid in defining a solution. Improvement in high Delta p gap heating methodology is still under study. Minor anomalies were also examined for improvements in modeling techniques and prediction capabilities. These include improved definition of low Delta p gap heating, an analytical model for inner mode line convection heat transfer, better modeling of structure, and inclusion of sneak heating. The limited number of problems related to penetration items that presented themselves during orbital flight tests were resolved expeditiously, and designs were changed and proved successful within the time frame of that program.

  14. How Six Sigma Methodology Improved Doctors' Performance

    Science.gov (United States)

    Zafiropoulos, George

    2015-01-01

    Six Sigma methodology was used in a District General Hospital to assess the effect of the introduction of an educational programme to limit unnecessary admissions. The performance of the doctors involved in the programme was assessed. Ishikawa Fishbone and 5 S's were initially used and Pareto analysis of their findings was performed. The results…

  15. Methodology for quantitative evaluation of diagnostic performance

    International Nuclear Information System (INIS)

    Metz, C.

    1981-01-01

    Of various approaches that might be taken to the diagnostic performance evaluation problem, Receiver Operating Characteristic (ROC) analysis holds great promise. Further development of the methodology for a unified, objective, and meaningful approach to evaluating the usefulness of medical imaging procedures is done by consideration of statistical significance testing, optimal sequencing of correlated studies, and analysis of observer performance

  16. A performance assessment methodology for low-level waste facilities

    International Nuclear Information System (INIS)

    Kozak, M.W.; Chu, M.S.Y.; Mattingly, P.A.

    1990-07-01

    A performance assessment methodology has been developed for use by the US Nuclear Regulatory Commission in evaluating license applications for low-level waste disposal facilities. This report provides a summary of background reports on the development of the methodology and an overview of the models and codes selected for the methodology. The overview includes discussions of the philosophy and structure of the methodology and a sequential procedure for applying the methodology. Discussions are provided of models and associated assumptions that are appropriate for each phase of the methodology, the goals of each phase, data required to implement the models, significant sources of uncertainty associated with each phase, and the computer codes used to implement the appropriate models. In addition, a sample demonstration of the methodology is presented for a simple conceptual model. 64 refs., 12 figs., 15 tabs

  17. Methodological approach to organizational performance improvement process

    OpenAIRE

    Buble, Marin; Dulčić, Želimir; Pavić, Ivan

    2017-01-01

    Organizational performance improvement is one of the fundamental enterprise tasks. This especially applies to the case when the term “performance improvement” implies efficiency improvement measured by indicators, such as ROI, ROE, ROA, or ROVA/ROI. Such tasks are very complex, requiring implementation by means of project management. In this paper, the authors propose a methodological approach to improving the organizational performance of a large enterprise.

  18. Methodological approach to organizational performance improvement process

    Directory of Open Access Journals (Sweden)

    Marin Buble

    2001-01-01

    Full Text Available Organizational performance improvement is one of the fundamental enterprise tasks. This especially applies to the case when the term “performance improvement” implies efficiency improvement measured by indicators, such as ROI, ROE, ROA, or ROVA/ROI. Such tasks are very complex, requiring implementation by means of project management. In this paper, the authors propose a methodological approach to improving the organizational performance of a large enterprise.

  19. Methodological approach to strategic performance optimization

    OpenAIRE

    Hell, Marko; Vidačić, Stjepan; Garača, Željko

    2009-01-01

    This paper presents a matrix approach to the measuring and optimization of organizational strategic performance. The proposed model is based on the matrix presentation of strategic performance, which follows the theoretical notions of the balanced scorecard (BSC) and strategy map methodologies, initially developed by Kaplan and Norton. Development of a quantitative record of strategic objectives provides an arena for the application of linear programming (LP), which is a mathematical tech...

  20. Improved best estimate plus uncertainty methodology, including advanced validation concepts, to license evolving nuclear reactors

    International Nuclear Information System (INIS)

    Unal, C.; Williams, B.; Hemez, F.; Atamturktur, S.H.; McClure, P.

    2011-01-01

    Research highlights: → The best estimate plus uncertainty methodology (BEPU) is one option in the licensing of nuclear reactors. → The challenges for extending the BEPU method for fuel qualification for an advanced reactor fuel are primarily driven by schedule, the need for data, and the sufficiency of the data. → In this paper we develop an extended BEPU methodology that can potentially be used to address these new challenges in the design and licensing of advanced nuclear reactors. → The main components of the proposed methodology are verification, validation, calibration, and uncertainty quantification. → The methodology includes a formalism to quantify an adequate level of validation (predictive maturity) with respect to existing data, so that required new testing can be minimized, saving cost by demonstrating that further testing will not enhance the quality of the predictive tools. - Abstract: Many evolving nuclear energy technologies use advanced predictive multiscale, multiphysics modeling and simulation (M and S) capabilities to reduce the cost and schedule of design and licensing. Historically, the role of experiments has been as a primary tool for the design and understanding of nuclear system behavior, while M and S played the subordinate role of supporting experiments. In the new era of multiscale, multiphysics computational-based technology development, this role has been reversed. The experiments will still be needed, but they will be performed at different scales to calibrate and validate the models leading to predictive simulations for design and licensing. Minimizing the required number of validation experiments produces cost and time savings. The use of multiscale, multiphysics models introduces challenges in validating these predictive tools - traditional methodologies will have to be modified to address these challenges. This paper gives the basic aspects of a methodology that can potentially be used to address these new challenges in

  1. Ultra wideband antennas design, methodologies, and performance

    CERN Document Server

    Galvan-Tejada, Giselle M; Jardón Aguilar, Hildeberto

    2015-01-01

    Ultra Wideband Antennas: Design, Methodologies, and Performance presents the current state of the art of ultra wideband (UWB) antennas, from theory specific for these radiators to guidelines for the design of omnidirectional and directional UWB antennas. Offering a comprehensive overview of the latest UWB antenna research and development, this book:Discusses the developed theory for UWB antennas in frequency and time domainsDelivers a brief exposition of numerical methods for electromagnetics oriented to antennasDescribes solid-planar equivalen

  2. Performance improvement using methodology: case study.

    Science.gov (United States)

    Harmelink, Stacy

    2008-01-01

    The department of radiology at St. Luke's Regional Medical Center in Sioux City, IA implemented meaningful workflow changes for reducing patient wait times and, at the same time, improved customer and employee satisfaction scores. Lean methodology and the 7 Deadly Wastes, along with small group interaction, was used to evaluate and change the process of a patient waiting for an exam in the radiology department. The most important key to the success of a performance improvement project is the involvement of staff.

  3. A Performance Tuning Methodology with Compiler Support

    Directory of Open Access Journals (Sweden)

    Oscar Hernandez

    2008-01-01

    Full Text Available We have developed an environment, based upon robust, existing, open source software, for tuning applications written using MPI, OpenMP or both. The goal of this effort, which integrates the OpenUH compiler and several popular performance tools, is to increase user productivity by providing an automated, scalable performance measurement and optimization system. In this paper we describe our environment, show how these complementary tools can work together, and illustrate the synergies possible by exploiting their individual strengths and combined interactions. We also present a methodology for performance tuning that is enabled by this environment. One of the benefits of using compiler technology in this context is that it can direct the performance measurements to capture events at different levels of granularity and help assess their importance, which we have shown to significantly reduce the measurement overheads. The compiler can also help when attempting to understand the performance results: it can supply information on how a code was translated and whether optimizations were applied. Our methodology combines two performance views of the application to find bottlenecks. The first is a high level view that focuses on OpenMP/MPI performance problems such as synchronization cost and load imbalances; the second is a low level view that focuses on hardware counter analysis with derived metrics that assess the efficiency of the code. Our experiments have shown that our approach can significantly reduce overheads for both profiling and tracing to acceptable levels and limit the number of times the application needs to be run with selected hardware counters. In this paper, we demonstrate the workings of this methodology by illustrating its use with selected NAS Parallel Benchmarks and a cloud resolving code.

  4. A methodology for performing computer security reviews

    International Nuclear Information System (INIS)

    Hunteman, W.J.

    1991-01-01

    DOE Order 5637.1, ''Classified Computer Security,'' requires regular reviews of the computer security activities for an ADP system and for a site. Based on experiences gained in the Los Alamos computer security program through interactions with DOE facilities, we have developed a methodology to aid a site or security officer in performing a comprehensive computer security review. The methodology is designed to aid a reviewer in defining goals of the review (e.g., preparation for inspection), determining security requirements based on DOE policies, determining threats/vulnerabilities based on DOE and local threat guidance, and identifying critical system components to be reviewed. Application of the methodology will result in review procedures and checklists oriented to the review goals, the target system, and DOE policy requirements. The review methodology can be used to prepare for an audit or inspection and as a periodic self-check tool to determine the status of the computer security program for a site or specific ADP system. 1 tab

  5. A methodology for performing computer security reviews

    International Nuclear Information System (INIS)

    Hunteman, W.J.

    1991-01-01

    This paper reports on DIE Order 5637.1, Classified Computer Security, which requires regular reviews of the computer security activities for an ADP system and for a site. Based on experiences gained in the Los Alamos computer security program through interactions with DOE facilities, the authors have developed a methodology to aid a site or security officer in performing a comprehensive computer security review. The methodology is designed to aid a reviewer in defining goals of the review (e.g., preparation for inspection), determining security requirements based on DOE policies, determining threats/vulnerabilities based on DOE and local threat guidance, and identifying critical system components to be reviewed. Application of the methodology will result in review procedures and checklists oriented to the review goals, the target system, and DOE policy requirements. The review methodology can be used to prepare for an audit or inspection and as a periodic self-check tool to determine the status of the computer security program for a site or specific ADP system

  6. Improved best estimate plus uncertainty methodology including advanced validation concepts to license evolving nuclear reactors

    International Nuclear Information System (INIS)

    Unal, Cetin; Williams, Brian; McClure, Patrick; Nelson, Ralph A.

    2010-01-01

    Many evolving nuclear energy programs plan to use advanced predictive multi-scale multi-physics simulation and modeling capabilities to reduce cost and time from design through licensing. Historically, the role of experiments was primary tool for design and understanding of nuclear system behavior while modeling and simulation played the subordinate role of supporting experiments. In the new era of multi-scale multi-physics computational based technology development, the experiments will still be needed but they will be performed at different scales to calibrate and validate models leading predictive simulations. Cost saving goals of programs will require us to minimize the required number of validation experiments. Utilization of more multi-scale multi-physics models introduces complexities in the validation of predictive tools. Traditional methodologies will have to be modified to address these arising issues. This paper lays out the basic aspects of a methodology that can be potentially used to address these new challenges in design and licensing of evolving nuclear technology programs. The main components of the proposed methodology are verification, validation, calibration, and uncertainty quantification. An enhanced calibration concept is introduced and is accomplished through data assimilation. The goal is to enable best-estimate prediction of system behaviors in both normal and safety related environments. To achieve this goal requires the additional steps of estimating the domain of validation and quantification of uncertainties that allow for extension of results to areas of the validation domain that are not directly tested with experiments, which might include extension of the modeling and simulation (M and S) capabilities for application to full-scale systems. The new methodology suggests a formalism to quantify an adequate level of validation (predictive maturity) with respect to required selective data so that required testing can be minimized for

  7. Improved best estimate plus uncertainty methodology including advanced validation concepts to license evolving nuclear reactors

    Energy Technology Data Exchange (ETDEWEB)

    Unal, Cetin [Los Alamos National Laboratory; Williams, Brian [Los Alamos National Laboratory; Mc Clure, Patrick [Los Alamos National Laboratory; Nelson, Ralph A [IDAHO NATIONAL LAB

    2010-01-01

    Many evolving nuclear energy programs plan to use advanced predictive multi-scale multi-physics simulation and modeling capabilities to reduce cost and time from design through licensing. Historically, the role of experiments was primary tool for design and understanding of nuclear system behavior while modeling and simulation played the subordinate role of supporting experiments. In the new era of multi-scale multi-physics computational based technology development, the experiments will still be needed but they will be performed at different scales to calibrate and validate models leading predictive simulations. Cost saving goals of programs will require us to minimize the required number of validation experiments. Utilization of more multi-scale multi-physics models introduces complexities in the validation of predictive tools. Traditional methodologies will have to be modified to address these arising issues. This paper lays out the basic aspects of a methodology that can be potentially used to address these new challenges in design and licensing of evolving nuclear technology programs. The main components of the proposed methodology are verification, validation, calibration, and uncertainty quantification. An enhanced calibration concept is introduced and is accomplished through data assimilation. The goal is to enable best-estimate prediction of system behaviors in both normal and safety related environments. To achieve this goal requires the additional steps of estimating the domain of validation and quantification of uncertainties that allow for extension of results to areas of the validation domain that are not directly tested with experiments, which might include extension of the modeling and simulation (M&S) capabilities for application to full-scale systems. The new methodology suggests a formalism to quantify an adequate level of validation (predictive maturity) with respect to required selective data so that required testing can be minimized for cost

  8. Methodology for performing measurements to release material from radiological control

    International Nuclear Information System (INIS)

    Durham, J.S.; Gardner, D.L.

    1993-09-01

    This report describes the existing and proposed methodologies for performing measurements of contamination prior to releasing material for uncontrolled use at the Hanford Site. The technical basis for the proposed methodology, a modification to the existing contamination survey protocol, is also described. The modified methodology, which includes a large-area swipe followed by a statistical survey, can be used to survey material that is unlikely to be contaminated for release to controlled and uncontrolled areas. The material evaluation procedure that is used to determine the likelihood of contamination is also described

  9. Knowledge management performance methodology regarding manufacturing organizations

    Science.gov (United States)

    Istrate, C.; Herghiligiu, I. V.

    2016-08-01

    The current business situation is extremely complicated. Business must adapt to the changes in order (a) to survive on the increasingly dynamic markets, (b) to meet customers’ new request for complex, customized and innovative products. In modern manufacturing organizations it can be seen a substantial improvement regarding the management of knowledge. This occurs due to the fact that organizations realized that knowledge and an efficient management of knowledge generates the highest value. Even it could be said that the manufacturing organizations were and are the biggest beneficiary of KM science. Knowledge management performance (KMP) evaluation in manufacturing organizations can be considered as extremely important because without measuring it, they are unable to properly assess (a) what goals, targets and activities must have continuity, (b) what must be improved and (c) what must be completed. Therefore a proper KM will generate multiple competitive advantages for organizations. This paper presents a developed methodological framework regarding the KMP importance regarding manufacturing organizations. This methodological framework was developed using as research methods: bibliographical research and a panel of specialists. The purpose of this paper is to improve the evaluation process of KMP and to provide a viable tool for manufacturing organizations managers.

  10. Methodology for evaluation of diagnostic performance

    International Nuclear Information System (INIS)

    Metz, C.E.

    1992-01-01

    Effort in this project during the past year has focused on the development, refinement, and distribution of computer software that will allow current Receiver Operating Characteristic (ROC) methodology to be used conveniently and reliably by investigators in a variety of evaluation tasks in diagnostic medicine; and on the development of new ROC methodology that will broaden the spectrum of evaluation tasks and/or experimental settings to which the fundamental approach can be applied. Progress has been limited by the amount of financial support made available to the project

  11. EnergiTools. A methodology for performance monitoring and diagnosis

    International Nuclear Information System (INIS)

    Ancion, P.; Bastien, R.; Ringdahl, K.

    2000-01-01

    EnergiTools is a performance monitoring and diagnostic tool that combines the power of on-line process data acquisition with advanced diagnosis methodologies. Analytical models based on thermodynamic principles are combined with neural networks to validate sensor data and to estimate missing or faulty measurements. Advanced diagnostic technologies are then applied to point out potential faults and areas to be investigated further. The diagnosis methodologies are based on Bayesian belief networks. Expert knowledge is captured in the form of the fault-symptom relationships and includes historical information as the likelihood of faults and symptoms. The methodology produces the likelihood of component failure root causes using the expert knowledge base. EnergiTools is used at Ringhals nuclear power plants. It has led to the diagnosis of various performance issues. Three case studies based on this plant data and model are presented and illustrate the diagnosis support methodologies implemented in EnergiTools . In the first case, the analytical data qualification technique points out several faulty measurements. The application of a neural network for the estimation of the nuclear reactor power by interpreting several plant indicators is then illustrated. The use of the Bayesian belief networks is finally described. (author)

  12. (Per)Forming Archival Research Methodologies

    Science.gov (United States)

    Gaillet, Lynee Lewis

    2012-01-01

    This article raises multiple issues associated with archival research methodologies and methods. Based on a survey of recent scholarship and interviews with experienced archival researchers, this overview of the current status of archival research both complicates traditional conceptions of archival investigation and encourages scholars to adopt…

  13. Overview of a performance assessment methodology for low-level radioactive waste disposal facilities

    International Nuclear Information System (INIS)

    Kozak, M.W.; Chu, M.S.Y.

    1991-01-01

    A performance assessment methodology has been developed for use by the US Nuclear Regulatory Commission in evaluating license applications for low-level waste disposal facilities. This paper provides a summary and an overview of the modeling approaches selected for the methodology. The overview includes discussions of the philosophy and structure of the methodology. This performance assessment methodology is designed to provide the NRC with a tool for performing confirmatory analyses in support of license reviews related to postclosure performance. The methodology allows analyses of dose to individuals from off-site releases under normal conditions as well as on-site doses to inadvertent intruders. 24 refs., 1 tab

  14. A theoretical and experimental investigation of propeller performance methodologies

    Science.gov (United States)

    Korkan, K. D.; Gregorek, G. M.; Mikkelson, D. C.

    1980-01-01

    This paper briefly covers aspects related to propeller performance by means of a review of propeller methodologies; presentation of wind tunnel propeller performance data taken in the NASA Lewis Research Center 10 x 10 wind tunnel; discussion of the predominent limitations of existing propeller performance methodologies; and a brief review of airfoil developments appropriate for propeller applications.

  15. Methodology for determining influence of organizational culture to business performance

    Directory of Open Access Journals (Sweden)

    Eva Skoumalová

    2007-01-01

    Full Text Available Content this article is to propose the possible methodology for quantitative measuring the organizational culture using the set of statistical methods. In view of aim we elected procedure consisting of two major sections. The first is classification of organizational culture and role of quantitative measurement on organizational culture. This part includes definition and several methods used to classify organizational culture: Hofstede, Peters and Waterman, Deal and Kennedy, Edgar Schein, Kotter and Heskett, Lukášová and opinions why a measurement perspective is worthwhile. The second major section contains methodology for measuring the organizational culture and its impact on organizational performance. We suggest using structural equation modeling for quantitative assessment of organizational culture.

  16. A development of containment performance analysis methodology using GOTHIC code

    Energy Technology Data Exchange (ETDEWEB)

    Lee, B. C.; Yoon, J. I. [Future and Challenge Company, Seoul (Korea, Republic of); Byun, C. S.; Lee, J. Y. [Korea Electric Power Research Institute, Taejon (Korea, Republic of); Lee, J. Y. [Seoul National University, Seoul (Korea, Republic of)

    2003-10-01

    In a circumstance that well-established containment pressure/temperature analysis code, CONTEMPT-LT treats the reactor containment as a single volume, this study introduces, as an alternative, the GOTHIC code for an usage on multi-compartmental containment performance analysis. With a developed GOTHIC methodology, its applicability is verified for containment performance analysis for Korean Nuclear Unit 1. The GOTHIC model for this plant is simply composed of 3 compartments including the reactor containment and RWST. In addition, the containment spray system and containment recirculation system are simulated. As a result of GOTHIC calculation, under the same assumptions and conditions as those in CONTEMPT-LT, the GOTHIC prediction shows a very good result; pressure and temperature transients including their peaks are nearly the same. It can be concluded that the GOTHIC could provide reasonable containment pressure and temperature responses if considering the inherent conservatism in CONTEMPT-LT code.

  17. A development of containment performance analysis methodology using GOTHIC code

    International Nuclear Information System (INIS)

    Lee, B. C.; Yoon, J. I.; Byun, C. S.; Lee, J. Y.; Lee, J. Y.

    2003-01-01

    In a circumstance that well-established containment pressure/temperature analysis code, CONTEMPT-LT treats the reactor containment as a single volume, this study introduces, as an alternative, the GOTHIC code for an usage on multi-compartmental containment performance analysis. With a developed GOTHIC methodology, its applicability is verified for containment performance analysis for Korean Nuclear Unit 1. The GOTHIC model for this plant is simply composed of 3 compartments including the reactor containment and RWST. In addition, the containment spray system and containment recirculation system are simulated. As a result of GOTHIC calculation, under the same assumptions and conditions as those in CONTEMPT-LT, the GOTHIC prediction shows a very good result; pressure and temperature transients including their peaks are nearly the same. It can be concluded that the GOTHIC could provide reasonable containment pressure and temperature responses if considering the inherent conservatism in CONTEMPT-LT code

  18. Amtrak performance tracking (APT) system : methodology summary

    Science.gov (United States)

    2017-09-15

    The Volpe Center collaborated with Amtrak and the Federal Railroad Administration (FRA) to develop a cost accounting system named Amtrak Performance Tracking (APT) used by Amtrak to manage, allocate, and report its costs. APTs initial development ...

  19. Development and testing of the methodology for performance requirements

    International Nuclear Information System (INIS)

    Rivers, J.D.

    1989-01-01

    The U.S. Department of Energy (DOE) is in the process of implementing a set of materials control and accountability (MC ampersand A) performance requirements. These graded requirements set a uniform level of performance for similar materials at various facilities against the threat of an insider adversary stealing special nuclear material (SNM). These requirements are phrased in terms of detecting the theft of a goal quantity of SNM within a specified time period and with a probability greater than or equal to a special value and include defense-in-depth requirements. The DOE has conducted an extensive effort over the last 2 1/2 yr to develop a practical methodology to be used in evaluating facility performance against the performance requirements specified in DOE order 5633.3. The major participants in the development process have been the Office of Safeguards and Security (OSS), Brookhaven National Laboratory, and Los Alamos National Laboratory. The process has included careful reviews of related evaluation systems, a review of the intent of the requirements in the order, and site visits to most of the major facilities in the DOE complex. As a result of this extensive effort to develop guidance for the MC ampersand A performance requirements, OSS was able to provide a practical method that will allow facilities to evaluate the performance of their safeguards systems against the performance requirements. In addition, the evaluations can be validated by the cognizant operations offices in a systematic manner

  20. Optimization of Gluten-Free Tulumba Dessert Formulation Including Corn Flour: Response Surface Methodology Approach

    Directory of Open Access Journals (Sweden)

    Yildiz Önder

    2017-03-01

    Full Text Available Tulumba dessert is widely preferred in Turkey; however, it cannot be consumed by celiac patients because it includes gluten. The diversity of gluten-free products should be expanded so that celiac patients may meet their daily needs regularly. In this study, corn flour (CF / potato starch (PS blend to be used in the gluten-free tulumba dessert formulation was optimized using the Response Surface Methodology (RSM. Increasing ratio of PS in the CF-PS led to a decrease in hardness of the dessert and to an increase in expansion, viscosity, adhesiveness, yield of dessert both with and without syrup (P0.05, additionally these desserts had a much higher sensory score compared to the control sample in terms of the overall quality and pore structure (P<0.05.

  1. A methodology for including wall roughness effects in k-ε low-Reynolds turbulence models

    International Nuclear Information System (INIS)

    Ambrosini, W.; Pucciarelli, A.; Borroni, I.

    2015-01-01

    Highlights: • A model for taking into account wall roughness in low-Reynolds k-ε models is presented. • The model is subjected to a first validation to show its potential in general applications. • The application of the model in predicting heat transfer to supercritical fluids is also discussed. - Abstract: A model accounting for wall roughness effects in k-ε low-Reynolds turbulence models is described in the present paper. In particular, the introduction in the transport equations of k and ε of additional source terms related to roughness, based on simple assumptions and dimensional relationships, is proposed. An objective of the present paper, in addition to obtaining more realistic predictions of wall friction, is the application of the proposed model to the study of heat transfer to supercritical fluids. A first validation of the model is reported. The model shows the capability of predicting, at least qualitatively, some of the most important trends observed when dealing with rough pipes in very different flow conditions. Qualitative comparisons with some DNS data available in literature are also performed. Further analyses provided promising results concerning the ability of the model in reproducing the trend of friction factor when varying the flow conditions, though improvements are necessary for achieving better quantitative accuracy. First applications of the model in simulating heat transfer to supercritical fluids are also described, showing the capability of the model to affect the predictions of these heat transfer phenomena, in particular in the vicinity of the pseudo-critical conditions. A more extended application of the model to relevant deteriorated heat transfer conditions will clarify the usefulness of this modelling methodology in improving predictions of these difficult phenomena. Whatever the possible success in this particular application that motivated its development, this approach suggests a general methodology for accounting

  2. Interim performance criteria for photovoltaic energy systems. [Glossary included

    Energy Technology Data Exchange (ETDEWEB)

    DeBlasio, R.; Forman, S.; Hogan, S.; Nuss, G.; Post, H.; Ross, R.; Schafft, H.

    1980-12-01

    This document is a response to the Photovoltaic Research, Development, and Demonstration Act of 1978 (P.L. 95-590) which required the generation of performance criteria for photovoltaic energy systems. Since the document is evolutionary and will be updated, the term interim is used. More than 50 experts in the photovoltaic field have contributed in the writing and review of the 179 performance criteria listed in this document. The performance criteria address characteristics of present-day photovoltaic systems that are of interest to manufacturers, government agencies, purchasers, and all others interested in various aspects of photovoltaic system performance and safety. The performance criteria apply to the system as a whole and to its possible subsystems: array, power conditioning, monitor and control, storage, cabling, and power distribution. They are further categorized according to the following performance attributes: electrical, thermal, mechanical/structural, safety, durability/reliability, installation/operation/maintenance, and building/site. Each criterion contains a statement of expected performance (nonprescriptive), a method of evaluation, and a commentary with further information or justification. Over 50 references for background information are also given. A glossary with definitions relevant to photovoltaic systems and a section on test methods are presented in the appendices. Twenty test methods are included to measure performance characteristics of the subsystem elements. These test methods and other parts of the document will be expanded or revised as future experience and needs dictate.

  3. Development of Constraint Force Equation Methodology for Application to Multi-Body Dynamics Including Launch Vehicle Stage Seperation

    Science.gov (United States)

    Pamadi, Bandu N.; Toniolo, Matthew D.; Tartabini, Paul V.; Roithmayr, Carlos M.; Albertson, Cindy W.; Karlgaard, Christopher D.

    2016-01-01

    The objective of this report is to develop and implement a physics based method for analysis and simulation of multi-body dynamics including launch vehicle stage separation. The constraint force equation (CFE) methodology discussed in this report provides such a framework for modeling constraint forces and moments acting at joints when the vehicles are still connected. Several stand-alone test cases involving various types of joints were developed to validate the CFE methodology. The results were compared with ADAMS(Registered Trademark) and Autolev, two different industry standard benchmark codes for multi-body dynamic analysis and simulations. However, these two codes are not designed for aerospace flight trajectory simulations. After this validation exercise, the CFE algorithm was implemented in Program to Optimize Simulated Trajectories II (POST2) to provide a capability to simulate end-to-end trajectories of launch vehicles including stage separation. The POST2/CFE methodology was applied to the STS-1 Space Shuttle solid rocket booster (SRB) separation and Hyper-X Research Vehicle (HXRV) separation from the Pegasus booster as a further test and validation for its application to launch vehicle stage separation problems. Finally, to demonstrate end-to-end simulation capability, POST2/CFE was applied to the ascent, orbit insertion, and booster return of a reusable two-stage-to-orbit (TSTO) vehicle concept. With these validation exercises, POST2/CFE software can be used for performing conceptual level end-to-end simulations, including launch vehicle stage separation, for problems similar to those discussed in this report.

  4. Design methodology to enhance high impedance surfaces performances

    Directory of Open Access Journals (Sweden)

    M. Grelier

    2014-04-01

    Full Text Available A methodology is introduced for designing wideband, compact and ultra-thin high impedance surfaces (HIS. A parametric study is carried out to examine the effect of the periodicity on the electromagnetic properties of an HIS. This approach allows designers to reach the best trade-off for HIS performances.

  5. Human Schedule Performance, Protocol Analysis, and the "Silent Dog" Methodology

    Science.gov (United States)

    Cabello, Francisco; Luciano, Carmen; Gomez, Inmaculada; Barnes-Holmes, Dermot

    2004-01-01

    The purpose of the current experiment was to investigate the role of private verbal behavior on the operant performances of human adults, using a protocol analysis procedure with additional methodological controls (the "silent dog" method). Twelve subjects were exposed to fixed ratio 8 and differential reinforcement of low rate 3-s schedules. For…

  6. A Novel Water Supply Network Sectorization Methodology Based on a Complete Economic Analysis, Including Uncertainties

    Directory of Open Access Journals (Sweden)

    Enrique Campbell

    2016-04-01

    Full Text Available The core idea behind sectorization of Water Supply Networks (WSNs is to establish areas partially isolated from the rest of the network to improve operational control. Besides the benefits associated with sectorization, some drawbacks must be taken into consideration by water operators: the economic investment associated with both boundary valves and flowmeters and the reduction of both pressure and system resilience. The target of sectorization is to properly balance these negative and positive aspects. Sectorization methodologies addressing the economic aspects mainly consider costs of valves and flowmeters and of energy, and the benefits in terms of water saving linked to pressure reduction. However, sectorization entails other benefits, such as the reduction of domestic consumption, the reduction of burst frequency and the enhanced capacity to detect and intervene over future leakage events. We implement a development proposed by the International Water Association (IWA to estimate the aforementioned benefits. Such a development is integrated in a novel sectorization methodology based on a social network community detection algorithm, combined with a genetic algorithm optimization method and Monte Carlo simulation. The methodology is implemented over a fraction of the WSN of Managua city, capital of Nicaragua, generating a net benefit of 25,572 $/year.

  7. Evaluation of analytical performance based on partial order methodology.

    Science.gov (United States)

    Carlsen, Lars; Bruggemann, Rainer; Kenessova, Olga; Erzhigitov, Erkin

    2015-01-01

    Classical measurements of performances are typically based on linear scales. However, in analytical chemistry a simple scale may be not sufficient to analyze the analytical performance appropriately. Here partial order methodology can be helpful. Within the context described here, partial order analysis can be seen as an ordinal analysis of data matrices, especially to simplify the relative comparisons of objects due to their data profile (the ordered set of values an object have). Hence, partial order methodology offers a unique possibility to evaluate analytical performance. In the present data as, e.g., provided by the laboratories through interlaboratory comparisons or proficiency testings is used as an illustrative example. However, the presented scheme is likewise applicable for comparison of analytical methods or simply as a tool for optimization of an analytical method. The methodology can be applied without presumptions or pretreatment of the analytical data provided in order to evaluate the analytical performance taking into account all indicators simultaneously and thus elucidating a "distance" from the true value. In the present illustrative example it is assumed that the laboratories analyze a given sample several times and subsequently report the mean value, the standard deviation and the skewness, which simultaneously are used for the evaluation of the analytical performance. The analyses lead to information concerning (1) a partial ordering of the laboratories, subsequently, (2) a "distance" to the Reference laboratory and (3) a classification due to the concept of "peculiar points". Copyright © 2014 Elsevier B.V. All rights reserved.

  8. Guidance on the Technology Performance Level (TPL) Assessment Methodology

    Energy Technology Data Exchange (ETDEWEB)

    Weber, Jochem [National Renewable Energy Lab. (NREL), Golden, CO (United States); Roberts, Jesse D. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Babarit, Aurelien [Ecole Centrale de Nantes (France). Lab. of Research in Hydrodynamics, Energetics and Atmospheric Environment (LHEEA); Costello, Ronan [Wave Venture, Penstraze (United Kingdom); Bull, Diana L. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Neilson, Kim [Ramboll, Copenhagen (Denmark); Bittencourt, Claudio [DNV GL, London (United Kingdom); Kennedy, Ben [Wave Venture, Penstraze (United Kingdom); Malins, Robert Joseph [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Dykes, Katherine [National Renewable Energy Lab. (NREL), Golden, CO (United States)

    2016-09-01

    This document presents the revised Technology Performance Level (TPL) assessment methodology. There are three parts to this revised methodology 1) the Stakeholder Needs and Assessment Guidance (this document), 2) the Technical Submission form, 3) the TPL scoring spreadsheet. The TPL assessment is designed to give a technology neutral or agnostic assessment of any wave energy converter technology. The focus of the TPL is on the performance of the technology in meeting the customer’s needs. The original TPL is described in [1, 2] and those references also detail the critical differences in the nature of the TPL when compared to the more widely used technology readiness level (TRL). (Wave energy TRL is described in [3]). The revised TPL is particularly intended to be useful to investors and also to assist technology developers to conduct comprehensive assessments in a way that is meaningful and attractive to investors. The revised TPL assessment methodology has been derived through a structured Systems Engineering approach. This was a formal process which involved analyzing customer and stakeholder needs through the discipline of Systems Engineering. The results of the process confirmed the high level of completeness of the original methodology presented in [1] (as used in the Wave Energy Prize judging) and now add a significantly increased level of detail in the assessment and an improved more investment focused structure. The revised TPL also incorporates the feedback of the Wave Energy Prize judges.

  9. Development of Testing Methodologies to Evaluate Postflight Locomotor Performance

    Science.gov (United States)

    Mulavara, A. P.; Peters, B. T.; Cohen, H. S.; Richards, J. T.; Miller, C. A.; Brady, R.; Warren, L. E.; Bloomberg, J. J.

    2006-01-01

    Crewmembers experience locomotor and postural instabilities during ambulation on Earth following their return from space flight. Gait training programs designed to facilitate recovery of locomotor function following a transition to a gravitational environment need to be accompanied by relevant assessment methodologies to evaluate their efficacy. The goal of this paper is to demonstrate the operational validity of two tests of locomotor function that were used to evaluate performance after long duration space flight missions on the International Space Station (ISS).

  10. Performance Testing Methodology for Safety-Critical Programmable Logic Controller

    International Nuclear Information System (INIS)

    Kim, Chang Ho; Oh, Do Young; Kim, Ji Hyeon; Kim, Sung Ho; Sohn, Se Do

    2009-01-01

    The Programmable Logic Controller (PLC) for use in Nuclear Power Plant safety-related applications is being developed and tested first time in Korea. This safety-related PLC is being developed with requirements of regulatory guideline and industry standards for safety system. To test that the quality of the developed PLC is sufficient to be used in safety critical system, document review and various product testings were performed over the development documents for S/W, H/W, and V/V. This paper provides the performance testing methodology and its effectiveness for PLC platform conducted by KOPEC

  11. A tool for standardized collector performance calculations including PVT

    DEFF Research Database (Denmark)

    Perers, Bengt; Kovacs, Peter; Olsson, Marcus

    2012-01-01

    A tool for standardized calculation of solar collector performance has been developed in cooperation between SP Technical Research Institute of Sweden, DTU Denmark and SERC Dalarna University. The tool is designed to calculate the annual performance of solar collectors at representative locations...... can be tested and modeled as a thermal collector, when the PV electric part is active with an MPP tracker in operation. The thermal collector parameters from this operation mode are used for the PVT calculations....

  12. Candidate genes for performance in horses, including monocarboxylate transporters

    Directory of Open Access Journals (Sweden)

    Inaê Cristina Regatieri

    Full Text Available ABSTRACT: Some horse breeds are highly selected for athletic activities. The athletic potential of each animal can be measured by its performance in sports. High athletic performance depends on the animal capacity to produce energy through aerobic and anaerobic metabolic pathways, among other factors. Transmembrane proteins called monocarboxylate transporters, mainly the isoform 1 (MCT1 and its ancillary protein CD147, can help the organism to adapt to physiological stress caused by physical exercise, transporting lactate and H+ ions. Horse breeds are selected for different purposes so we might expect differences in the amount of those proteins and in the genotypic frequencies for genes that play a significant role in the performance of the animals. The study of MCT1 and CD147 gene polymorphisms, which can affect the formation of the proteins and transport of lactate and H+, can provide enough information to be used for selection of athletic horses increasingly resistant to intense exercise. Two other candidate genes, the PDK4 and DMRT3, have been associated with athletic potential and indicated as possible markers for performance in horses. The oxidation of fatty acids is highly effective in generating ATP and is controlled by the expression of PDK4 (pyruvate dehydrogenase kinase, isozyme 4 in skeletal muscle during and after exercise. The doublesex and mab-3 related transcription factor 3 (DMRT3 gene encodes an important transcription factor in the setting of spinal cord circuits controlling movement in vertebrates and may be associated with gait performance in horses. This review describes how the monocarboxylate transporters work during physical exercise in athletic horses and the influence of polymorphisms in candidate genes for athletic performance in horses.

  13. Socio-cultural determinants of child mortality in southern Peru: including some methodological considerations.

    Science.gov (United States)

    de Meer, K; Bergman, R; Kusner, J S

    1993-02-01

    Among Amerindian children living at high altitude in the Andes in southern Peru, high child mortality rates have been reported in the literature, especially in the perinatal and neonatal period. We compared mortality rates in children calculated from retrospective survey data in 86 rural families from 2 Aymara and 3 Quechua peasant communities living at the same level of altitude (3825 m) in southern Peru. Relations between land tenure, socio-cultural factors and child mortality were studied, and methodological considerations in this field of interest are discussed. Checks on consistency of empirical data showed evidence for underreporting of neonatal female deaths with birth order 3 and more. Perinatal (124 vs 34 per 1000 births) and infant mortality (223 vs 111 per 1000 live births) was significantly higher in Aymara compared with Quechua children, but no difference was found after the first year of life. A short pregnancy interval was associated with an elevated perinatal and infant mortality rate, and a similar albeit insignificant association was found with increased maternal age. Amount of land owned and birth order were not related with child mortality. Although levels of maternal education are generally low in both cultures, a consistent decline in infant and child mortality was found with the amount of years mothers had attended school. However, the results suggest a U-shaped relationship between the amount of years of parental education and perinatal mortality in offspring. Late fetal and early neonatal mortality were particularly high in one Aymara community where mothers were found to have more years of education. Infanticide, a known phenomenon in the highlands of the Andes, is discussed in relation with the findings of the study. Although maternal and child health services are utilized by the majority of families in 4 of 5 study communities, 43 of 51 mothers under the age of 45 years reported that they delivered their last baby in the absence of

  14. Methodology for quantitative evalution of diagnostic performance. Project III

    International Nuclear Information System (INIS)

    Metz, C.E.

    1985-01-01

    Receiver Operation Characteristic (ROC) methodology is now widely recognized as the most satisfactory approach to the problem of measuring and specifying the performance of a diagnostic procedure. The primary advantage of ROC analysis over alternative methodologies is that it seperates differences in diagnostic accuracy that are due to actual differences in discrimination capacity from those that are due to decision threshold effects. Our effort during the past year has been devoted to developing digital computer programs for fitting ROC curves to diagnostic data by maximum likelihood estimation and to developing meaningful and valid statistical tests for assessing the significance of apparent differences between measured ROC curves. FORTRAN programs previously written here for ROC curve fitting and statistical testing have been refined to make them easier to use and to allow them to be run in a large variety of computer systems. We have attempted also to develop two new curve-fitting programs: one for conventional ROC data that assumes a different functional form for the ROC curve, and one that can be used for ''free-response'' ROC data. Finally, we have cooperated with other investigators to apply our techniques to analyze ROC data generated in clinical studies, and we have sought to familiarize the medical community with the advantages of ROC methodology. 36 ref

  15. Design Methodology and Performance Evaluation of New Generation Sounding Rockets

    Directory of Open Access Journals (Sweden)

    Marco Pallone

    2018-01-01

    Full Text Available Sounding rockets are currently deployed for the purpose of providing experimental data of the upper atmosphere, as well as for microgravity experiments. This work provides a methodology in order to design, model, and evaluate the performance of new sounding rockets. A general configuration composed of a rocket with four canards and four tail wings is sized and optimized, assuming different payload masses and microgravity durations. The aerodynamic forces are modeled with high fidelity using the interpolation of available data. Three different guidance algorithms are used for the trajectory integration: constant attitude, near radial, and sun-pointing. The sun-pointing guidance is used to obtain the best microgravity performance while maintaining a specified attitude with respect to the sun, allowing for experiments which are temperature sensitive. Near radial guidance has instead the main purpose of reaching high altitudes, thus maximizing the microgravity duration. The results prove that the methodology at hand is straightforward to implement and capable of providing satisfactory performance in term of microgravity duration.

  16. Calibration Modeling Methodology to Optimize Performance for Low Range Applications

    Science.gov (United States)

    McCollum, Raymond A.; Commo, Sean A.; Parker, Peter A.

    2010-01-01

    Calibration is a vital process in characterizing the performance of an instrument in an application environment and seeks to obtain acceptable accuracy over the entire design range. Often, project requirements specify a maximum total measurement uncertainty, expressed as a percent of full-scale. However in some applications, we seek to obtain enhanced performance at the low range, therefore expressing the accuracy as a percent of reading should be considered as a modeling strategy. For example, it is common to desire to use a force balance in multiple facilities or regimes, often well below its designed full-scale capacity. This paper presents a general statistical methodology for optimizing calibration mathematical models based on a percent of reading accuracy requirement, which has broad application in all types of transducer applications where low range performance is required. A case study illustrates the proposed methodology for the Mars Entry Atmospheric Data System that employs seven strain-gage based pressure transducers mounted on the heatshield of the Mars Science Laboratory mission.

  17. TOOLS TO INCLUDE BLIND STUDENTS IN SCHOOL BUILDING PERFORMANCE ASSESSMENTS

    Directory of Open Access Journals (Sweden)

    Tania Pietzschke Abate

    2016-05-01

    Full Text Available This article discusses the design of data collection instruments that include the opinions of blind students, in accordance with the principles of Universal Design (UD. The aim of this study is to understand the importance of adapting data collection instruments for the inclusion of disabled persons in field research in Architecture and Design, among other fields. The data collection instruments developed were a play interview with a tactile map and a 3D survey with the use of tactile models. These instruments sought to assess the school environment experienced by blind students. The study involved students from the early years of a school for the blind who had not yet mastered the Braille system. The participation of these students was evaluated. A multidisciplinary team consisting of architects, designers, educators, and psychologists lent support to the study. The results showed that the data collection instruments adapted to blind students were successful in making the group of authors examine questions regarding UD. An analysis of the participatory phase showed that the limitations resulting from blindness determine the specificities in the adaptation and implementation process of the instruments in schools. Practical recommendations for future studies related to instruments in the UD thematic are presented. This approach is in line with the global trend of including disabled persons in society based on these users’ opinions concerning what was designed by architects and designers.

  18. A performance assessment methodology for high-level radioactive waste disposal in unsaturated, fractured tuff

    International Nuclear Information System (INIS)

    Gallegos, D.P.

    1991-07-01

    Sandia National Laboratories, has developed a methodology for performance assessment of deep geologic disposal of high-level nuclear waste. The applicability of this performance assessment methodology has been demonstrated for disposal in bedded salt and basalt; it has since been modified for assessment of repositories in unsaturated, fractured tuff. Changes to the methodology are primarily in the form of new or modified ground water flow and radionuclide transport codes. A new computer code, DCM3D, has been developed to model three-dimensional ground-water flow in unsaturated, fractured rock using a dual-continuum approach. The NEFTRAN 2 code has been developed to efficiently model radionuclide transport in time-dependent velocity fields, has the ability to use externally calculated pore velocities and saturations, and includes the effect of saturation dependent retardation factors. In order to use these codes together in performance-assessment-type analyses, code-coupler programs were developed to translate DCM3D output into NEFTRAN 2 input. Other portions of the performance assessment methodology were evaluated as part of modifying the methodology for tuff. The scenario methodology developed under the bedded salt program has been applied to tuff. An investigation of the applicability of uncertainty and sensitivity analysis techniques to non-linear models indicate that Monte Carlo simulation remains the most robust technique for these analyses. No changes have been recommended for the dose and health effects models, nor the biosphere transport models. 52 refs., 1 fig

  19. Quality control of CT units - methodology of performance I

    International Nuclear Information System (INIS)

    Prlic, I.; Radalj, Z.

    1996-01-01

    Increasing use of x-ray computed tomography systems (CT scanners) in the diagnostic requires an efficient means of evaluating the performance of them. Therefore, this paper presents the way to measure (Quality Control procedure-Q/C) and define the CT scanner performance through a special phantom which is based on the recommendation of the American association of Physicists in Medicine (AAPM). The performance parameters measurable with the phantom represent the capability, so periodical evaluation of the parameters enable the users to recognize the stability of the CT scanner no matter on the manufacturer, model or software option of the scanner. There are five important performance parameters which are to be measured: Noise, Contrast scale, Nominal tomographic section thickness, High and Low contrast resolution (MTF). The sixth parameter is, of course the dose per scan and slice which gives the patient dose for the certain diagnostic procedure. The last but not the least parameter is the final image quality which is given through the image processing device connected to the scanner. This is the final medical information needed for the good medical practice according to the Quality Assurance (Q/A) procedures in diagnostic radiology. We have to assure the results of the performance evaluation without environmental influences (the measurements are to be made under the certain conditions according Q/A). This paper will give no detailed methodology recipe but will show on the one example; the system noise measurements and linearity; the need and relevant results of the measurements.1 The rest of the methodology is to be published. (author)

  20. Design Methodology And Performance Studies Of A Flexible Electrotextile Surface

    Directory of Open Access Journals (Sweden)

    Kayacan Ozan

    2015-09-01

    Full Text Available ‘The smart textiles’ concept has to develop products based not only on design, fashion and comfort but also in terms of functions. The novel electro-textiles in the market open up new trends in smart and interactive gadgets. ‘Easy to care and durability’ properties are among the most important features of these products. On the other hand, wearable electronic knitwear has been gaining the attention of both researchers and industrial sectors. Combining knitting technology with electronics may become a dominant trend in the future because of the wide application possibilities. This research is concerned primarily with the design methodology of knitted fabrics containing electrically conductive textiles and especially in-use performance studies. The structural characteristics of the fabrics have been evaluated to enhance the performance properties.

  1. Performance evaluation methodology for historical document image binarization.

    Science.gov (United States)

    Ntirogiannis, Konstantinos; Gatos, Basilis; Pratikakis, Ioannis

    2013-02-01

    Document image binarization is of great importance in the document image analysis and recognition pipeline since it affects further stages of the recognition process. The evaluation of a binarization method aids in studying its algorithmic behavior, as well as verifying its effectiveness, by providing qualitative and quantitative indication of its performance. This paper addresses a pixel-based binarization evaluation methodology for historical handwritten/machine-printed document images. In the proposed evaluation scheme, the recall and precision evaluation measures are properly modified using a weighting scheme that diminishes any potential evaluation bias. Additional performance metrics of the proposed evaluation scheme consist of the percentage rates of broken and missed text, false alarms, background noise, character enlargement, and merging. Several experiments conducted in comparison with other pixel-based evaluation measures demonstrate the validity of the proposed evaluation scheme.

  2. Application of transient analysis methodology to heat exchanger performance monitoring

    International Nuclear Information System (INIS)

    Rampall, I.; Soler, A.I.; Singh, K.P.; Scott, B.H.

    1994-01-01

    A transient testing technique is developed to evaluate the thermal performance of industrial scale heat exchangers. A Galerkin-based numerical method with a choice of spectral basis elements to account for spatial temperature variations in heat exchangers is developed to solve the transient heat exchanger model equations. Testing a heat exchanger in the transient state may be the only viable alternative where conventional steady state testing procedures are impossible or infeasible. For example, this methodology is particularly suited to the determination of fouling levels in component cooling water system heat exchangers in nuclear power plants. The heat load on these so-called component coolers under steady state conditions is too small to permit meaningful testing. An adequate heat load develops immediately after a reactor shutdown when the exchanger inlet temperatures are highly time-dependent. The application of the analysis methodology is illustrated herein with reference to an in-situ transient testing carried out at a nuclear power plant. The method, however, is applicable to any transient testing application

  3. Methodology for assessing performance of waste management systems

    International Nuclear Information System (INIS)

    Meshkov, N.K.; Herzenberg, C.L.; Camasta, S.F.

    1988-01-01

    The newly revised draft DOE Order 5820.2, Chapter 3, requires that DOE low-level waste shall be managed on a systematic basis using the most appropriate combination of waste generation reduction, segregation, treatment, and disposal practices so that the radioactive components are contained and the overall cost effectiveness is minimized. This order expects each site to prepare and maintain an overall waste management systems performance assessment supporting the combination of waste management practices used in generation reduction segregation, treatment, packaging, storage, and disposal. A document prepared by EG and G Idaho, Inc. for the Department of Energy called Guidance for Conduct of Waste Management Systems Performance Assessment is specifically intended to provide the approach necessary to meet the systems performance assessment requirement of DOE Order 5820.2, Chapter 3, and other applicable state regulations dealing with LLW (low-level radioactive wastes). Methods and procedures are needed for assessing the performance of a waste management system. This report addresses this need. The purpose of the methodology provided in this report is to select the optimal way to manage particular sets of waste streams from generation to disposal in a safe and cost-effective manner, and thereby assist the DOE LLW mangers in complying with the DOE Order 5820.2, Chapter 3, and the associated guidance document

  4. An integrated methodology for the dynamic performance and reliability evaluation of fault-tolerant systems

    International Nuclear Information System (INIS)

    Dominguez-Garcia, Alejandro D.; Kassakian, John G.; Schindall, Joel E.; Zinchuk, Jeffrey J.

    2008-01-01

    We propose an integrated methodology for the reliability and dynamic performance analysis of fault-tolerant systems. This methodology uses a behavioral model of the system dynamics, similar to the ones used by control engineers to design the control system, but also incorporates artifacts to model the failure behavior of each component. These artifacts include component failure modes (and associated failure rates) and how those failure modes affect the dynamic behavior of the component. The methodology bases the system evaluation on the analysis of the dynamics of the different configurations the system can reach after component failures occur. For each of the possible system configurations, a performance evaluation of its dynamic behavior is carried out to check whether its properties, e.g., accuracy, overshoot, or settling time, which are called performance metrics, meet system requirements. Markov chains are used to model the stochastic process associated with the different configurations that a system can adopt when failures occur. This methodology not only enables an integrated framework for evaluating dynamic performance and reliability of fault-tolerant systems, but also enables a method for guiding the system design process, and further optimization. To illustrate the methodology, we present a case-study of a lateral-directional flight control system for a fighter aircraft

  5. Development of Human Performance Analysis and Advanced HRA Methodology

    Energy Technology Data Exchange (ETDEWEB)

    Jung, Won Dea; Park, Jin Kyun; Kim, Jae Whan; Kim, Seong Whan; Kim, Man Cheol; Ha, Je Joo

    2007-06-15

    The purpose of this project is to build a systematic framework that can evaluate the effect of human factors related problems on the safety of nuclear power plants (NPPs) as well as develop a technology that can be used to enhance human performance. The research goal of this project is twofold: (1) the development of a human performance database and a framework to enhance human performance, and (2) the analysis of human error with constructing technical basis for human reliability analysis. There are three kinds of main results of this study. The first result is the development of a human performance database, called OPERA-I/II (Operator Performance and Reliability Analysis, Part I and Part II). In addition, a standard communication protocol was developed based on OPERA to reduce human error caused from communication error in the phase of event diagnosis. Task complexity (TACOM) measure and the methodology of optimizing diagnosis procedures were also finalized during this research phase. The second main result is the development of a software, K-HRA, which is to support the standard HRA method. Finally, an advanced HRA method named as AGAPE-ET was developed by combining methods MDTA (misdiagnosis tree analysis technique) and K-HRA, which can be used to analyze EOC (errors of commission) and EOO (errors of ommission). These research results, such as OPERA-I/II, TACOM, a standard communication protocol, K-HRA and AGAPE-ET methods will be used to improve the quality of HRA and to enhance human performance in nuclear power plants.

  6. Development of Human Performance Analysis and Advanced HRA Methodology

    International Nuclear Information System (INIS)

    Jung, Won Dea; Park, Jin Kyun; Kim, Jae Whan; Kim, Seong Whan; Kim, Man Cheol; Ha, Je Joo

    2007-06-01

    The purpose of this project is to build a systematic framework that can evaluate the effect of human factors related problems on the safety of nuclear power plants (NPPs) as well as develop a technology that can be used to enhance human performance. The research goal of this project is twofold: (1) the development of a human performance database and a framework to enhance human performance, and (2) the analysis of human error with constructing technical basis for human reliability analysis. There are three kinds of main results of this study. The first result is the development of a human performance database, called OPERA-I/II (Operator Performance and Reliability Analysis, Part I and Part II). In addition, a standard communication protocol was developed based on OPERA to reduce human error caused from communication error in the phase of event diagnosis. Task complexity (TACOM) measure and the methodology of optimizing diagnosis procedures were also finalized during this research phase. The second main result is the development of a software, K-HRA, which is to support the standard HRA method. Finally, an advanced HRA method named as AGAPE-ET was developed by combining methods MDTA (misdiagnosis tree analysis technique) and K-HRA, which can be used to analyze EOC (errors of commission) and EOO (errors of ommission). These research results, such as OPERA-I/II, TACOM, a standard communication protocol, K-HRA and AGAPE-ET methods will be used to improve the quality of HRA and to enhance human performance in nuclear power plants

  7. Caffeine and cognitive performance: persistent methodological challenges in caffeine research.

    Science.gov (United States)

    James, Jack E

    2014-09-01

    Human cognitive performance is widely perceived to be enhanced by caffeine at usual dietary doses. However, the evidence for and against this belief continues to be vigorously contested. Controversy has centred on caffeine withdrawal and withdrawal reversal as potential sources of experimental confounding. In response, some researchers have enlisted "caffeine-naïve" experimental participants (persons alleged to consume little or no caffeine) assuming that they are not subject to withdrawal. This mini-review examines relevant research to illustrate general methodological challenges that have been the cause of enduring confusion in caffeine research. At issue are the processes of caffeine withdrawal and withdrawal reversal, the definition of caffeine-naïve, the population representativeness of participants deemed to be caffeine-naïve, and confounding due to caffeine tolerance. Attention to these processes is necessary if premature conclusions are to be avoided, and if caffeine's complex effects and the mechanisms responsible for those effects are to be illuminated. Strategies are described for future caffeine research aimed at minimising confounding from withdrawal and withdrawal reversal. Copyright © 2014 Elsevier Inc. All rights reserved.

  8. Integrated cost estimation methodology to support high-performance building design

    Energy Technology Data Exchange (ETDEWEB)

    Vaidya, Prasad; Greden, Lara; Eijadi, David; McDougall, Tom [The Weidt Group, Minnetonka (United States); Cole, Ray [Axiom Engineers, Monterey (United States)

    2007-07-01

    Design teams evaluating the performance of energy conservation measures (ECMs) calculate energy savings rigorously with established modelling protocols, accounting for the interaction between various measures. However, incremental cost calculations do not have a similar rigor. Often there is no recognition of cost reductions with integrated design, nor is there assessment of cost interactions amongst measures. This lack of rigor feeds the notion that high-performance buildings cost more, creating a barrier for design teams pursuing aggressive high-performance outcomes. This study proposes an alternative integrated methodology to arrive at a lower perceived incremental cost for improved energy performance. The methodology is based on the use of energy simulations as means towards integrated design and cost estimation. Various points along the spectrum of integration are identified and characterized by the amount of design effort invested, the scheduling of effort, and relative energy performance of the resultant design. It includes a study of the interactions between building system parameters as they relate to capital costs. Several cost interactions amongst energy measures are found to be significant.The value of this approach is demonstrated with alternatives in a case study that shows the differences between perceived costs for energy measures along various points on the integration spectrum. These alternatives show design tradeoffs and identify how decisions would have been different with a standard costing approach. Areas of further research to make the methodology more robust are identified. Policy measures to encourage the integrated approach and reduce the barriers towards improved energy performance are discussed.

  9. Lean methodology for performance improvement in the trauma discharge process.

    Science.gov (United States)

    O'Mara, Michael Shaymus; Ramaniuk, Aliaksandr; Graymire, Vickie; Rozzell, Monica; Martin, Stacey

    2014-07-01

    High-volume, complex services such as trauma and acute care surgery are at risk for inefficiency. Lean process improvement can reduce health care waste. Lean allows a structured look at processes not easily amenable to analysis. We applied lean methodology to the current state of communication and discharge planning on an urban trauma service, citing areas for improvement. A lean process mapping event was held. The process map was used to identify areas for immediate analysis and intervention-defining metrics for the stakeholders. After intervention, new performance was assessed by direct data evaluation. The process was completed with an analysis of effect and plans made for addressing future focus areas. The primary area of concern identified was interservice communication. Changes centering on a standardized morning report structure reduced the number of consult questions unanswered from 67% to 34% (p = 0.0021). Physical therapy rework was reduced from 35% to 19% (p = 0.016). Patients admitted to units not designated to the trauma service had 1.6 times longer stays (p miscommunication exists around patient education at discharge. Lean process improvement is a viable means of health care analysis. When applied to a trauma service with 4,000 admissions annually, lean identifies areas ripe for improvement. Our inefficiencies surrounded communication and patient localization. Strategies arising from the input of all stakeholders led to real solutions for communication through a face-to-face morning report and identified areas for ongoing improvement. This focuses resource use and identifies areas for improvement of throughput in care delivery.

  10. A Proposal for a Methodology to Develop a Cyber-Attack Penetration Test Scenario Including NPPs Safety

    Energy Technology Data Exchange (ETDEWEB)

    Lee, In Hyo [KAIST, Daejeon (Korea, Republic of); Son, Han Seong [Joongbu Univ., Geumsan (Korea, Republic of); Kim, Si Won [Korea Institute of Nuclear Nonproliferation and Control, Daejeon (Korea, Republic of); Kang, Hyun Gook [Rensselaer Polytechnic Institute, Troy (United States)

    2016-10-15

    Penetration test is a method to evaluate the cyber security of NPPs; so, this approach was performed in some studies. Because they focused on vulnerability finding or test bed construction, scenario based approach was not performed. However, to test the cyber security of NPPs, a proper test scenario should be needed. Ahn et al. developed cyber-attack scenarios but those scenarios couldn't be applied in penetration test because they developed the scenarios based on past incidents of NPPs induced by cyber-attack. That is, those scenarios only covered scenarios which were happened before; so, they couldn't cover other various scenarios and couldn't reflect them into a penetration test. In this study, a method to develop a cyber-attack penetration test scenario of NPPs especially focused on safety point of view is suggested. To evaluate the cyber security of NPPs, penetration test can be a possible way. In this study, a method to develop a penetration test scenario was explained. Especially, the goal of hacker was focused on nuclear fuel integrity deterioration. So, in the methodology, Level 1 PSA results were utilized to reflect plant safety into the security. From the PSA results, basic event was post processed and possible cyber-attacks were reviewed with vulnerabilities of digital control system.

  11. A Proposal for a Methodology to Develop a Cyber-Attack Penetration Test Scenario Including NPPs Safety

    International Nuclear Information System (INIS)

    Lee, In Hyo; Son, Han Seong; Kim, Si Won; Kang, Hyun Gook

    2016-01-01

    Penetration test is a method to evaluate the cyber security of NPPs; so, this approach was performed in some studies. Because they focused on vulnerability finding or test bed construction, scenario based approach was not performed. However, to test the cyber security of NPPs, a proper test scenario should be needed. Ahn et al. developed cyber-attack scenarios but those scenarios couldn't be applied in penetration test because they developed the scenarios based on past incidents of NPPs induced by cyber-attack. That is, those scenarios only covered scenarios which were happened before; so, they couldn't cover other various scenarios and couldn't reflect them into a penetration test. In this study, a method to develop a cyber-attack penetration test scenario of NPPs especially focused on safety point of view is suggested. To evaluate the cyber security of NPPs, penetration test can be a possible way. In this study, a method to develop a penetration test scenario was explained. Especially, the goal of hacker was focused on nuclear fuel integrity deterioration. So, in the methodology, Level 1 PSA results were utilized to reflect plant safety into the security. From the PSA results, basic event was post processed and possible cyber-attacks were reviewed with vulnerabilities of digital control system

  12. Measuring Instruments Control Methodology Performance for Analog Electronics Remote Labs

    Directory of Open Access Journals (Sweden)

    Unai Hernandez-Jayo

    2012-12-01

    Full Text Available This paper presents the work that has been developed in parallel to the VISIR project. The objective of this paper is to present the results of the validations processes that have been carried out to check the control methodology. This method has been developed with the aim of being independent of the instruments of the labs.

  13. A global fouling factor methodology for analyzing steam generator thermal performance degradation

    International Nuclear Information System (INIS)

    Kreider, M.A.; White, G.A.; Varrin, R.D. Jr.

    1998-06-01

    Over the past few years, steam generator (SG) thermal performance degradation has led to decreased plant efficiency and power output at numerous PWR nuclear power plants with recirculating-type SGs. The authors have developed and implemented methodologies for quantitatively evaluating the various sources of SG performance degradation, both internal and external to the SG pressure boundary. These methodologies include computation of the global fouling factor history, evaluation of secondary deposit thermal resistance using deposit characterization data, and consideration of pressure loss causes unrelated to the tube bundle, such as hot-leg temperature streaming and SG moisture separator fouling. In order to evaluate the utility of the global fouling factor methodology, the authors performed case studies for a number of PWR SG designs. Key results from two of these studies are presented here. In tandem with the fouling-factor analyses, a study evaluated for each plant the potential causes of pressure loss. The combined results of the global fouling factor calculations and the pressure-loss evaluations demonstrated two key points: (1) that the available thermal margin against fouling, which can vary substantially from plant to plant, has an important bearing on whether a given plant exhibits losses in electrical generating capacity, and (2) that a wide variety of causes can result in SG thermal performance degradation

  14. Development of a fluidized bed agglomeration modeling methodology to include particle-level heterogeneities in ash chemistry and granular physics

    Science.gov (United States)

    Khadilkar, Aditi B.

    The utility of fluidized bed reactors for combustion and gasification can be enhanced if operational issues such as agglomeration are mitigated. The monetary and efficiency losses could be avoided through a mechanistic understanding of the agglomeration process and prediction of operational conditions that promote agglomeration. Pilot-scale experimentation prior to operation for each specific condition can be cumbersome and expensive. So the development of a mathematical model would aid predictions. With this motivation, the study comprised of the following model development stages- 1) development of an agglomeration modeling methodology based on binary particle collisions, 2) study of heterogeneities in ash chemical composition and gaseous atmosphere, 3) computation of a distribution of particle collision frequencies based on granular physics for a poly-disperse particle size distribution, 4) combining the ash chemistry and granular physics inputs to obtain agglomerate growth probabilities and 5) validation of the modeling methodology. The modeling methodology comprised of testing every binary particle collision in the system for sticking, based on the extent of dissipation of the particles' kinetic energy through viscous dissipation by slag-liquid (molten ash) covering the particles. In the modeling methodology developed in this study, thermodynamic equilibrium calculations are used to estimate the amount of slag-liquid in the system, and the changes in particle collision frequencies are accounted for by continuously tracking the number density of the various particle sizes. In this study, the heterogeneities in chemical composition of fuel ash were studied by separating the bulk fuel into particle classes that are rich in specific minerals. FactSage simulations were performed on two bituminous coals and an anthracite to understand the effect of particle-level heterogeneities on agglomeration. The mineral matter behavior of these constituent classes was studied

  15. Performance specification methodology: introduction and application to displays

    Science.gov (United States)

    Hopper, Darrel G.

    1998-09-01

    Acquisition reform is based on the notion that DoD must rely on the commercial marketplace insofar as possible rather than solely looking inward to a military marketplace to meet its needs. This reform forces a fundamental change in the way DoD conducts business, including a heavy reliance on private sector models of change. The key to more reliance on the commercial marketplace is the performance specifications (PS). This paper introduces some PS concepts and a PS classification principal to help bring some structure to the analysis of risk (cost, schedule, capability) in weapons system development and the management of opportunities for affordable ownership (maintain/increase capability via technology insertion, reduce cost) in this new paradigm. The DoD shift toward commercial components is nowhere better exemplified than in displays. Displays are the quintessential dual-use technology and are used herein to exemplify these PS concepts and principal. The advent of flat panel displays as a successful technology is setting off an epochal shift in cockpits and other military applications. Displays are installed in every DoD weapon system, and are, thus, representative of a range of technologies where issues and concerns throughout industry and government have been raised regarding the increased DoD reliance on the commercial marketplace. Performance specifications require metrics: the overall metrics of 'information-thrust' with units of Mb/s and 'specific info- thrust' with units of Mb/s/kg are introduced to analyze value of a display to the warfighter and affordability to the taxpayer.

  16. A global fouling factor methodology for analyzing steam generator thermal performance degradation

    International Nuclear Information System (INIS)

    Kreider, M.A.; White, G.A.; Varrin, R.D.

    1998-01-01

    Over the past few years, steam generator (SG) thermal performance degradation has led to decreased plant efficiency and power output at numerous PWR nuclear power plants with recirculating-type SGs. The authors have developed and implemented methodologies for quantitatively evaluating the various sources of SG performance degradation, both internal and external to the SG pressure boundary. These methodologies include computation of the global fouling factor history, evaluation of secondary deposit thermal resistance using deposit characterization data, and consideration of pressure loss causes unrelated to the tube bundle, such as hot-leg temperature streaming and SG moisture separator performance. In order to evaluate the utility of the global fouling factor methodology, the authors performed case studies for a number of PWR SG designs. Key results from two of these studies are presented here. Uncertainty analyses were performed to determine whether the calculated fouling factor for each plant represented significant fouling or whether uncertainty in key variables (e.g., steam pressure or feedwater flow rate) could be responsible for calculated fouling. The methodology was validated using two methods: by predicting the SG pressure following chemical cleaning at San Onofre 2 and also by performing a sensitivity study with the industry-standard thermal-hydraulics code ATHOS to investigate the effects of spatially varying tube scale distributions. This study indicated that the average scale thickness has a greater impact on fouling than the spatial distribution, showing that the assumption of uniform resistance inherent to the global fouling factor is reasonable. In tandem with the fouling-factor analyses, a study evaluated for each plant the potential causes of pressure loss. The combined results of the global fouling factor calculations and the pressure loss evaluations demonstrated two key points: 1) that the available thermal margin against fouling, which can

  17. Demonstration of a performance assessment methodology for nuclear waste isolation in basalt formations

    International Nuclear Information System (INIS)

    Bonano, E.J.; Davis, P.A.

    1988-01-01

    This paper summarizes the results of the demonstration of a performance assessment methodology developed by Sandia National Laboratories, Albuquerque for the US Nuclear Regulatory Commission for use in the analysis of high-level radioactive waste disposal in deep basalts. Seven scenarios that could affect the performance of a repository in basalts were analyzed. One of these scenarios, normal ground-water flow, was called the base-case scenario. This was used to demonstrate the modeling capabilities in the methodology necessary to assess compliance with the ground-water travel time criterion. The scenario analysis consisted of both scenario screening and consequence modeling. Preliminary analyses of scenarios considering heat released from the waste and the alteration of the hydraulic properties of the rock mass due to loads created by a glacier suggested that these effects would not be significant. The analysis of other scenarios indicated that those changing the flow field in the vicinity of the repository would have an impact on radionuclide discharges, while changes far from the repository may not be significant. The analysis of the base-case scenario was used to show the importance of matrix diffusion as a radionuclide retardation mechanism in fractured media. The demonstration of the methodology also included an overall sensitivity analysis to identify important parameters and/or processes. 15 refs., 13 figs., 2 tabs

  18. Statistical methodology for discrete fracture model - including fracture size, orientation uncertainty together with intensity uncertainty and variability

    International Nuclear Information System (INIS)

    Darcel, C.; Davy, P.; Le Goc, R.; Dreuzy, J.R. de; Bour, O.

    2009-11-01

    the other, addresses the issue of the nature of the transition. We develop a new 'mechanistic' model that could help in modeling why and where this transition can occur. The transition between both regimes would occur for a fracture length of 1-10 m and even at a smaller scale for the few outcrops that follow the self-similar density model. A consequence for the disposal issue is that the model that is likely to apply in the 'blind' scale window between 10-100 m is the self-similar model as it is defined for large-scale lineaments. The self-similar model, as it is measured for some outcrops and most lineament maps, is definitely worth being investigated as a reference for scales above 1-10 m. In the rest of the report, we develop a methodology for incorporating uncertainty and variability into the DFN modeling. Fracturing properties arise from complex processes which produce an intrinsic variability; characterizing this variability as an admissible variation of model parameter or as the division of the site into subdomains with distinct DFN models is a critical point of the modeling effort. Moreover, the DFN model encompasses a part of uncertainty, due to data inherent uncertainties and sampling limits. Both effects must be quantified and incorporated into the DFN site model definition process. In that context, all available borehole data including recording of fracture intercept positions, pole orientation and relative uncertainties are used as the basis for the methodological development and further site model assessment. An elementary dataset contains a set of discrete fracture intercepts from which a parent orientation/density distribution can be computed. The elementary bricks of the site, from which these initial parent density distributions are computed, rely on the former Single Hole Interpretation division of the boreholes into sections whose local boundaries are expected to reflect - locally - geology and fracturing properties main characteristics. From that

  19. Application of the Biosphere Assessment Methodology to the ENRESA, 1997 Performance and Safety Assessment

    International Nuclear Information System (INIS)

    Pinedo, P.; Simon, I.; Aguero, A.

    1998-01-01

    For several years CIEMAT has been developing for ENRESA knowledge and tools to support the modelling of the migration and accumulation of radionuclides within the biosphere once those radionuclides are released or reach one or more parts of the biosphere (atmosphere, water bodies or soils). The model development also includes evaluation of radiological impacts arising from the resulting distribution of radionuclides in the biosphere. In 1996, a Methodology to analyse the biosphere in this context proposed to ENRESA. The level of development of the different aspects proposed within the Methodology was quite heterogeneous and, while aspects of radionuclide transport modelling were already well developed in theoretical and practical terms, other aspects like the procedure for conceptual model development and the description of biosphere system representatives of the long term needed further developments. At present, the International Atomic Energy Agency (IAEA) Programme on Biosphere Modelling and Assessment (BIOMASS) in collaboration with several national organizations, ENRESA and CIEMAT among them, is working to complete and augment the Reference Biosphere Methodology and to produce some practical descriptions of Reference Systems. The overall purpose of this document is to apply the Methodology, taking account of on-going developments in biosphere modelling, to the last performance assessment (PA) exercise made by ENRESA (ENRESA, 1997), using from it the general and particular information about the assessment context, radionuclide information, geosphere and geobiosphere interface data. There are three particular objectives to this work: (a) to determine the practicability of the Methodology in an application to a realistic assessment situation, (b) To compare and contrast previous biosphere modelling in HLW PA and, (c) to test software development related with data management and modelling. (Author) 42 refs

  20. Statistical methodology for discrete fracture model - including fracture size, orientation uncertainty together with intensity uncertainty and variability

    Energy Technology Data Exchange (ETDEWEB)

    Darcel, C. (Itasca Consultants SAS (France)); Davy, P.; Le Goc, R.; Dreuzy, J.R. de; Bour, O. (Geosciences Rennes, UMR 6118 CNRS, Univ. def Rennes, Rennes (France))

    2009-11-15

    the lineament scale (k{sub t} = 2) on the other, addresses the issue of the nature of the transition. We develop a new 'mechanistic' model that could help in modeling why and where this transition can occur. The transition between both regimes would occur for a fracture length of 1-10 m and even at a smaller scale for the few outcrops that follow the self-similar density model. A consequence for the disposal issue is that the model that is likely to apply in the 'blind' scale window between 10-100 m is the self-similar model as it is defined for large-scale lineaments. The self-similar model, as it is measured for some outcrops and most lineament maps, is definitely worth being investigated as a reference for scales above 1-10 m. In the rest of the report, we develop a methodology for incorporating uncertainty and variability into the DFN modeling. Fracturing properties arise from complex processes which produce an intrinsic variability; characterizing this variability as an admissible variation of model parameter or as the division of the site into subdomains with distinct DFN models is a critical point of the modeling effort. Moreover, the DFN model encompasses a part of uncertainty, due to data inherent uncertainties and sampling limits. Both effects must be quantified and incorporated into the DFN site model definition process. In that context, all available borehole data including recording of fracture intercept positions, pole orientation and relative uncertainties are used as the basis for the methodological development and further site model assessment. An elementary dataset contains a set of discrete fracture intercepts from which a parent orientation/density distribution can be computed. The elementary bricks of the site, from which these initial parent density distributions are computed, rely on the former Single Hole Interpretation division of the boreholes into sections whose local boundaries are expected to reflect - locally - geology

  1. Synthesizing Soft Systems Methodology and Human Performance Technology

    Science.gov (United States)

    Scott, Glen; Winiecki, Donald J.

    2012-01-01

    Human performance technology (HPT), like other concepts, models, and frameworks that we use to describe the world in which we live and the way we organize ourselves to accomplish valuable activities, is built from paradigms that were fresh and relevant at the time it was conceived and from the fields of study from which it grew. However, when the…

  2. A Methodology for Making Early Comparative Architecture Performance Evaluations

    Science.gov (United States)

    Doyle, Gerald S.

    2010-01-01

    Complex and expensive systems' development suffers from a lack of method for making good system-architecture-selection decisions early in the development process. Failure to make a good system-architecture-selection decision increases the risk that a development effort will not meet cost, performance and schedule goals. This research provides a…

  3. Mutual fund performance: A synthesis of taxonomic and methodological issues

    Directory of Open Access Journals (Sweden)

    S.G. Badrinath

    2010-12-01

    Full Text Available This paper provides a comprehensive taxonomy of mutual funds and discusses the relative importance of these fund types. While most academic research focuses on US equity funds, we provide results for many more asset classes with this taxonomy—fixed income, balanced, global, International, sector, market-neutral and long-short funds. For each, we start by reporting statistics on the number of funds and their total net asset values at different intervals over the last four decades. We then identify short and long-term patterns in annual returns to mutual funds. We study the cross-sectional and time-series properties of the distribution of investor flows into different types of mutual funds, describe the relationship between flows and performance and discuss its implications for the strategic behaviour of managers and investors. We estimate and interpret fund performance alphas using both the single-factor and four-factor Fama-French models for each taxonomy type. Finally we describe the state of academic research on portfolio performance evaluation tilted towards an applied audience.

  4. Methodologies for Measuring Judicial Performance: The Problem of Bias

    Directory of Open Access Journals (Sweden)

    Jennifer Elek

    2014-12-01

    Full Text Available Concerns about gender and racial bias in the survey-based evaluations of judicial performance common in the United States have persisted for decades. Consistent with a large body of basic research in the psychological sciences, recent studies confirm that the results from these JPE surveys are systematically biased against women and minority judges. In this paper, we explain the insidious manner in which performance evaluations may be biased, describe some techniques that may help to reduce expressions of bias in judicial performance evaluation surveys, and discuss the potential problem such biases may pose in other common methods of performance evaluation used in the United States and elsewhere. We conclude by highlighting the potential adverse consequences of judicial performance evaluation programs that rely on biased measurements. Durante décadas ha habido una preocupación por la discriminación por género y racial en las evaluaciones del rendimiento judicial basadas en encuestas, comunes en Estados Unidos. De acuerdo con un gran corpus de investigación básica en las ciencias psicológicas, estudios recientes confirman que los resultados de estas encuestas de evaluación del rendimiento judicial están sistemáticamente sesgados contra las mujeres y los jueces de minorías. En este artículo se explica la manera insidiosa en que las evaluaciones de rendimiento pueden estar sesgadas, se describen algunas técnicas que pueden ayudar a reducir las expresiones de sesgo en los estudios de evaluación del rendimiento judicial, y se debate el problema potencial que estos sesgos pueden plantear en otros métodos comunes de evaluación del rendimiento utilizados en Estados Unidos y otros países. Se concluye destacando las posibles consecuencias adversas de los programas de evaluación del rendimiento judicial que se basan en mediciones sesgadas. DOWNLOAD THIS PAPER FROM SSRN: http://ssrn.com/abstract=2533937

  5. A performance-oriented power transformer design methodology using multi-objective evolutionary optimization.

    Science.gov (United States)

    Adly, Amr A; Abd-El-Hafiz, Salwa K

    2015-05-01

    Transformers are regarded as crucial components in power systems. Due to market globalization, power transformer manufacturers are facing an increasingly competitive environment that mandates the adoption of design strategies yielding better performance at lower costs. In this paper, a power transformer design methodology using multi-objective evolutionary optimization is proposed. Using this methodology, which is tailored to be target performance design-oriented, quick rough estimation of transformer design specifics may be inferred. Testing of the suggested approach revealed significant qualitative and quantitative match with measured design and performance values. Details of the proposed methodology as well as sample design results are reported in the paper.

  6. Methodological aspects of fuel performance system analysis at raw hydrocarbon processing plants

    Science.gov (United States)

    Kulbjakina, A. V.; Dolotovskij, I. V.

    2018-01-01

    The article discusses the methodological aspects of fuel performance system analysis at raw hydrocarbon (RH) processing plants. Modern RH processing facilities are the major consumers of energy resources (ER) for their own needs. To reduce ER, including fuel consumption, and to develop rational fuel system structure are complex and relevant scientific tasks that can only be done using system analysis and complex system synthesis. In accordance with the principles of system analysis, the hierarchical structure of the fuel system, the block scheme for the synthesis of the most efficient alternative of the fuel system using mathematical models and the set of performance criteria have been developed on the main stages of the study. The results from the introduction of specific engineering solutions to develop their own energy supply sources for RH processing facilities have been provided.

  7. Development of CANDU ECCS performance evaluation methodology and guides

    Energy Technology Data Exchange (ETDEWEB)

    Bang, Kwang Hyun; Park, Kyung Soo; Chu, Won Ho [Korea Maritime Univ., Jinhae (Korea, Republic of)

    2003-03-15

    The objectives of the present work are to carry out technical evaluation and review of CANDU safety analysis methods in order to assist development of performance evaluation methods and review guides for CANDU ECCS. The applicability of PWR ECCS analysis models are examined and it suggests that unique data or models for CANDU are required for the following phenomena: break characteristics and flow, frictional pressure drop, post-CHF heat transfer correlations, core flow distribution during blowdown, containment pressure, and reflux rate. For safety analysis of CANDU, conservative analysis or best estimate analysis can be used. The main advantage of BE analysis is a more realistic prediction of margins to acceptance criteria. The expectation is that margins demonstrated with BE methods would be larger that when a conservative approach is applied. Some outstanding safety analysis issues can be resolved by demonstration that accident consequences are more benign than previously predicted. Success criteria for analysis and review of Large LOCA can be developed by top-down approach. The highest-level success criteria can be extracted from C-6 and from them, the lower level criteria can be developed step-by-step, in a logical fashion. The overall objectives for analysis and review are to verify radiological consequences and frequency are met.

  8. A Performance-Based Technology Assessment Methodology to Support DoD Acquisition

    National Research Council Canada - National Science Library

    Mahafza, Sherry; Componation, Paul; Tippett, Donald

    2005-01-01

    .... This methodology is referred to as Technology Performance Risk Index (TPRI). The TPRI can track technology readiness through a life cycle, or it can be used at a specific time to support a particular system milestone decision...

  9. Postal auditing methodology used to find out the performance of high rate brachytherapy equipment

    International Nuclear Information System (INIS)

    Morales, J.A.; Campa, R.

    1998-01-01

    This work describes results from a methodology implemented at the Secondary Laboratory for Dosimetric Calibration at CPHR used to check the brachytherapy performance at high doses rate using Cesium 137 or cobalt 60 sources

  10. Methodology for evaluating gloves in relation to the effects on hand performance capabilities: a literature review.

    Science.gov (United States)

    Dianat, Iman; Haslegrave, Christine M; Stedmon, Alex W

    2012-01-01

    The present study was conducted to review the literature on the methods that have been considered appropriate for evaluation of the effects of gloves on different aspects of hand performance, to make recommendations for the testing and assessment of gloves, and to identify where further research is needed to improve the evaluation protocols. Eighty-five papers meeting the criteria for inclusion were reviewed. Many studies show that gloves may have negative effects on manual dexterity, tactile sensitivity, handgrip strength, muscle activity and fatigue and comfort, while further research is needed to determine glove effects on pinch strength, forearm torque strength and range of finger and wrist movements. The review also highlights several methodological issues (including consideration of both task type and duration of glove use by workers, guidance on the selection and allocation of suitable glove(s) for particular tasks/jobs, and glove design features) that need to be considered in future research. Practitioner Summary: The relevant literature on the effects of protective gloves on different aspects of hand performance was reviewed to make recommendations for the testing and assessment of gloves, and to improve evaluation protocols. The review highlights research areas and methodological issues that need to be considered in future research.

  11. Methodology of the Integrated Analysis of Company's Financial Status and Its Performance Results

    OpenAIRE

    Mackevičius, Jonas; Valkauskas, Romualdas

    2010-01-01

    Information about company's financial status and its performance results is very important for the objective evaluation of company's position in the market and competitive possibilities in the future. Such information is provided in the financial statement. It is important to apply and investigate this information properly. The methodology of company's financial status and performance results integrated analysis is recommended in this article. This methodology consists of these three elements...

  12. A performance assessment methodology for low-level radioactive waste disposal

    International Nuclear Information System (INIS)

    Derring, L.R.

    1990-01-01

    To demonstrate compliance with the performance objectives governing protection of the general population in 10 CFR 61.41, applicants for land disposal of low-level radioactive waste are required to conduct a pathways analysis, or quantitative evaluation of radionuclide release, transport through environmental media, and dose to man. The Nuclear Regulatory Commission staff defined a strategy and initiated a project at Sandia National Laboratories to develop a methodology for independently evaluating an applicant's analysis of postclosure performance. This performance assessment methodology was developed in five stages: identification of environmental pathways, ranking the significance of the pathways, identification and integration of models for pathway analyses, identification and selection of computer codes and techniques for the methodology, and implementation of the codes and documentation of the methodology. This paper summarizes the NRC approach for conducting evaluations of license applications for low-level radioactive waste facilities. 23 refs

  13. Introduction on performance analysis and profiling methodologies for KVM on ARM virtualization

    Science.gov (United States)

    Motakis, Antonios; Spyridakis, Alexander; Raho, Daniel

    2013-05-01

    The introduction of hardware virtualization extensions on ARM Cortex-A15 processors has enabled the implementation of full virtualization solutions for this architecture, such as KVM on ARM. This trend motivates the need to quantify and understand the performance impact, emerged by the application of this technology. In this work we start looking into some interesting performance metrics on KVM for ARM processors, which can provide us with useful insight that may lead to potential improvements in the future. This includes measurements such as interrupt latency and guest exit cost, performed on ARM Versatile Express and Samsung Exynos 5250 hardware platforms. Furthermore, we discuss additional methodologies that can provide us with a deeper understanding in the future of the performance footprint of KVM. We identify some of the most interesting approaches in this field, and perform a tentative analysis on how these may be implemented in the KVM on ARM port. These take into consideration hardware and software based counters for profiling, and issues related to the limitations of the simulators which are often used, such as the ARM Fast Models platform.

  14. A new methodology for assessment of the performance of heartbeat classification systems

    Directory of Open Access Journals (Sweden)

    Hool Livia C

    2008-01-01

    Full Text Available Abstract Background The literature presents many different algorithms for classifying heartbeats from ECG signals. The performance of the classifier is normally presented in terms of sensitivity, specificity or other metrics describing the proportion of correct versus incorrect beat classifications. From the clinician's point of view, such metrics are however insufficient to rate the performance of a classifier. Methods We propose a new methodology for the presentation of classifier performance, based on Bayesian classification theory. Our proposition lets the investigators report their findings in terms of beat-by-beat comparisons, and defers the role of assessing the utility of the classifier to the statistician. Evaluation of the classifier's utility must be undertaken in conjunction with the set of relative costs applicable to the clinicians' application. Such evaluation produces a metric more tuned to the specific application, whilst preserving the information in the results. Results By way of demonstration, we propose a set of costs, based on clinical data from the literature, and examine the results of two published classifiers using our method. We make recommendations for reporting classifier performance, such that this method can be used for subsequent evaluation. Conclusion The proportion of misclassified beats contains insufficient information to fully evaluate a classifier. Performance reports should include a table of beat-by-beat comparisons, showing not-only the number of misclassifications, but also the identity of the classes involved in each inaccurate classification.

  15. How Is Working Memory Training Likely to Influence Academic Performance? Current Evidence and Methodological Considerations.

    Science.gov (United States)

    Bergman Nutley, Sissela; Söderqvist, Stina

    2017-01-01

    Working memory (WM) is one of our core cognitive functions, allowing us to keep information in mind for shorter periods of time and then work with this information. It is the gateway that information has to pass in order to be processed consciously. A well-functioning WM is therefore crucial for a number of everyday activities including learning and academic performance (Gathercole et al., 2003; Bull et al., 2008), which is the focus of this review. Specifically, we will review the research investigating whether improving WM capacity using Cogmed WM training can lead to improvements on academic performance. Emphasis is given to reviewing the theoretical principles upon which such investigations rely, in particular the complex relation between WM and mathematical and reading abilities during development and how these are likely to be influenced by training. We suggest two possible routes in which training can influence academic performance, one through an effect on learning capacity which would thus be evident with time and education, and one through an immediate effect on performance on reading and mathematical tasks. Based on the theoretical complexity described we highlight some methodological issues that are important to take into consideration when designing and interpreting research on WM training and academic performance, but that are nonetheless often overlooked in the current research literature. Finally, we will provide some suggestions for future research for advancing the understanding of WM training and its potential role in supporting academic attainment.

  16. BER-3.2 report: Methodology for justification and optimization of protective measures including a case study

    International Nuclear Information System (INIS)

    Hedemann Jensen, P.; Sinkko, K.; Walmod-Larsen, O.; Gjoerup, H.L.; Salo, A.

    1992-07-01

    This report is a part of the Nordic BER-3 project's work to propose and harmonize Nordic intervention levels for countermeasures in case of nuclear accidents. This report focuses on the methodology for justification and optimization of protective measures in case of a reactor accident situation with a large release of fission products to the environment. The down-wind situation is very complicated. The dose to the exposed society is almost unpredictable. The task of the radiation protection experts: To give advice to the decision makers on averted doses by the different actions at hand in the situation - is complicated. That of the decision makers is certainly more: On half of the society they represent, they must decide if they wish to follow the advices from their radiation protection experts or if they wish to add further arguments - economical or political (or personal) - into their considerations before their decisions are taken. Two analysis methods available for handling such situations: cost-benefit analysis and multi-attribute utility analysis are described in principle and are utilized in a case study: The impacts of a Chernobyl-like accident on the Swedish island of Gotland in the Baltic Sea are analyzed with regard to the acute consequences. The use of the intervention principles found in international guidance (IAEA 91, ICRP 91), which can be summarized as the principles of justification, optimization and avoidance of unacceptable doses, are described. How to handle more intangible factors of a psychological or political character is indicated. (au) (6 tabs., 3 ills., 17 refs.)

  17. M&A Performance and Economic Impact: Integration and Critical Assessment of Methodological Approach

    Directory of Open Access Journals (Sweden)

    Karolis Andriuskevicius

    2017-11-01

    Full Text Available Purpose of the article: Existing methodologies employed within the M&A performance framework are investigated and critically discuss. Methodology/methods: The research has been carried out as a structured assessment of past literature. The findings from scientific articles and studies by various scholars have been categorized, grouped and summarized to discern a meta-analytic view of the work carried out to date. Scientific aim: The conducted research seeks to ascertain and evaluate theoretically existing methodologies used in empirical studies that would allow proper and critical understanding of the results of various findings in the holistic and global M&As area. Findings: The research elaborates on several key developments in M&A methodology and performance studies carried out in empirical works during the last two decades. The findings help to independently and objectively assess performance of M&A from a holistic perspective. Conclusions: Each methodology measuring either M&A performance on a corporate level or effects of M&A on the economy level shall be interpreted and relied on with caution as each of them dispose their limitations whereas application of these methodologies is subject to data availability and case specific.

  18. Meta-analysis of the technical performance of an imaging procedure: guidelines and statistical methodology.

    Science.gov (United States)

    Huang, Erich P; Wang, Xiao-Feng; Choudhury, Kingshuk Roy; McShane, Lisa M; Gönen, Mithat; Ye, Jingjing; Buckler, Andrew J; Kinahan, Paul E; Reeves, Anthony P; Jackson, Edward F; Guimaraes, Alexander R; Zahlmann, Gudrun

    2015-02-01

    Medical imaging serves many roles in patient care and the drug approval process, including assessing treatment response and guiding treatment decisions. These roles often involve a quantitative imaging biomarker, an objectively measured characteristic of the underlying anatomic structure or biochemical process derived from medical images. Before a quantitative imaging biomarker is accepted for use in such roles, the imaging procedure to acquire it must undergo evaluation of its technical performance, which entails assessment of performance metrics such as repeatability and reproducibility of the quantitative imaging biomarker. Ideally, this evaluation will involve quantitative summaries of results from multiple studies to overcome limitations due to the typically small sample sizes of technical performance studies and/or to include a broader range of clinical settings and patient populations. This paper is a review of meta-analysis procedures for such an evaluation, including identification of suitable studies, statistical methodology to evaluate and summarize the performance metrics, and complete and transparent reporting of the results. This review addresses challenges typical of meta-analyses of technical performance, particularly small study sizes, which often causes violations of assumptions underlying standard meta-analysis techniques. Alternative approaches to address these difficulties are also presented; simulation studies indicate that they outperform standard techniques when some studies are small. The meta-analysis procedures presented are also applied to actual [18F]-fluorodeoxyglucose positron emission tomography (FDG-PET) test-retest repeatability data for illustrative purposes. © The Author(s) 2014 Reprints and permissions: sagepub.co.uk/journalsPermissions.nav.

  19. 77 FR 33492 - Cequent Performance Products, Inc. a Subsidiary of Trimas Corporation Including Workers Whose...

    Science.gov (United States)

    2012-06-06

    ... Products, Inc. Including On-Site Leased Workers From Manpower Tekonsha, MI; Amended Certification Regarding... Cequent Performance Products, Inc. Accordingly, the Department is amending this certification to properly... Products, Inc. a Subsidiary of Trimas Corporation Including Workers Whose Wages Were Reported Under...

  20. A Methodology for Equitable Performance Assessment and Presentation of Wave Energy Converters Based on Sea Trials

    DEFF Research Database (Denmark)

    Kofoed, Jens Peter; Pecher, Arthur; Margheritini, Lucia

    2013-01-01

    This paper provides a methodology for the analysis and presentation of data obtained from sea trials of wave energy converters (WEC). The equitable aspect of this methodology lies in its wide application, as any WEC at any scale or stage of development can be considered as long as the tests are p...... parameters influence the performance of the WEC can also be investigated using this methodology.......This paper provides a methodology for the analysis and presentation of data obtained from sea trials of wave energy converters (WEC). The equitable aspect of this methodology lies in its wide application, as any WEC at any scale or stage of development can be considered as long as the tests...... leads to testing campaigns that are not as extensive as desired. Therefore, the performance analysis should be robust enough to allow for not fully complete sea trials and sub optimal performance data. In other words, this methodology is focused at retrieving the maximum amount of useful information out...

  1. The Effect of Soft Skills and Training Methodology on Employee Performance

    Science.gov (United States)

    Ibrahim, Rosli; Boerhannoeddin, Ali; Bakare, Kazeem Kayode

    2017-01-01

    Purpose: The purpose of this paper is to investigate the effect of soft skill acquisition and the training methodology adopted on employee work performance. In this study, the authors study the trends of research in training and work performance in organisations that focus on the acquisition of technical or "hard skills" for employee…

  2. Methodology for the preliminary design of high performance schools in hot and humid climates

    Science.gov (United States)

    Im, Piljae

    A methodology to develop an easy-to-use toolkit for the preliminary design of high performance schools in hot and humid climates was presented. The toolkit proposed in this research will allow decision makers without simulation knowledge easily to evaluate accurately energy efficient measures for K-5 schools, which would contribute to the accelerated dissemination of energy efficient design. For the development of the toolkit, first, a survey was performed to identify high performance measures available today being implemented in new K-5 school buildings. Then an existing case-study school building in a hot and humid climate was selected and analyzed to understand the energy use pattern in a school building and to be used in developing a calibrated simulation. Based on the information from the previous step, an as-built and calibrated simulation was then developed. To accomplish this, five calibration steps were performed to match the simulation results with the measured energy use. The five steps include: (1) Using an actual 2006 weather file with measured solar radiation, (2) Modifying lighting & equipment schedule using ASHRAE's RP-1093 methods, (3) Using actual equipment performance curves (i.e., scroll chiller), (4) Using the Winkelmann's method for the underground floor heat transfer, and (5) Modifying the HVAC and room setpoint temperature based on the measured field data. Next, the calibrated simulation of the case-study K-5 school was compared to an ASHRAE Standard 90.1-1999 code-compliant school. In the next step, the energy savings potentials from the application of several high performance measures to an equivalent ASHRAE Standard 90.1-1999 code-compliant school. The high performance measures applied included the recommendations from the ASHRAE Advanced Energy Design Guides (AEDG) for K-12 and other high performance measures from the literature review as well as a daylighting strategy and solar PV and thermal systems. The results show that the net

  3. A performance assessment methodology for low-level radioactive waste disposal

    International Nuclear Information System (INIS)

    Deering, L.R.; Kozak, M.W.

    1990-01-01

    To demonstrate compliance with the performance objectives governing protection of the general population in 10 CFR 61.41, applicants for land disposal of low-level radioactive waste are required to conduct a pathways analysis, or quantitative evaluation of radionuclide release, transport through environmental media, and dose to man. The Nuclear Regulatory Commission staff defined a strategy and initiated a project at Sandia National Laboratories to develop a methodology for independently evaluating an applicant's analysis of postclosure performance. This performance assessment methodology was developed in five stages: (1) identification of environmental pathways, (2) ranking, the significance of the pathways, (3) identification and integration of models for pathway analyses, (4) identification and selection of computer codes and techniques for the methodology, and (5) implementation of the codes and documentation of the methodology. The final methodology implements analytical and simple numerical solutions for source term, ground-water flow and transport, surface water transport, air transport, food chain, and dosimetry analyses, as well as more complex numerical solutions for multidimensional or transient analyses when more detailed assessments are needed. The capability to perform both simple and complex analyses is accomplished through modular modeling, which permits substitution of various models and codes to analyze system components

  4. Nonrandomized studies are not always found even when selection criteria for health systems intervention reviews include them: a methodological study.

    Science.gov (United States)

    Glenton, Claire; Lewin, Simon; Mayhew, Alain; Scheel, Inger; Odgaard-Jensen, Jan

    2013-04-01

    Systematic reviews within the Cochrane Effective Practice and Organisation of Care Group (EPOC) can include both randomized and nonrandomized study designs. We explored how many EPOC reviews consider and identify nonrandomized studies, and whether the proportion of nonrandomized studies identified is linked to the review topic. We recorded the study designs considered in 65 EPOC reviews. For reviews that considered nonrandomized studies, we calculated the proportion of identified studies that were nonrandomized and explored whether there were differences in the proportion of nonrandomized studies according to the review topic. Fifty-one (78.5%) reviews considered nonrandomized studies. Forty-six of these reviews found nonrandomized studies, but the proportion varied a great deal (median, 33%; interquartile range, 25--50%). Reviews of health care delivery interventions had lower proportions of nonrandomized studies than those of financial and governance interventions. Most EPOC reviews consider nonrandomized studies, but the degree to which they find them varies. As nonrandomized studies are believed to be at higher risk of bias and their inclusion entails a considerable effort, review authors should consider whether the benefits justify the inclusion of these designs. Research should explore whether it is more useful to consider nonrandomized studies in reviews of some intervention types than others. Copyright © 2013 Elsevier Inc. All rights reserved.

  5. A methodology to quantify the aerobic and anaerobic sludge digestion performance for nutrient recycling in aquaponics

    Directory of Open Access Journals (Sweden)

    Delaide, B.

    2018-01-01

    Full Text Available Description of the subject. This research note presents a methodology to quantify the tilapia sludge digestion performance in aerobic and anaerobic reactors for aquaponic purpose. Both organic reduction and macro- and microelements mineralization performances were addressed. Objectives. To set up an appropriate methodology to quantify sludge digestion performance in aquaponics. To describe the methodology and illustrate it with some results as example. Method. Equations were adapted to quantify (1 the organic reduction performance in terms of chemical oxygen demand (COD and total suspended solids (TSS reduction, and (2 the nutrient recycling performance in terms of macro- and microelements mineralization. Results. The equations were applied to data obtained from experimental aerobic and anaerobic reactors as example. Reactors were able to remove at least 50% of the TSS and COD input. The nutrient mineralization was consistent with a 10 — 60% range for all macro- and micronutrients. Conclusions. The methodology provides explicit indicators on the sludge treatment performances for aquaponics. Treating aquaponic sludge onsite is promising to avoid sludge spillage, improve nutrient recycling and save water.

  6. An Intervention Including an Online Game to Improve Grade 6 Students' Performance in Early Algebra

    Science.gov (United States)

    Kolovou, Angeliki; van den Heuvel-Panhuizen, Marja; Koller, Olaf

    2013-01-01

    This study investigated whether an intervention including an online game contributed to 236 Grade 6 students' performance in early algebra, that is, solving problems with covarying quantities. An exploratory quasi-experimental study was conducted with a pretest-posttest-control-group design. Students in the experimental group were asked to solve…

  7. Test Methodologies for Hydrogen Sensor Performance Assessment: Chamber vs. Flow Through Test Apparatus: Preprint

    Energy Technology Data Exchange (ETDEWEB)

    Buttner, William J [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Hartmann, Kevin S [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Schmidt, Kara [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Cebolla, Rafeal O [Joint Research Centre, Petten, the Netherlands; Weidner, Eveline [Joint Research Centre, Petten, the Netherlands; Bonato, Christian [Joint Research Centre, Petten, the Netherlands

    2017-11-06

    Certification of hydrogen sensors to standards often prescribes using large-volume test chambers [1, 2]. However, feedback from stakeholders such as sensor manufacturers and end-users indicate that chamber test methods are often viewed as too slow and expensive for routine assessment. Flow through test methods potentially are an efficient, cost-effective alternative for sensor performance assessment. A large number of sensors can be simultaneously tested, in series or in parallel, with an appropriate flow through test fixture. The recent development of sensors with response times of less than 1s mandates improvements in equipment and methodology to properly capture the performance of this new generation of fast sensors; flow methods are a viable approach for accurate response and recovery time determinations, but there are potential drawbacks. According to ISO 26142 [1], flow through test methods may not properly simulate ambient applications. In chamber test methods, gas transport to the sensor can be dominated by diffusion which is viewed by some users as mimicking deployment in rooms and other confined spaces. Alternatively, in flow through methods, forced flow transports the gas to the sensing element. The advective flow dynamics may induce changes in the sensor behaviour relative to the quasi-quiescent condition that may prevail in chamber test methods. One goal of the current activity in the JRC and NREL sensor laboratories [3, 4] is to develop a validated flow through apparatus and methods for hydrogen sensor performance testing. In addition to minimizing the impact on sensor behaviour induced by differences in flow dynamics, challenges associated with flow through methods include the ability to control environmental parameters (humidity, pressure and temperature) during the test and changes in the test gas composition induced by chemical reactions with upstream sensors. Guidelines on flow through test apparatus design and protocols for the evaluation of

  8. OvidSP Medline-to-PubMed search filter translation: a methodology for extending search filter range to include PubMed's unique content.

    Science.gov (United States)

    Damarell, Raechel A; Tieman, Jennifer J; Sladek, Ruth M

    2013-07-02

    PubMed translations of OvidSP Medline search filters offer searchers improved ease of access. They may also facilitate access to PubMed's unique content, including citations for the most recently published biomedical evidence. Retrieving this content requires a search strategy comprising natural language terms ('textwords'), rather than Medical Subject Headings (MeSH). We describe a reproducible methodology that uses a validated PubMed search filter translation to create a textword-only strategy to extend retrieval to PubMed's unique heart failure literature. We translated an OvidSP Medline heart failure search filter for PubMed and established version equivalence in terms of indexed literature retrieval. The PubMed version was then run within PubMed to identify citations retrieved by the filter's MeSH terms (Heart failure, Left ventricular dysfunction, and Cardiomyopathy). It was then rerun with the same MeSH terms restricted to searching on title and abstract fields (i.e. as 'textwords'). Citations retrieved by the MeSH search but not the textword search were isolated. Frequency analysis of their titles/abstracts identified natural language alternatives for those MeSH terms that performed less effectively as textwords. These terms were tested in combination to determine the best performing search string for reclaiming this 'lost set'. This string, restricted to searching on PubMed's unique content, was then combined with the validated PubMed translation to extend the filter's performance in this database. The PubMed heart failure filter retrieved 6829 citations. Of these, 834 (12%) failed to be retrieved when MeSH terms were converted to textwords. Frequency analysis of the 834 citations identified five high frequency natural language alternatives that could improve retrieval of this set (cardiac failure, cardiac resynchronization, left ventricular systolic dysfunction, left ventricular diastolic dysfunction, and LV dysfunction). Together these terms reclaimed

  9. Performance-based methodology for assessing seismic vulnerability and capacity of buildings

    Science.gov (United States)

    Shibin, Lin; Lili, Xie; Maosheng, Gong; Ming, Li

    2010-06-01

    This paper presents a performance-based methodology for the assessment of seismic vulnerability and capacity of buildings. The vulnerability assessment methodology is based on the HAZUS methodology and the improved capacitydemand-diagram method. The spectral displacement ( S d ) of performance points on a capacity curve is used to estimate the damage level of a building. The relationship between S d and peak ground acceleration (PGA) is established, and then a new vulnerability function is expressed in terms of PGA. Furthermore, the expected value of the seismic capacity index (SCev) is provided to estimate the seismic capacity of buildings based on the probability distribution of damage levels and the corresponding seismic capacity index. The results indicate that the proposed vulnerability methodology is able to assess seismic damage of a large number of building stock directly and quickly following an earthquake. The SCev provides an effective index to measure the seismic capacity of buildings and illustrate the relationship between the seismic capacity of buildings and seismic action. The estimated result is compared with damage surveys of the cities of Dujiangyan and Jiangyou in the M8.0 Wenchuan earthquake, revealing that the methodology is acceptable for seismic risk assessment and decision making. The primary reasons for discrepancies between the estimated results and the damage surveys are discussed.

  10. Performance assessment methodology (PAM) for low level radioactive waste (LLRW) disposal facilities

    International Nuclear Information System (INIS)

    Selander, W.N.

    1992-01-01

    An overview is given for Performance Assessment Methodology (PAM) for Low Level Radioactive Waste (LLRW) disposal technologies, as required for licensing and safety studies. This is a multi-disciplinary activity, emphasizing applied mathematics, mass transfer, geohydrology and radiotoxicity effects on humans. (author). 2 refs

  11. Evaluating electronic performance support systems: A methodology focused on future use-in-practice

    NARCIS (Netherlands)

    Collis, Betty; Verwijs, C.A.

    1995-01-01

    Electronic performance support systems, as an emerging type of software environment, present many new challenges in relation to effective evaluation. In this paper, a global approach to a 'usage-orientated' evaluation methodology for software product is presented, followed by a specific example of

  12. Rethinking Fragile Landscapes during the Greek Crisis: Precarious Aesthetics and Methodologies in Athenian Dance Performances

    Science.gov (United States)

    Zervou, Natalie

    2017-01-01

    The financial crisis in Greece brought about significant changes in the sociopolitical and financial landscape of the country. Severe budget cuts imposed on the arts and performing practices have given rise to a new aesthetic which has impacted the themes and methodologies of contemporary productions. To unpack this aesthetic, I explore the ways…

  13. Using a False Biofeedback Methodology to Explore Relationships between Learners' Affect, Metacognition, and Performance

    Science.gov (United States)

    Strain, Amber Chauncey; Azevedo, Roger; D'Mello, Sidney K.

    2013-01-01

    We used a false-biofeedback methodology to manipulate physiological arousal in order to induce affective states that would influence learners' metacognitive judgments and learning performance. False-biofeedback is a method used to induce physiological arousal (and resultant affective states) by presenting learners with audio stimuli of false heart…

  14. Design methodology for flexible energy conversion systems accounting for dynamic performance

    DEFF Research Database (Denmark)

    Pierobon, Leonardo; Casati, Emiliano; Casella, Francesco

    2014-01-01

    This article presents a methodology to help in the definition of the optimal design of power generation systems. The innovative element is the integration of requirements on dynamic performance into the system design procedure. Operational flexibility is an increasingly important specification...

  15. A new scaling methodology for NO(x) emissions performance of gas burners and furnaces

    Science.gov (United States)

    Hsieh, Tse-Chih

    1997-11-01

    A general burner and furnace scaling methodology is presented, together with the resulting scaling model for NOsb{x} emissions performance of a broad class of swirl-stabilized industrial gas burners. The model is based on results from a set of novel burner scaling experiments on a generic gas burner and furnace design at five different scales having near-uniform geometric, aerodynamic, and thermal similarity and uniform measurement protocols. These provide the first NOsb{x} scaling data over the range of thermal scales from 30 kW to 12 MW, including input-output measurements as well as detailed in-flame measurements of NO, NOsb{x}, CO, Osb2, unburned hydrocarbons, temperature, and velocities at each scale. The in-flame measurements allow identification of key sources of NOsb{x} production. The underlying physics of these NOsb{x} sources lead to scaling laws for their respective contributions to the overall NOsb{x} emissions performance. It is found that the relative importance of each source depends on the burner scale and operating conditions. Simple furnace residence time scaling is shown to be largely irrelevant, with NOsb{x} emissions instead being largely controlled by scaling of the near-burner region. The scalings for these NOsb{x} sources are combined in a comprehensive scaling model for NOsb{x} emission performance. Results from the scaling model show good agreement with experimental data at all burner scales and over the entire range of turndown, staging, preheat, and excess air dilution, with correlations generally exceeding 90%. The scaling model permits design trade-off assessments for a broad class of burners and furnaces, and allows performance of full industrial scale burners and furnaces of this type to be inferred from results of small scale tests.

  16. Exergoeconomic performance optimization for a steady-flow endoreversible refrigeration model including six typical cycles

    Energy Technology Data Exchange (ETDEWEB)

    Chen, Lingen; Kan, Xuxian; Sun, Fengrui; Wu, Feng [College of Naval Architecture and Power, Naval University of Engineering, Wuhan 430033 (China)

    2013-07-01

    The operation of a universal steady flow endoreversible refrigeration cycle model consisting of a constant thermal-capacity heating branch, two constant thermal-capacity cooling branches and two adiabatic branches is viewed as a production process with exergy as its output. The finite time exergoeconomic performance optimization of the refrigeration cycle is investigated by taking profit rate optimization criterion as the objective. The relations between the profit rate and the temperature ratio of working fluid, between the COP (coefficient of performance) and the temperature ratio of working fluid, as well as the optimal relation between profit rate and the COP of the cycle are derived. The focus of this paper is to search the compromised optimization between economics (profit rate) and the utilization factor (COP) for endoreversible refrigeration cycles, by searching the optimum COP at maximum profit, which is termed as the finite-time exergoeconomic performance bound. Moreover, performance analysis and optimization of the model are carried out in order to investigate the effect of cycle process on the performance of the cycles using numerical example. The results obtained herein include the performance characteristics of endoreversible Carnot, Diesel, Otto, Atkinson, Dual and Brayton refrigeration cycles.

  17. "Found Performance": Towards a Musical Methodology for Exploring the Aesthetics of Care.

    Science.gov (United States)

    Wood, Stuart

    2017-09-18

    Concepts of performance in fine art reflect key processes in music therapy. Music therapy enables practitioners to reframe patients as performers, producing new meanings around the clinical knowledge attached to medical histories and constructs. In this paper, music therapy practices are considered in the wider context of art history, with reference to allied theories from social research. Tracing a century in art that has revised the performativity of found objects (starting with Duchamp's "Fountain"), and of found sound (crystallised by Cage's 4' 33) this paper proposes that music therapy might be a pioneer methodology of "found performance". Examples from music therapy and contemporary socially engaged art practices are brought as potential links between artistic methodologies and medical humanities research, with specific reference to notions of Aesthetics of Care.

  18. Computational Methodologies for Developing Structure–Morphology–Performance Relationships in Organic Solar Cells: A Protocol Review

    KAUST Repository

    Do, Khanh

    2016-09-08

    We outline a step-by-step protocol that incorporates a number of theoretical and computational methodologies to evaluate the structural and electronic properties of pi-conjugated semiconducting materials in the condensed phase. Our focus is on methodologies appropriate for the characterization, at the molecular level, of the morphology in blend systems consisting of an electron donor and electron acceptor, of importance for understanding the performance properties of bulk-heterojunction organic solar cells. The protocol is formulated as an introductory manual for investigators who aim to study the bulk-heterojunction morphology in molecular details, thereby facilitating the development of structure morphology property relationships when used in tandem with experimental results.

  19. Assessment of an isolation condenser performance in case of a LOHS using the RMPS+ Methodology

    International Nuclear Information System (INIS)

    Giménez, M; Mezio, F.; Zanocco, P.; Lorenzo, G.

    2011-01-01

    Conclusions: • It has been observed that in the original RMPS proposal the response surface may be poorly built, and therefore the system reliability. • The methodology was improved by means of an iterative process in order to build a response surface with new performance indicator values in the boundary failure domain, obtained trough the plant model. • The proposed methodology was useful: – To identify efficiently “odd events”; – To rank the influence of the parameters with uncertainties; – To estimate CAREM-like PRHRS “functional reliability” to verify a design criterion

  20. Methodological quality of diagnostic accuracy studies on non-invasive coronary CT angiography: influence of QUADAS (Quality Assessment of Diagnostic Accuracy Studies included in systematic reviews) items on sensitivity and specificity

    Energy Technology Data Exchange (ETDEWEB)

    Schueler, Sabine; Walther, Stefan; Schuetz, Georg M. [Humboldt-Universitaet zu Berlin, Freie Universitaet Berlin, Charite Medical School, Department of Radiology, Berlin (Germany); Schlattmann, Peter [University Hospital of Friedrich Schiller University Jena, Department of Medical Statistics, Informatics, and Documentation, Jena (Germany); Dewey, Marc [Humboldt-Universitaet zu Berlin, Freie Universitaet Berlin, Charite Medical School, Department of Radiology, Berlin (Germany); Charite, Institut fuer Radiologie, Berlin (Germany)

    2013-06-15

    To evaluate the methodological quality of diagnostic accuracy studies on coronary computed tomography (CT) angiography using the QUADAS (Quality Assessment of Diagnostic Accuracy Studies included in systematic reviews) tool. Each QUADAS item was individually defined to adapt it to the special requirements of studies on coronary CT angiography. Two independent investigators analysed 118 studies using 12 QUADAS items. Meta-regression and pooled analyses were performed to identify possible effects of methodological quality items on estimates of diagnostic accuracy. The overall methodological quality of coronary CT studies was merely moderate. They fulfilled a median of 7.5 out of 12 items. Only 9 of the 118 studies fulfilled more than 75 % of possible QUADAS items. One QUADAS item (''Uninterpretable Results'') showed a significant influence (P = 0.02) on estimates of diagnostic accuracy with ''no fulfilment'' increasing specificity from 86 to 90 %. Furthermore, pooled analysis revealed that each QUADAS item that is not fulfilled has the potential to change estimates of diagnostic accuracy. The methodological quality of studies investigating the diagnostic accuracy of non-invasive coronary CT is only moderate and was found to affect the sensitivity and specificity. An improvement is highly desirable because good methodology is crucial for adequately assessing imaging technologies. (orig.)

  1. Methodological quality of diagnostic accuracy studies on non-invasive coronary CT angiography: influence of QUADAS (Quality Assessment of Diagnostic Accuracy Studies included in systematic reviews) items on sensitivity and specificity

    International Nuclear Information System (INIS)

    Schueler, Sabine; Walther, Stefan; Schuetz, Georg M.; Schlattmann, Peter; Dewey, Marc

    2013-01-01

    To evaluate the methodological quality of diagnostic accuracy studies on coronary computed tomography (CT) angiography using the QUADAS (Quality Assessment of Diagnostic Accuracy Studies included in systematic reviews) tool. Each QUADAS item was individually defined to adapt it to the special requirements of studies on coronary CT angiography. Two independent investigators analysed 118 studies using 12 QUADAS items. Meta-regression and pooled analyses were performed to identify possible effects of methodological quality items on estimates of diagnostic accuracy. The overall methodological quality of coronary CT studies was merely moderate. They fulfilled a median of 7.5 out of 12 items. Only 9 of the 118 studies fulfilled more than 75 % of possible QUADAS items. One QUADAS item (''Uninterpretable Results'') showed a significant influence (P = 0.02) on estimates of diagnostic accuracy with ''no fulfilment'' increasing specificity from 86 to 90 %. Furthermore, pooled analysis revealed that each QUADAS item that is not fulfilled has the potential to change estimates of diagnostic accuracy. The methodological quality of studies investigating the diagnostic accuracy of non-invasive coronary CT is only moderate and was found to affect the sensitivity and specificity. An improvement is highly desirable because good methodology is crucial for adequately assessing imaging technologies. (orig.)

  2. Low-level waste disposal site performance assessment with the RQ/PQ methodology. Final report

    International Nuclear Information System (INIS)

    Rogers, V.C.; Grant, M.W.; Sutherland, A.A.

    1982-12-01

    A methodology called RQ/PQ (retention quotient/performance quotient) has been developed for relating the potential hazard of radioactive waste to the natural and man-made barriers provided by a disposal facility. The methodology utilizes a systems approach to quantify the safety of low-level waste disposed in a near-surface facility. The main advantages of the RQ/PQ methodology are its simplicity of analysis and clarity of presentation while still allowing a comprehensive set of nuclides and pathways to be treated. Site performance and facility designs for low-level waste disposal can be easily investigated with relatively few parameters needed to define the problem. Application of the methodology has revealed that the key factor affecting the safety of low-level waste disposal in near surface facilities is the potential for intrusion events. Food, inhalation and well water pathways dominate in the analysis of such events. While the food and inhalation pathways are not strongly site-dependent, the well water pathway is. Finally, burial at depths of 5 m or more was shown to reduce the impacts from intrusion events

  3. On self-propagating methodological flaws in performance normalization for strength and power sports.

    Science.gov (United States)

    Arandjelović, Ognjen

    2013-06-01

    Performance in strength and power sports is greatly affected by a variety of anthropometric factors. The goal of performance normalization is to factor out the effects of confounding factors and compute a canonical (normalized) performance measure from the observed absolute performance. Performance normalization is applied in the ranking of elite athletes, as well as in the early stages of youth talent selection. Consequently, it is crucial that the process is principled and fair. The corpus of previous work on this topic, which is significant, is uniform in the methodology adopted. Performance normalization is universally reduced to a regression task: the collected performance data are used to fit a regression function that is then used to scale future performances. The present article demonstrates that this approach is fundamentally flawed. It inherently creates a bias that unfairly penalizes athletes with certain allometric characteristics, and, by virtue of its adoption in the ranking and selection of elite athletes, propagates and strengthens this bias over time. The main flaws are shown to originate in the criteria for selecting the data used for regression, as well as in the manner in which the regression model is applied in normalization. This analysis brings into light the aforesaid methodological flaws and motivates further work on the development of principled methods, the foundations of which are also laid out in this work.

  4. Performance Assessment of the Wave Dragon Wave Energy Converter Based on the EquiMar Methodology

    DEFF Research Database (Denmark)

    Parmeggiani, Stefano; Chozas, Julia Fernandez; Pecher, Arthur

    2011-01-01

    At the present pre-commercial phase of the wave energy sector, device developers are called to provide reliable estimates on power performance and production at possible deployment locations. The EU EquiMar project has proposed a novel approach, where the performance assessment is based mainly...... on experimental data deriving from sea trials rather than solely on numerical predictions. The study applies this methodology to evaluate the performance of Wave Dragon at two locations in the North Sea, based on the data acquired during the sea trials of a 1:4.5 scale prototype. Indications about power...

  5. Model development and optimization of operating conditions to maximize PEMFC performance by response surface methodology

    International Nuclear Information System (INIS)

    Kanani, Homayoon; Shams, Mehrzad; Hasheminasab, Mohammadreza; Bozorgnezhad, Ali

    2015-01-01

    Highlights: • The optimization of the operating parameters in a serpentine PEMFC is done using RSM. • The RSM model can predict the cell power over the wide range of operating conditions. • St-An, St-Ca and RH-Ca have an optimum value to obtain the best performance. • The interactions of the operating conditions affect the output power significantly. • The cathode and anode stoichiometry are the most effective parameters on the power. - Abstract: Optimization of operating conditions to obtain maximum power in PEMFCs could have a significant role to reduce the costs of this emerging technology. In the present experimental study, a single serpentine PEMFC is used to investigate the effects of operating conditions on the electrical power production of the cell. Four significant parameters including cathode stoichiometry, anode stoichiometry, gases inlet temperature, and cathode relative humidity are studied using Design of Experiment (DOE) to obtain an optimal power. Central composite second order Response Surface Methodology (RSM) is used to model the relationship between goal function (power) and considered input parameters (operating conditions). Using this statistical–mathematical method leads to obtain a second-order equation for the cell power. This model considers interactions and quadratic effects of different operating conditions and predicts the maximum or minimum power production over the entire working range of the parameters. In this range, high stoichiometry of cathode and low stoichiometry of anode results in the minimum cell power and contrary the medium range of fuel and oxidant stoichiometry leads to the maximum power. Results show that there is an optimum value for the anode stoichiometry, cathode stoichiometry and relative humidity to reach the best performance. The predictions of the model are evaluated by experimental tests and they are in a good agreement for different ranges of the parameters

  6. Assessment of the performance of containment and surveillance equipment part 1: methodology

    International Nuclear Information System (INIS)

    Rezniczek, A.; Richter, B.

    2009-01-01

    Equipment performance aims at the creation of relevant data. As Containment and Surveillance (C/S) is playing an ever increasing role in safeguards systems, the issue of how to assess the performance of C/S equipment is being addressed by the ESARDA Working Group on C/S. The issue is important not only for the development of appropriate safeguards approaches but also for the review of existing approaches with regard to the implementation of the Additional Protocol (A P) and Integrated Safeguards. It is expected that the selection process of appropriate equipment, especially for unattended operation, is facilitated by the availability of methods to determine the performance of such equipment. Apart from EURATOM, the users of assessment methodologies would be the International Atomic Energy Agency (IAEA), plant operators, and instrument developers. The paper describes a non-quantitative performance assessment methodology. A structured procedure is outlined that allows assessing the suitability of different C/S instrumentation to comply with the objectives of its application. The principle to determine the performance of C/S equipment is to define, based on safeguards requirements, a task profile and to check the performance profile against the task profile. The performance profile of C/S equipment can be derived from the functional specifications and design basis tolerances provided by the equipment manufacturers.

  7. Methodologies for predicting the part-load performance of aero-derivative gas turbines

    DEFF Research Database (Denmark)

    Haglind, Fredrik; Elmegaard, Brian

    2009-01-01

    Prediction of the part-load performance of gas turbines is advantageous in various applications. Sometimes reasonable part-load performance is sufficient, while in other cases complete agreement with the performance of an existing machine is desirable. This paper is aimed at providing some guidance...... on methodologies for predicting part-load performance of aero-derivative gas turbines. Two different design models – one simple and one more complex – are created. Subsequently, for each of these models, the part-load performance is predicted using component maps and turbine constants, respectively. Comparisons...... with manufacturer data are made. With respect to the design models, the simple model, featuring a compressor, combustor and turbines, results in equally good performance prediction in terms of thermal efficiency and exhaust temperature as does a more complex model. As for part-load predictions, the results suggest...

  8. Thermal performance of a concrete cask: Methodology to model helium leakage from the steel canister

    International Nuclear Information System (INIS)

    Penalva, J.; Feria, F.; Herranz, L.E.

    2017-01-01

    Highlights: • A thermal analysis of the canister during a loss of leaktightness has been performed. • Methodologies that predict fuel temperatures and heat up rates have been developed. • Casks with heat loads below 20 kW would never exceed the thermal threshold. - Abstract: Concrete cask storage systems used in dry storage allocate spent fuel within containers that are usually filled with helium at a certain pressure. Potential leaks from the container would result in a cooling degradation of fuel that might jeopardize fuel integrity if temperature exceeded a threshold value. According to ISG-11, temperatures below 673 K ensure fuel integrity preservation. Therefore, the container thermal response to a loss of leaktightness is of utmost importance in terms of safety. In this work, a thermo-fluid dynamic analysis of the canister during a loss of leaktightness has been performed. To do so, steady-state and transient Computational Fluid Dynamics (CFD) simulations have been carried out. Likewise, it has been developed two methodologies capable of estimating peak fuel temperatures and heat up rates resulting from a postulated depressurization in a dry storage cask. One methodology is based on control theory and transfers functions, and the other methodology is based on a linear relationship between the inner pressure and the maximum temperature. Both methodologies have been verified through comparisons with CFD calculations. The period of time to achieve the temperature threshold (673 K) is a function of pressure loss rate and decay heat of the fuel stored in the container; in case of a fuel canister with 30 kW the period of time to reach the thermal limit takes between half day (fast pressure loss) and one week (slow pressure loss). In case of a 15% reduction of the decay heat, the period of time to achieve the thermal limit increase up to a few weeks. The results highlight that casks with heat loads below 20 kW would never exceed the thermal threshold (673 K).

  9. Enhancing the Benefit of the Chemical Mixture Methodology: A Report on Methodology Testing and Potential Approaches for Improving Performance

    Energy Technology Data Exchange (ETDEWEB)

    Yu, Xiao-Ying; Yao, Juan; He, Hua; Glantz, Clifford S.; Booth, Alexander E.

    2012-01-01

    Extensive testing shows that the current version of the Chemical Mixture Methodology (CMM) is meeting its intended mission to provide conservative estimates of the health effects from exposure to airborne chemical mixtures. However, the current version of the CMM could benefit from several enhancements that are designed to improve its application of Health Code Numbers (HCNs) and employ weighting factors to reduce over conservatism.

  10. Acceleration-based methodology to assess the blast mitigation performance of explosive ordnance disposal helmets

    Science.gov (United States)

    Dionne, J. P.; Levine, J.; Makris, A.

    2018-01-01

    To design the next generation of blast mitigation helmets that offer increasing levels of protection against explosive devices, manufacturers must be able to rely on appropriate test methodologies and human surrogates that will differentiate the performance level of various helmet solutions and ensure user safety. Ideally, such test methodologies and associated injury thresholds should be based on widely accepted injury criteria relevant within the context of blast. Unfortunately, even though significant research has taken place over the last decade in the area of blast neurotrauma, there currently exists no agreement in terms of injury mechanisms for blast-induced traumatic brain injury. In absence of such widely accepted test methods and injury criteria, the current study presents a specific blast test methodology focusing on explosive ordnance disposal protective equipment, involving the readily available Hybrid III mannequin, initially developed for the automotive industry. The unlikely applicability of the associated brain injury criteria (based on both linear and rotational head acceleration) is discussed in the context of blast. Test results encompassing a large number of blast configurations and personal protective equipment are presented, emphasizing the possibility to develop useful correlations between blast parameters, such as the scaled distance, and mannequin engineering measurements (head acceleration). Suggestions are put forward for a practical standardized blast testing methodology taking into account limitations in the applicability of acceleration-based injury criteria as well as the inherent variability in blast testing results.

  11. Demonstration of a performance assessment methodology for high-level radioactive waste disposal in basalt formations

    International Nuclear Information System (INIS)

    Bonano, E.J.; Davis, P.A.; Shipers, L.R.; Brinster, K.F.; Beyler, W.E.; Updegraff, C.D.; Shepherd, E.R.; Tilton, L.M.; Wahi, K.K.

    1989-06-01

    This document describes a performance assessment methodology developed for a high-level radioactive waste repository mined in deep basalt formations. This methodology is an extension of an earlier one applicable to bedded salt. The differences between the two methodologies arise primarily in the modeling of round-water flow and radionuclide transport. Bedded salt was assumed to be a porous medium, whereas basalt formations contain fractured zones. Therefore, mathematical models and associated computer codes were developed to simulate the aforementioned phenomena in fractured media. The use of the methodology is demonstrated at a hypothetical basalt site by analyzing seven scenarios: (1) thermohydrological effects caused by heat released from the repository, (2) mechanohydrological effects caused by an advancing and receding glacier, (3) normal ground-water flow, (4) pumping of ground water from a confined aquifer, (5) rerouting of a river near the repository, (6) drilling of a borehole through the repository, and (7) formation of a new fault intersecting the repository. The normal ground-water flow was considered the base-case scenario. This scenario was used to perform uncertainty and sensitivity analyses and to demonstrate the existing capabilities for assessing compliance with the ground-water travel time criterion and the containment requirements. Most of the other scenarios were considered perturbations of the base case, and a few were studied in terms of changes with respect to initial conditions. The potential impact of these scenarios on the long-term performance of the disposal system was ascertained through comparison with the base-case scenario or the undisturbed initial conditions. 66 refs., 106 figs., 27 tabs

  12. A proposed methodology for performing risk analysis of state radiation control programs

    International Nuclear Information System (INIS)

    Dornsife, W.P.

    1996-01-01

    This paper is comprised of viewgraphs from a conference presentation. Topics discussed include barriers to effective risk assessment and management, and real versus perceived risk for various radiation programs in the state of Pennsylvania. Calculation results for Pennsylvania are provided for low-level radioactive waste transportation risks, indoor radon risk, and cancer morbidity risk from x-rays. A methodology for prioritizing radiation regulatory programs based on risk is presented with calculations for various Pennsylvania programs

  13. Assessing the Impact of Clothing and Individual Equipment (CIE) on Soldier Physical, Biomechanical, and Cognitive Performance Part 1: Test Methodology

    Science.gov (United States)

    2018-02-01

    29 during Soldier Equipment Configuration Impact on Performance: Establishing a Test Methodology for the...during ACSM’S resource manual for exercise testing and prescription Human Movement Science, 31(2), Proceedings of the 2016 American Biomechanics...Performance of Medium Rucksack Prototypes An investigation: Comparison of live-fire and weapon simulator test methodologies and the of three extremity armor

  14. Quick Green Scan: A Methodology for Improving Green Performance in Terms of Manufacturing Processes

    Directory of Open Access Journals (Sweden)

    Aldona Kluczek

    2017-01-01

    Full Text Available The heating sector has begun implementing technologies and practices to tackle the environmental and social–economic problems caused by their production process. The purpose of this paper is to develop a methodology, “the Quick-Green-Scan”, that caters for the need of quick assessment decision-makers to improve green manufacturing performance in companies that produce heating devices. The study uses a structured approach that integrates Life Cycle Assessment-based indicators, framework and linguistic scales (fuzzy numbers to evaluate the extent of greening of the enterprise. The evaluation criteria and indicators are closely related to the current state of technology, which can be improved. The proposed methodology has been created to answer the question whether a company acts on the opportunity to be green and whether these actions are contributing towards greening, maintaining the status quo or moving away from a green outcome. Results show that applying the proposed improvements in processes helps move the facility towards being a green enterprise. Moreover, the methodology, being particularly quick and simple, is a practical tool for benchmarking, not only in the heating industry, but also proves useful in providing comparisons for facility performance in other manufacturing sectors.

  15. Methodology to extract of humic substances of lombricompost and evaluation of their performance

    International Nuclear Information System (INIS)

    Torrente Trujillo, Armando; Gomez Zambrano, Jairo

    1995-01-01

    The present works was developed at the facultad de ciencias agropecuarias of the Universidad Nacional de Colombia, located in Palmira City Valle del Cauca. The research consisted in the development of the appropriate methodology to extract humic substances contained in lombricompost and on the other hand to evaluate the performance in organic carbon of the fulvic and humic acids. The lombricompost source consisted in organic matter such as: dug cow, filter press cake, coffee pulp and Paspalum notatum with and without application of lime. The results showed sixteen steps, which are completely described in the work, obtain the proposal methodology. By the other hand this method showed that humic acids in the lombricompost are richer than fulvic ones; besides among the four sources used in the experiment the filter press cake was different and higher in carbon yield than coffee pulp and Paspalum notatum

  16. Reference Performance Test Methodology for Degradation Assessment of Lithium-Sulfur Batteries

    DEFF Research Database (Denmark)

    Knap, Vaclav; Stroe, Daniel-Ioan; Purkayastha, Rajlakshmi

    2018-01-01

    Lithium-Sulfur (Li-S) is an emerging battery technology receiving a growing amount of attention due to its potentially high gravimetric energy density, safety, and low production cost. However, there are still some obstacles preventing its swift commercialization. Li-S batteries are driven...... by different electrochemical processes than commonly used Lithium-ion batteries, which often results in very different behavior. Therefore, the testing and modeling of these systems have to be adjusted to reflect their unique behavior and to prevent possible bias. A methodology for a Reference Performance Test...... (RPT) for the Li-S batteries is proposed in this study to point out Li-S battery features and provide guidance to users how to deal with them and possible results into standardization. The proposed test methodology is demonstrated for 3.4 Ah Li-S cells aged under different conditions....

  17. Development of a methodology for the evaluation of radiation protection performance and management in nuclear power plants

    International Nuclear Information System (INIS)

    Schieber, Caroline; Bataille, Celine; Cordier, Gerard; Delabre, Herve; Jeannin, Bernard

    2008-01-01

    This paper describes a specific methodology adopted by Electricite de France to perform the evaluation of radiation protection performance and management within its 19 nuclear power plants. The results obtained in 2007 are summed up. (author)

  18. Generalized Characterization Methodology for Performance Modelling of Lithium-Ion Batteries

    DEFF Research Database (Denmark)

    Stroe, Daniel Loan; Swierczynski, Maciej Jozef; Stroe, Ana-Irina

    2016-01-01

    Lithium-ion (Li-ion) batteries are complex energy storage devices with their performance behavior highly dependent on the operating conditions (i.e., temperature, load current, and state-of-charge (SOC)). Thus, in order to evaluate their techno-economic viability for a certain application, detailed...... information about Li-ion battery performance behavior becomes necessary. This paper proposes a comprehensive seven-step methodology for laboratory characterization of Li-ion batteries, in which the battery’s performance parameters (i.e., capacity, open-circuit voltage (OCV), and impedance) are determined...... and their dependence on the operating conditions are obtained. Furthermore, this paper proposes a novel hybrid procedure for parameterizing the batteries’ equivalent electrical circuit (EEC), which is used to emulate the batteries’ dynamic behavior. Based on this novel parameterization procedure, the performance model...

  19. A methodology for energy performance classification of residential building stock of Hamirpur

    Directory of Open Access Journals (Sweden)

    Aniket Sharma

    2017-12-01

    Full Text Available In India, there are various codes, standards, guidelines and rating systems launched to make energy intensive and large sized buildings energy efficient whereas independent residential buildings are not covered even though they exist most in numbers of total housing stock. This paper presents a case study methodology for energy performance assessment of existing residential stock of Hamirpur that can be used to develop suitable energy efficiency regulations. The paper discusses the trend of residential development in Hamirpur followed by classification based on usage, condition, predominant material use, ownership size and number of rooms, source of lighting, assets available, number of storey and plot sizes using primary and secondary data. It results in identification of predominant materials used and other characteristics in each of urban and rural area. Further cradle to site embodied energy index of various dominant building materials and their market available alternative materials is calculated from secondary literature and by calculating transportation energy. One representative existing building is selected in each of urban and rural area and their energy performance is evaluated for material embodied energy and operational energy using simulation. Further alternatives are developed based on other dominant materials in each area and evaluated for change in embodied and operational energy. This paper identifies the energy performance of representative houses for both areas and in no way advocates the preference of one type over another. The paper demonstrates a methodology by which energy performance assessment of houses shall be done and also highlights further research.

  20. Wind Energy Development in India and a Methodology for Evaluating Performance of Wind Farm Clusters

    Directory of Open Access Journals (Sweden)

    Sanjeev H. Kulkarni

    2016-01-01

    Full Text Available With maturity of advanced technologies and urgent requirement for maintaining a healthy environment with reasonable price, India is moving towards a trend of generating electricity from renewable resources. Wind energy production, with its relatively safer and positive environmental characteristics, has evolved from a marginal activity into a multibillion dollar industry today. Wind energy power plants, also known as wind farms, comprise multiple wind turbines. Though there are several wind-mill clusters producing energy in different geographical locations across the world, evaluating their performance is a complex task and is an important focus for stakeholders. In this work an attempt is made to estimate the performance of wind clusters employing a multicriteria approach. Multiple factors that affect wind farm operations are analyzed by taking experts opinions, and a performance ranking of the wind farms is generated. The weights of the selection criteria are determined by pairwise comparison matrices of the Analytic Hierarchy Process (AHP. The proposed methodology evaluates wind farm performance based on technical, economic, environmental, and sociological indicators. Both qualitative and quantitative parameters were considered. Empirical data were collected through questionnaire from the selected wind farms of Belagavi district in the Indian State of Karnataka. This proposed methodology is a useful tool for cluster analysis.

  1. Performance analysis for disposal of mixed low-level waste. 1: Methodology

    International Nuclear Information System (INIS)

    Waters, R.D.; Gruebel, M.M.

    1999-01-01

    A simple methodology has been developed for evaluating the technical capabilities of potential sites for disposal of mixed low-level radioactive waste. The results of the evaluation are expressed as permissible radionuclide concentrations in disposed waste. The methodology includes an analysis of three separate pathways: (1) releases of radionuclides to groundwater; (2) releases of potentially volatile radionuclides to the atmosphere; and (3) the consequences of inadvertent intrusion into a disposal facility. For each radionuclide, its limiting permissible concentration in disposed waste is the lowest of the permissible concentrations determined from each of the three pathways. These permissible concentrations in waste at an evaluated site can be used to assess the capability of the site to dispose of waste streams containing multiple radionuclides

  2. Methodology for electrical studies in industrial networks including the study of electric arc; Metodologia para los estudios electricos en redes industriales incluyendo el estudio de arco electrico

    Energy Technology Data Exchange (ETDEWEB)

    Rasgado Casique, Jose Pepe; Silva Farias, Jose Luis [Instituto de Investigaciones Electricas, Cuernavaca, Morelos (Mexico)]. E-mail: jrasgado@iie.org.mx; jlsilva@iie.org.mx

    2010-11-15

    This article presents a methodology for conducting electrical studies in industrial networks. The methodology included the study of arc flash as a very important area of current basic electrical studies, such as power flow, short circuit and coordination. The aim of this study is to determine the Personal Protective Equipment (PPE) and flash protection boundary for personnel working with or near energized equipment, based on the IEEE Std 1584-2004 and NFPA-70E- 2004. Also included are criteria and recommendations to reduce incident energy level (cal/cm{sup 2}). At work we used a distribution network for industrial type test. The studies were carried out using a commercial program for the analysis of electrical networks. [Spanish] En este articulo se presenta una metodologia para llevar a cabo los estudios electricos en redes industriales. En la metodologia se incluye al estudio de arco electrico como un area muy importante de los estudios electricos basicos actuales, como: flujos de potencia, cortocircuito y coordinacion de protecciones. El objetivo de dicho estudio es determinar el Equipo de Proteccion Personal (EPP) apropiado y los limites de proteccion para el personal que opera con o cerca de equipo energizado, con base en las normas IEEE Std. 1584-2004 y la NFPA-70E-2004. Ademas, se incluyen criterios y recomendaciones para disminuir el nivel de energia incidente (cal/cm{sup 2}). En el trabajo se utilizo una red de distribucion tipo industrial de prueba. Los estudios se llevaron a cabo utilizando un programa comercial para el analisis de redes electricas.

  3. Development of evaluation methodology to assess the sodium fire suppression performance of leak collection tray

    International Nuclear Information System (INIS)

    Parida, F.C.; Rao, P.M.; Ramesh, S.S.; Somayajulu, P.A.; Malarvizhi, B.; Kannan, S.E.

    2005-01-01

    Full text of publication follows: Leakage of hot liquid sodium and its subsequent combustion in the form of a pool cannot be completely ruled out in a Fast breeder Reactor (FBR) plant in spite of provision for adequate safety measures. To protect the plant system from the hazardous effects of flame, heat and smoke, one of the passive protection devices used in FBR plants is the Leak Collection Tray (LCT). The design of LCT is based on immediate channeling of burning liquid sodium on the funnel shaped sloping cover tray (SCT) to the bottom sodium hold-up vessel (SHV) in which self-extinction of the fire occurs due to oxygen starvation. The SCT has one or three drain pipes and air vent pipes depending on the type of design. In each experiment, a known amount ranging from 30 to 40 kg of hot liquid sodium at 550 deg. C was discharged on the LCT in the open air. Continuous on-line monitoring of temperature at strategic locations (∼ 28 points) was carried out. Colour video-graphy was employed for taking motion pictures of various time-dependent events like sodium dumping, appearance of flame and release of smoke through vent pipes. After self-extinction of sodium fire, the LCT was allowed to cool overnight in an argon atmosphere. Solid samples of sodium debris in the SCT and SHV were collected by manual core drilling machine. The samples were subjected to chemical analysis for determination of unburnt and burnt sodium. The sodium debris removed from SCT and SHV were separately weighed. To assess the performance of the LCT, two different geometrical configurations of SCT, one made up of stainless steel an the other of carbon steel, were used. Three broad phenomena are identified as the basis of evaluation methodology. These are (a) thermal transients, i.e. heating and cooling of the bulk sodium in SCT and SHV respectively, (b) post test sodium debris distribution between SCT and SHV as well as (c) sodium combustion and smoke release behaviour. Under each category

  4. Development of evaluation methodology to assess the sodium fire suppression performance of leak collection tray

    Energy Technology Data Exchange (ETDEWEB)

    Parida, F.C.; Rao, P.M.; Ramesh, S.S.; Somayajulu, P.A.; Malarvizhi, B.; Kannan, S.E. [Engineering Safety Division, Safety Group, Indira Gandhi Centre for Atomic Research, Kalpakkam - 603102, Tamilnadu (India)

    2005-07-01

    Full text of publication follows: Leakage of hot liquid sodium and its subsequent combustion in the form of a pool cannot be completely ruled out in a Fast breeder Reactor (FBR) plant in spite of provision for adequate safety measures. To protect the plant system from the hazardous effects of flame, heat and smoke, one of the passive protection devices used in FBR plants is the Leak Collection Tray (LCT). The design of LCT is based on immediate channeling of burning liquid sodium on the funnel shaped sloping cover tray (SCT) to the bottom sodium hold-up vessel (SHV) in which self-extinction of the fire occurs due to oxygen starvation. The SCT has one or three drain pipes and air vent pipes depending on the type of design. In each experiment, a known amount ranging from 30 to 40 kg of hot liquid sodium at 550 deg. C was discharged on the LCT in the open air. Continuous on-line monitoring of temperature at strategic locations ({approx} 28 points) was carried out. Colour video-graphy was employed for taking motion pictures of various time-dependent events like sodium dumping, appearance of flame and release of smoke through vent pipes. After self-extinction of sodium fire, the LCT was allowed to cool overnight in an argon atmosphere. Solid samples of sodium debris in the SCT and SHV were collected by manual core drilling machine. The samples were subjected to chemical analysis for determination of unburnt and burnt sodium. The sodium debris removed from SCT and SHV were separately weighed. To assess the performance of the LCT, two different geometrical configurations of SCT, one made up of stainless steel an the other of carbon steel, were used. Three broad phenomena are identified as the basis of evaluation methodology. These are (a) thermal transients, i.e. heating and cooling of the bulk sodium in SCT and SHV respectively, (b) post test sodium debris distribution between SCT and SHV as well as (c) sodium combustion and smoke release behaviour. Under each category

  5. FPGA hardware acceleration for high performance neutron transport computation based on agent methodology - 318

    International Nuclear Information System (INIS)

    Shanjie, Xiao; Tatjana, Jevremovic

    2010-01-01

    The accurate, detailed and 3D neutron transport analysis for Gen-IV reactors is still time-consuming regardless of advanced computational hardware available in developed countries. This paper introduces a new concept in addressing the computational time while persevering the detailed and accurate modeling; a specifically designed FPGA co-processor accelerates robust AGENT methodology for complex reactor geometries. For the first time this approach is applied to accelerate the neutronics analysis. The AGENT methodology solves neutron transport equation using the method of characteristics. The AGENT methodology performance was carefully analyzed before the hardware design based on the FPGA co-processor was adopted. The most time-consuming kernel part is then transplanted into the FPGA co-processor. The FPGA co-processor is designed with data flow-driven non von-Neumann architecture and has much higher efficiency than the conventional computer architecture. Details of the FPGA co-processor design are introduced and the design is benchmarked using two different examples. The advanced chip architecture helps the FPGA co-processor obtaining more than 20 times speed up with its working frequency much lower than the CPU frequency. (authors)

  6. Electrical performance verification methodology for large reflector antennas: based on the P-band SAR payload of the ESA BIOMASS candidate mission

    DEFF Research Database (Denmark)

    Pivnenko, Sergey; Kim, Oleksiy S.; Nielsen, Jeppe Majlund

    2013-01-01

    pattern and gain of the entire antenna including support and satellite structure with an appropriate computational software. A preliminary investigation of the proposed methodology was carried out by performing extensive simulations of different verification approaches. The experimental validation......In this paper, an electrical performance verification methodology for large reflector antennas is proposed. The verification methodology was developed for the BIOMASS P-band (435 MHz) synthetic aperture radar (SAR), but can be applied to other large deployable or fixed reflector antennas for which...... the verification of the entire antenna or payload is impossible. The two-step methodology is based on accurate measurement of the feed structure characteristics, such as complex radiation pattern and radiation efficiency, with an appropriate Measurement technique, and then accurate calculation of the radiation...

  7. Repository environmental parameters and models/methodologies relevant to assessing the performance of high-level waste packages in basalt, tuff, and salt

    Energy Technology Data Exchange (ETDEWEB)

    Claiborne, H.C.; Croff, A.G.; Griess, J.C.; Smith, F.J.

    1987-09-01

    This document provides specifications for models/methodologies that could be employed in determining postclosure repository environmental parameters relevant to the performance of high-level waste packages for the Basalt Waste Isolation Project (BWIP) at Richland, Washington, the tuff at Yucca Mountain by the Nevada Test Site, and the bedded salt in Deaf Smith County, Texas. Guidance is provided on the identify of the relevant repository environmental parameters; the models/methodologies employed to determine the parameters, and the input data base for the models/methodologies. Supporting studies included are an analysis of potential waste package failure modes leading to identification of the relevant repository environmental parameters, an evaluation of the credible range of the repository environmental parameters, and a summary of the review of existing models/methodologies currently employed in determining repository environmental parameters relevant to waste package performance. 327 refs., 26 figs., 19 tabs.

  8. Repository environmental parameters and models/methodologies relevant to assessing the performance of high-level waste packages in basalt, tuff, and salt

    International Nuclear Information System (INIS)

    Claiborne, H.C.; Croff, A.G.; Griess, J.C.; Smith, F.J.

    1987-09-01

    This document provides specifications for models/methodologies that could be employed in determining postclosure repository environmental parameters relevant to the performance of high-level waste packages for the Basalt Waste Isolation Project (BWIP) at Richland, Washington, the tuff at Yucca Mountain by the Nevada Test Site, and the bedded salt in Deaf Smith County, Texas. Guidance is provided on the identify of the relevant repository environmental parameters; the models/methodologies employed to determine the parameters, and the input data base for the models/methodologies. Supporting studies included are an analysis of potential waste package failure modes leading to identification of the relevant repository environmental parameters, an evaluation of the credible range of the repository environmental parameters, and a summary of the review of existing models/methodologies currently employed in determining repository environmental parameters relevant to waste package performance. 327 refs., 26 figs., 19 tabs

  9. Methodology for thermal hydraulic conceptual design and performance analysis of KALIMER core

    International Nuclear Information System (INIS)

    Young-Gyun Kim; Won-Seok Kim; Young-Jin Kim; Chang-Kue Park

    2000-01-01

    This paper summarizes the methodology for thermal hydraulic conceptual design and performance analysis which is used for KALIMER core, especially the preliminary methodology for flow grouping and peak pin temperature calculation in detail. And the major technical results of the conceptual design for the KALIMER 98.03 core was shown and compared with those of KALIMER 97.07 design core. The KALIMER 98.03 design core is proved to be more optimized compared to the 97.07 design core. The number of flow groups are reduced from 16 to 11, and the equalized peak cladding midwall temperature from 654 deg. C to 628 deg. C. It was achieved from the nuclear and thermal hydraulic design optimization study, i.e. core power flattening and increase of radial blanket power fraction. Coolant flow distribution to the assemblies and core coolant/component temperatures should be determined in core thermal hydraulic analysis. Sodium flow is distributed to core assemblies with the overall goal of equalizing the peak cladding midwall temperatures for the peak temperature pin of each bundle, thus pin cladding damage accumulation and pin reliability. The flow grouping and the peak pin temperature calculation for the preliminary conceptual design is performed with the modules ORFCE-F60 and ORFCE-T60 respectively. The basic subchannel analysis will be performed with the SLTHEN code, and the detailed subchannel analysis will be done with the MATRA-LMR code which is under development for the K-Core system. This methodology was proved practical to KALIMER core thermal hydraulic design from the related benchmark calculation studies, and it is used to KALIMER core thermal hydraulic conceptual design. (author)

  10. Providing hierarchical approach for measuring supply chain performance using AHP and DEMATEL methodologies

    Directory of Open Access Journals (Sweden)

    Ali Najmi

    2010-06-01

    Full Text Available Measuring the performance of a supply chain is normally of a function of various parameters. Such a problem often involves in a multiple criteria decision making (MCMD problem where different criteria need to be defined and calculated, properly. During the past two decades, Analytical hierarchy procedure (AHP and DEMATEL have been some of the most popular MCDM approaches for prioritizing various attributes. The study of this paper uses a new methodology which is a combination of AHP and DEMATEL to rank various parameters affecting the performance of the supply chain. The DEMATEL is used for understanding the relationship between comparison metrics and AHP is used for the integration to provide a value for the overall performance.

  11. High performance computation of landscape genomic models including local indicators of spatial association.

    Science.gov (United States)

    Stucki, S; Orozco-terWengel, P; Forester, B R; Duruz, S; Colli, L; Masembe, C; Negrini, R; Landguth, E; Jones, M R; Bruford, M W; Taberlet, P; Joost, S

    2017-09-01

    With the increasing availability of both molecular and topo-climatic data, the main challenges facing landscape genomics - that is the combination of landscape ecology with population genomics - include processing large numbers of models and distinguishing between selection and demographic processes (e.g. population structure). Several methods address the latter, either by estimating a null model of population history or by simultaneously inferring environmental and demographic effects. Here we present samβada, an approach designed to study signatures of local adaptation, with special emphasis on high performance computing of large-scale genetic and environmental data sets. samβada identifies candidate loci using genotype-environment associations while also incorporating multivariate analyses to assess the effect of many environmental predictor variables. This enables the inclusion of explanatory variables representing population structure into the models to lower the occurrences of spurious genotype-environment associations. In addition, samβada calculates local indicators of spatial association for candidate loci to provide information on whether similar genotypes tend to cluster in space, which constitutes a useful indication of the possible kinship between individuals. To test the usefulness of this approach, we carried out a simulation study and analysed a data set from Ugandan cattle to detect signatures of local adaptation with samβada, bayenv, lfmm and an F ST outlier method (FDIST approach in arlequin) and compare their results. samβada - an open source software for Windows, Linux and Mac OS X available at http://lasig.epfl.ch/sambada - outperforms other approaches and better suits whole-genome sequence data processing. © 2016 The Authors. Molecular Ecology Resources Published by John Wiley & Sons Ltd.

  12. A methodology to investigate the contribution of conduction and radiation heat transfer to the effective thermal conductivity of packed graphite pebble beds, including the wall effect

    Energy Technology Data Exchange (ETDEWEB)

    De Beer, M., E-mail: maritz.db@gmail.com [School of Mechanical and Nuclear Engineering, North-West University, Private Bag X6001, Potchefstroom 2520 (South Africa); Du Toit, C.G., E-mail: Jat.DuToit@nwu.ac.za [School of Mechanical and Nuclear Engineering, North-West University, Private Bag X6001, Potchefstroom 2520 (South Africa); Rousseau, P.G., E-mail: pieter.rousseau@uct.ac.za [Department of Mechanical Engineering, University of Cape Town, Private Bag X3, Rondebosch 7701 (South Africa)

    2017-04-01

    Highlights: • The radiation and conduction components of the effective thermal conductivity are separated. • Near-wall effects have a notable influence on the effective thermal conductivity. • Effective thermal conductivity is a function of the macro temperature gradient. • The effective thermal conductivity profile shows a characteristic trend. • The trend is a result of the interplay between conduction and radiation. - Abstract: The effective thermal conductivity represents the overall heat transfer characteristics of a packed bed of spheres and must be considered in the analysis and design of pebble bed gas-cooled reactors. During depressurized loss of forced cooling conditions the dominant heat transfer mechanisms for the passive removal of decay heat are radiation and conduction. Predicting the value of the effective thermal conductivity is complex since it inter alia depends on the temperature level and temperature gradient through the bed, as well as the pebble packing structure. The effect of the altered packing structure in the wall region must therefore also be considered. Being able to separate the contributions of radiation and conduction allows a better understanding of the underlying phenomena and the characteristics of the resultant effective thermal conductivity. This paper introduces a purpose-designed test facility and accompanying methodology that combines physical measurements with Computational Fluid Dynamics (CFD) simulations to separate the contributions of radiation and conduction heat transfer, including the wall effects. Preliminary results obtained with the methodology offer important insights into the trends observed in the experimental results and provide a better understanding of the interplay between the underlying heat transfer phenomena.

  13. Including Performance Assessments in Accountability Systems: A Review of Scale-Up Efforts

    Science.gov (United States)

    Tung, Rosann

    2010-01-01

    The purpose of this literature and field review is to understand previous efforts at scaling up performance assessments for use across districts and states. Performance assessments benefit students and teachers by providing more opportunities for students to demonstrate their knowledge and complex skills, by providing teachers with better…

  14. Proposed methodology for completion of scenario analysis for the Basalt Waste Isolation Project. [Assessment of post-closure performance for a proposed repository for high-level nuclear waste

    Energy Technology Data Exchange (ETDEWEB)

    Roberds, W.J.; Plum, R.J.; Visca, P.J.

    1984-11-01

    This report presents the methodology to complete an assessment of postclosure performance, considering all credible scenarios, including the nominal case, for a proposed repository for high-level nuclear waste at the Hanford Site, Washington State. The methodology consists of defensible techniques for identifying and screening scenarios, and for then assessing the risks associated with each. The results of the scenario analysis are used to comprehensively determine system performance and/or risk for evaluation of compliance with postclosure performance criteria (10 CFR 60 and 40 CFR 191). In addition to describing the proposed methodology, this report reviews available methodologies for scenario analysis, discusses pertinent performance assessment and uncertainty concepts, advises how to implement the methodology (including the organizational requirements and a description of tasks) and recommends how to use the methodology in guiding future site characterization, analysis, and engineered subsystem design work. 36 refs., 24 figs., 1 tab.

  15. Application of the BEPU methodology to assess fuel performance in dry storage

    International Nuclear Information System (INIS)

    Feria, F.; Herranz, L.E.

    2017-01-01

    Highlights: • Application of the BEPU methodology to estimate the cladding stress in dry storage. • The stress predicted is notably affected by the irradiation history. • Improvements of FGR modelling would significantly enhance the stress estimates. • The prediction uncertainty should not be disregarded when assessing clad integrity. - Abstract: The stress at which fuel cladding is submitted in dry storage is the driving force of the main degrading mechanisms postulated (i.e., embrittlement due to hydrides radial reorientation and creep). Therefore, a sound assessment is mandatory to reliably predict fuel performance under the dry storage prevailing conditions. Through fuel rod thermo-mechanical codes, best estimate calculations can be conducted. Precision of predictions depends on uncertainties affecting the way of calculating the stress, so by using uncertainty analysis an upper bound of stress can be determined and compared to safety limits set. The present work shows the application of the BEPU (Best Estimate Plus Uncertainty) methodology in this field. Concretely, hydrides radial reorientation has been assessed based on stress predictions under challenging thermal conditions (400 °C) and a stress limit of 90 MPa. The computational tools used to do that are FRAPCON-3xt (best estimate) and Dakota (uncertainty analysis). The methodology has been applied to a typical PWR fuel rod highly irradiated (65 GWd/tU) at different power histories. The study performed allows concluding that both the power history and the prediction uncertainty should not be disregarded when fuel rod integrity is evaluated in dry storage. On probabilistic bases, a burnup of 60 GWd/tU is found out as an acceptable threshold even in the most challenging irradiation conditions considered.

  16. Development of performance assessment methodology for establishment of quantitative acceptance criteria of near-surface radioactive waste disposal

    Energy Technology Data Exchange (ETDEWEB)

    Kim, C. R.; Lee, E. Y.; Park, J. W.; Chang, G. M.; Park, H. Y.; Yeom, Y. S. [Korea Hydro and Nuclear Power Co., Ltd., Seoul (Korea, Republic of)

    2002-03-15

    The contents and the scope of this study are as follows : review of state-of-the-art on the establishment of waste acceptance criteria in foreign near-surface radioactive waste disposal facilities, investigation of radiological assessment methodologies and scenarios, investigation of existing models and computer codes used in performance/safety assessment, development of a performance assessment methodology(draft) to derive quantitatively radionuclide acceptance criteria of domestic near-surface disposal facility, preliminary performance/safety assessment in accordance with the developed methodology.

  17. Causality analysis in business performance measurement system using system dynamics methodology

    Science.gov (United States)

    Yusof, Zainuridah; Yusoff, Wan Fadzilah Wan; Maarof, Faridah

    2014-07-01

    One of the main components of the Balanced Scorecard (BSC) that differentiates it from any other performance measurement system (PMS) is the Strategy Map with its unidirectional causality feature. Despite its apparent popularity, criticisms on the causality have been rigorously discussed by earlier researchers. In seeking empirical evidence of causality, propositions based on the service profit chain theory were developed and tested using the econometrics analysis, Granger causality test on the 45 data points. However, the insufficiency of well-established causality models was found as only 40% of the causal linkages were supported by the data. Expert knowledge was suggested to be used in the situations of insufficiency of historical data. The Delphi method was selected and conducted in obtaining the consensus of the causality existence among the 15 selected expert persons by utilizing 3 rounds of questionnaires. Study revealed that only 20% of the propositions were not supported. The existences of bidirectional causality which demonstrate significant dynamic environmental complexity through interaction among measures were obtained from both methods. With that, a computer modeling and simulation using System Dynamics (SD) methodology was develop as an experimental platform to identify how policies impacting the business performance in such environments. The reproduction, sensitivity and extreme condition tests were conducted onto developed SD model to ensure their capability in mimic the reality, robustness and validity for causality analysis platform. This study applied a theoretical service management model within the BSC domain to a practical situation using SD methodology where very limited work has been done.

  18. Preliminary Evaluation Methodology of ECCS Performance for Design Basis LOCA Redefinition

    International Nuclear Information System (INIS)

    Kang, Dong Gu; Ahn, Seung Hoon; Seul, Kwang Won

    2010-01-01

    To improve their existing regulations, the USNRC has made efforts to develop the risk-informed and performance-based regulation (RIPBR) approaches. As a part of these efforts, the rule revision of 10CFR50.46 (ECCS Acceptance Criteria) is underway, considering some options for 4 categories of spectrum of break sizes, ECCS functional reliability, ECCS evaluation model, and ECCS acceptance criteria. Since the potential for safety benefits and unnecessary burden reduction from design basis LOCA redefinition is high relative to other options, the USNRC is proceeding with the rulemaking for design basis LOCA redefinition. An instantaneous break with a flow rate equivalent to a double ended guillotine break (DEGB) of the largest primary piping system in the plant is widely recognized as an extremely unlikely event, while redefinition of design basis LOCA can affect the existing regulatory practices and approaches. In this study, the status of the design basis LOCA redefinition and OECD/NEA SMAP (Safety Margin Action Plan) methodology are introduced. Preliminary evaluation methodology of ECCS performance for LOCA is developed and discussed for design basis LOCA redefinition

  19. Use of Balanced Scorecard Methodology for Performance Measurement of the Health Extension Program in Ethiopia.

    Science.gov (United States)

    Teklehaimanot, Hailay D; Teklehaimanot, Awash; Tedella, Aregawi A; Abdella, Mustofa

    2016-05-04

    In 2004, Ethiopia introduced a community-based Health Extension Program to deliver basic and essential health services. We developed a comprehensive performance scoring methodology to assess the performance of the program. A balanced scorecard with six domains and 32 indicators was developed. Data collected from 1,014 service providers, 433 health facilities, and 10,068 community members sampled from 298 villages were used to generate weighted national, regional, and agroecological zone scores for each indicator. The national median indicator scores ranged from 37% to 98% with poor performance in commodity availability, workforce motivation, referral linkage, infection prevention, and quality of care. Indicator scores showed significant difference by region (P < 0.001). Regional performance varied across indicators suggesting that each region had specific areas of strength and deficiency, with Tigray and the Southern Nations, Nationalities and Peoples Region being the best performers while the mainly pastoral regions of Gambela, Afar, and Benishangul-Gumuz were the worst. The findings of this study suggest the need for strategies aimed at improving specific elements of the program and its performance in specific regions to achieve quality and equitable health services. © The American Society of Tropical Medicine and Hygiene.

  20. Development of performance assessment methodology for nuclear waste isolation in geologic media

    International Nuclear Information System (INIS)

    Bonano, E.J.; Chu, M.S.Y.; Cranwell, R.M.; Davis, P.A.

    1986-01-01

    The analysis of the processes involved in the burial of nuclear wastes can be performed only with reliable mathematical models and computer codes as opposed to conducting experiments because the time scales associated are on the order of tens of thousands of years. These analyses are concerned primarily with the migration of radioactive contaminants from the repository to the environment accessible to humans. Modeling of this phenomenon depends on a large number of other phenomena taking place in the geologic porous and/or fractured medium. These are ground-water flow, physicochemical interactions of the contaminants with the rock, heat transfer, and mass transport. Once the radionuclides have reached the accessible environment, the pathways to humans and health effects are estimated. A performance assessment methodology for a potential high-level waste repository emplaced in a basalt formation has been developed for the US Nuclear Regulatory Commission

  1. Catalytic Reforming: Methodology and Process Development for a Constant Optimisation and Performance Enhancement

    Directory of Open Access Journals (Sweden)

    Avenier Priscilla

    2016-05-01

    Full Text Available Catalytic reforming process has been used to produce high octane gasoline since the 1940s. It would appear to be an old process that is well established and for which nothing new could be done. It is however not the case and constant improvements are proposed at IFP Energies nouvelles. With a global R&D approach using new concepts and forefront methodology, IFPEN is able to: propose a patented new reactor concept, increasing capacity; ensure efficiency and safety of mechanical design for reactor using modelization of the structure; develop new catalysts to increase process performance due to a high comprehension of catalytic mechanism by using, an experimental and innovative analytical approach (119Sn Mössbauer and X-ray absorption spectroscopies and also a Density Functional Theory (DFT calculations; have efficient, reliable and adapted pilots to validate catalyst performance.

  2. RANS Based Methodology for Predicting the Influence of Leading Edge Erosion on Airfoil Performance

    Energy Technology Data Exchange (ETDEWEB)

    Langel, Christopher M. [Univ. of California, Davis, CA (United States). Dept. of Mechanical and Aerospace Engineering; Chow, Raymond C. [Univ. of California, Davis, CA (United States). Dept. of Mechanical and Aerospace Engineering; van Dam, C. P. [Univ. of California, Davis, CA (United States). Dept. of Mechanical and Aerospace Engineering; Maniaci, David Charles [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States). Wind Energy Technologies Dept.

    2017-10-01

    The impact of surface roughness on flows over aerodynamically designed surfaces is of interested in a number of different fields. It has long been known the surface roughness will likely accelerate the laminar- turbulent transition process by creating additional disturbances in the boundary layer. However, there are very few tools available to predict the effects surface roughness will have on boundary layer flow. There are numerous implications of the premature appearance of a turbulent boundary layer. Increases in local skin friction, boundary layer thickness, and turbulent mixing can impact global flow properties compounding the effects of surface roughness. With this motivation, an investigation into the effects of surface roughness on boundary layer transition has been conducted. The effort involved both an extensive experimental campaign, and the development of a high fidelity roughness model implemented in a R ANS solver. Vast a mounts of experimental data was generated at the Texas A&M Oran W. Nicks Low Speed Wind Tunnel for the calibration and validation of the roughness model described in this work, as well as future efforts. The present work focuses on the development of the computational model including a description of the calibration process. The primary methodology presented introduces a scalar field variable and associated transport equation that interacts with a correlation based transition model. The additional equation allows for non-local effects of surface roughness to be accounted for downstream of rough wall sections while maintaining a "local" formulation. The scalar field is determined through a boundary condition function that has been calibrated to flat plate cases with sand grain roughness. The model was initially tested on a NACA 0012 airfoil with roughness strips applied to the leading edge. Further calibration of the roughness model was performed using results from the companion experimental study on a NACA 633 -418 airfoil

  3. Performance of muon reconstruction including Alignment Position Errors for 2016 Collision Data

    CERN Document Server

    CMS Collaboration

    2016-01-01

    From 2016 Run muon reconstruction is using non-zero Alignment Position Errors to account for the residual uncertainties of muon chambers' positions. Significant improvements are obtained in particular for the startup phase after opening/closing the muon detector. Performance results are presented for real data and MC simulations, related to both the offline reconstruction and the High-Level Trigger.

  4. Transient performances analysis of wind turbine system with induction generator including flux saturation and skin effect

    DEFF Research Database (Denmark)

    Li, H.; Zhao, B.; Han, L.

    2010-01-01

    In order to analyze correctly the effect of different models for induction generators on the transient performances of large wind power generation, Wind turbine driven squirrel cage induction generator (SCIG) models taking into account both main and leakage flux saturation and skin effect were...

  5. Sandia National Laboratories performance assessment methodology for long-term environmental programs : the history of nuclear waste management.

    Energy Technology Data Exchange (ETDEWEB)

    Marietta, Melvin Gary; Anderson, D. Richard; Bonano, Evaristo J.; Meacham, Paul Gregory (Raytheon Ktech, Albuquerque, NM)

    2011-11-01

    Sandia National Laboratories (SNL) is the world leader in the development of the detailed science underpinning the application of a probabilistic risk assessment methodology, referred to in this report as performance assessment (PA), for (1) understanding and forecasting the long-term behavior of a radioactive waste disposal system, (2) estimating the ability of the disposal system and its various components to isolate the waste, (3) developing regulations, (4) implementing programs to estimate the safety that the system can afford to individuals and to the environment, and (5) demonstrating compliance with the attendant regulatory requirements. This report documents the evolution of the SNL PA methodology from inception in the mid-1970s, summarizing major SNL PA applications including: the Subseabed Disposal Project PAs for high-level radioactive waste; the Waste Isolation Pilot Plant PAs for disposal of defense transuranic waste; the Yucca Mountain Project total system PAs for deep geologic disposal of spent nuclear fuel and high-level radioactive waste; PAs for the Greater Confinement Borehole Disposal boreholes at the Nevada National Security Site; and PA evaluations for disposal of high-level wastes and Department of Energy spent nuclear fuels stored at Idaho National Laboratory. In addition, the report summarizes smaller PA programs for long-term cover systems implemented for the Monticello, Utah, mill-tailings repository; a PA for the SNL Mixed Waste Landfill in support of environmental restoration; PA support for radioactive waste management efforts in Egypt, Iraq, and Taiwan; and, most recently, PAs for analysis of alternative high-level radioactive waste disposal strategies including repositories deep borehole disposal and geologic repositories in shale and granite. Finally, this report summarizes the extension of the PA methodology for radioactive waste disposal toward development of an enhanced PA system for carbon sequestration and storage systems

  6. A methodology to determine the power performance of wave energy converters at a particular coastal location

    International Nuclear Information System (INIS)

    Carballo, R.; Iglesias, G.

    2012-01-01

    Highlights: ► We develop a method to accurately compute the power output of a WEC at a site. ► The analysis of the wave resource is integrated seamlessly with the WEC efficiency. ► The intra-annual variability of the resource is considered. ► The method is illustrated with a case study: a WEC projected to be built in Spain. - Abstract: The assessment of the power performance of a wave energy converter (WEC) at a given site involves two tasks: (i) the characterisation of the wave resource at the site in question, and (ii) the computation of its power performance. These tasks are generally seen as disconnected, and tackled as such; they are, however, deeply interrelated – so much so that they should be treated as two phases of the same procedure. Indeed, beyond the characterisation of the wave resource of a certain area lies a crucial question: how much power would a WEC installed in that area output to the network? This work has two main objectives. First, to develop a methodology that integrates both tasks seamlessly and guarantees the accurate computation of the power performance of a WEC installed at a site of interest; it involves a large dataset of deepwater records and the implementation of a high-resolution, nested spectral model, which is used to propagate 95% of the total offshore wave energy to the WEC site. The second objective is to illustrate this methodology with a case study: an Oscillating Water Column (OWC) projected to be constructed at the breakwater of A Guarda (NW Spain). It is found that the approach presented allows to accurately determine the power that the WEC will output to the network, and that this power exhibits a significant monthly variability, so an estimate of the energy production based on mean annual values may be misleading.

  7. Influence of superoleophobic layer on the lubrication performance of partially textured bearing including cavitation

    Science.gov (United States)

    Tauviqirrahman, M.; Bayuseno, A. P.; Muchammad, Jamari, J.

    2016-04-01

    Surfaces with high superoleophobicity have attracted important attention because of their potential applications in scientific and industrial field. Especially classical metal bearing are faced with lubrication problem, because metal surface shows typically oleophilicity. The development of superolephobic metal surfaces which repel oil liquid droplet have significant applications in preventing the stiction. In addition, for classical bearing with texturing, the cavitation occurence is often considered as the main cause of the deterioration of the lubrication performance and thus shorten the lifetime of the bearing. In the present study, the exploration of the influence of adding the superoleophobic layer on the improvement of the performance of partially textured bearing in preventing the cavitation was performed. Navier slip model was used to model the behavior of the superoleophobic layer. A formulation of the modified Reynolds equation with mass-conserving boundary conditions was derived and the pressure distribution was of particular interest. The equations of lubrication were discretized using a finite volume method and solved using a tri-diagonal-matrix-algortihm. In this calculation, it was shown that after introducing the superoleophobic layer at the leading edge of the contact, the cavitation occurence can be prevented and thus the increased hydrodynamic pressure is found. However, the results showed that for deeper texture, the deterioration of the load support is noted. This findings may have useful implications to extend the life time of textured bearing.

  8. Visual methodologies and participatory action research: Performing women's community-based health promotion in post-Katrina New Orleans.

    Science.gov (United States)

    Lykes, M Brinton; Scheib, Holly

    2016-01-01

    Recovery from disaster and displacement involves multiple challenges including accompanying survivors, documenting effects, and rethreading community. This paper demonstrates how African-American and Latina community health promoters and white university-based researchers engaged visual methodologies and participatory action research (photoPAR) as resources in cross-community praxis in the wake of Hurricane Katrina and the flooding of New Orleans. Visual techniques, including but not limited to photonarratives, facilitated the health promoters': (1) care for themselves and each other as survivors of and responders to the post-disaster context; (2) critical interrogation of New Orleans' entrenched pre- and post-Katrina structural racism as contributing to the racialised effects of and responses to Katrina; and (3) meaning-making and performances of women's community-based, cross-community health promotion within this post-disaster context. This feminist antiracist participatory action research project demonstrates how visual methodologies contributed to the co-researchers' cross-community self- and other caring, critical bifocality, and collaborative construction of a contextually and culturally responsive model for women's community-based health promotion post 'unnatural disaster'. Selected limitations as well as the potential for future cross-community antiracist feminist photoPAR in post-disaster contexts are discussed.

  9. Environmental performance of gasified willow from different lands including land-use changes

    DEFF Research Database (Denmark)

    Saez de Bikuna Salinas, Koldo; Hauschild, Michael Zwicky; Pilegaard, Kim

    2017-01-01

    A life-cycle assessment (LCA) of a low-input, short rotation coppice (SRC) willow grown on different Danish lands was performed. Woodchips are gasified, producer gas is used for co-generation of heat and power (CHP) and the ash-char output is applied as soil amendment in the field. A hybrid model...... for abandoned farmland, as a relative C stock loss compared to natural regeneration. ILUC results show that area related GHG emissions are dominant (93% of iLUCfood and 80% of iLUCfeed), transformation being more important (82% of iLUCfood) than occupation (11%) impacts. LCA results show that CHP from willow...

  10. Evaluation of frother performance in coal flotation: A critical review of existing methodologies

    Energy Technology Data Exchange (ETDEWEB)

    Bhattacharya, S.; Dey, S. [Indian School for Mines, Dhanbad (India). Dept. for Fuel & Mineral Engineering

    2008-07-01

    Separation efficiency in flotation depends, to a considerable extent, on the efficiency of the frother used. A successful frother must achieve a delicate balance between froth stability and non-persistency. Ideally, the frother is not supposed to influence the state of the surface of the coal and minerals. In practice, however, interaction does occur between the frother, other reagents, and solid surfaces. Various commercially available frothers can differ slightly or significantly in their influence on the flotation results. Therefore, a plant operator is in a dilemma when it comes to selecting a frother to be used in his plant. This article attempts to critically review the different methodologies, which are available to compare the performance of two or more frothers in order to decide which would best serve the purpose of the plant operator.

  11. A methodology for performing virtual measurements in a nuclear reactor system

    International Nuclear Information System (INIS)

    Ikonomopoulos, A.; Uhrig, R.E.; Tsoukalas, L.H.

    1992-01-01

    A novel methodology is presented for monitoring nonphysically measurable variables in an experimental nuclear reactor. It is based on the employment of artificial neural networks to generate fuzzy values. Neural networks map spatiotemporal information (in the form of time series) to algebraically defined membership functions. The entire process can be thought of as a virtual measurement. Through such virtual measurements the values of nondirectly monitored parameters with operational significance, e.g., transient-type, valve-position, or performance, can be determined. Generating membership functions is a crucial step in the development and practical utilization of fuzzy reasoning, a computational approach that offers the advantage of describing the state of the system in a condensed, linguistic form, convenient for monitoring, diagnostics, and control algorithms

  12. Methodology for Selection of Economic Performance Factors in the Area of Information and Communication Activities

    Directory of Open Access Journals (Sweden)

    Jana Hornungová

    2015-01-01

    Full Text Available The article presents one part of the research work of the author that is focused on the business performance. The aim of this paper is to find and introduce economic factors of corporate performance system that are important part of the performance, because can help to organization define and measure progress toward organizational goals. The aim also included the determination of Key Performance Indicators (KPIs. The first step for the evaluation of performance is the projective access. This approach is meant, that the performance in terms of the future development of the company it is possible to conclude on the basis of, and ongoing activities. In relation to this idea are as fundamental the economic indicators of the performance scale. To find these factors were used the theoretical information from the area of KPIs and data from primary research. This data were tested through mathematical-statistical analysis, in this case, directly on the basis of factor analysis.

  13. Mechanical–biological treatment: Performance and potentials. An LCA of 8 MBT plants including waste characterization

    DEFF Research Database (Denmark)

    Montejo, Cristina; Tonini, Davide; Márquez, María del Carmen

    2013-01-01

    recovery through increased automation of the selection and to prioritize biogas-electricity production from the organic fraction over direct composting. The optimal strategy for refuse derived fuel (RDF) management depends upon the environmental compartment to be prioritized and the type of marginal...... of the MBT plants. These widely differed in type of biological treatment and recovery efficiencies. The results indicated that the performance is strongly connected with energy and materials recovery efficiency. The recommendation for upgrading and/or commissioning of future plants is to optimize materials...... electricity source in the system. It was estimated that, overall, up to ca. 180—190 kt CO2-eq. y−1 may be saved by optimizing the MBT plants under assessment....

  14. Thermal enhanced vapor extraction systems: Design, application and performance prediction including contaminant behavior

    International Nuclear Information System (INIS)

    Phelan, J.M.; Webb, S.W.

    1994-01-01

    Soil heating technologies have been proposed as a method to accelerate contaminant removal from subsurface soils. These methods include the use of hot air, steam, conductive heaters, in-situ resistive heating and in-situ radiofrequency heating (Buettner et.al., EPA, Dev et.al., Heath et.al.). Criteria for selection of a particular soil heating technology is a complex function of contaminant and soil properties, and efficiency in energy delivery and contaminant removal technologies. The work presented here seeks to expand the understanding of the interactions of subsurface water, contaminant, heat and vacuum extraction through model predictions and field data collection. Field demonstration will involve the combination of two soil heating technologies (resistive and dielectric) with a vacuum vapor extraction system and will occur during the summer of 1994

  15. Study on dynamic team performance evaluation methodology based on team situation awareness model

    International Nuclear Information System (INIS)

    Kim, Suk Chul

    2005-02-01

    The purpose of this thesis is to provide a theoretical framework and its evaluation methodology of team dynamic task performance of operating team at nuclear power plant under the dynamic and tactical environment such as radiological accident. This thesis suggested a team dynamic task performance evaluation model so called team crystallization model stemmed from Endsely's situation awareness model being comprised of four elements: state, information, organization, and orientation and its quantification methods using system dynamics approach and a communication process model based on a receding horizon control approach. The team crystallization model is a holistic approach for evaluating the team dynamic task performance in conjunction with team situation awareness considering physical system dynamics and team behavioral dynamics for a tactical and dynamic task at nuclear power plant. This model provides a systematic measure to evaluate time-dependent team effectiveness or performance affected by multi-agents such as plant states, communication quality in terms of transferring situation-specific information and strategies for achieving the team task goal at given time, and organizational factors. To demonstrate the applicability of the proposed model and its quantification method, the case study was carried out using the data obtained from a full-scope power plant simulator for 1,000MWe pressurized water reactors with four on-the-job operating groups and one expert group who knows accident sequences. Simulated results team dynamic task performance with reference key plant parameters behavior and team-specific organizational center of gravity and cue-and-response matrix illustrated good symmetry with observed value. The team crystallization model will be useful and effective tool for evaluating team effectiveness in terms of recruiting new operating team for new plant as cost-benefit manner. Also, this model can be utilized as a systematic analysis tool for

  16. Study on dynamic team performance evaluation methodology based on team situation awareness model

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Suk Chul

    2005-02-15

    The purpose of this thesis is to provide a theoretical framework and its evaluation methodology of team dynamic task performance of operating team at nuclear power plant under the dynamic and tactical environment such as radiological accident. This thesis suggested a team dynamic task performance evaluation model so called team crystallization model stemmed from Endsely's situation awareness model being comprised of four elements: state, information, organization, and orientation and its quantification methods using system dynamics approach and a communication process model based on a receding horizon control approach. The team crystallization model is a holistic approach for evaluating the team dynamic task performance in conjunction with team situation awareness considering physical system dynamics and team behavioral dynamics for a tactical and dynamic task at nuclear power plant. This model provides a systematic measure to evaluate time-dependent team effectiveness or performance affected by multi-agents such as plant states, communication quality in terms of transferring situation-specific information and strategies for achieving the team task goal at given time, and organizational factors. To demonstrate the applicability of the proposed model and its quantification method, the case study was carried out using the data obtained from a full-scope power plant simulator for 1,000MWe pressurized water reactors with four on-the-job operating groups and one expert group who knows accident sequences. Simulated results team dynamic task performance with reference key plant parameters behavior and team-specific organizational center of gravity and cue-and-response matrix illustrated good symmetry with observed value. The team crystallization model will be useful and effective tool for evaluating team effectiveness in terms of recruiting new operating team for new plant as cost-benefit manner. Also, this model can be utilized as a systematic analysis tool for

  17. Performance of cement solidification with barium for high activity liquid waste including sulphate

    International Nuclear Information System (INIS)

    Waki, Toshikazu; Yamada, Motoyuki; Horikawa, Yoshihiko; Kaneko, Masaaki; Saso, Michitaka; Haruguchi, Yoshiko; Yamashita, Yu; Sakai, Hitoshi

    2009-01-01

    The target liquid waste to be solidified is generated from PWR primary loop spent resin treatment with sulphate acid, so, its main constituent is sodium sulphate and the activity of this liquid is relatively high. Waste form of this liquid waste is considered to be a candidate for the subsurface disposal. The disposed waste including sulphate is anticipated to rise a concentration of sulphate ion in the ground water around the disposal facility and it may cause degradation of materials such as cement and bentonite layer and comprise the disposal facility. There could be two approaches to avoid this problem, the strong design of the disposal facility and the minimization of sulphaste ion migration from the solidified waste. In this study, the latter approach was examined. In order to keep the low concentration of sulphate ion in the ground water, it is effective to make barium sulphate by adding barium compound into the liquid waste in solidification. However, adding equivalent amount of barium compound with sulphate ion causes difficulty of mixing, because production of barium sulphate causes high viscosity. In this study, mixing condition after and before adding cement into the liquid waste was estimated. The mixing condition was set with consideration to keep anion concentration low in the ground water and of mixing easily enough in practical operation. Long term leaching behavior of the simulated solidified waste was also analyzed by PHREEQC. And the concentration of the constitution affected to the disposal facility was estimated be low enough in the ground water. (author)

  18. Performance-based, cost- and time-effective PCB analytical methodology

    International Nuclear Information System (INIS)

    Alvarado, J. S.

    1998-01-01

    Laboratory applications for the analysis of PCBs (polychlorinated biphenyls) in environmental matrices such as soil/sediment/sludge and oil/waste oil were evaluated for potential reduction in waste, source reduction, and alternative techniques for final determination. As a consequence, new procedures were studied for solvent substitution, miniaturization of extraction and cleanups, minimization of reagent consumption, reduction of cost per analysis, and reduction of time. These new procedures provide adequate data that meet all the performance requirements for the determination of PCBs. Use of the new procedures reduced costs for all sample preparation techniques. Time and cost were also reduced by combining the new sample preparation procedures with the power of fast gas chromatography. Separation of Aroclor 1254 was achieved in less than 6 min by using DB-1 and SPB-608 columns. With the greatly shortened run times, reproducibility can be tested quickly and consequently with low cost. With performance-based methodology, the applications presented here can be applied now, without waiting for regulatory approval

  19. Development of performance assessment methodology for nuclear waste isolation in geologic media

    International Nuclear Information System (INIS)

    Bonano, E.J.; Chu, M.S.Y.; Cranwell, R.M.; Davis, P.A.

    1985-01-01

    The burial of nuclear wastes in deep geologic formations as a means for their disposal is an issue of significant technical and social impact. The analysis of the processes involved can be performed only with reliable mathematical models and computer codes as opposed to conducting experiments because the time scales associated are on the order of tens of thousands of years. These analyses are concerned primarily with the migration of radioactive contaminants from the repository to the environment accessible to humans. Modeling of this phenomenon depends on a large number of other phenomena taking place in the geologic porous and/or fractured medium. These are gound-water flow, physicochemical interactions of the contaminants with the rock, heat transfer, and mass transport. Once the radionuclides have reached the accessible environment, the pathways to humans and health effects are estimated. A performance assessment methodology for a potential high-level waste repository emplaced in a basalt formation has been developed for the US Nuclear Regulatory Commission. The approach followed consists of a description of the overall system (waste, facility, and site), scenario selection and screening, consequence modeling (source term, ground-water flow, radionuclide transport, biosphere transport, and health effects), and uncertainty and sensitivity analysis

  20. Development of performance assessment methodology for nuclear waste isolation in geologic media

    Science.gov (United States)

    Bonano, E. J.; Chu, M. S. Y.; Cranwell, R. M.; Davis, P. A.

    The burial of nuclear wastes in deep geologic formations as a means for their disposal is an issue of significant technical and social impact. The analysis of the processes involved can be performed only with reliable mathematical models and computer codes as opposed to conducting experiments because the time scales associated are on the order of tens of thousands of years. These analyses are concerned primarily with the migration of radioactive contaminants from the repository to the environment accessible to humans. Modeling of this phenomenon depends on a large number of other phenomena taking place in the geologic porous and/or fractured medium. These are ground-water flow, physicochemical interactions of the contaminants with the rock, heat transfer, and mass transport. Once the radionuclides have reached the accessible environment, the pathways to humans and health effects are estimated. A performance assessment methodology for a potential high-level waste repository emplaced in a basalt formation has been developed for the U.S. Nuclear Regulatory Commission.

  1. Methodologies for assessing long-term performance of high-level radioactive waste packages

    International Nuclear Information System (INIS)

    Stephens, K.; Boesch, L.; Crane, B.; Johnson, R.; Moler, R.; Smith, S.; Zaremba, L.

    1986-01-01

    Several methods the Nuclear Regulatory Commission (NRC) can use to independently assess Department of Energy (DOE) waste package performance were identified by The Aerospace Corporation. The report includes an overview of the necessary attributes of performance assessment, followed by discussions of DOE methods, probabilistic methods capable of predicting waste package lifetime and radionuclide releases, process modeling of waste package barriers, sufficiency of the necessary input data, and the applicability of probability density functions. It is recommended that the initial NRC performance assessment (for the basalt conceptual waste package design) should apply modular simulation, using available process models and data, to demonstrate this assessment method

  2. Risk Prediction Models for Incident Heart Failure: A Systematic Review of Methodology and Model Performance.

    Science.gov (United States)

    Sahle, Berhe W; Owen, Alice J; Chin, Ken Lee; Reid, Christopher M

    2017-09-01

    Numerous models predicting the risk of incident heart failure (HF) have been developed; however, evidence of their methodological rigor and reporting remains unclear. This study critically appraises the methods underpinning incident HF risk prediction models. EMBASE and PubMed were searched for articles published between 1990 and June 2016 that reported at least 1 multivariable model for prediction of HF. Model development information, including study design, variable coding, missing data, and predictor selection, was extracted. Nineteen studies reporting 40 risk prediction models were included. Existing models have acceptable discriminative ability (C-statistics > 0.70), although only 6 models were externally validated. Candidate variable selection was based on statistical significance from a univariate screening in 11 models, whereas it was unclear in 12 models. Continuous predictors were retained in 16 models, whereas it was unclear how continuous variables were handled in 16 models. Missing values were excluded in 19 of 23 models that reported missing data, and the number of events per variable was models. Only 2 models presented recommended regression equations. There was significant heterogeneity in discriminative ability of models with respect to age (P prediction models that had sufficient discriminative ability, although few are externally validated. Methods not recommended for the conduct and reporting of risk prediction modeling were frequently used, and resulting algorithms should be applied with caution. Copyright © 2017 Elsevier Inc. All rights reserved.

  3. Methodological guide for the follow-up and elaboration of performance assessments of a methanization plant

    International Nuclear Information System (INIS)

    Bastide, Guillaume

    2014-06-01

    This guide aims at giving indications required for a good implementation of an exploitation follow-up of a methanization plant. More precisely, it aims at foreseeing equipment necessary to the follow-up of installation construction, at preparing the operator to the follow-up and command of his installation, at elaborating operation assessments and performance interpretations, at proposing solutions and/or improvements. The described follow-up process can be applied to all the process stages (from receipt to by-product valorization), and addresses technical as well as economic aspects. Thus, four types of assessments are made: technical, energetic, environmental, and social-economic. This guide comprises five parts: a presentation of follow-up objectives (information to be looked for, benefits and drawbacks, follow-up level to be implemented), the follow-up methodology, follow-up assessments (what they are and how to elaborate them), practical sheets (practical presentation of techniques, typical Excel spreadsheets), and a glossary which explains the main technical terms

  4. A High-Performance Embedded Hybrid Methodology for Uncertainty Quantification With Applications

    Energy Technology Data Exchange (ETDEWEB)

    Iaccarino, Gianluca

    2014-04-01

    Multiphysics processes modeled by a system of unsteady di erential equations are natu- rally suited for partitioned (modular) solution strategies. We consider such a model where probabilistic uncertainties are present in each module of the system and represented as a set of random input parameters. A straightforward approach in quantifying uncertainties in the predicted solution would be to sample all the input parameters into a single set, and treat the full system as a black-box. Although this method is easily parallelizable and requires minimal modi cations to deterministic solver, it is blind to the modular structure of the underlying multiphysical model. On the other hand, using spectral representations polynomial chaos expansions (PCE) can provide richer structural information regarding the dynamics of these uncertainties as they propagate from the inputs to the predicted output, but can be prohibitively expensive to implement in the high-dimensional global space of un- certain parameters. Therefore, we investigated hybrid methodologies wherein each module has the exibility of using sampling or PCE based methods of capturing local uncertainties while maintaining accuracy in the global uncertainty analysis. For the latter case, we use a conditional PCE model which mitigates the curse of dimension associated with intru- sive Galerkin or semi-intrusive Pseudospectral methods. After formalizing the theoretical framework, we demonstrate our proposed method using a numerical viscous ow simulation and benchmark the performance against a solely Monte-Carlo method and solely spectral method.

  5. Experience and benefits from using the EPRI MOV Performance Prediction Methodology in nuclear power plants

    International Nuclear Information System (INIS)

    Walker, T.; Damerell, P.S.

    1999-01-01

    The EPRI MOV Performance Prediction Methodology (PPM) is an effective tool for evaluating design basis thrust and torque requirements for MOVs. Use of the PPM has become more widespread in US nuclear power plants as they close out their Generic Letter (GL) 89-10 programs and address MOV periodic verification per GL 96-05. The PPM has also been used at plants outside the US, many of which are implementing programs similar to US plants' GL 89-10 programs. The USNRC Safety Evaluation of the PPM and the USNRC's discussion of the PPM in GL 96-05 make the PPM an attractive alternative to differential pressure (DP) testing, which can be costly and time-consuming. Significant experience and benefits, which are summarized in this paper, have been gained using the PPM. Although use of PPM requires a commitment of resources, the benefits of a solidly justified approach and a reduced need for DP testing provide a substantial safety and economic benefit. (author)

  6. Towards better environmental performance of wastewater sludge treatment using endpoint approach in LCA methodology

    Directory of Open Access Journals (Sweden)

    Isam Alyaseri

    2017-03-01

    Full Text Available The aim of this study is to use the life cycle assessment method to measure the environmental performance of the sludge incineration process in a wastewater treatment plant and to propose an alternative that can reduce the environmental impact. To show the damages caused by the treatment processes, the study aimed to use an endpoint approach in evaluating impacts on human health, ecosystem quality, and resources due to the processes. A case study was taken at Bissell Point Wastewater Treatment Plant in Saint Louis, Missouri, U.S. The plant-specific data along with literature data from technical publications were used to build an inventory, and then analyzed the environmental burdens from sludge handling unit in the year 2011. The impact assessment method chosen was ReCipe 2008. The existing scenario (dewatering-multiple hearth incineration-ash to landfill was evaluated and three alternative scenarios (fluid bed incineration and anaerobic digestion with and without land application with energy recovery from heat or biogas were proposed and analyzed to find the one with the least environmental impact. The existing scenario shows that the most significant impacts are related to depletion in resources and damage to human health. These impacts mainly came from the operation phase (electricity and fuel consumption and emissions related to combustion. Alternatives showed better performance than the existing scenario. Using ReCipe endpoint methodology, and among the three alternatives tested, the anaerobic digestion had the best overall environmental performance. It is recommended to convert to fluid bed incineration if the concerns were more about human health or to anaerobic digestion if the concerns were more about depletion in resources. The endpoint approach may simplify the outcomes of this study as follows: if the plant is converted to fluid bed incineration, it could prevent an average of 43.2 DALYs in human life, save 0.059 species in the area

  7. Discovering the Effects-Endstate Linkage: Using Soft Systems Methodology to Perform EBO Mission Analysis

    National Research Council Canada - National Science Library

    Young, Jr, William E

    2005-01-01

    .... EBO mission analysis is shown to be more problem structuring than problem solving. A new mission analysis process is proposed using a modified version of Soft Systems Methodology to meet these challenges...

  8. Computational Methodologies for Developing Structure–Morphology–Performance Relationships in Organic Solar Cells: A Protocol Review

    KAUST Repository

    Do, Khanh; Ravva, Mahesh Kumar; Wang, Tonghui; Bredas, Jean-Luc

    2016-01-01

    We outline a step-by-step protocol that incorporates a number of theoretical and computational methodologies to evaluate the structural and electronic properties of pi-conjugated semiconducting materials in the condensed phase. Our focus

  9. Measuring performance in off-patent drug markets: a methodological framework and empirical evidence from twelve EU Member States.

    Science.gov (United States)

    Kanavos, Panos

    2014-11-01

    This paper develops a methodological framework to help evaluate the performance of generic pharmaceutical policies post-patent expiry or after loss of exclusivity in non-tendering settings, comprising five indicators (generic availability, time delay to and speed of generic entry, number of generic competitors, price developments, and generic volume share evolution) and proposes a series of metrics to evaluate performance. The paper subsequently tests this framework across twelve EU Member States (MS) by using IMS data on 101 patent expired molecules over the 1998-2010 period. Results indicate that significant variation exists in generic market entry, price competition and generic penetration across the study countries. Size of a geographical market is not a predictor of generic market entry intensity or price decline. Regardless of geographic or product market size, many off patent molecules lack generic competitors two years after loss of exclusivity. The ranges in each of the five proposed indicators suggest, first, that there are numerous factors--including institutional ones--contributing to the success of generic entry, price decline and market penetration and, second, MS should seek a combination of supply and demand-side policies in order to maximise cost-savings from generics. Overall, there seems to be considerable potential for faster generic entry, uptake and greater generic competition, particularly for molecules at the lower end of the market. Copyright © 2014. Published by Elsevier Ireland Ltd.

  10. Association between functional performance and executive cognitive functions in an elderly population including patients with low ankle–brachial index

    Science.gov (United States)

    Ferreira, Naomi Vidal; Cunha, Paulo Jannuzzi; da Costa, Danielle Irigoyen; dos Santos, Fernando; Costa, Fernando Oliveira; Consolim-Colombo, Fernanda; Irigoyen, Maria Cláudia

    2015-01-01

    Introduction Peripheral arterial disease, as measured by the ankle–brachial index (ABI), is prevalent among the elderly, and is associated with functional performance, assessed by the 6-minute walk test (6MWT). Executive cognitive function (ECF) impairments are also prevalent in this population, but no existing study has investigated the association between ECF and functional performance in an elderly population including individuals with low ABI. Aim To investigate the association between functional performance, as measured by the 6MWT, and loss in ECF, in an elderly sample including individuals with low ABI. Method The ABI group was formed by 26 elderly individuals with low ABI (mean ABI: 0.63±0.19), and the control group was formed by 40 elderly individuals with normal ABI (mean ABI: 1.08±0.07). We analyzed functional performance using the 6MWT, global cognition using the Mini-Mental State Examination (MMSE), and ECF using the Digit Span for assessing attention span and working memory, the Stroop Color Word Test (SCWT) for assessing information processing speed and inhibitory control/impulsivity, and the Controlled Oral Word Association Test (COWAT) for assessing semantic verbal fluency and phonemic verbal fluency. We also used a factor analysis on all of the ECF tests (global ECF). Results Before adjustment, the ABI group performed worse on global cognition, attention span, working memory, inhibitory control/impulsivity, semantic verbal fluency, and phonemic verbal fluency. After adjustment, the ABI group performance remained worse for working memory and semantic verbal fluency. In a simple correlation analysis including all of the subjects, the 6MWT was associated with global cognition, attention span, working memory, information processing speed, inhibitory control/impulsivity, semantic verbal fluency, and global ECF. After adjustment, all the associations remained statistically significant. Conclusion This study found an independent association between

  11. Bridging the gap between neurocognitive processing theory and performance validity assessment among the cognitively impaired: a review and methodological approach.

    Science.gov (United States)

    Leighton, Angela; Weinborn, Michael; Maybery, Murray

    2014-10-01

    Bigler (2012) and Larrabee (2012) recently addressed the state of the science surrounding performance validity tests (PVTs) in a dialogue highlighting evidence for the valid and increased use of PVTs, but also for unresolved problems. Specifically, Bigler criticized the lack of guidance from neurocognitive processing theory in the PVT literature. For example, individual PVTs have applied the simultaneous forced-choice methodology using a variety of test characteristics (e.g., word vs. picture stimuli) with known neurocognitive processing implications (e.g., the "picture superiority effect"). However, the influence of such variations on classification accuracy has been inadequately evaluated, particularly among cognitively impaired individuals. The current review places the PVT literature in the context of neurocognitive processing theory, and identifies potential methodological factors to account for the significant variability we identified in classification accuracy across current PVTs. We subsequently evaluated the utility of a well-known cognitive manipulation to provide a Clinical Analogue Methodology (CAM), that is, to alter the PVT performance of healthy individuals to be similar to that of a cognitively impaired group. Initial support was found, suggesting the CAM may be useful alongside other approaches (analogue malingering methodology) for the systematic evaluation of PVTs, particularly the influence of specific neurocognitive processing components on performance.

  12. Combining soft system methodology and pareto analysis in safety management performance assessment : an aviation case

    NARCIS (Netherlands)

    Karanikas, Nektarios

    2016-01-01

    Although reengineering is strategically advantageous for organisations in order to keep functional and sustainable, safety must remain a priority and respective efforts need to be maintained. This paper suggests the combination of soft system methodology (SSM) and Pareto analysis on the scope of

  13. A multimedia exposure assessment methodology for evaluating the performance of the design of structures containing chemical and radioactive wastes

    International Nuclear Information System (INIS)

    Stephanatos, B.N.; Molholt, B.; Walter, K.P.; MacGregor, A.

    1991-01-01

    The objectives of this work are to develop a multimedia exposure assessment methodology for the evaluation of existing and future design of structures containing chemical and radioactive wastes and to identify critical parameters for design optimization. The designs are evaluated in terms of their compliance with various federal and state regulatory requirements. Evaluation of the performance of a particular design is presented within the scope of a given exposure pathway. An exposure pathway has four key components: (1) a source and mechanism of chemical release, (2) a transport medium; (3) a point of exposure; and (4) a route of exposure. The first step in the analysis is the characterization of the waste source behavior. The rate and concentration of releases from the source are evaluated using appropriate mathematical models. The migration of radionuclides and chemicals is simulated through each environmental medium to the exposure point. The total exposure to the potential receptor is calculated, and an estimate of the health effects of the exposure is made. Simulation of the movement of radionuclides and chemical wastes from the source to the receptor point includes several processes. If the predicted human exposure to contaminants meets the performance criteria, the design has been validated. Otherwise the structure design is improved to meet the performance criteria. A phased modeling approach is recommended at a particular mixed waste site. A relatively simple model is initially used to pinpoint critical fate and transport processes and design parameters. The second phase of the modeling effort involves the use of more complex and resource intensive fate and transport models. This final step in the modeling process provides more accurate estimates of contaminant concentrations at the point of exposure. Thus the human dose is more accurately predicted, providing better design validation

  14. The association between school-based physical activity, including physical education, and academic performance: a systematic review of the literature.

    Science.gov (United States)

    Rasberry, Catherine N; Lee, Sarah M; Robin, Leah; Laris, B A; Russell, Lisa A; Coyle, Karin K; Nihiser, Allison J

    2011-06-01

    The purpose of this review is to synthesize the scientific literature that has examined the association between school-based physical activity (including physical education) and academic performance (including indicators of cognitive skills and attitudes, academic behaviors, and academic achievement). Relevant research was identified through a search of nine electronic databases using both physical activity and academic-related search terms. Forty-three articles (reporting a total of 50 unique studies) met the inclusion criteria and were read, abstracted, and coded for this synthesis. Findings of the 50 studies were then summarized. Across all the studies, there were a total of 251 associations between physical activity and academic performance, representing measures of academic achievement, academic behavior, and cognitive skills and attitudes. Slightly more than half (50.5%) of all associations examined were positive, 48% were not significant, and 1.5% were negative. Examination of the findings by each physical activity context provides insights regarding specific relationships. Results suggest physical activity is either positively related to academic performance or that there is not a demonstrated relationship between physical activity and academic performance. Results have important implications for both policy and schools. Copyright © 2010 Elsevier Inc. All rights reserved.

  15. Time, Non-representational Theory and the "Performative Turn"—Towards a New Methodology in Qualitative Social Research

    Directory of Open Access Journals (Sweden)

    Peter Dirksmeier

    2008-05-01

    Full Text Available Because of their constitution, the usage of performative techniques in qualitative social research must deal with a paradox. Acting as performance takes place in the present and it takes place just once. One result of this is that every representation of a performance be it as text, discussion or film refers to the past. Performative social research solves this paradox by conceptualising performance as a kind of liminal phase of a ritual. Our thesis is that by simple outsourcing the problem of present in the theory of ritual, performative techniques commit the logical mistake of genetic fallacy, i.e., the mistake of forgetting that the primary value or meaning of an event has no necessary connections with its genesis in history. Therefore, a new methodology for qualitative social research after the performative turn requires a theoretical position which does not fall back to a position of causality as the temporal consequence of a cause and effect, as maintained by ritual theory. In this essay we suggest a "non-representational theory" for this venture, and point out how a methodology for qualitative research could be constituted "after" the performative turn. URN: urn:nbn:de:0114-fqs0802558

  16. Squaring the Project Management Circle: Updating the Cost, Schedule, and Performance Methodology

    Science.gov (United States)

    2016-04-30

    MIT Sloan Management Review . MIT Sloan Management Review . Retrieved from http://sloanreview.mit.edu/article/the-new-industrial-engineering-information-technology-and- business-process-redesign/ ...critical variable that must be addressed by project managers . The research methodology consists of a system-focused approach based on an extensive review ...doi.org/10.1016/j.ijproman.2007.01.004 Baccarini, D. (1996). The concept of project complexity–A review .

  17. Performance Evaluation and Measurement of the Organization in Strategic Analysis and Control: Methodological Aspects

    OpenAIRE

    Živan Ristić; Neđo Balaban

    2006-01-01

    Information acquired by measuring and evaluation are a necessary condition for good decision-making in strategic management. This work deals with : (a) Methodological aspects of evaluation (kinds of evaluation, metaevaluation) and measurement (supposition of isomorphism in measurement, kinds and levels of measurement, errors in measurement and the basic characteristics of measurement) (b) Evaluation and measurement of potential and accomplishments of the organization in Kaplan-Norton perspect...

  18. Clinical performance of an objective methodology to categorize tear film lipid layer patterns

    Science.gov (United States)

    Garcia-Resua, Carlos; Pena-Verdeal, Hugo; Giraldez, Maria J.; Yebra-Pimentel, Eva

    2017-08-01

    Purpose: To validate the performance of a new objective application designated iDEAS (Dry Eye Assessment System) to categorize different zones of lipid layer patterns (LLPs) in one image. Material and methods: Using the Tearscopeplus and a digital camera attached to a slit-lamp, 50 images were captured and analyzed by 4 experienced optometrists. In each image the observers outlined tear film zones that they clearly identified as a specific LLP. Further, the categorization made by the 4 optometrists (called observer 1, 2, 3 and 4) was compared with the automatic system included in iDEAS (5th observer). Results: In general, observer 3 classified worse than all observers (observers 1, 2, 4 and automatic application, Wilcoxon test, 0.05). Furthermore, we obtained a set of photographs per LLP category for which all optometrists showed agreement by using the new tool. After examining them, we detected the more characteristic features for each LLP to enhance the description of the patterns implemented by Guillon. Conclusions: The automatic application included in the iDEAS framework is able to provide zones similar to the annotations made by experienced optometrists. Thus, the manual process done by experts can be automated with the benefits of being unaffected by subjective factors.

  19. The effects of overtime work and task complexity on the performance of nuclear plant operators: A proposed methodology

    International Nuclear Information System (INIS)

    Banks, W.W.; Potash, L.

    1985-01-01

    This document presents a very general methodology for determining the effect of overtime work and task complexity on operator performance in response to simulated out-of-limit nuclear plant conditions. The independent variables consist of three levels of overtime work and three levels of task complexity. Multiple dependent performance measures are proposed for use and discussion. Overtime work is operationally defined in terms of the number of hours worked by nuclear plant operators beyond the traditional 8 hours per shift. Task complexity is operationalized in terms of the number of operator tasks required to remedy a given plant anomalous condition and bring the plant back to a ''within limits'' or ''normal'' steady-state condition. The proposed methodology would employ a 2 factor repeated measures design along with the analysis of variance (linear) model

  20. “How many sums can I do”? : Performative strategies and diffractive thinking as methodological tools for rethinking mathematical subjectivity

    OpenAIRE

    Palmer, Anna

    2011-01-01

    The aim of this article is to illustrate how the understanding of mathematical subjectivity changes when transiting theoretically and methodologically from a discursive and performative thinking, as suggested by Judith Butler (1990, 1993, 1997), to an agential realist and diffractive thinking, inspired by Karen Barad’s theories (2007, 2008). To show this I have examined narrative memory stories about mathematics written by students participating in Teacher Education maths courses. I pro...

  1. Using integrated control methodology to optimize energy performance for the guest rooms in UAE hospitality sector

    International Nuclear Information System (INIS)

    AlFaris, Fadi; Abu-Hijleh, Bassam; Abdul-Ameer, Alaa

    2016-01-01

    Highlights: • Energy efficiency in 4 and 5 star luxury hotels in the United Arab Emirates. • The normalized energy use index (EUI) ranges between 241.5 and 348.4 kWh/m"2/year for post 2003 hotels. • The normalized energy use index (EUI) ranges between 348.4 and 511.1 kWh/m"2/year for pre 2003 hotels. • Integrated HVAC and lighting control strategies can reduce total energy consumption by up to 31.5%. - Abstract: The hospitality sector is growing rapidly in the UAE and especially in Dubai. As a result, it contributes substantially in the UAE's carbon footprint. This research was conducted to measure, evaluate and increase the energy efficiency in 4 and 5 star luxury hotels in UAE. Energy benchmarking analysis was used to analyze the energy data of 19 hotel buildings to differentiate between usual and best practice of energy performance. Moreover, the normalized energy use index (EUI) kWh/m"2/year has been identified for the best, usual and poor practice hotels. It was found that the normalized EUI ranges between 241.5 kWh/m"2/year or less as a best practice to more than 361.3 kWh/m"2/year of the poor energy practice for the hotels constructed after the year of 2003. Whereas the hotels' energy data showed higher values for those constructed before 2003, as the normalized EUI varies between 348.4 kWh/m"2/year as best practice to more than 511.1 kWh/m"2/year. An integrated control strategy has been employed to improve the energy performance and assess the energy saving for the guestroom. This technique showed that the overall energy performance improvement reached to 31.5% out of entire energy consumption of the hotel including electricity and gas. This reduction resulted in 43.2% savings from the cooling system and 13.2% from the lighting system due to the installing of the integrated control system in the guestrooms.

  2. Performance evaluation of CT measurements made on step gauges using statistical methodologies

    DEFF Research Database (Denmark)

    Angel, J.; De Chiffre, L.; Kruth, J.P.

    2015-01-01

    In this paper, a study is presented in which statistical methodologies were applied to evaluate the measurement of step gauges on an X-ray computed tomography (CT) system. In particular, the effects of step gauge material density and orientation were investigated. The step gauges consist of uni......- and bidirectional lengths. By confirming the repeatability of measurements made on the test system, the number of required scans in the design of experiment (DOE) was reduced. The statistical model was checked using model adequacy principles; model adequacy checking is an important step in validating...

  3. Performance of neutron activation analysis in the evaluation of bismuth iodide purification methodology

    International Nuclear Information System (INIS)

    Armelin, Maria Jose A.; Ferraz, Caue de Mello; Hamada, Margarida M.

    2015-01-01

    Bismuth tri-iodide (BrI 3 ) is an attractive material for using as a semiconductor. In this paper, BiI 3 crystals have been grown by the vertical Bridgman technique using commercially available powder. The impurities were evaluated by instrumental neutron activation analysis (INAA). The results show that INAA is an analytical method appropriate for monitoring the impurities of: Ag, As, Br, Cr, K, Mo, Na and Sb in the various stages of the BiI 3 purification methodology. (author)

  4. Performance Assessment of the Pico OWC Power Plant Following the Equimar Methodology

    DEFF Research Database (Denmark)

    Pecher, Arthur; Crom, I. Le; Kofoed, Jens Peter

    2011-01-01

    This paper presents the power performance of the Oscillating Water Column (OWC) wave energy converter installed on the Island of Pico. The performance assessment of the device is based on real performance data gathered over the last years during normal operation. In addition to the estimation...

  5. Analytical and Experimental Performance Evaluation of BLE Neighbor Discovery Process Including Non-Idealities of Real Chipsets

    Directory of Open Access Journals (Sweden)

    David Perez-Diaz de Cerio

    2017-03-01

    Full Text Available The purpose of this paper is to evaluate from a real perspective the performance of Bluetooth Low Energy (BLE as a technology that enables fast and reliable discovery of a large number of users/devices in a short period of time. The BLE standard specifies a wide range of configurable parameter values that determine the discovery process and need to be set according to the particular application requirements. Many previous works have been addressed to investigate the discovery process through analytical and simulation models, according to the ideal specification of the standard. However, measurements show that additional scanning gaps appear in the scanning process, which reduce the discovery capabilities. These gaps have been identified in all of the analyzed devices and respond to both regular patterns and variable events associated with the decoding process. We have demonstrated that these non-idealities, which are not taken into account in other studies, have a severe impact on the discovery process performance. Extensive performance evaluation for a varying number of devices and feasible parameter combinations has been done by comparing simulations and experimental measurements. This work also includes a simple mathematical model that closely matches both the standard implementation and the different chipset peculiarities for any possible parameter value specified in the standard and for any number of simultaneous advertising devices under scanner coverage.

  6. The professional methodological teaching performance of the professor of Physical education. Set of parameters for its measurement

    Directory of Open Access Journals (Sweden)

    Orlando Pedro Suárez Pérez

    2017-07-01

    Full Text Available This work was developed due to the need to attend to the difficulties found in the Physical Education teachers of the municipality of San Juan and Martínez during the development of the teaching-learning process of Basketball, which threaten the quality of the classes, sports results and preparation of the School for life. The objective is to propose parameters that allow measuring the professional teaching methodological performance of these teachers. The customized behavior of the research made possible the diagnosis of the 26 professors taken as a sample, expressing the traits that distinguish their efficiency, determining their potentialities and deficiencies. During the research process, theoretical, empirical and statistical methods were used, which permitted to corroborate the real existence of the problem, as well as the evaluation of its impact, which revealed a positive transformation in pedagogical practice. The results provide a concrete and viable answer for the improvement of the evaluation of the teaching-methodological component of the Physical Education teacher, which constitutes an important material of guidance for methodologists and managers related to the instrumental cognitive, procedural and attitudinal performance , In order to conduct from the precedent knowledge, the new knowledge and lead to a formative process, with a contemporary vision, offering methodological resources to control the quality of Physical Education lessons.

  7. The definitive analysis of the Bendandi's methodology performed with a specific software

    Science.gov (United States)

    Ballabene, Adriano; Pescerelli Lagorio, Paola; Georgiadis, Teodoro

    2015-04-01

    The presentation aims to clarify the "Method Bendandi" supposed, in the past, to be able to forecast earthquakes and never let expressly resolved by the geophysicist from Faenza to posterity. The geoethics implications of the Bendandi's forecasts, and those that arise around the speculation of possible earthquakes inferred from suppositories "Bendandiane" methodologies, rose up in previous years caused by social alarms during supposed occurrences of earthquakes which never happened but where widely spread by media following some 'well informed' non conventional scientists. The analysis was conducted through an extensive literature search of the archive 'Raffaele Bendandi' at Geophy sical Observatory of Faenza and the forecasts analyzed utilising a specially developed software, called "Bendandiano Dashboard", that can reproduce the planetary configurations reported in the graphs made by Italian geophysicist. This analysis should serve to clarify 'definitively' what are the basis of the Bendandi's calculations as well as to prevent future unwarranted warnings issued on the basis of supposed prophecies and illusory legacy documents.

  8. Work in support of biosphere assessments for solid radioactive waste disposal. 1. performance assessments, requirements and methodology; criteria for radiological environmental protection

    International Nuclear Information System (INIS)

    Egan, M.J.; Loose, M.; Smith, G.M.; Watkins, B.M.

    2001-10-01

    The first part of this report is intended to assess how the recent Swedish regulatory developments and resulting criteria impose requirements on what should be included in a performance assessment (PA) for the SFR low and medium level waste repository and for a potential deep repository for high level waste. The second part of the report has been prepared by QuantiSci as an input to the development of SSI's PA review methodology. The aim of the third part is to provide research input to the development of radiological protection framework for the environment, for use in Sweden. This is achieved through a review of various approaches used in other fields

  9. A Probabilistic Design Methodology for a Turboshaft Engine Overall Performance Analysis

    Directory of Open Access Journals (Sweden)

    Min Chen

    2014-05-01

    Full Text Available In reality, the cumulative effect of the many uncertainties in engine component performance may stack up to affect the engine overall performance. This paper aims to quantify the impact of uncertainty in engine component performance on the overall performance of a turboshaft engine based on Monte-Carlo probabilistic design method. A novel probabilistic model of turboshaft engine, consisting of a Monte-Carlo simulation generator, a traditional nonlinear turboshaft engine model, and a probability statistical model, was implemented to predict this impact. One of the fundamental results shown herein is that uncertainty in component performance has a significant impact on the engine overall performance prediction. This paper also shows that, taking into consideration the uncertainties in component performance, the turbine entry temperature and overall pressure ratio based on the probabilistic design method should increase by 0.76% and 8.33%, respectively, compared with the ones of deterministic design method. The comparison shows that the probabilistic approach provides a more credible and reliable way to assign the design space for a target engine overall performance.

  10. Stepwise-refinement for performance: a methodology for many-core programming

    NARCIS (Netherlands)

    Hijma, P.; van Nieuwpoort, R.V.; Jacobs, C.J.H.; Bal, H.E.

    2015-01-01

    Many-core hardware is targeted specifically at obtaining high performance, but reaching high performance is often challenging because hardware-specific details have to be taken into account. Although there are many programming systems that try to alleviate many-core programming, some providing a

  11. Towards a performance assessment methodology using computational simulation for air distribution system designs in operating rooms

    NARCIS (Netherlands)

    Melhado, M.D.A.

    2012-01-01

    One of the important performance requirements for an air distribution system for an operating room (OR) is to provide good indoor environmental conditions in which to perform operations. Important conditions in this respect relate to the air quality and to the thermal conditions for the surgical

  12. Expert Performance Transfer - Making Knowledge Transfer Count with ExPerT Methodology

    International Nuclear Information System (INIS)

    Turner, C.L.; Braudt, T.E.

    2011-01-01

    'Knowledge Transfer' is a high-priority imperative as the nuclear industry faces the combined effects of an aging workforce and economic pressures to do more with less. Knowledge Transfer is only a part of the solution to these challenges, however. The more compelling and immediate need faced by industry is Accomplishment Transfer, or the transference of the applied knowledge necessary to assure optimal performance transfer from experienced, high-performing staff to inexperienced staff. A great deal of industry knowledge and required performance information has been documented in the form of procedures. Often under-appreciated either as knowledge stores or as drivers of human performance, procedures, coupled with tightly-focused and effective training, are arguably the most effective influences on human and plant performance. (author)

  13. Performance Analysis of a Six-Port Receiver in a WCDMA Communication System including a Multipath Fading Channel

    Directory of Open Access Journals (Sweden)

    A. O. Olopade

    2014-01-01

    Full Text Available Third generation communication systems require receivers with wide bandwidth of operation to support high transmission rates and are also reconfigurable to support various communication standards with different frequency bands. An ideal software defined radio (SDR will be the absolute answer to this requirement but it is not achievable with the current level of technology. This paper proposes the use of a six-port receiver (SPR front-end (FE in a WCDMA communication system. A WCDMA end-to-end physical layer MATLAB demo which includes a multipath channel distortion block is used to determine the viability of the six-port based receiver. The WCDMA signal after passing through a multipath channel is received using a constructed SPR FE. The baseband signal is then calibrated and corrected in MATLAB. The six-port receiver performance is measured in terms of bit error rate (BER. The signal-to-noise ratio (SNR of the transmitted IQ data is varied and the BER profile of the communication system is plotted. The effect of the multipath fading on the receiver performance and the accuracy of the calibration algorithm are obtained by comparing two different measured BER curves for different calibration techniques to the simulated BER curve of an ideal receiver.

  14. Applying distance-to-target weighing methodology to evaluate the environmental performance of bio-based energy, fuels, and materials

    International Nuclear Information System (INIS)

    Weiss, Martin; Patel, Martin; Heilmeier, Hermann; Bringezu, Stefan

    2007-01-01

    The enhanced use of biomass for the production of energy, fuels, and materials is one of the key strategies towards sustainable production and consumption. Various life cycle assessment (LCA) studies demonstrate the great potential of bio-based products to reduce both the consumption of non-renewable energy resources and greenhouse gas emissions. However, the production of biomass requires agricultural land and is often associated with adverse environmental effects such as eutrophication of surface and ground water. Decision making in favor of or against bio-based and conventional fossil product alternatives therefore often requires weighing of environmental impacts. In this article, we apply distance-to-target weighing methodology to aggregate LCA results obtained in four different environmental impact categories (i.e., non-renewable energy consumption, global warming potential, eutrophication potential, and acidification potential) to one environmental index. We include 45 bio- and fossil-based product pairs in our analysis, which we conduct for Germany. The resulting environmental indices for all product pairs analyzed range from -19.7 to +0.2 with negative values indicating overall environmental benefits of bio-based products. Except for three options of packaging materials made from wheat and cornstarch, all bio-based products (including energy, fuels, and materials) score better than their fossil counterparts. Comparing the median values for the three options of biomass utilization reveals that bio-energy (-1.2) and bio-materials (-1.0) offer significantly higher environmental benefits than bio-fuels (-0.3). The results of this study reflect, however, subjective value judgments due to the weighing methodology applied. Given the uncertainties and controversies associated not only with distance-to-target methodologies in particular but also with weighing approaches in general, the authors strongly recommend using weighing for decision finding only as a

  15. Performing Ecosystem Services at Mud Flats in Seocheon, Korea: Using Q Methodology for Cooperative Decision Making

    Directory of Open Access Journals (Sweden)

    Jae-Hyuck Lee

    2017-05-01

    Full Text Available The concept of ecosystem services, which are the direct and indirect benefits of nature to humans, has been established as a supporting tool to increase the efficiency in decision-making regarding environmental planning. However, preceding studies on decision-making in relation to ecosystem services have been limited to identifying differences in perception, whereas few studies have reported cooperative alternatives. Therefore, this study aimed to present a method for cooperative decision-making among ecosystem service stakeholders using Q methodology. The results showed three perspectives on ecosystem services of small mud flat areas: ecological function, ecotourism, and human activity. The perspectives on cultural services and regulating services were diverse, whereas those on supporting services were similar. Thus, supporting services were considered crucial for the cooperative assessment and management of small mud flat ecosystems as well as for the scientific evaluation of regulating services. Furthermore, this study identified practical implementation measures to increase production through land management, to manufacture related souvenirs, and to link them to ecotourism. Overall, our results demonstrated the ideal process of cooperative decision-making to improve ecosystem services.

  16. Enhancing plant performance in newer CANDU plants utilizing PLiM methodologies

    International Nuclear Information System (INIS)

    Azeez, S.; Krishnan, V.S.; Nickerson, J.H.; Kakaria, B.

    2002-01-01

    Over the past 5 years, Atomic Energy of Canada Ltd. (AECL) has been working with CANDU utilities on comprehensive and integrated CANDU PLiM programs for successful and reliable operation through design life and beyond. Considerable progress has been made in the development of CANDU PLiM methodologies and implementation of the outcomes at the plants. The basis of CANDU PLiM programs is to understand the ageing degradation mechanisms, prevent/minimize the effects of these phenomena in the Critical Structures, Systems and Components (CSSCs), and maintain the CSSC condition as close as possible in the best operating condition. Effective plant practices in surveillance, maintenance, and operations are the primary means of managing ageing. From the experience to date, the CANDU PLiM program will modify and enhance, but not likely replace, existing plant programs that address ageing. However, a successful PLiM program will provide assurance that these existing plant programs are both effective and can be shown to be effective, in managing ageing. This requires a structured and managed approach to both the assessment and implementation processes

  17. Response surface methodology for sensitivity and uncertainty analysis: performance and perspectives

    International Nuclear Information System (INIS)

    Olivi, L.; Brunelli, F.; Cacciabue, P.C.; Parisi, P.

    1985-01-01

    Two main aspects have to be taken into account in studying a nuclear accident scenario when using nuclear safety codes as an information source. The first one concerns the behavior of the code response and the set of assumptions to be introduced for its modelling. The second one is connected with the uncertainty features of the code input, often modelled as a probability density function (pdf). The analyst can apply two well-defined approaches depending on whether he wants major emphasis put on either of the aspects. Response Surface Methodology uses polynomial and inverse polynomial models together with the theory of experimental design, expressly developed for the identification procedure. It constitutes a well-established body of techniques able to cover a wide spectrum of requirements, when the first aspect plays the crucial role in the definition of the objectives. Other techniques such as Latin hypercube sampling, stratified sampling or even random sampling can fit better, when the second aspect affects the reliability of the analysis. The ultimate goal for both approaches is the selection of the variable, i.e. the identification of the code input variables most effective on the output and the uncertainty propagation, i.e. the assessment of the pdf to be attributed to the code response. The main aim of this work is to present a sensitivity analysis method, already tested on a real case, sufficiently flexible to be applied in both approaches mentioned

  18. Evaluating the methodology and performance of jetting and flooding of granular backfill materials.

    Science.gov (United States)

    2014-11-01

    Compaction of backfill in confined spaces on highway projects is often performed with small vibratory plates, based : solely on the experience of the contractor, leading to inadequate compaction. As a result, the backfill is prone to : erosion and of...

  19. A New Methodology for the Integration of Performance Materials into the Clothing Curriculum

    OpenAIRE

    Power, Jess

    2014-01-01

    This paper presents a model for integrating the study of performance materials into the clothing curriculum. In recent years there has been an increase in demand for stylish, functional and versatile sports apparel. Analysts predict this will reach US$126.30 billion by 2015. This growth is accredited to dramatic lifestyle changes and increasing participation in sports/leisurely pursuits particularly by women. The desire to own performance clothing for specific outdoor pursuits is increasing a...

  20. A Compact Methodology to Understand, Evaluate, and Predict the Performance of Automatic Target Recognition

    Science.gov (United States)

    Li, Yanpeng; Li, Xiang; Wang, Hongqiang; Chen, Yiping; Zhuang, Zhaowen; Cheng, Yongqiang; Deng, Bin; Wang, Liandong; Zeng, Yonghu; Gao, Lei

    2014-01-01

    This paper offers a compacted mechanism to carry out the performance evaluation work for an automatic target recognition (ATR) system: (a) a standard description of the ATR system's output is suggested, a quantity to indicate the operating condition is presented based on the principle of feature extraction in pattern recognition, and a series of indexes to assess the output in different aspects are developed with the application of statistics; (b) performance of the ATR system is interpreted by a quality factor based on knowledge of engineering mathematics; (c) through a novel utility called “context-probability” estimation proposed based on probability, performance prediction for an ATR system is realized. The simulation result shows that the performance of an ATR system can be accounted for and forecasted by the above-mentioned measures. Compared to existing technologies, the novel method can offer more objective performance conclusions for an ATR system. These conclusions may be helpful in knowing the practical capability of the tested ATR system. At the same time, the generalization performance of the proposed method is good. PMID:24967605

  1. Cost optimal building performance requirements. Calculation methodology for reporting on national energy performance requirements on the basis of cost optimality within the framework of the EPBD

    Energy Technology Data Exchange (ETDEWEB)

    Boermans, T.; Bettgenhaeuser, K.; Hermelink, A.; Schimschar, S. [Ecofys, Utrecht (Netherlands)

    2011-05-15

    On the European level, the principles for the requirements for the energy performance of buildings are set by the Energy Performance of Buildings Directive (EPBD). Dating from December 2002, the EPBD has set a common framework from which the individual Member States in the EU developed or adapted their individual national regulations. The EPBD in 2008 and 2009 underwent a recast procedure, with final political agreement having been reached in November 2009. The new Directive was then formally adopted on May 19, 2010. Among other clarifications and new provisions, the EPBD recast introduces a benchmarking mechanism for national energy performance requirements for the purpose of determining cost-optimal levels to be used by Member States for comparing and setting these requirements. The previous EPBD set out a general framework to assess the energy performance of buildings and required Member States to define maximum values for energy delivered to meet the energy demand associated with the standardised use of the building. However it did not contain requirements or guidance related to the ambition level of such requirements. As a consequence, building regulations in the various Member States have been developed by the use of different approaches (influenced by different building traditions, political processes and individual market conditions) and resulted in different ambition levels where in many cases cost optimality principles could justify higher ambitions. The EPBD recast now requests that Member States shall ensure that minimum energy performance requirements for buildings are set 'with a view to achieving cost-optimal levels'. The cost optimum level shall be calculated in accordance with a comparative methodology. The objective of this report is to contribute to the ongoing discussion in Europe around the details of such a methodology by describing possible details on how to calculate cost optimal levels and pointing towards important factors and

  2. Going beyond a First Reader: A Machine Learning Methodology for Optimizing Cost and Performance in Breast Ultrasound Diagnosis.

    Science.gov (United States)

    Venkatesh, Santosh S; Levenback, Benjamin J; Sultan, Laith R; Bouzghar, Ghizlane; Sehgal, Chandra M

    2015-12-01

    The goal of this study was to devise a machine learning methodology as a viable low-cost alternative to a second reader to help augment physicians' interpretations of breast ultrasound images in differentiating benign and malignant masses. Two independent feature sets consisting of visual features based on a radiologist's interpretation of images and computer-extracted features when used as first and second readers and combined by adaptive boosting (AdaBoost) and a pruning classifier resulted in a very high level of diagnostic performance (area under the receiver operating characteristic curve = 0.98) at a cost of pruning a fraction (20%) of the cases for further evaluation by independent methods. AdaBoost also improved the diagnostic performance of the individual human observers and increased the agreement between their analyses. Pairing AdaBoost with selective pruning is a principled methodology for achieving high diagnostic performance without the added cost of an additional reader for differentiating solid breast masses by ultrasound. Copyright © 2015 World Federation for Ultrasound in Medicine & Biology. Published by Elsevier Inc. All rights reserved.

  3. Application of Response Surface Methodology (RSM for Optimization of Operating Parameters and Performance Evaluation of Cooling Tower Cold Water Temperature

    Directory of Open Access Journals (Sweden)

    Ramkumar RAMAKRISHNAN

    2012-01-01

    Full Text Available The performance of a cooling tower was analyzed with various operating parameters tofind the minimum cold water temperature. In this study, optimization of operating parameters wasinvestigated. An experimental design was carried out based on central composite design (CCD withresponse surface methodology (RSM. This paper presents optimum operating parameters and theminimum cold water temperature using the RSM method. The RSM was used to evaluate the effectsof operating variables and their interaction towards the attainment of their optimum conditions.Based on the analysis, air flow, hot water temperature and packing height were high significanteffect on cold water temperature. The optimum operating parameters were predicted using the RSMmethod and confirmed through experiment.

  4. Methodology for performing RF reliability experiments on a generic test structure

    NARCIS (Netherlands)

    Sasse, G.T.; de Vries, Rein J.; Schmitz, Jurriaan

    2007-01-01

    This paper discusses a new technique developed for generating well defined RF large voltage swing signals for on wafer experiments. This technique can be employed for performing a broad range of different RF reliability experiments on one generic test structure. The frequency dependence of a

  5. Performance characterization of night vision equipment based on Triangle Orientation Discrimination (TOD) methodology

    NARCIS (Netherlands)

    Laurent, N.; Lejard, C.; Deltel, G.; Bijl, P.

    2013-01-01

    Night vision equipment is crucial in order to accomplish supremacy and safety of the troops on the battlefield. Evidently, system integrators, MODs and end-users need access to reliable quantitative characterization of the expected field performance when using night vision equipment. The Image

  6. Bioclim Deliverable D10 - 12: development and application of a methodology for taking climate-driven environmental change into account in performance assessments

    International Nuclear Information System (INIS)

    2004-01-01

    The BIOCLIM project on modelling sequential Biosphere systems under Climate change for radioactive waste disposal is part of the EURATOM fifth European framework programme. The project was launched in October 2000 for a three-year period. The project aims at providing a scientific basis and practical methodology for assessing the possible long term impacts on the safety of radioactive waste repositories in deep formations due to climate and environmental change. Five work packages have been identified to fulfill the project objectives: - Work package 1 will consolidate the needs of the European agencies of the consortium and summarize how environmental change has been treated to date in performance assessments. - Work packages 2 and 3 will develop two innovative and complementary strategies for representing time series of long term climate change using different methods to analyse extreme climate conditions (the hierarchical strategy) and a continuous climate simulation over more than the next glacial-interglacial cycle (the integrated strategy). - Work package 4 will explore and evaluate the potential effects of climate change on the nature of the biosphere systems. - Work package 5 will disseminate information on the results obtained from the three year project among the international community for further use. The output from the climate models developed and applied in WP2 and WP3 has been interpreted in WP4 ('Biosphere system description') in terms of model requirements for the post-closure radiological performance assessment of deep geological repositories for radioactive wastes, in order to develop a methodology to demonstrate how biosphere systems can be represented in the long-term. The work undertaken in WP4 is described in this report. This report describes the methodology used for identification and characterisation of specific climate states and transitions between those climate states. It also covers the application of those methods in the context of

  7. Evaluating long-term performance of in situ vitrified waste forms: Methodology and results

    International Nuclear Information System (INIS)

    McGrail, B.P.; Olson, K.M.

    1992-11-01

    In situ vitrification (ISV) is an emerging technology for the remediation of hazardous and radioactive waste sites. The concept relies on the principle of Joule heating to raise the temperature of a soil between an array of electrodes above the melting temperature. After cooling, the melt solidifies into a massive glass and crystalline block similar to naturally occurring obsidian. Determining the long-term performance of ISV products in a changing regulatory environment requires a fundamental understanding of the mechanisms controlling the dissolution behavior of the material. A series of experiments was performed to determine the dissolution behavior of samples produced from the ISV processing of typical soils from the Idaho National Engineering Laboratory subsurface disposal area. Dissolution rate constant measurements were completed at 90 degrees C over the pH range 2 to 11 for one sample obtained from a field test of the ISV process

  8. A Performance Measurement and Implementation Methodology in a Department of Defense CIM (Computer Integrated Manufacturing) Environment

    Science.gov (United States)

    1988-01-24

    vanes.-The new facility is currently being called the Engine Blade/ Vape Facility (EB/VF). There are three primary goals in automating this proc..e...earlier, the search led primarily into the areas of CIM Justification, Automation Strategies , Performance Measurement, and Integration issues. Of...of living, has been steadily eroding. One dangerous trend that has developed in keenly competitive world markets , says Rohan [33], has been for U.S

  9. The Plumbing of Land Surface Models: Is Poor Performance a Result of Methodology or Data Quality?

    Science.gov (United States)

    Haughton, Ned; Abramowitz, Gab; Pitman, Andy J.; Or, Dani; Best, Martin J.; Johnson, Helen R.; Balsamo, Gianpaolo; Boone, Aaron; Cuntz, Matthais; Decharme, Bertrand; hide

    2016-01-01

    The PALS Land sUrface Model Benchmarking Evaluation pRoject (PLUMBER) illustrated the value of prescribing a priori performance targets in model intercomparisons. It showed that the performance of turbulent energy flux predictions from different land surface models, at a broad range of flux tower sites using common evaluation metrics, was on average worse than relatively simple empirical models. For sensible heat fluxes, all land surface models were outperformed by a linear regression against downward shortwave radiation. For latent heat flux, all land surface models were outperformed by a regression against downward shortwave, surface air temperature and relative humidity. These results are explored here in greater detail and possible causes are investigated. We examine whether particular metrics or sites unduly influence the collated results, whether results change according to time-scale aggregation and whether a lack of energy conservation in fluxtower data gives the empirical models an unfair advantage in the intercomparison. We demonstrate that energy conservation in the observational data is not responsible for these results. We also show that the partitioning between sensible and latent heat fluxes in LSMs, rather than the calculation of available energy, is the cause of the original findings. Finally, we present evidence suggesting that the nature of this partitioning problem is likely shared among all contributing LSMs. While we do not find a single candidate explanation forwhy land surface models perform poorly relative to empirical benchmarks in PLUMBER, we do exclude multiple possible explanations and provide guidance on where future research should focus.

  10. Development of a High Performance PES Ultrafiltration Hollow Fiber Membrane for Oily Wastewater Treatment Using Response Surface Methodology

    Directory of Open Access Journals (Sweden)

    Noor Adila Aluwi Shakir

    2015-12-01

    Full Text Available This study attempts to optimize the spinning process used for fabricating hollow fiber membranes using the response surface methodology (RSM. The spinning factors considered for the experimental design are the dope extrusion rate (DER, air gap length (AGL, coagulation bath temperature (CBT, bore fluid ratio (BFR, and post-treatment time (PT whilst the response investigated is rejection. The optimal spinning conditions promising the high rejection performance of polyethersulfone (PES ultrafiltration hollow fiber membranes for oily wastewater treatment are at the dope extrusion rate of 2.13 cm3/min, air gap length of 0 cm, coagulation bath temperature of 30 °C, and bore fluid ratio (NMP/H2O of 0.01/99.99 wt %. This study will ultimately enable the membrane fabricators to produce high-performance membranes that contribute towards the availability of a more sustainable water supply system.

  11. A Methodology to Reduce the Computational Effort in the Evaluation of the Lightning Performance of Distribution Networks

    Directory of Open Access Journals (Sweden)

    Ilaria Bendato

    2016-11-01

    Full Text Available The estimation of the lightning performance of a power distribution network is of great importance to design its protection system against lightning. An accurate evaluation of the number of lightning events that can create dangerous overvoltages requires a huge computational effort, as it implies the adoption of a Monte Carlo procedure. Such a procedure consists of generating many different random lightning events and calculating the corresponding overvoltages. The paper proposes a methodology to deal with the problem in two computationally efficient ways: (i finding out the minimum number of Monte Carlo runs that lead to reliable results; and (ii setting up a procedure that bypasses the lightning field-to-line coupling problem for each Monte Carlo run. The proposed approach is shown to provide results consistent with existing approaches while exhibiting superior Central Processing Unit (CPU time performances.

  12. Methodology to determine the technical performance and value proposition for grid-scale energy storage systems :

    Energy Technology Data Exchange (ETDEWEB)

    Byrne, Raymond Harry; Loose, Verne William; Donnelly, Matthew K.; Trudnowski, Daniel J.

    2012-12-01

    As the amount of renewable generation increases, the inherent variability of wind and photovoltaic systems must be addressed in order to ensure the continued safe and reliable operation of the nation's electricity grid. Grid-scale energy storage systems are uniquely suited to address the variability of renewable generation and to provide other valuable grid services. The goal of this report is to quantify the technical performance required to provide di erent grid bene ts and to specify the proper techniques for estimating the value of grid-scale energy storage systems.

  13. Methodological progress in the development of scenarios for ENRESA-2000 Performance assessment exercise

    International Nuclear Information System (INIS)

    Cortes Martin, A.

    2000-01-01

    ENRESA is carrying out a new safety assessment exercise for a deep geological spent fuel disposal facility located in granite, known as ENRESA-2000. One of the main objectives of this safety analysis is the integration and implementation of all R and D studies performed to date by ENRESA, as well as the identification of those aspects of the assessment which require further investigation. One of the main activities of this exercise is the selection and development of the scenarios to be quantitatively analysed during the assessment, where a scenario is defined as a sufficient number of FEPs (ie relevant features, events and processes) as well as their influence relationships, which explain the behaviour of the disposal system. As a result of these three methods, a definitive list of FEPs will be obtained for the ENRESA-2000 exercise. Once grouped into scenarios, these FEPs will be used to model and calculate consequences. This process of generation and development of scenarios for the ENRESA-2000 performance assessment exercise is presented in this paper. (Author)

  14. High performance shape annealing matrix (HPSAM) methodology for core protection calculators

    International Nuclear Information System (INIS)

    Cha, K. H.; Kim, Y. H.; Lee, K. H.

    1999-01-01

    In CPC(Core Protection Calculator) of CE-type nuclear power plants, the core axial power distribution is calculated to evaluate the safety-related parameters. The accuracy of the CPC axial power distribution highly depends on the quality of the so called shape annealing matrix(SAM). Currently, SAM is determined by using data measured during startup test and used throughout the entire cycle. An issue concerned with SAM is that it is fairly sensitive to measurements and thus the fidelity of SAM is not guaranteed for all cycles. In this paper, a novel method to determine a high-performance SAM (HPSAM) is proposed, where both measured and simulated data are used in determining SAM

  15. Assessing Confidence in Performance Assessments Using an Evidence Support Logic Methodology: An Application of Tesla

    International Nuclear Information System (INIS)

    Egan, M.; Paulley, A.; Lehman, L.; Lowe, J.; Rochette, E.; Baker, St.

    2009-01-01

    The assessment of uncertainties and their implications is a key requirement when undertaking performance assessment (PA) of radioactive waste facilities. Decisions based on the outcome of such assessments become translated into judgments about confidence in the information they provide. This confidence, in turn, depends on uncertainties in the underlying evidence. Even if there is a large amount of information supporting an assessment, it may be only partially relevant, incomplete or less than completely reliable. In order to develop a measure of confidence in the outcome, sources of uncertainty need to be identified and adequately addressed in the development of the PA, or in any overarching strategic decision-making processes. This paper describes a trial application of the technique of Evidence Support Logic (ESL), which has been designed for application in support of 'high stakes' decisions, where important aspects of system performance are subject to uncertainty. The aims of ESL are to identify the amount of uncertainty or conflict associated with evidence relating to a particular decision, and to guide understanding of how evidence combines to support confidence in judgments. Elicitation techniques are used to enable participants in the process to develop a logical hypothesis model that best represents the relationships between different sources of evidence to the proposition under examination. The aim is to identify key areas of subjectivity and other sources of potential bias in the use of evidence (whether for or against the proposition) to support judgments of confidence. Propagation algorithms are used to investigate the overall implications of the logic according to the strength of the underlying evidence and associated uncertainties. (authors)

  16. Progress in Methodologies for the Assessment of Passive Safety System Reliability in Advanced Reactors. Results from the Coordinated Research Project on Development of Advanced Methodologies for the Assessment of Passive Safety Systems Performance in Advanced Reactors

    International Nuclear Information System (INIS)

    2014-09-01

    Strong reliance on inherent and passive design features has become a hallmark of many advanced reactor designs, including several evolutionary designs and nearly all advanced small and medium sized reactor (SMR) designs. Advanced nuclear reactor designs incorporate several passive systems in addition to active ones — not only to enhance the operational safety of the reactors but also to eliminate the possibility of serious accidents. Accordingly, the assessment of the reliability of passive safety systems is a crucial issue to be resolved before their extensive use in future nuclear power plants. Several physical parameters affect the performance of a passive safety system, and their values at the time of operation are unknown a priori. The functions of passive systems are based on basic physical laws and thermodynamic principals, and they may not experience the same kind of failures as active systems. Hence, consistent efforts are required to qualify the reliability of passive systems. To support the development of advanced nuclear reactor designs with passive systems, investigations into their reliability using various methodologies are being conducted in several Member States with advanced reactor development programmes. These efforts include reliability methods for passive systems by the French Atomic Energy and Alternative Energies Commission, reliability evaluation of passive safety system by the University of Pisa, Italy, and assessment of passive system reliability by the Bhabha Atomic Research Centre, India. These different approaches seem to demonstrate a consensus on some aspects. However, the developers of the approaches have been unable to agree on the definition of reliability in a passive system. Based on these developments and in order to foster collaboration, the IAEA initiated the Coordinated Research Project (CRP) on Development of Advanced Methodologies for the Assessment of Passive Safety Systems Performance in Advanced Reactors in 2008. The

  17. MCDA-C Methodology Based Performance Evaluation of Small and Medium-Sized Businesses at the City of Lages

    Directory of Open Access Journals (Sweden)

    Marcelo Nascimento

    2013-12-01

    Full Text Available When employed in a focused manner, corporate performance evaluation has proven to be instrumental for entrepreneurs as an important tool that contributes with performance improvements at their organizations. The descriptive study herein, prepared as of a questionnaire comprising 46 queries, poses to analyse the performance of micro and small companies (MSEs by employing the multicriteria methodology for constructive decision aiding (MCDA-C. As of respondent replies, MCDA-C descriptors were formed, shaping six prime groups so as to identify relevant factors that drive or hinder MSE success. The questionnaire was applied to managers in charge administering 25 small and medium-sized companies of Lages, a city within the Brazilian state of Santa Catarina. Study findings provide evidence as to the fact that (i 24% of surveyed companies, tend to go bankrupt; (ii managerial functions at the MSEs are the prime source of influence on negative outcomes; (iii from a financial control standpoint, surveyed companies fall far shorter than the minimum level deemed necessary to qualify as satisfactory; (iv those that present the best results, operate both within the domestic and international markets; (v the study placed under the spotlight the group “Evolution Stage”, evidencing the trend of ever increasing MSE expansion. This study revealed that corporate failure contributing factors are intensely interconnected and largely depend on the entrepreneur´s own performance, the prime contribution of findings residing in demonstrating that MCDA-C can be employed to analyse the performance of micro and small businesses.

  18. Methodologic Guide for Evaluating Clinical Performance and Effect of Artificial Intelligence Technology for Medical Diagnosis and Prediction.

    Science.gov (United States)

    Park, Seong Ho; Han, Kyunghwa

    2018-03-01

    The use of artificial intelligence in medicine is currently an issue of great interest, especially with regard to the diagnostic or predictive analysis of medical images. Adoption of an artificial intelligence tool in clinical practice requires careful confirmation of its clinical utility. Herein, the authors explain key methodology points involved in a clinical evaluation of artificial intelligence technology for use in medicine, especially high-dimensional or overparameterized diagnostic or predictive models in which artificial deep neural networks are used, mainly from the standpoints of clinical epidemiology and biostatistics. First, statistical methods for assessing the discrimination and calibration performances of a diagnostic or predictive model are summarized. Next, the effects of disease manifestation spectrum and disease prevalence on the performance results are explained, followed by a discussion of the difference between evaluating the performance with use of internal and external datasets, the importance of using an adequate external dataset obtained from a well-defined clinical cohort to avoid overestimating the clinical performance as a result of overfitting in high-dimensional or overparameterized classification model and spectrum bias, and the essentials for achieving a more robust clinical evaluation. Finally, the authors review the role of clinical trials and observational outcome studies for ultimate clinical verification of diagnostic or predictive artificial intelligence tools through patient outcomes, beyond performance metrics, and how to design such studies. © RSNA, 2018.

  19. Development and validation of methodology for technetium-99m radiopharmaceuticals using high performance liquid chromatography (HPLC)

    International Nuclear Information System (INIS)

    Almeida, Erika Vieira de

    2009-01-01

    Radiopharmaceuticals are compounds, with no pharmacological action, which have a radioisotope in their composition and are used in Nuclear Medicine for diagnosis and therapy of several diseases. In this work, the development and validation of an analytical method for 99 mTc-HSA, 99 mTc-EC, 99 mTc-ECD and 99 mTc-Sestamibi radiopharmaceuticals and for some raw materials were carried out by high performance liquid chromatography (HPLC). The analyses were performed in a Shimadzu HPLC equipment, LC-20AT Prominence model. Some impurities were identified by the addition of a reference standard substance. Validation of the method was carried out according to the criteria defined in RE n. 899/2003 of the National Sanitary Agency (ANVISA). The results for robustness of the method showed that it is necessary to control flow rate conditions, sample volume, pH of the mobile phase and temperature of the oven. The analytical curves were linear in the concentration ranges, with linear correlation coefficients (r 2 ) above 0.9995. The results for precision, accuracy and recovery showed values in the range of 0.07-4.78%, 95.38-106.50% and 94.40-100.95%, respectively. The detection limits and quantification limits varied from 0.27 to 5.77 μg mL -1 and 0.90 to 19.23 μg mL -1 , respectively. The values for HAS, EC, ECD and MIBI in the lyophilized reagents were 8.95; 0.485; 0.986 and 0.974 mg L-1, respectively. The mean radiochemical purity for 99 mTc-HSA, 99 mTc-EC, 99 mTc-ECD and 99 mTc-Sestamibi was (97.28 ± 0.09)%, (98.96 ± 0.03)%, (98.96 ± 0.03)% and (98.07 ± 0.01)%, respectively. All the parameters recommended by ANVISA were evaluated and the results are below the established limits. (author)

  20. Performance of uncertainty quantification methodologies and linear solvers in cardiovascular simulations

    Science.gov (United States)

    Seo, Jongmin; Schiavazzi, Daniele; Marsden, Alison

    2017-11-01

    Cardiovascular simulations are increasingly used in clinical decision making, surgical planning, and disease diagnostics. Patient-specific modeling and simulation typically proceeds through a pipeline from anatomic model construction using medical image data to blood flow simulation and analysis. To provide confidence intervals on simulation predictions, we use an uncertainty quantification (UQ) framework to analyze the effects of numerous uncertainties that stem from clinical data acquisition, modeling, material properties, and boundary condition selection. However, UQ poses a computational challenge requiring multiple evaluations of the Navier-Stokes equations in complex 3-D models. To achieve efficiency in UQ problems with many function evaluations, we implement and compare a range of iterative linear solver and preconditioning techniques in our flow solver. We then discuss applications to patient-specific cardiovascular simulation and how the problem/boundary condition formulation in the solver affects the selection of the most efficient linear solver. Finally, we discuss performance improvements in the context of uncertainty propagation. Support from National Institute of Health (R01 EB018302) is greatly appreciated.

  1. Evaluation of the performance of diagnosis-related groups and similar casemix systems: methodological issues.

    Science.gov (United States)

    Palmer, G; Reid, B

    2001-05-01

    With the increasing recognition and application of casemix for managing and financing healthcare resources, the evaluation of alternative versions of systems such as diagnosis-related groups (DRGs) has been afforded high priority by governments and researchers in many countries. Outside the United States, an important issue has been the perceived need to produce local versions, and to establish whether or not these perform more effectively than the US-based classifications. A discussion of casemix evaluation criteria highlights the large number of measures that may be used, the rationale and assumptions underlying each measure, and the problems in interpreting the results. A review of recent evaluation studies from a number of countries indicates that considerable emphasis has been placed on the predictive validity criterion, as measured by the R2 statistic. However, the interpretation of the findings has been affected greatly by the methods used, especially the treatment and definition of outlier cases. Furthermore, the extent to which other evaluation criteria have been addressed has varied widely. In the absence of minimum evaluation standards, it is not possible to draw clear-cut conclusions about the superiority of one version of a casemix system over another, the need for a local adaptation, or the further development of an existing version. Without the evidence provided by properly designed studies, policy-makers and managers may place undue reliance on subjective judgments and the views of the most influential, but not necessarily best informed, healthcare interest groups.

  2. A methodology of uncertainty/sensitivity analysis for PA of HLW repository learned from 1996 WIPP performance assessment

    International Nuclear Information System (INIS)

    Lee, Y. M.; Kim, S. K.; Hwang, Y. S.; Kang, C. H.

    2002-01-01

    The WIPP (Waste Isolation Pilot Plant) is a mined repository constructed by the US DOE for the permanent disposal of transuranic (TRU) wastes generated by activities related to defence of the US since 1970. Its historical disposal operation began in March 1999 following receipt of a final permit from the State of NM after a positive certification decision for the WIPP was issued by the EPA in 1998, as the first licensed facility in the US for the deep geologic disposal of radioactive wastes. The CCA (Compliance Certification Application) for the WIPP that the DOE submitted to the EPA in 1966 was supported by an extensive Performance Assessment (PA) carried out by Sandia National Laboratories (SNL), with so-called 1996 PA. Even though such PA methodologies could be greatly different from the way we consider for HLW disposal in Korea largely due to quite different geologic formations in which repository are likely to be located, a review on lots of works done through the WIPP PA studies could be the most important lessons that we can learn from in view of current situation in Korea where an initial phase of conceptual studies on HLW disposal has been just started. The objective of this work is an overview of the methodology used in the recent WIPP PA to support the US DOE WIPP CCA ans a proposal for Korean case

  3. Novel experimental methodology for the characterization of thermodynamic performance of advanced working pairs for adsorptive heat transformers

    International Nuclear Information System (INIS)

    Frazzica, Andrea; Sapienza, Alessio; Freni, Angelo

    2014-01-01

    This paper presents a novel experimental protocol for the evaluation of the thermodynamic performance of working pairs for application in adsorption heat pumps and chillers. The proposed approach is based on the experimental measurements of the main thermo-physical parameters of adsorbent pairs, by means of a DSC/TG apparatus modified to work under saturated vapour conditions, able to measure the ads-/desorption isobars and heat flux as well as the adsorbent specific heat under real boundary conditions. Such kind of activity allows to characterize the thermodynamic performance of an adsorbent pair allowing the estimation of the thermal Coefficient Of Performance (COP) both for heating and cooling applications, only relying on experimental values. The experimental uncertainty of the method has been estimated to be around 2%, for the COP evaluation. In order to validate the proposed procedure, a first test campaign has been carried out on the commercial adsorbent material, AQSOA-Z02, produced by MPI (Mitsubishi Plastics Inc.), while water was used as refrigerant. The proposed experimental methodology will be applied on several other adsorbent materials, either already on the market or still under investigation, in order to get an easy and reliable method to compare thermodynamic performance of adsorptive working pairs

  4. Methodological considerations in a pilot study on the effects of a berry enriched smoothie on children's performance in school.

    Science.gov (United States)

    Rosander, Ulla; Rumpunen, Kimmo; Olsson, Viktoria; Åström, Mikael; Rosander, Pia; Wendin, Karin

    2017-01-01

    Berries contain bioactive compounds that may affect children's cognitive function positively, while hunger and thirst during lessons before lunch affect academic performance negatively. This pilot study addresses methodological challenges in studying if a berry smoothie, offered to schoolchildren as a mid-morning beverage, affects academic performance. The objective was to investigate if a cross-over design can be used to study these effects in a school setting. Therefore, in order to investigate assay sensitivity, 236 Swedish children aged 10-12 years were administered either a berry smoothie (active) or a fruit-based control beverage after their mid-morning break. Both beverages provided 5% of child daily energy intake. In total, 91% of participants completed the study. Academic performance was assessed using the d2 test of attention. Statistical analyses were performed using the Wilcoxon signed rank test in StatXact v 10.3. The results showed that the children consumed less of the active berry smoothie than the control (154 g vs. 246 g). Both beverages increased attention span and concentration significantly (p = 0.000). However, as there was no significant difference (p = 0.938) in the magnitude of this effect between the active and control beverages, the assay sensitivity of the study design was not proven. The effect of the beverages on academic performance was attributed the supplementation of water and energy. Despite careful design, the active smoothie was less accepted than the control. This could be explained by un-familiar sensory characteristics and peer influence, stressing the importance of sensory similarity and challenges to perform a study in school settings. The employed cross-over design did not reveal any effects of bioactive compound consumption on academic performance. In future studies, the experimental set up should be modified or replaced by e.g. the parallel study design, in order to provide conclusive results.

  5. Assessing the performance of methodological search filters to improve the efficiency of evidence information retrieval: five literature reviews and a qualitative study.

    Science.gov (United States)

    Lefebvre, Carol; Glanville, Julie; Beale, Sophie; Boachie, Charles; Duffy, Steven; Fraser, Cynthia; Harbour, Jenny; McCool, Rachael; Smith, Lynne

    2017-11-01

    Effective study identification is essential for conducting health research, developing clinical guidance and health policy and supporting health-care decision-making. Methodological search filters (combinations of search terms to capture a specific study design) can assist in searching to achieve this. This project investigated the methods used to assess the performance of methodological search filters, the information that searchers require when choosing search filters and how that information could be better provided. Five literature reviews were undertaken in 2010/11: search filter development and testing; comparison of search filters; decision-making in choosing search filters; diagnostic test accuracy (DTA) study methods; and decision-making in choosing diagnostic tests. We conducted interviews and a questionnaire with experienced searchers to learn what information assists in the choice of search filters and how filters are used. These investigations informed the development of various approaches to gathering and reporting search filter performance data. We acknowledge that there has been a regrettable delay between carrying out the project, including the searches, and the publication of this report, because of serious illness of the principal investigator. The development of filters most frequently involved using a reference standard derived from hand-searching journals. Most filters were validated internally only. Reporting of methods was generally poor. Sensitivity, precision and specificity were the most commonly reported performance measures and were presented in tables. Aspects of DTA study methods are applicable to search filters, particularly in the development of the reference standard. There is limited evidence on how clinicians choose between diagnostic tests. No published literature was found on how searchers select filters. Interviewing and questioning searchers via a questionnaire found that filters were not appropriate for all tasks but were

  6. Implementation of a methodology to perform the uncertainty and sensitivity analysis of the control rod drop in a BWR

    Energy Technology Data Exchange (ETDEWEB)

    Reyes F, M. del C.

    2015-07-01

    A methodology to perform uncertainty and sensitivity analysis for the cross sections used in a Trace/PARCS coupled model for a control rod drop transient of a BWR-5 reactor was implemented with the neutronics code PARCS. A model of the nuclear reactor detailing all assemblies located in the core was developed. However, the thermohydraulic model designed in Trace was a simple model, where one channel representing all the types of assemblies located in the core, it was located inside a simple vessel model and boundary conditions were established. The thermohydraulic model was coupled with the neutronics model, first for the steady state and then a Control Rod Drop (CRD) transient was performed, in order to carry out the uncertainty and sensitivity analysis. To perform the analysis of the cross sections used in the Trace/PARCS coupled model during the transient, Probability Density Functions (PDFs) were generated for the 22 parameters cross sections selected from the neutronics parameters that PARCS requires, thus obtaining 100 different cases for the Trace/PARCS coupled model, each with a database of different cross sections. All these cases were executed with the coupled model, therefore obtaining 100 different outputs for the CRD transient with special emphasis on 4 responses per output: 1) The reactivity, 2) the percentage of rated power, 3) the average fuel temperature and 4) the average coolant density. For each response during the transient an uncertainty analysis was performed in which the corresponding uncertainty bands were generated. With this analysis it is possible to observe the results ranges of the responses chose by varying the uncertainty parameters selected. This is very useful and important for maintaining the safety in the nuclear power plants, also to verify if the uncertainty band is within of safety margins. The sensitivity analysis complements the uncertainty analysis identifying the parameter or parameters with the most influence on the

  7. Implementation of a methodology to perform the uncertainty and sensitivity analysis of the control rod drop in a BWR

    International Nuclear Information System (INIS)

    Reyes F, M. del C.

    2015-01-01

    A methodology to perform uncertainty and sensitivity analysis for the cross sections used in a Trace/PARCS coupled model for a control rod drop transient of a BWR-5 reactor was implemented with the neutronics code PARCS. A model of the nuclear reactor detailing all assemblies located in the core was developed. However, the thermohydraulic model designed in Trace was a simple model, where one channel representing all the types of assemblies located in the core, it was located inside a simple vessel model and boundary conditions were established. The thermohydraulic model was coupled with the neutronics model, first for the steady state and then a Control Rod Drop (CRD) transient was performed, in order to carry out the uncertainty and sensitivity analysis. To perform the analysis of the cross sections used in the Trace/PARCS coupled model during the transient, Probability Density Functions (PDFs) were generated for the 22 parameters cross sections selected from the neutronics parameters that PARCS requires, thus obtaining 100 different cases for the Trace/PARCS coupled model, each with a database of different cross sections. All these cases were executed with the coupled model, therefore obtaining 100 different outputs for the CRD transient with special emphasis on 4 responses per output: 1) The reactivity, 2) the percentage of rated power, 3) the average fuel temperature and 4) the average coolant density. For each response during the transient an uncertainty analysis was performed in which the corresponding uncertainty bands were generated. With this analysis it is possible to observe the results ranges of the responses chose by varying the uncertainty parameters selected. This is very useful and important for maintaining the safety in the nuclear power plants, also to verify if the uncertainty band is within of safety margins. The sensitivity analysis complements the uncertainty analysis identifying the parameter or parameters with the most influence on the

  8. Performance of Transuranic-Loaded Fully Ceramic Micro-Encapsulated Fuel in LWRs Final Report, Including Void Reactivity Evaluation

    International Nuclear Information System (INIS)

    Pope, Michael A.; Sen, R. Sonat; Boer, Brian; Ougouag, Abderrafi M.; Youinou, Gilles

    2011-01-01

    The current focus of the Deep Burn Project is on once-through burning of transuranics (TRU) in light-water reactors (LWRs). The fuel form is called Fully-Ceramic Micro-encapsulated (FCM) fuel, a concept that borrows the tri-isotropic (TRISO) fuel particle design from high-temperature reactor technology. In the Deep Burn LWR (DB-LWR) concept, these fuel particles are pressed into compacts using SiC matrix material and loaded into fuel pins for use in conventional LWRs. The TRU loading comes from the spent fuel of a conventional LWR after 5 years of cooling. Unit cell and assembly calculations have been performed using the DRAGON-4 code to assess the physics attributes of TRU-only FCM fuel in an LWR lattice. Depletion calculations assuming an infinite lattice condition were performed with calculations of various reactivity coefficients performed at each step. Unit cells and assemblies containing typical UO2 and mixed oxide (MOX) fuel were analyzed in the same way to provide a baseline against which to compare the TRU-only FCM fuel. Then, assembly calculations were performed evaluating the performance of heterogeneous arrangements of TRU-only FCM fuel pins along with UO2 pins.

  9. Prenatal Exposure to Organohalogens, Including Brominated Flame Retardants, Influences Motor, Cognitive, and Behavioral Performance at School Age

    NARCIS (Netherlands)

    Roze, Elise; Meijer, Lisethe; Bakker, Attie; Van Braeckel, Koenraad N. J. A.; Sauer, Pieter J. J.; Bos, Arend F.

    2009-01-01

    BACKGROUND: Organohalogen compounds (OHCs) are known to have neurotoxic effects on the developing brain. OBJECTIVE: We investigated the influence of prenatal exposure to OHCs, including brominated flame retardants, on motor, cognitive, and behavioral outcome in healthy children of school age.

  10. Methodology for the analysis of external flooding in CN Asco-II and CN Vandellos during the performance of stress tests

    International Nuclear Information System (INIS)

    Aleman, A.; Cobas, I.; Sabater, J.; Canadell, F.; Garces, L.; Otero, M.

    2012-01-01

    The work carried out in relation to extemal floods have allowed synthesized in a unique methodology to obtain the entire process of margins against external flooding, including identification of the extemal external events could cause flooding.

  11. Aluminum nitride coatings using response surface methodology to optimize the thermal dissipated performance of light-emitting diode modules

    Science.gov (United States)

    Jean, Ming-Der; Lei, Peng-Da; Kong, Ling-Hua; Liu, Cheng-Wu

    2018-05-01

    This study optimizes the thermal dissipation ability of aluminum nitride (AlN) ceramics to increase the thermal performance of light-emitting diode (LED) modulus. AlN powders are deposited on heat sink as a heat interface material, using an electrostatic spraying process. The junction temperature of the heat sink is developed by response surface methodology based on Taguchi methods. In addition, the structure and properties of the AlN coating are examined using X-ray photoelectron spectroscopy (XPS). In the XPS analysis, the AlN sub-peaks are observed at 72.79 eV for Al2p and 398.88 eV for N1s, and an N1s sub-peak is assigned to N-O at 398.60eV and Al-N bonding at 395.95eV, which allows good thermal properties. The results have shown that the use of AlN ceramic material on a heat sink can enhance the thermal performance of LED modules. In addition, the percentage error between the predicted and experimental results compared the quadric model with between the linear and he interaction models was found to be within 7.89%, indicating that it was a good predictor. Accordingly, RSM can effectively enhance the thermal performance of an LED, and the beneficial heat dissipation effects for AlN are improved by electrostatic spraying.

  12. Using performance assessment for radioactive waste disposal decision making -- implementation of the methodology into the third performance assessment iteration of the Greater Confinement Disposal site

    International Nuclear Information System (INIS)

    Gallegos, D.P.; Conrad, S.H.; Baer, T.A.

    1993-01-01

    The US Department of Energy is responsible for the disposal of a variety of radioactive wastes. Some of these wastes are prohibited from shallow land burial and also do not meet the waste acceptance criteria for proposed waste repositories at the Waste Isolation Pilot Plant (WIPP) and Yucca Mountain. These have been termed ''special-case'' waste and require an alternative disposal method. From 1984 to 1989, the Department of Energy disposed of a small quantity of special-case transuranic wastes at the Greater Confinement Disposal (GCD) site at the Nevada Test Site. In this paper, an iterative performance assessment is demonstrated as a useful decision making tool in the overall compliance assessment process for waste disposal. The GCD site has been used as the real-site implementation and test of the performance assessment approach. Through the first two performance assessment iterations for the GCD site, and the transition into the third, we demonstrate how the performance assessment methodology uses probabilistic risk concepts to guide affective decisions about site characterization activities and how it can be used as a powerful tool in bringing compliance decisions to closure

  13. Work in support of biosphere assessments for solid radioactive waste disposal. 1. performance assessments, requirements and methodology; criteria for radiological environmental protection

    Energy Technology Data Exchange (ETDEWEB)

    Egan, M.J.; Loose, M.; Smith, G.M.; Watkins, B.M. [QuantiSci Ltd., Henley-on-Thames (United Kingdom)

    2001-10-01

    The first part of this report is intended to assess how the recent Swedish regulatory developments and resulting criteria impose requirements on what should be included in a performance assessment (PA) for the SFR low and medium level waste repository and for a potential deep repository for high level waste. The second part of the report has been prepared by QuantiSci as an input to the development of SSI's PA review methodology. The aim of the third part is to provide research input to the development of radiological protection framework for the environment, for use in Sweden. This is achieved through a review of various approaches used in other fields.

  14. Analysis of microdialysate monoamines, including noradrenaline, dopamine and serotonin, using capillary ultra-high performance liquid chromatography and electrochemical detection.

    Science.gov (United States)

    Ferry, Barbara; Gifu, Elena-Patricia; Sandu, Ioana; Denoroy, Luc; Parrot, Sandrine

    2014-03-01

    Electrochemical methods are very often used to detect catecholamine and indolamine neurotransmitters separated by conventional reverse-phase high performance liquid chromatography (HPLC). The present paper presents the development of a chromatographic method to detect monoamines present in low-volume brain dialysis samples using a capillary column filled with sub-2μm particles. Several parameters (repeatability, linearity, accuracy, limit of detection) for this new ultrahigh performance liquid chromatography (UHPLC) method with electrochemical detection were examined after optimization of the analytical conditions. Noradrenaline, adrenaline, serotonin, dopamine and its metabolite 3-methoxytyramine were separated in 1μL of injected sample volume; they were detected above concentrations of 0.5-1nmol/L, with 2.1-9.5% accuracy and intra-assay repeatability equal to or less than 6%. The final method was applied to very low volume dialysates from rat brain containing monoamine traces. The study demonstrates that capillary UHPLC with electrochemical detection is suitable for monitoring dialysate monoamines collected at high sampling rate. Copyright © 2014 Elsevier B.V. All rights reserved.

  15. Reserve, thin form-factor, hypochlorite-based cells for powering portable systems: Manufacture (including MEMS processes), performance and characterization

    Energy Technology Data Exchange (ETDEWEB)

    Cardenas-Valencia, Andres M.; Langebrake, Larry [Center for Ocean Technology, University of South Florida, 140 Seventh Ave. S., St. Petersburg, FL (United States); Biver, Carl J. [Center for Ocean Technology, University of South Florida, 140 Seventh Ave. S., St. Petersburg, FL (United States); Department of Chemical Engineering, University of South Florida, 4202 E. Fowler Ave. Tampa, FL (United States)

    2007-03-30

    This work focuses on fabrication routes and performance evaluation of thin form-factors, reserve cells, as a powering alternative for expendable and/or remotely operated systems. The catalytic decomposition of sodium hypochlorite solutions is revisited herein with two cost-effective anodes: zinc and aluminum. Aluminum, even though the most expensive of the utilized anodes, constituted cells with double the energy content (up to 55 Wh kg{sup -1}) than those fabricated with zinc. Even though the hypochlorite concentration in the solution limits the cells' operational life, attractive performances (1.0 V with a current of 10 mA) for the manufactured cells are obtained. It is shown that micro fabrication processes, allowing for close electrodes interspacing, provided high faradic and columbic efficiencies of up to 70 and 100%, respectively. Obtained specific energies (50-120 Wh kg{sup -1}) are in the same order of magnitude than batteries currently used for powering deployable systems. Experimental results show that a simple model that linearly relates over potentials and the electrical load, adequately describe all the cell designs. A mathematical model based on a kinetic-mechanistic scheme that relates the current output as a function of time agrees fairly well with results obtained activating cells with various concentrations of NaOCl solutions. (author)

  16. Methodology for predicting the characteristics and performance of different PET camera designs: The choice of figures of merit

    International Nuclear Information System (INIS)

    Deconinck, F.; Defrise, M.; Kuyk, S.; Bossuyt, A.

    1985-01-01

    In order to compare different PET camera designs (ring and planar geometry), this paper proposes ''figures of merit which allow questions such as ''Is it better, given an particular design, to achieve a coincidence rate of 10 kHz with 5% randoms, or a 20 kHz rate with 10% randoms?'' to be answered. The authors propose a methodology based on information theory. The image is the three dimensional distribution which conveys the information to the human or electronic observer. The image is supposed to consist of discrete image elements (voxels) of uniform size, and each characterized by a coincidence density. The performance of a non-ideal imager can be studied by evaluating the effect on the visibility surface: the number of events per image element will be affected by the limited efficiency, geometrical acceptance angle, temporal resolution, energy resolution. The authors show that the effect of the degradations can be introduced in the formula by replacing the number of true coincidences by an effective number of coincidences. The non-ideal imager will distort the visibility surface, so the authors compare the performance of different PET cameras by comparing the different distortions which they induce and hence their ability to detect the information present in study objects

  17. CRISPRCasFinder, an update of CRISRFinder, includes a portable version, enhanced performance and integrates search for Cas proteins.

    Science.gov (United States)

    Couvin, David; Bernheim, Aude; Toffano-Nioche, Claire; Touchon, Marie; Michalik, Juraj; Néron, Bertrand; C Rocha, Eduardo P; Vergnaud, Gilles; Gautheret, Daniel; Pourcel, Christine

    2018-05-22

    CRISPR (clustered regularly interspaced short palindromic repeats) arrays and their associated (Cas) proteins confer bacteria and archaea adaptive immunity against exogenous mobile genetic elements, such as phages or plasmids. CRISPRCasFinder allows the identification of both CRISPR arrays and Cas proteins. The program includes: (i) an improved CRISPR array detection tool facilitating expert validation based on a rating system, (ii) prediction of CRISPR orientation and (iii) a Cas protein detection and typing tool updated to match the latest classification scheme of these systems. CRISPRCasFinder can either be used online or as a standalone tool compatible with Linux operating system. All third-party software packages employed by the program are freely available. CRISPRCasFinder is available at https://crisprcas.i2bc.paris-saclay.fr.

  18. Performance of portland limestone cements: Cements designed to be more sustainable that include up to 15% limestone addition

    Science.gov (United States)

    Barrett, Timothy J.

    In 2009, ASTM and AASHTO permitted the use of up to 5% interground limestone in ordinary portland cement (OPC) as a part of a change to ASTM C150/AASHTO M85. When this work was initiated a new proposal was being discussed that would enable up to 15% interground limestone cement to be considered in ASTM C595/AASHTO M234. This work served to provide rapid feedback to the state department of transportation and concrete industry for use in discussions regarding these specifications. Since the time this work was initiated, ASTM C595/AASHTO M234 was passed (2012c) and PLCs are now able to be specified, however they are still not widely used. The proposal for increasing the volume of limestone that would be permitted to be interground in cement is designed to enable more sustainable construction, which may significantly reduce the CO2 that is embodied in the built infrastructure while also extending the life of cement quarries. Research regarding the performance of cements with interground limestone has been conducted by the cement industry since these cements became widely used in Europe over three decades ago, however this work focuses on North American Portland Limestone Cements (PLCs) which are specifically designed to achieve similar performance as the OPCs they replace.This thesis presents a two-phase study in which the potential for application of cements containing limestone was assessed. The first phase of this study utilized a fundamental approach to determine whether cement with up to 15% of interground or blended limestone can be used as a direct substitute to ordinary portland cement. The second phase of the study assessed the concern of early age shrinkage and cracking potential when using PLCs, as these cements are typically ground finer than their OPC counterparts. For the first phase of the study, three commercially produced PLCs were obtained and compared to three commercially produced OPCs made from the same clinker. An additional cement was tested

  19. Methodology developed at the CEA/IPSN for logn term performance assessment of nuclear waste repositories in geological formations

    International Nuclear Information System (INIS)

    Raimbault, P.; Lewi, J.

    1985-05-01

    The CEA/ISPN is currently developing a methodology for safety evaluation of disposal site projects in granite, clay and bedded salt, host rocks formations. In the Institute of Protection and Nuclear Safety, the Department of Safety Analysis (DAS) is responsible for the coordination of the modeling effort which is performed in several specialized groups. The models are commissionned and utilized at the IPSN for specific safety evaluations. They are improved as needed and validated through international exercices (INTRACOIN-HYDROCOIN-ATKINS) and experimental programs. The DAS develops as well a global performance assessment code named MELODIE which structure allows to couple the individual models. This code participates to international joint studies such as PAGIS, in order to test its ability to model specific sites. This should help to control the adequation of the individual models to the risk assessment evaluation in order to insure the availability of specific data and to identify the most sensitive parameters. This approach should allow to coordinate the action between experimentation, code development and safety rules determination in order to be ready to perform safety assessment on chosen sites. The current status of the different aspects of this work is presented. The model development concerns mainly: transport, hydrogeology, source term, dose calculation and sensitivity studies. Its connection with data collection and model validation is stressed in the field of source modeling, hydrogeology, geochemistry and geoprospective. The description of the first version of MELODIE is presented. Some results of the interactive evaluation of the source term, the groundwater flow and the transport of radionuclides in a granite site are presented as well

  20. Using a model of the performance measures in Soft Systems Methodology (SSM) to take action: a case study in health care

    NARCIS (Netherlands)

    Kotiadis, K.; Tako, A.; Rouwette, E.A.J.A.; Vasilakis, C.; Brennan, J.; Gandhi, P.; Wegstapel, H.; Sagias, F.; Webb, P.

    2013-01-01

    This paper uses a case study of a multidisciplinary colorectal cancer team in health care to explain how a model of performance measures can lead to debate and action in Soft System Methodology (SSM). This study gives a greater emphasis and role to the performance measures than currently given in

  1. Performance evaluation of alternative fuel/engine concepts 1990- 1995. Final report including addendum of diesel vehicles

    Energy Technology Data Exchange (ETDEWEB)

    Nylund, N.O.; Ikonen, M.; Kytoe, M.; Lappi, M.; Westerholm, M.; Laurikko, J. [VTT Energy, Espoo (Finland). Energy Use

    1996-12-31

    Annex V within the IEA Agreement on Alternative Motor Fuels is the first subtask to generate new experimental data. The objective of the task is to generate information on the emission potential of alternative fuels in severe operating conditions and to evaluate new emission measurement methods. The work was carried out in three phases, Engine Tests, Vehicle Tests and Addendum of Diesel Vehicles. The work was carried out at VTT (Technical Research Centre of Finland) as a cost shared operation. Participants were Belgium (Parts Two and Three), Canada (Parts One and Two), Finland, Italy (Part One), Japan, the Netherlands Sweden and USA. The United Kingdom also joined at the end of the Annex. The work included 143 different vehicle/fuel/temperature combinations. FTP type emission tests were run on 14 vehicles powered with different gasoline compositions, methanol (M50 and M85), ethanol (E85), LPG, CNG and diesel. Both regulated and unregulated emission components were measured using the most up-to-date emissions measurement technology. The results indicated, that today`s advanced gasoline vehicles must be considered rather clean. Diesel is comparable with gasoline in the case of CO and HC. M85 gives low emissions in warm conditions, but unburned methanol must be controlled. Natural gas and LPG are inherently clean fuels which, using up-to-date engine technology, give low emissions in all conditions. (orig.) (29 refs.)

  2. Proposal for evaluation methodology on impact resistant performance and construction method of tornado missile protection net structure

    International Nuclear Information System (INIS)

    Namba, Kosuke; Shirai, Koji

    2014-01-01

    In nuclear power plants, the necessity of the Tornado Missile Protection Structure is becoming a technical key issue. Utilization of the net structure seems to be one of the realistic counter measures from the point of the view of the mitigation wind and seismic loads. However, the methodology for the selection of the net suitable materials, the energy absorption design method and the construction method are not sufficiently established. In this report, three materials (high-strength metal mesh, super strong polyethylene fiber net and steel grating) were selected for the candidate material and the material screening tests, the energy absorption tests by free drop test using the heavy weight and the impact tests with the small diameter missile. As a result, high-strength metal mesh was selected as a suitable material for tornado missile protection net structure. Moreover, the construction method to obtain the good energy absorption performance of the material and the practical design method to estimate the energy absorption of the high-strength metal mesh under tornado missile impact load were proposed. (author)

  3. Attention-deficit/hyperactivity disorder and phonological working memory: Methodological variability affects clinical and experimental performance metrics.

    Science.gov (United States)

    Tarle, Stephanie J; Alderson, R Matt; Patros, Connor H G; Lea, Sarah E; Hudec, Kristen L; Arrington, Elaine F

    2017-05-01

    Despite promising findings in extant research that suggest impaired working memory (WM) serves as a central neurocognitive deficit or candidate endophenotype of attention-deficit/hyperactivity disorder (ADHD), findings from translational research have been relatively underwhelming. This study aimed to explicate previous equivocal findings by systematically examining the effect of methodological variability on WM performance estimates across experimental and clinical WM measures. Age-matched boys (ages 8-12 years) with (n = 20) and without (n = 20) ADHD completed 1 experimental (phonological) and 2 clinical (digit span, letter-number sequencing) WM measures. The use of partial scoring procedures, administration of greater trial numbers, and high central executive demands yielded moderate-to-large between-groups effect sizes. Moreover, the combination of these best-case procedures, compared to worst-case procedures (i.e., absolute scoring, administration of few trials, use of discontinue rules, and low central executive demands), resulted in a 12.5% increase in correct group classification. Collectively, these findings explain inconsistent ADHD-related WM deficits in previous reports, and highlight the need for revised clinical measures that utilize best-case procedures. (PsycINFO Database Record (c) 2017 APA, all rights reserved).

  4. Methodology to include a correction for offset in the calibration of a Diode-based 2D verification device; Metodologia para incluir una correccion por offset en la calibracion de un dispositivo de verificacion 2D basado en diodos

    Energy Technology Data Exchange (ETDEWEB)

    Ramirez Ros, J. C.; Pamos Urena, M.; Jerez Sainz, M.; Lobato Munoz, M.; Jodar Lopez, C. A.; Ruiz Lopez, M. a.; Carrasco Rodriguez, J. L.

    2013-07-01

    We propose a methodology to correct doses of device verification 2D MapChek2 planes by offset. This methodology provides an array of correction by Offset applied to the calibration per dose due to the Offset of the diode Central as well as the correction of the Offset of each diode on each acquisition. (Author)

  5. The LBB methodology application results performed on the safety related piping of NPP V-1 in Jaslovske Bohunice

    Energy Technology Data Exchange (ETDEWEB)

    Kupca, L.; Beno, P. [Nuclear Power Plants Research Institute, Trnava (Slovakia)

    1997-04-01

    A broad overview of the leak before break (LBB) application to the Slovakian V-1 nuclear power plant is presented in the paper. LBB was applied to the primary cooling circuit and surge lines of both WWER 440 type units, and also used to assess the integrity of safety related piping in the feed water and main steam systems. Experiments and calculations performed included analyses of stresses, material mechanical properties, corrosion, fatigue damage, stability of heavy component supports, water hammer, and leak rates. A list of analysis results and recommendations are included in the paper.

  6. A Novel Performance Framework and Methodology to Analyze the Impact of 4D Trajectory Based Operations in the Future Air Traffic Management System

    OpenAIRE

    Ruiz, Sergio; Lopez Leones, Javier; Ranieri, Andrea

    2018-01-01

    The introduction of new Air Traffic Management (ATM) concepts such as Trajectory Based Operations (TBO) may produce a significant impact in all performance areas, that is, safety, capacity, flight efficiency, and others. The performance framework in use today has been tailored to the operational needs of the current ATM system and must evolve to fulfill the new needs and challenges brought by the TBO content. This paper presents a novel performance assessment framework and methodology adapted...

  7. Methodological guidelines

    International Nuclear Information System (INIS)

    Halsnaes, K.; Callaway, J.M.; Meyer, H.J.

    1999-01-01

    The guideline document establishes a general overview of the main components of climate change mitigation assessment. This includes an outline of key economic concepts, scenario structure, common assumptions, modelling tools and country study assumptions. The guidelines are supported by Handbook Reports that contain more detailed specifications of calculation standards, input assumptions and available tools. The major objectives of the project have been provided a methodology, an implementing framework and a reporting system which countries can follow in meeting their future reporting obligations under the FCCC and for GEF enabling activities. The project builds upon the methodology development and application in the UNEP National Abatement Coasting Studies (UNEP, 1994a). The various elements provide countries with a road map for conducting climate change mitigation studies and submitting national reports as required by the FCCC. (au) 121 refs

  8. Methodological guidelines

    Energy Technology Data Exchange (ETDEWEB)

    Halsnaes, K.; Callaway, J.M.; Meyer, H.J.

    1999-04-01

    The guideline document establishes a general overview of the main components of climate change mitigation assessment. This includes an outline of key economic concepts, scenario structure, common assumptions, modelling tools and country study assumptions. The guidelines are supported by Handbook Reports that contain more detailed specifications of calculation standards, input assumptions and available tools. The major objectives of the project have been provided a methodology, an implementing framework and a reporting system which countries can follow in meeting their future reporting obligations under the FCCC and for GEF enabling activities. The project builds upon the methodology development and application in the UNEP National Abatement Coasting Studies (UNEP, 1994a). The various elements provide countries with a road map for conducting climate change mitigation studies and submitting national reports as required by the FCCC. (au) 121 refs.

  9. A methodological study of environmental simulation in architecture and engineering. Integrating daylight and thermal performance across the urban and building scales

    DEFF Research Database (Denmark)

    Sattrup, Peter Andreas; Strømann-Andersen, Jakob Bjørn

    2011-01-01

    This study presents a methodological and conceptual framework that allows for the integration and creation of knowledge across professional borders in the field of environmental simulation. The framework has been developed on the basis of interviews with leading international practitioners, key...... in pointing out the need for improving metrics, software and not least the performance of the built environment itself....

  10. Studies of national research performance: A case of ‘methodological nationalism’ and ‘zombie science’?

    DEFF Research Database (Denmark)

    Sørensen, Mads P.; Schneider, Jesper Wiborg

    2017-01-01

    The analytical point of departure in this paper is the ongoing debate, initiated by Ulrich Beck, on methodological nationalism within the social sciences. Based on a comprehensive study of research collaboration and mobility of researchers this paper discusses possible traces of methodological...... with researchers in other countries. The national research institutions are increasingly transnationalised due to the growing mobility of researchers. Based on an examination of all the papers registered in the Thompson Reuter’s Web of Science database we follow the development in research collaboration...

  11. Changes in Methodology for Assessing Performance of Research Organisations and Influence of Such Changes on Researchers' Behaviour

    Directory of Open Access Journals (Sweden)

    Luboš Marek

    2017-12-01

    Full Text Available Assessing quality of research results on an international scale is a basis for evaluating the level of scientific activities pursued in research organisations. In the past 15 years, significant changes have occurred in the Czech Republic in research management and, in particular, the methodology of assessing research results. The methodology of assessment and its modifications should always be focused on increasing quality of research results; the rules of assessment have their effects on researchers' behaviour. This paper studies a question of whether the changes applied to the methodology of assessing research results in the Czech Republic have supported higher quality research results, i.e., results published in high-quality international journals. The authors have developed their own statistical test to measure significance of such changes, as well as other statistical tests of hypotheses. The main source is represented by the results of assessing public universities in the Czech Republic according to "Methodology for assessing results of research organisations" in 2010 and 2013. Our tests have not proven any statistically significant differences in the numbers of papers published in the journals monitored in the Web of Science and Scopus databases.

  12. Exercise-Based Performance Enhancement and Injury Prevention for Firefighters: Contrasting the Fitness- and Movement-Related Adaptations to Two Training Methodologies.

    Science.gov (United States)

    Frost, David M; Beach, Tyson A C; Callaghan, Jack P; McGill, Stuart M

    2015-09-01

    Using exercise to enhance physical fitness may have little impact on performers' movement patterns beyond the gym environment. This study examined the fitness and movement adaptations exhibited by firefighters in response to 2 training methodologies. Fifty-two firefighters were assigned to a movement-guided fitness (MOV), conventional fitness (FIT), or control (CON) group. Before and after 12 weeks of training, participants performed a fitness evaluation and laboratory-based test. Three-dimensional lumbar spine and frontal plane knee kinematics were quantified. Five whole-body tasks not included in the interventions were used to evaluate the transfer of training. FIT and MOV groups exhibited significant improvements in all aspects of fitness; however, only MOV exhibited improvements in spine and frontal plane knee motion control when performing each transfer task (effect sizes [ESs] of 0.2-1.5). FIT exhibited less controlled spine and frontal plane knee motions while squatting, lunging, pushing, and pulling (ES: 0.2-0.7). More MOV participants (43%) exhibited only positive posttraining changes (i.e., improved control), in comparison with FIT (30%) and CON (23%). Fewer negative posttraining changes were also noted (19, 25, and 36% for MOV, FIT, and CON). These findings suggest that placing an emphasis on how participants move while exercising may be an effective training strategy to elicit behavioral changes beyond the gym environment. For occupational athletes such as firefighters, soldiers, and police officers, this implies that exercise programs designed with a movement-oriented approach to periodization could have a direct impact on their safety and effectiveness by engraining desirable movement patterns that transfer to occupational tasks.

  13. High heterogeneity in methods used for the laboratory confirmation of pertussis diagnosis among European countries, 2010: integration of epidemiological and laboratory surveillance must include standardisation of methodologies and quality assurance.

    Science.gov (United States)

    He, Q; Barkoff, A M; Mertsola, J; Glismann, S; Bacci, S

    2012-08-09

    Despite extensive childhood immunisation, pertussis remains one of the world’s leading causes of vaccine preventable deaths. The current methods used for laboratory diagnosis of pertussis include bacterial culture, polymerase chain reaction (PCR) and enzyme linked immunosorbent assay (ELISA) serology. We conducted a questionnaire survey to identify variations in the laboratory methods and protocols used among participating countries included in the European surveillance network for vaccine-preventable diseases(EUVAC.NET). In February 2010, we performed the survey using a web-based questionnaire and sent it to the country experts of 25 European Union countries,and two European Economic Area (EEA) countries,Norway and Iceland. The questionnaire consisted of 37 questions which covered both general information on surveillance methods and detailed laboratory methods used. A descriptive analysis was performed.Questionnaires were answered by all 27 contacted countries. Nineteen countries had pertussis reference laboratories at the national level; their functions varied from performing diagnosis to providing technical advice for routine microbiology laboratories. Culture,PCR and serology were used in 17, 18 and 20 countries,respectively. For PCR, nine laboratories used insertion sequence IS481 as the target gene, which is present in multiple copies in the Bordetella pertussis genome and thus has a greater sensitivity over single copy targets, but has been proved not to be specific for B.pertussis. Antibodies directed against pertussis toxin(PT) are specific for B. pertussis infections. For ELISA serology, only 13 countries’ laboratories used purified PT as coating antigen and 10 included World Health Organization (WHO) or Food and Drug Administration (FDA) reference sera in their tests. This present survey shows that methods used for laboratory confirmation of pertussis differ widely among European countries and that there is a great heterogeneity of the reference

  14. Performance and Perception in the Flipped Learning Model: An Initial Approach to Evaluate the Effectiveness of a New Teaching Methodology in a General Science Classroom

    Science.gov (United States)

    González-Gómez, David; Jeong, Jin Su; Airado Rodríguez, Diego; Cañada-Cañada, Florentina

    2016-06-01

    "Flipped classroom" teaching methodology is a type of blended learning in which the traditional class setting is inverted. Lecture is shifted outside of class, while the classroom time is employed to solve problems or doing practical works through the discussion/peer collaboration of students and instructors. This relatively new instructional methodology claims that flipping your classroom engages more effectively students with the learning process, achieving better teaching results. Thus, this research aimed to evaluate the effects of the flipped classroom on the students' performance and perception of this new methodology. This study was conducted in a general science course, sophomore of the Primary Education bachelor degree in the Training Teaching School of the University of Extremadura (Spain) during the course 2014/2015. In order to assess the suitability of the proposed methodology, the class was divided in two groups. For the first group, a traditional methodology was followed, and it was used as control. On the other hand, the "flipped classroom" methodology was used in the second group, where the students were given diverse materials, such as video lessons and reading materials, before the class to be revised at home by them. Online questionnaires were as well provided to assess the progress of the students before the class. Finally, the results were compared in terms of students' achievements and a post-task survey was also conducted to know the students' perceptions. A statistically significant difference was found on all assessments with the flipped class students performing higher on average. In addition, most students had a favorable perception about the flipped classroom noting the ability to pause, rewind and review lectures, as well as increased individualized learning and increased teacher availability.

  15. Methodology used for total system performance assessment of the potential nuclear waste repository at yucca mountain (USA)

    International Nuclear Information System (INIS)

    Devonec, E.; Sevougian, S.D.; Mattie, P.D.; Mcneish, J.A.; Mishra, S.

    2001-01-01

    The U.S. Department of Energy and its contractors are currently evaluating a site in Nevada (Yucca Mountain) for disposal of high-level radioactive waste from U.S. commercial nuclear plants and U.S. government-owned facilities. The suitability of the potential geologic repository is assessed, based on its performance in isolating the nuclear waste from the environment. Experimental data and models representing the natural and engineered barriers are combined into a Total System Performance Assessment (TSPA) model. Because of the uncertainty in the current data and in the future evolution of the total system, simulations follow a probabilistic approach. Multiple realization simulations using Monte Carlo analysis are conducted over time periods of up to one million years, which estimates a range of possible behaviors of the repository. In addition to the nominal scenario, other exposure scenarios include the possibility of disruptive events such as volcanic eruption or intrusion, or accidental human intrusion. Sensitivity to key uncertain processes is analyzed. The influence of stochastic variables on the TSPA model output is assessed by ''uncertainty importance analysis'', e.g., regression analysis and classification tree analysis. Further investigation of the impact of parameters and assumptions is conducted through ''one-off analysis'', which consists in fixing a parameter at a particular value, using an alternative conceptual model, or in making a different assumption. Finally, robustness analysis evaluates the performance of the repository when various natural or engineered barriers are assumed to be degraded. The objective of these analyses is to evaluate the performance of the potential repository system under conditions ranging from expected to highly unlikely, though physically possible conditions. (author)

  16. Methodology used for total system performance assessment of the potential nuclear waste repository at yucca mountain (USA)

    Energy Technology Data Exchange (ETDEWEB)

    Devonec, E.; Sevougian, S.D.; Mattie, P.D.; Mcneish, J.A. [Duke Engineering and Services, Town Center Drive, Las Vegas (United States); Mishra, S. [Duke Engineering and Services, Austin, TX (United States)

    2001-07-01

    The U.S. Department of Energy and its contractors are currently evaluating a site in Nevada (Yucca Mountain) for disposal of high-level radioactive waste from U.S. commercial nuclear plants and U.S. government-owned facilities. The suitability of the potential geologic repository is assessed, based on its performance in isolating the nuclear waste from the environment. Experimental data and models representing the natural and engineered barriers are combined into a Total System Performance Assessment (TSPA) model. Because of the uncertainty in the current data and in the future evolution of the total system, simulations follow a probabilistic approach. Multiple realization simulations using Monte Carlo analysis are conducted over time periods of up to one million years, which estimates a range of possible behaviors of the repository. In addition to the nominal scenario, other exposure scenarios include the possibility of disruptive events such as volcanic eruption or intrusion, or accidental human intrusion. Sensitivity to key uncertain processes is analyzed. The influence of stochastic variables on the TSPA model output is assessed by ''uncertainty importance analysis'', e.g., regression analysis and classification tree analysis. Further investigation of the impact of parameters and assumptions is conducted through ''one-off analysis'', which consists in fixing a parameter at a particular value, using an alternative conceptual model, or in making a different assumption. Finally, robustness analysis evaluates the performance of the repository when various natural or engineered barriers are assumed to be degraded. The objective of these analyses is to evaluate the performance of the potential repository system under conditions ranging from expected to highly unlikely, though physically possible conditions. (author)

  17. Methodology Used for Total System Performance Assessment of the Potential Nuclear Waste Repository at Yucca Mountain (USA)

    International Nuclear Information System (INIS)

    E. Devibec; S.D. Sevougian; P.D. Mattie; J.A. McNeish; S. Mishra

    2001-01-01

    The U.S. Department of Energy and its contractors are currently evaluating a site in Nevada (Yucca Mountain) for disposal of high-level radioactive waste from U.S. commercial nuclear plants and U.S. government-owned facilities. The suitability of the potential geologic repository is assessed, based on its performance in isolating the nuclear waste from the environment. Experimental data and models representing the natural and engineered barriers are combined into a Total System Performance Assessment (TSPA) model [1]. Process models included in the TSPA model are unsaturated zone flow and transport, thermal hydrology, in-drift geochemistry, waste package degradation, waste form degradation, engineered barrier system transport, saturated zone flow and transport, and biosphere transport. Because of the uncertainty in the current data and in the future evolution of the total system, simulations follow a probabilistic approach. Multiple realization simulations using Monte Carlo analysis are conducted over time periods of up to one million years, which estimates a range of possible behaviors of the repository. The environmental impact is measured primarily by the annual dose received by an average member of a critical population group residing 20 km down-gradient of the potential repository. In addition to the nominal scenario, other exposure scenarios include the possibility of disruptive events such as volcanic eruption or intrusion, or accidental human intrusion. Sensitivity to key uncertain processes is analyzed. The influence of stochastic variables on the TSPA model output is assessed by ''uncertainty importance analysis'', e.g., regression analysis and classification tree analysis. Further investigation of the impact of parameters and assumptions is conducted through ''one-off analysis'', which consists in fixing a parameter at a particular value, using an alternative conceptual model, or in making a different assumption. Finally, robustness analysis evaluates

  18. A Methodology for Quality Problems Diagnosis in SMEs

    OpenAIRE

    Humberto N. Teixeira; Isabel S. Lopes; Sérgio D. Sousa

    2012-01-01

    This article proposes a new methodology to be used by SMEs (Small and Medium enterprises) to characterize their performance in quality, highlighting weaknesses and area for improvement. The methodology aims to identify the principal causes of quality problems and help to prioritize improvement initiatives. This is a self-assessment methodology that intends to be easy to implement by companies with low maturity level in quality. The methodology is organized in six different steps which include...

  19. Methodology for the identification of the factors that can influence the performance of operators of nuclear power plants control room under emergency situations

    International Nuclear Information System (INIS)

    Paiva, Bernardo Spitz; Santos, Isaac J.A. Luquetti

    2009-01-01

    In order to minimize the human errors of the operators in a nuclear power plan control room, during emergency situations, it has to be considered the factors which affect the human performance. Work situations adequately projected, compatible with the necessities, capacities and human limitations, taking into consideration the factors which affect the operator performance . This paper aims to develop a methodology for identification of the factors affecting the operator performance under emergency situation, using the aspects defined by the human reliability analysis focusing the judgment done by specialists

  20. FY1995 study of design methodology and environment of high-performance processor architectures; 1995 nendo koseino processor architecture sekkeiho to sekkei kankyo no kenkyu

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1997-03-01

    The aim of our project is to develop high-performance processor architectures for both general purpose and application-specific purpose. We also plan to develop basic softwares, such as compliers, and various design aid tools for those architectures. We are particularly interested in performance evaluation at architecture design phase, design optimization, automatic generation of compliers from processor designs, and architecture design methodologies combined with circuit layout. We have investigated both microprocessor architectures and design methodologies / environments for the processors. Our goal is to establish design technologies for high-performance, low-power, low-cost and highly-reliable systems in system-on-silicon era. We have proposed PPRAM architecture for high-performance system using DRAM and logic mixture technology, Softcore processor architecture for special purpose processors in embedded systems, and Power-Pro architecture for low power systems. We also developed design methodologies and design environments for the above architectures as well as a new method for design verification of microprocessors. (NEDO)

  1. Development of a cost efficient methodology to perform allocation of flammable and toxic gas detectors applying CFD tools

    Energy Technology Data Exchange (ETDEWEB)

    Storch, Rafael Brod; Rocha, Gean Felipe Almeida [Det Norske Veritas (DNV), Rio de Janeiro, RJ (Brazil); Nalvarte, Gladys Augusta Zevallos [Det Norske Veritas (DNV), Novik (Norway)

    2012-07-01

    This paper is aimed to present a computational procedure for flammable and toxic gas detector allocation and quantification developed by DNV. The proposed methodology applies Computational Fluid Dynamics (CFD) simulations as well as operational and safety characteristics of the analyzed region to assess the optimal number of toxic and flammable gas detectors and their optimal location. A probabilistic approach is also used when applying the DNV software ThorEXPRESSLite, following NORSOK Z013 Annex G and presented in HUSER et al. 2000 and HUSER et al. 2001, when the flammable gas detectors are assessed. A DNV developed program, DetLoc, is used to run in an iterative way the procedure described above leading to an automatic calculation of the gas detectors location and number. The main advantage of the methodology presented above is the independence of human interaction in the gas detector allocation leading to a more precise and free of human judgment allocation. Thus, a reproducible allocation is generated when comparing several different analyses and a global criteria appliance is guaranteed through different regions in the same project. A case study is presented applying the proposed methodology. (author)

  2. Evaluation of Radiology Teachers' Performance and Identification of the "Best Teachers" in a Residency Program: Mixed Methodology and Pilot Study of the MEDUC-RX32 Questionnaire.

    Science.gov (United States)

    Huete, Álvaro; Julio, Rodrigo; Rojas, Viviana; Herrera, Cristián; Padilla, Oslando; Solís, Nancy; Pizarro, Margarita; Etcheberry, Lorena; Sarfatis, Alberto; Pérez, Gonzalo; Díaz, Luis A; Delfino, Alejandro; Muñoz, Estrella; Rivera, Horacio; Parra, Dimitri A; Bitran, Marcela; Riquelme, Arnoldo

    2016-07-01

    Radiology teachers are well trained in their specialty; however, when working in academic institutions, faculty development and promotion through the education pathway tends to be based on their teaching knowledge and skills. The aim of this study is to assess psychometric properties of the Medicina Universidad Católica-Radiology 32 items (MEDUC-RX32), an instrument designed to evaluate the performance of postgraduate radiology teachers and to identify the best teachers. Mixed methodology was used, including qualitative and quantitative phases. The psychometric properties of the MEDUC-RX32 survey were performed by factor analysis (validity), Cronbach alpha coefficient, and G coefficient (reliability). The residents assessed their teachers and simultaneously voted for the "best teacher," which was used as a gold standard for the receiver operating characteristic curves construction comparing their votes with the global score. A total of 28 residents answered 164 surveys. The global score was 6.23 ± 0.8 (scale from 1 to 7). The factor analysis showed six domains of the resident's perception: (1) tutorial teaching, feedback, and independent learning; (2) communication and teamwork; (3) learning objectives; (4) respectful behavior; (5) radiological report; and (6) teaching and care support. The tutor's strengths were related with respectful behavior and teamwork. The instrument is highly reliable with a Cronbach alpha of 0.937 and a G coefficient of 0.831 (with a minimum of 8 residents). The MEDUC-RX32 instrument has a sensitivity of 91.7% and specificity of 83.3% to identify tutors as best teachers with at least one vote with an area under the receiver operating characteristic curve of 0.931 with a cutoff of 5.94. The MEDC-RX32 instrument is a multidimensional, valid, and highly reliable method to evaluate radiology teachers, identifying teachers with excellence in tutorial teaching in a postgraduate radiology program. Copyright © 2016 The Association of

  3. Cation-exchange high-performance liquid chromatography for variant hemoglobins and HbF/A2: What must hematopathologists know about methodology?

    OpenAIRE

    Sharma, Prashant; Das, Reena

    2016-01-01

    Cation-exchange high-performance liquid chromatography (CE-HPLC) is a widely used laboratory test to detect variant hemoglobins as well as quantify hemoglobins F and A2 for the diagnosis of thalassemia syndromes. It’s versatility, speed, reproducibility and convenience have made CE-HPLC the method of choice to initially screen for hemoglobin disorders. Despite its popularity, several methodological aspects of the technology remain obscure to pathologists and this may have consequences in spec...

  4. The bounds on tracking performance utilising a laser-based linear and angular sensing and measurement methodology for micro/nano manipulation

    International Nuclear Information System (INIS)

    Clark, Leon; Shirinzadeh, Bijan; Tian, Yanling; Zhong, Yongmin

    2014-01-01

    This paper presents an analysis of the tracking performance of a planar three degrees of freedom (DOF) flexure-based mechanism for micro/nano manipulation, utilising a tracking methodology for the measurement of coupled linear and angular motions. The methodology permits trajectories over a workspace with large angular range through the reduction of geometric errors. However, when combining this methodology with feedback control systems, the accuracy of performed manipulations can only be stated within the bounds of the uncertainties in measurement. The dominant sources of error and uncertainty within each sensing subsystem are therefore identified, which leads to a formulation of the measurement uncertainty in the final system outputs, in addition to methods of reducing their magnitude. Specific attention is paid to the analysis of the vision-based subsystem utilised for the measurement of angular displacement. Furthermore, a feedback control scheme is employed to minimise tracking errors, and the coupling of certain measurement errors is shown to have a detrimental effect on the controller operation. The combination of controller tracking errors and measurement uncertainty provides the bounds on the final tracking performance. (paper)

  5. Response surface methodology approach for structural reliability analysis: An outline of typical applications performed at CEC-JRC, Ispra

    International Nuclear Information System (INIS)

    Lucia, A.C.

    1982-01-01

    The paper presents the main results of the work carried out at JRC-Ispra for the study of specific problems posed by the application of the response surface methodology to the exploration of structural and nuclear reactor safety codes. Some relevant studies have been achieved: assessment of structure behaviours in the case of seismic occurrences; determination of the probability of coherent blockage in LWR fuel elements due to LOCA occurrence; analysis of ATWS consequences in PWR reactors by means of an ALMOD code; analysis of the first wall for an experimental fusion reactor by means of the Bersafe code. (orig.)

  6. Measuring the accomplishments of public participation programs: Overview of a methodological study performed for DOE's Office of Environmental Management

    International Nuclear Information System (INIS)

    Schweitzer, M.; Carnes, S.A.; Peelle, E.B.; Wolfe, A.K.

    1997-01-01

    Recently, staff at Oak Ridge National Laboratory performed a study for the Office of Intergovernmental and Public Accountability within the U.S. Department of Energy's (DOE) Office of Environmental Management (EM), examining how to measure the success of public participation programs. While the study began with a thorough literature review, the primary emphasis of this research effort was on getting key stakeholders to help identify attributes of successful public participation in EM activities and to suggest how those attributes might be measured. Interviews were conducted at nine DOE sites that provided substantial variety in terms of geographic location, types of environmental management activities undertaken, the current life-cycle stage of those EM efforts, and the public participation mechanisms utilized. Approximately 12 to 15 oral interviews were conducted at each site, and each respondent also was asked to complete a written survey. Those interviewed included: non-regulatory state and local government officials; project managers and public participation staff for DOE and its management and operations contractors; non-government groups concerned with environmental protection, public safety, and health issues; federal and state environmental regulators; business organizations; civic groups; and other interested parties. While this study examined only those public participation programs sponsored by DOE, the resulting findings also have applicability to the public involvement efforts sponsored by many other public and private sector organizations

  7. Association between functional performance and executive cognitive functions in an elderly population including patients with low ankle–brachial index

    Directory of Open Access Journals (Sweden)

    Ferreira NV

    2015-05-01

    Full Text Available Naomi Vidal Ferreira,1 Paulo Jannuzzi Cunha,2 Danielle Irigoyen da Costa,3 Fernando dos Santos,1 Fernando Oliveira Costa,1 Fernanda Consolim-Colombo,4 Maria Cláudia Irigoyen1 1Heart Institute, Medical School, Universidade de São Paulo, São Paulo, SP, Brazil; 2Neuroimaging in Psychiatry Laboratory, Department of Psychiatry, Medical School, Universidade de São Paulo, São Paulo, SP, Brazil; 3Rio Grande do Sul Cardiology Institute, Fundação Universitária de Cardiologia, Porto Alegre, RS, Brazil; 4Medical School, Universidade Nove de Julho, São Paulo, SP, Brazil Introduction: Peripheral arterial disease, as measured by the ankle–brachial index (ABI, is prevalent among the elderly, and is associated with functional performance, assessed by the 6-minute walk test (6MWT. Executive cognitive function (ECF impairments are also prevalent in this population, but no existing study has investigated the association between ECF and functional performance in an elderly population including individuals with low ABI.Aim: To investigate the association between functional performance, as measured by the 6MWT, and loss in ECF, in an elderly sample including individuals with low ABI.Method: The ABI group was formed by 26 elderly individuals with low ABI (mean ABI: 0.63±0.19, and the control group was formed by 40 elderly individuals with normal ABI (mean ABI: 1.08±0.07. We analyzed functional performance using the 6MWT, global cognition using the Mini-Mental State Examination (MMSE, and ECF using the Digit Span for assessing attention span and working memory, the Stroop Color Word Test (SCWT for assessing information processing speed and inhibitory control/impulsivity, and the Controlled Oral Word Association Test (COWAT for assessing semantic verbal fluency and phonemic verbal fluency. We also used a factor analysis on all of the ECF tests (global ECF.Results: Before adjustment, the ABI group performed worse on global cognition, attention span, working

  8. Development and application of the Safe Performance Index as a risk-based methodology for identifying major hazard-related safety issues in underground coal mines

    Science.gov (United States)

    Kinilakodi, Harisha

    The underground coal mining industry has been under constant watch due to the high risk involved in its activities, and scrutiny increased because of the disasters that occurred in 2006-07. In the aftermath of the incidents, the U.S. Congress passed the Mine Improvement and New Emergency Response Act of 2006 (MINER Act), which strengthened the existing regulations and mandated new laws to address the various issues related to a safe working environment in the mines. Risk analysis in any form should be done on a regular basis to tackle the possibility of unwanted major hazard-related events such as explosions, outbursts, airbursts, inundations, spontaneous combustion, and roof fall instabilities. One of the responses by the Mine Safety and Health Administration (MSHA) in 2007 involved a new pattern of violations (POV) process to target mines with a poor safety performance, specifically to improve their safety. However, the 2010 disaster (worst in 40 years) gave an impression that the collective effort of the industry, federal/state agencies, and researchers to achieve the goal of zero fatalities and serious injuries has gone awry. The Safe Performance Index (SPI) methodology developed in this research is a straight-forward, effective, transparent, and reproducible approach that can help in identifying and addressing some of the existing issues while targeting (poor safety performance) mines which need help. It combines three injury and three citation measures that are scaled to have an equal mean (5.0) in a balanced way with proportionate weighting factors (0.05, 0.15, 0.30) and overall normalizing factor (15) into a mine safety performance evaluation tool. It can be used to assess the relative safety-related risk of mines, including by mine-size category. Using 2008 and 2009 data, comparisons were made of SPI-associated, normalized safety performance measures across mine-size categories, with emphasis on small-mine safety performance as compared to large- and

  9. Effects of disease severity distribution on the performance of quantitative diagnostic methods and proposal of a novel 'V-plot' methodology to display accuracy values.

    Science.gov (United States)

    Petraco, Ricardo; Dehbi, Hakim-Moulay; Howard, James P; Shun-Shin, Matthew J; Sen, Sayan; Nijjer, Sukhjinder S; Mayet, Jamil; Davies, Justin E; Francis, Darrel P

    2018-01-01

    Diagnostic accuracy is widely accepted by researchers and clinicians as an optimal expression of a test's performance. The aim of this study was to evaluate the effects of disease severity distribution on values of diagnostic accuracy as well as propose a sample-independent methodology to calculate and display accuracy of diagnostic tests. We evaluated the diagnostic relationship between two hypothetical methods to measure serum cholesterol (Chol rapid and Chol gold ) by generating samples with statistical software and (1) keeping the numerical relationship between methods unchanged and (2) changing the distribution of cholesterol values. Metrics of categorical agreement were calculated (accuracy, sensitivity and specificity). Finally, a novel methodology to display and calculate accuracy values was presented (the V-plot of accuracies). No single value of diagnostic accuracy can be used to describe the relationship between tests, as accuracy is a metric heavily affected by the underlying sample distribution. Our novel proposed methodology, the V-plot of accuracies, can be used as a sample-independent measure of a test performance against a reference gold standard.

  10. Effects of disease severity distribution on the performance of quantitative diagnostic methods and proposal of a novel ‘V-plot’ methodology to display accuracy values

    Science.gov (United States)

    Dehbi, Hakim-Moulay; Howard, James P; Shun-Shin, Matthew J; Sen, Sayan; Nijjer, Sukhjinder S; Mayet, Jamil; Davies, Justin E; Francis, Darrel P

    2018-01-01

    Background Diagnostic accuracy is widely accepted by researchers and clinicians as an optimal expression of a test’s performance. The aim of this study was to evaluate the effects of disease severity distribution on values of diagnostic accuracy as well as propose a sample-independent methodology to calculate and display accuracy of diagnostic tests. Methods and findings We evaluated the diagnostic relationship between two hypothetical methods to measure serum cholesterol (Cholrapid and Cholgold) by generating samples with statistical software and (1) keeping the numerical relationship between methods unchanged and (2) changing the distribution of cholesterol values. Metrics of categorical agreement were calculated (accuracy, sensitivity and specificity). Finally, a novel methodology to display and calculate accuracy values was presented (the V-plot of accuracies). Conclusion No single value of diagnostic accuracy can be used to describe the relationship between tests, as accuracy is a metric heavily affected by the underlying sample distribution. Our novel proposed methodology, the V-plot of accuracies, can be used as a sample-independent measure of a test performance against a reference gold standard. PMID:29387424

  11. Validation Methodology to Allow Simulated Peak Reduction and Energy Performance Analysis of Residential Building Envelope with Phase Change Materials: Preprint

    Energy Technology Data Exchange (ETDEWEB)

    Tabares-Velasco, P. C.; Christensen, C.; Bianchi, M.

    2012-08-01

    Phase change materials (PCM) represent a potential technology to reduce peak loads and HVAC energy consumption in residential buildings. This paper summarizes NREL efforts to obtain accurate energy simulations when PCMs are modeled in residential buildings: the overall methodology to verify and validate Conduction Finite Difference (CondFD) and PCM algorithms in EnergyPlus is presented in this study. It also shows preliminary results of three residential building enclosure technologies containing PCM: PCM-enhanced insulation, PCM impregnated drywall and thin PCM layers. The results are compared based on predicted peak reduction and energy savings using two algorithms in EnergyPlus: the PCM and Conduction Finite Difference (CondFD) algorithms.

  12. [The methodological assessment and qualitative evaluation of psychometric performance tests based on the example of modern tests that assess reading and spelling skills].

    Science.gov (United States)

    Galuschka, Katharina; Rothe, Josefine; Schulte-Körne, Gerd

    2015-09-01

    This article looks at a means of objectively evaluating the quality of psychometric tests. This approach enables users to evaluate psychometric tests based on their methodological characteristics, in order to decide which instrument should be used. Reading and spelling assessment tools serve as examples. The paper also provides a review of German psychometric tests for the assessment of reading and spelling skills. This method facilitates the identification of psychometric tests.of high methodological quality which can be used for the assessment of reading and spelling skills. Reading performance should ideally be assessed with the following instruments: ELFE 1-6, LGVT 6-12, LESEN 6-7, LESEN 8-9, or WLLP-R. The tests to be used for the evaluation of spelling skills are DERET 1-2+, DERET 3-4+, WRT 1+, WRT 2+, WRT 3+, WRT 4+ or HSP 1-10.

  13. High-performance control of a three-phase voltage-source converter including feedforward compensation of the estimated load current

    International Nuclear Information System (INIS)

    Leon, Andres E.; Solsona, Jorge A.; Busada, Claudio; Chiacchiarini, Hector; Valla, Maria Ines

    2009-01-01

    In this paper a new control strategy for voltage-source converters (VSC) is introduced. The proposed strategy consists of a nonlinear feedback controller based on feedback linearization plus a feedforward compensation of the estimated load current. In our proposal an energy function and the direct-axis current are considered as outputs, in order to avoid the internal dynamics. In this way, a full linearization is obtained via nonlinear transformation and feedback. An estimate of the load current is feedforwarded to improve the performance of the whole system and to diminish the capacitor size. This estimation allows to obtain a more rugged and cheaper implementation. The estimate is calculated by using a nonlinear reduced-order observer. The proposal is validated through different tests. These tests include performance in presence of switching frequency, measurement filters delays, parameters uncertainties and disturbances in the input voltage.

  14. A study of methodology and its applications for the evaluation of total system performance which considered the site-information and the design-information

    International Nuclear Information System (INIS)

    Inagaki, Manabu; Ebina, Takanori

    2008-01-01

    The step-by-step approach which chooses detailed investigation area(s) from the preliminary investigations area(s) in the volunteer sites is adopted on the high-level radioactive waste disposal in JAPAN. It's requested at the stage of chooses detailed investigation area that there are few fears that surf zone and groundwater stream exert bad influence on disposal facility as well as stability of the geological environment. In order to evaluate influence of groundwater stream, it is necessary to evaluate of total performance of disposal system with limited information from preliminary investigations and several design options which have been assumed on the generic geological environment. For a rational procedure of total system evaluation, it is essential to arrange of known information of the former performance evaluation, and so, when same part of those information was changed, we become able to pick a scenario, models, parameters with a possibility which changes out by tracing these information. In this study, as a first step, we carried out make a methodology to structure a flow of a series of information from a geological survey result in detail and until performance evaluation. We tried to make a structuralization of known information by base on this methodology. And, we discussed realistic performance of over-pack by tracing this structured information. (author)

  15. Evaluation of a performance assessment methodology for low-level radioactive waste disposal facilities: Validation needs. Volume 2

    International Nuclear Information System (INIS)

    Kozak, M.W.; Olague, N.E.

    1995-02-01

    In this report, concepts on how validation fits into the scheme of developing confidence in performance assessments are introduced. A general framework for validation and confidence building in regulatory decision making is provided. It is found that traditional validation studies have a very limited role in developing site-specific confidence in performance assessments. Indeed, validation studies are shown to have a role only in the context that their results can narrow the scope of initial investigations that should be considered in a performance assessment. In addition, validation needs for performance assessment of low-level waste disposal facilities are discussed, and potential approaches to address those needs are suggested. These areas of topical research are ranked in order of importance based on relevance to a performance assessment and likelihood of success

  16. Evaluation of a performance assessment methodology for low-level radioactive waste disposal facilities: Validation needs. Volume 2

    Energy Technology Data Exchange (ETDEWEB)

    Kozak, M.W.; Olague, N.E. [Sandia National Labs., Albuquerque, NM (United States)

    1995-02-01

    In this report, concepts on how validation fits into the scheme of developing confidence in performance assessments are introduced. A general framework for validation and confidence building in regulatory decision making is provided. It is found that traditional validation studies have a very limited role in developing site-specific confidence in performance assessments. Indeed, validation studies are shown to have a role only in the context that their results can narrow the scope of initial investigations that should be considered in a performance assessment. In addition, validation needs for performance assessment of low-level waste disposal facilities are discussed, and potential approaches to address those needs are suggested. These areas of topical research are ranked in order of importance based on relevance to a performance assessment and likelihood of success.

  17. Computational model and performance optimization methodology of a compact design heat exchanger used as an IHX in HTGR

    International Nuclear Information System (INIS)

    De la Torre V, R.; Francois L, J. L.

    2017-09-01

    The intermediate heat exchangers (IHX) present in high-temperature gas-cooled reactor (HTGR) present complex operating conditions, characterized by temperature values higher than 1073 K. Conventional designs of tubes and shell have shown disadvantages with respect to compact designs. In this work, computational models of a compact heat exchanger design, the printed circuit, were built under IHX conditions in a HTGR installation. In these models, a detailed geometry was considered in three dimensions, corresponding to a transfer unit of the heat exchanger. Computational fluid dynamics techniques and finite element methods were used to study the thermo-hydraulic and mechanical functioning of the equipment, respectively. The properties of the materials were defined as temperature functions. The thermo-hydraulic results obtained were established as operating conditions in the structural calculations. A methodology was developed based on the analysis of capital and operating costs, which takes into account the heat transfer, pressure drop and the mechanical behavior of the structure, in a single optimization variable. By analyzing the experimental results of other authors, a relationship was obtained between the operation time of the equipment and the maximum effort in the structure, which was used in the model. The results show that the model that allows a greater thermal efficiency differs from the one that has lower total cost per year. (Author)

  18. Advanced self-healing asphalt composites in the pavement performance field: mechanisms at the nano level and new repairing methodologies.

    Science.gov (United States)

    Agzenai, Yahya; Pozuelo, Javier; Sanz, Javier; Perez, Ignacio; Baselga, Juan

    2015-01-01

    In an effort to give a global view of this field of research, in this mini-review we highlight the most recent publications and patents focusing on modified asphalt pavements that contain certain reinforcing nanoparticles which impart desirable thermal, electrical and mechanical properties. In response to the increasing cost of asphalt binder and road maintenance, there is a need to look for alternative technologies and new asphalt composites, able to self-repair, for preserving and renewing the existing pavements. First, we will focus on the self-healing property of asphalt, the evidences that support that healing takes place immediately after the contact between the faces of a crack, and how the amount of healing can be measured in both the laboratory and the field. Next we review the hypothetical mechanisms of healing to understand the material behaviour and establish models to quantify the damage-healing process. Thereafter, we outline different technologies, nanotechnologies and methodologies used for self-healing paying particular attention to embedded micro-capsules, new nano-materials like carbon nanotubes and nano-fibres, ionomers, and microwave and induction heating processes.

  19. The Methodology of Magpies

    Science.gov (United States)

    Carter, Susan

    2014-01-01

    Arts/Humanities researchers frequently do not explain methodology overtly; instead, they "perform" it through their use of language, textual and historic cross-reference, and theory. Here, methodologies from literary studies are shown to add to Higher Education (HE) an exegetical and critically pluralist approach. This includes…

  20. Audiovisual synchrony enhances BOLD responses in a brain network including multisensory STS while also enhancing target-detection performance for both modalities

    Science.gov (United States)

    Marchant, Jennifer L; Ruff, Christian C; Driver, Jon

    2012-01-01

    The brain seeks to combine related inputs from different senses (e.g., hearing and vision), via multisensory integration. Temporal information can indicate whether stimuli in different senses are related or not. A recent human fMRI study (Noesselt et al. [2007]: J Neurosci 27:11431–11441) used auditory and visual trains of beeps and flashes with erratic timing, manipulating whether auditory and visual trains were synchronous or unrelated in temporal pattern. A region of superior temporal sulcus (STS) showed higher BOLD signal for the synchronous condition. But this could not be related to performance, and it remained unclear if the erratic, unpredictable nature of the stimulus trains was important. Here we compared synchronous audiovisual trains to asynchronous trains, while using a behavioral task requiring detection of higher-intensity target events in either modality. We further varied whether the stimulus trains had predictable temporal pattern or not. Synchrony (versus lag) between auditory and visual trains enhanced behavioral sensitivity (d') to intensity targets in either modality, regardless of predictable versus unpredictable patterning. The analogous contrast in fMRI revealed BOLD increases in several brain areas, including the left STS region reported by Noesselt et al. [2007: J Neurosci 27:11431–11441]. The synchrony effect on BOLD here correlated with the subject-by-subject impact on performance. Predictability of temporal pattern did not affect target detection performance or STS activity, but did lead to an interaction with audiovisual synchrony for BOLD in inferior parietal cortex. PMID:21953980

  1. High thermal performance lithium-ion battery pack including hybrid active–passive thermal management system for using in hybrid/electric vehicles

    International Nuclear Information System (INIS)

    Fathabadi, Hassan

    2014-01-01

    In this study, a novel Li-ion battery pack design including hybrid active–passive thermal management system is presented. The battery pack is suitable for using in hybrid/electric vehicles. Active part of the hybrid thermal management system uses distributed thin ducts, air flow and natural convection as cooling media while the passive part utilizes phase change material/expanded graphite composite (PCM/EG) as cooling/heating component to optimize the thermal performance of the proposed battery pack. High melting enthalpy of PCM/EG composite together with melting of PCM/EG composite at the temperature of 58.9 °C remains the temperature distribution of the battery units in the desired temperature range (below 60 °C). The temperature and voltage distributions in the proposed battery pack design consisting of battery units, distributed thin ducts and PCM/EG composite are calculated by numerical solving of the related partial differential equations. Simulation results obtained by writing M-files code in Matlab environment and plotting the numerical data are presented to validate the theoretical results. A comparison between the thermal and physical characteristics of the proposed battery pack and other latest works is presented that explicitly proves the battery pack performance. - Highlights: • Novel Li-ion battery pack including active and passive thermal management systems. • The battery pack has high thermal performance for ambient temperatures until 55 °C. • Uniform temperature and voltage distributions. • The maximum observed temperature in each battery unit is less than other works. • The maximum temperature dispersion in each battery is less than other works

  2. Guidance for the application of an assessment methodology for innovative nuclear energy systems. INPRO manual - Overview of the methodology. Vol. 1 of 9 of the final report of phase 1 of the International Project on Innovative Nuclear Reactors and Fuel Cycles (INPRO) including a CD-ROM comprising all volumes

    International Nuclear Information System (INIS)

    2008-11-01

    The International Project on Innovative Nuclear Reactors and Fuel Cycles (INPRO) was initiated in the year 2000, based on a resolution of the IAEA General Conference (GC(44)/RES/21). The main objectives of INPRO are (1) to help to ensure that nuclear energy is available to contribute in fulfilling energy needs in the 21st century in a sustainable manner, (2) to bring together both technology holders and technology users to consider jointly the international and national actions required to achieve desired innovations in nuclear reactors and fuel cycles; and (3) to create a forum to involve all relevant stakeholders that will have an impact on, draw from, and complement the activities of existing institutions, as well as ongoing initiatives at the national and international level. This document follows the guidelines of the INPRO report 'Methodology for the assessment of innovative nuclear reactors and fuel cycles, Report of Phase 1B (first part) of the International Project on Innovative Nuclear Reactors and Fuel Cycles (INPRO)', IAEA-TECDOC-1434 (2004), together with its previous report Guidance for the evaluation for innovative nuclear reactors and fuel cycles, Report of Phase 1A of the International Project on Innovative Nuclear Reactors and Fuel Cycles (INPRO), IAEA-TECDOC-1362 (2003). This INPRO manual is comprised of an overview volume (laid out in this report), and eight additional volumes (available on a CD-ROM attached to the inside back cover of this report) covering the areas of economics (Volume 2), infrastructure (Volume 3), waste management (Volume 4), proliferation resistance (Volume 5), physical protection (Volume 6), environment (Volume 7), safety of reactors (Volume 8), and safety of nuclear fuel cycle facilities (Volume 9). The overview volume sets out the philosophy of INPRO and a general discussion of the INPRO methodology. This overview volume discusses the relationship of INPRO with the UN concept of sustainability to demonstrate how the

  3. Scenario development methodologies

    International Nuclear Information System (INIS)

    Eng, T.; Hudson, J.; Stephansson, O.

    1994-11-01

    In the period 1981-1994, SKB has studied several methodologies to systematize and visualize all the features, events and processes (FEPs) that can influence a repository for radioactive waste in the future. All the work performed is based on the terminology and basic findings in the joint SKI/SKB work on scenario development presented in the SKB Technical Report 89-35. The methodologies studied are a) Event tree analysis, b) Influence diagrams and c) Rock Engineering Systems (RES) matrices. Each one of the methodologies is explained in this report as well as examples of applications. One chapter is devoted to a comparison between the two most promising methodologies, namely: Influence diagrams and the RES methodology. In conclusion a combination of parts of the Influence diagram and the RES methodology is likely to be a promising approach. 26 refs

  4. Evaluating the Effects of Clothing and Individual Equipment on Marksmanship Performance Using a Novel Five Target Methodology

    Science.gov (United States)

    2016-11-01

    operationally relevant and address the key factors for Warfighter performance. N ot s ub je ct to U .S . c op yr ig ht re st ric tio ns . D...11B) from the 75th Ranger Regiment. Two TPs were Aberdeen Test Center (ATC) Contractors as Representative Soldiers (CARS). One of the CARS is

  5. Effectiveness of maximal safe resection for glioblastoma including elderly and low karnofsky performance status patients. Retrospective review at a single institute

    International Nuclear Information System (INIS)

    Uzuka, Takeo; Takahashi, Hideaki; Aoki, Hiroshi; Natsumeda, Manabu; Fujii, Yukihiko

    2012-01-01

    Elderly and low Karnofsky performance status (KPS) patients have been excluded from most prospective trials. This retrospective study investigated glioblastoma treatment outcomes, including those of elderly and low KPS patients, and analyzed the prognostic factors using the medical records of 107 consecutive patients, 59 men and 48 women aged from 21 to 85 years (median 65 years), with newly diagnosed glioblastoma treated at our institute. There were 71 high-risk patients with age >70 years and/or KPS 6 -methylguanine-deoxyribonucleic acid methyltransferase-negative (p=0.027), and more than subtotal removal (p=0.003) were significant prognostic factors. The median postoperative KPS score tended to be better than the preoperative score, even in the high-risk group. We recommend maximal safe resection for glioblastoma patients, even those with advanced age and/or with low KPS scores. (author)

  6. Transferable and flexible label-like macromolecular memory on arbitrary substrates with high performance and a facile methodology.

    Science.gov (United States)

    Lai, Ying-Chih; Hsu, Fang-Chi; Chen, Jian-Yu; He, Jr-Hau; Chang, Ting-Chang; Hsieh, Ya-Ping; Lin, Tai-Yuan; Yang, Ying-Jay; Chen, Yang-Fang

    2013-05-21

    A newly designed transferable and flexible label-like organic memory based on a graphene electrode behaves like a sticker, and can be readily placed on desired substrates or devices for diversified purposes. The memory label reveals excellent performance despite its physical presentation. This may greatly extend the memory applications in various advanced electronics and provide a simple scheme to integrate with other electronics. Copyright © 2013 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  7. LOFT uncertainty-analysis methodology

    International Nuclear Information System (INIS)

    Lassahn, G.D.

    1983-01-01

    The methodology used for uncertainty analyses of measurements in the Loss-of-Fluid Test (LOFT) nuclear-reactor-safety research program is described and compared with other methodologies established for performing uncertainty analyses

  8. LOFT uncertainty-analysis methodology

    International Nuclear Information System (INIS)

    Lassahn, G.D.

    1983-01-01

    The methodology used for uncertainty analyses of measurements in the Loss-of-Fluid Test (LOFT) nuclear reactor safety research program is described and compared with other methodologies established for performing uncertainty analyses

  9. Problem-based learning versus a traditional educational methodology: a comparison of preclinical and clinical periodontics performance.

    Science.gov (United States)

    Rich, Sandra K; Keim, Robert G; Shuler, Charles F

    2005-06-01

    To evaluate efficacy of a problem-based learning (PBL) pedagogy in preclinical and clinical teaching, test scores of 234 undergraduate dental students from the conventionally taught classes of 2003 and 2004 were compared with scores of 274 dental students from the PBL classes of 2005 and 2006. Although the groups' means were close together, t-test analysis of scores revealed that PBL students performed significantly better than traditional (TRAD) students on midterm (p=.0001) and final (p=.015) examinations taken on student partner/mock patients. ANOVA comparing the classes with each other showed significant differences for the midterm and final, but not for the clinical examination. Further multiple comparison tests (Tukey HSD) for the midterm and final revealed that differences specifically reflected superior performance of PBL classes against one of the TRAD classes (2004). There was no difference in performance between PBL (n=134) and TRAD (n=233) students on examinations taken with actual clinical patients who were undergoing nonsurgical periodontal treatment. Over a two-year period, PBL students rated their program instructors at a mean of 4.41 on a Likert-type scale of 1 (not helpful) to 5 (outstanding). The program provides a PBL model for teaching preclinical and clinical skills supported by a four-year evaluation of manual skills outcomes.

  10. Investigation of the performance of fermentation processes using a mathematical model including effects of metabolic bottleneck and toxic product on cells.

    Science.gov (United States)

    Sriyudthsak, Kansuporn; Shiraishi, Fumihide

    2010-11-01

    A number of recent research studies have focused on theoretical and experimental investigation of a bottleneck in a metabolic reaction network. However, there is no study on how the bottleneck affects the performance of a fermentation process when a product is highly toxic and remarkably influences the growth and death of cells. The present work therefore studies the effect of bottleneck on product concentrations under different product toxicity conditions. A generalized bottleneck model in a fed-batch fermentation is constructed including both the bottleneck and the product influences on cell growth and death. The simulation result reveals that when the toxic product strongly influences the cell growth and death, the final product concentration is hardly changed even if the bottleneck is removed, whereas it is markedly changed by the degree of product toxicity. The performance of an ethanol fermentation process is also discussed as a case example to validate this result. In conclusion, when the product is highly toxic, one cannot expect a significant increase in the final product concentration even if removing the bottleneck; rather, it may be more effective to somehow protect the cells so that they can continuously produce the product. Copyright © 2010 Elsevier Inc. All rights reserved.

  11. Update on the methodology for Amtrak cost accounting Amtrak performance tracking (APT) : volume 2, appendices A-F.

    Science.gov (United States)

    2016-04-22

    Each table below represents the list of Cost Centers associated with each APT Subfamily at the time the data for this report was gathered from APT in April 2016. The #701 Capital Family is not included as it does not have Cost Centers in a traditiona...

  12. Optimization of Tape Winding Process Parameters to Enhance the Performance of Solid Rocket Nozzle Throat Back Up Liners using Taguchi's Robust Design Methodology

    Science.gov (United States)

    Nath, Nayani Kishore

    2017-08-01

    The throat back up liners is used to protect the nozzle structural members from the severe thermal environment in solid rocket nozzles. The throat back up liners is made with E-glass phenolic prepregs by tape winding process. The objective of this work is to demonstrate the optimization of process parameters of tape winding process to achieve better insulative resistance using Taguchi's robust design methodology. In this method four control factors machine speed, roller pressure, tape tension, tape temperature that were investigated for the tape winding process. The presented work was to study the cogency and acceptability of Taguchi's methodology in manufacturing of throat back up liners. The quality characteristic identified was Back wall temperature. Experiments carried out using L 9 ' (34) orthogonal array with three levels of four different control factors. The test results were analyzed using smaller the better criteria for Signal to Noise ratio in order to optimize the process. The experimental results were analyzed conformed and successfully used to achieve the minimum back wall temperature of the throat back up liners. The enhancement in performance of the throat back up liners was observed by carrying out the oxy-acetylene tests. The influence of back wall temperature on the performance of throat back up liners was verified by ground firing test.

  13. Quantifying Averted Disability-Adjusted Life Years as a Performance Indicator for Water Quality Interventions: A Review of Current Methodologies and Challenges

    Directory of Open Access Journals (Sweden)

    Darcy M. Anderson

    2018-06-01

    Full Text Available Sustainable access to safe drinking water protects against infectious disease and promotes overall health. Despite considerable progress toward increasing water access, safe water quality and reliable service delivery remain a challenge. Traditional financing strategies pay implementers based on inputs and activities, with minimal incentives for water quality monitoring and sustained service operation. Pay-for-performance offers an alternative financing strategy that delivers all or a portion of payment based on performance indicators of desired outputs or outcomes. A pay-for-performance approach in the water sector could quantify and incentivize health impact. Averted disability-adjusted life years (ADALYs have been used as a performance indicator to measure the burden of disease averted due to environmental health interventions. Water-related disease burden can be measured for application as an ADALYs performance indicator following either comparative risk assessment or quantitative microbial risk assessment. Comparative risk assessment models disease burden using water source type as a proxy indicator of microbial water quality, while quantitative microbial risk assessment models disease burden using concentrations of indicator pathogens. This paper compares these risk assessment methodologies, and summarizes the limitations of applying these approaches toward quantifying ADALYs as a performance indicator for water quality interventions.

  14. Methodology and Toolset for Model Verification, Hardware/Software co-simulation, Performance Optimisation and Customisable Source-code generation

    DEFF Research Database (Denmark)

    Berger, Michael Stübert; Soler, José; Yu, Hao

    2013-01-01

    The MODUS project aims to provide a pragmatic and viable solution that will allow SMEs to substantially improve their positioning in the embedded-systems development market. The MODUS tool will provide a model verification and Hardware/Software co-simulation tool (TRIAL) and a performance...... optimisation and customisable source-code generation tool (TUNE). The concept is depicted in automated modelling and optimisation of embedded-systems development. The tool will enable model verification by guiding the selection of existing open-source model verification engines, based on the automated analysis...

  15. Contribution to the integration methodology of environment in the small and medium enterprises or industries: evaluation of environmental performances

    International Nuclear Information System (INIS)

    Personne, M.

    1998-01-01

    The integration of environmental criteria into industrial plants working is nowadays an obligation for companies. Implementation of an Environmental Management System (EMS) is a mean to integrate these criteria, and the system registration (by ISO 14001 or EMAS standards) enables companies to demonstrate the validity of their environmental behaviour to interested parties. Our experience in Small and Medium Enterprises (SMEs) has allowed us to note the inadequacy between their environmental integration level and EMS requirements. In addition to that, we have observed that environmental assessment methods, which could enable SMEs to make up for lost time, were not adapted to their specificities. However, two recent approaches are innovative: the first one is based on a progressive processes, the second one on an environmental information system, based on indicators construction. On the basis of existing methods study, improved with our SMEs experience, our approach consists of developing an environmental integration method, joining the progressive aspect (construction of a 'multi-phases' method) and the information treatment (exploitation of environmental data by indicators construction). We propose a four phases method, - environmental performance evaluation, internal and external results exploitation, process perpetuation -, setting up an information treatment system, by means of compliance, progress and monitoring indicators. Leading to implementation of an environmental performance continuous improvement cycle, this process enables companies to step forward EMS implementation. (author)

  16. Effects of including NaOH-treated corn straw as a substitute for wheat hay in the ration of lactating cows on performance, digestibility, and rumen microbial profile.

    Science.gov (United States)

    Jami, E; Shterzer, N; Yosef, E; Nikbachat, M; Miron, J; Mizrahi, I

    2014-03-01

    This study measured the effects of including 5% NaOH-treated corn straw (T-CS) as a substitute for 15% wheat hay in the control total mixed ration (TMR) of lactating cows on performance, digestibility, and rumen microbial profile. Two groups of 21 cows each, similar in initial performance, were fed individually 1 of the 2 TMR examined. Voluntary dry matter intake of cows fed the control TMR was 4.3% higher than that of the T-CS cows, but in vivo dry matter and organic matter digestibilities of both groups were similar. Crude protein digestibility was higher in the control cows but digestibility of neutral detergent fiber polysaccharides (cellulose and hemicelluloses) was higher in the T-CS TMR. This was followed by 4.6% reduction in rumination time of the T-CS group. A slightly higher milk yield was observed in the control cows compared with the T-CS group; however, milk fat and milk protein content were higher in cows fed the T-CS TMR. This was reflected in 1.3% increase in energy-corrected milk yield and 5.34% increase in production efficiency (energy-corrected milk yield/intake) of the T-CS cows compared with the control. Welfare of the cows, as assessed by length of daily recumbence time, was improved by feeding the T-CS TMR relative to the control group. As a whole, the rumen bacterial community was significantly modulated in the T-CS group in the experimental period compared with the preexperimental period, whereas the bacterial community of the control group remained unchanged during this period. Out of the 8 bacterial species that were quantified using real-time PCR, a notable decrease in cellulolytic bacteria was observed in the T-CS group, as well as an increase in lactic acid-utilizing bacteria. These results illustrate the effect of T-CS on the composition of rumen microbiota, which may play a role in improving the performance of the lactating cow. Copyright © 2014 American Dairy Science Association. Published by Elsevier Inc. All rights reserved.

  17. Simultaneous analysis of perfluoroalkyl and polyfluoroalkyl substances including ultrashort-chain C2 and C3 compounds in rain and river water samples by ultra performance convergence chromatography.

    Science.gov (United States)

    Yeung, Leo W Y; Stadey, Christopher; Mabury, Scott A

    2017-11-03

    An analytical method using ultra performance convergence chromatography (UPC 2 ) coupled to a tandem mass spectrometer operated in negative electrospray mode was developed to measure perfluoroalkyl and polyfluoroalkyl substances (PFASs) including the ultrashort-chain PFASs (C2-C3). Compared to the existing liquid chromatography tandem mass spectrometry method using an ion exchange column, the new method has a lower detection limit (0.4pg trifluoroacetate (TFA) on-column), narrower peak width (3-6s), and a shorter run time (8min). Using the same method, different classes of PFASs (e.g., perfluoroalkyl sulfonates (PFSAs) and perfluorinated carboxylates (PFCAs), perfluorinated phosphonates (PFPAs) and phosphinates (PFPiAs), polyfluoroalkyl phosphate diesters (diPAPs)) can be measured in a single analysis. Rain (n=2) and river water (n=2) samples collected in Toronto, ON, were used for method validation and application. Results showed that short-chain PFAS (C2-C7 PFCAs and C4 PFSA) contributed to over 80% of the detectable PFASs in rain samples and the C2-C3 PFASs alone accounted for over 40% of the total. Reports on environmental levels of these ultrashort-chain PFASs are relatively scarce. Relatively large contribution of these ultrashort-chain PFASs to the total PFASs indicate the need to include the measurement of short-chain PFASs, especially C2 and C3 PFASs, in environmental monitoring. The sources of TFA and other short-chain PFASs in the environment are not entirely clear. The newly developed analytical method may help further investigation on the sources and the environmental levels of these ultrashort-chain PFASs. Copyright © 2017 Elsevier B.V. All rights reserved.

  18. Application of response surface methodology in optimization of performance and exhaust emissions of secondary butyl alcohol-gasoline blends in SI engine

    International Nuclear Information System (INIS)

    Yusri, I.M.; Mamat, R.; Azmi, W.H.; Omar, A.I.; Obed, M.A.; Shaiful, A.I.M.

    2017-01-01

    Highlights: • Adding 2-butanol in gasoline fuel can improve engine performance. • 2-Butanol addition reduced NO x , CO, and HC but produced higher CO 2 . • RSM was applied to optimize the engine performance and exhaust emissions. - Abstract: Producing an optimal balance between engine performance and exhaust emissions has always been one of the main challenges in automotive technology. This paper examines the use of RSM (response surface methodology) to optimize the engine performance, and exhaust emissions of a spark-ignition (SI) engine which operates with 2-butanol–gasoline blends of 5%, 10%, and 15% called GBu5, GBu10, and GBu15. In the experiments, the engine ran at various speeds for each test fuel and 13 different conditions were constructed. The optimization of the independent variables was performed by means of a statistical tool known as DoE (design of experiments). The desirability approach by RSM was employed with the aim of minimizing emissions and maximizing of performance parameters. Based on the RSM model, performance characteristics revealed that increments of 2-butanol in the blended fuels lead to increasing trends of brake power, brake mean effective pressure and brake thermal efficiency. Nonetheless, marginal higher brake specific fuel consumption was observed. Furthermore, the RSM model suggests that the presence of 2-butanol exhibits a decreasing trend of nitrogen oxides, carbon monoxides, and unburnt hydrocarbon, however, a higher trend was observed for carbon dioxides exhaust emissions. It was established from the study that the GBu15 blend with an engine speed of 3205 rpm was found to be optimal to provide the best performance and emissions characteristics as compared to the other tested blends.

  19. Process performance assessment of advanced anaerobic digestion of sewage sludge including sequential ultrasound-thermal (55 °C) pre-treatment.

    Science.gov (United States)

    Neumann, Patricio; Barriga, Felipe; Álvarez, Claudia; González, Zenón; Vidal, Gladys

    2018-03-15

    The aim of this study was to evaluate the performance and digestate quality of advanced anaerobic digestion of sewage sludge including sequential ultrasound-thermal (55 °C) pre-treatment. Both stages of pre-treatment contributed to chemical oxygen demand (COD) solubilization, with an overall factor of 11.4 ± 2.2%. Pre-treatment led to 19.1, 24.0 and 29.9% increased methane yields at 30, 15 and 7.5 days solid retention times (SRT), respectively, without affecting process stability or accumulation of intermediates. Pre-treatment decreased up to 4.2% water recovery from the digestate, but SRT was a more relevant factor controlling dewatering. Advanced digestion showed 2.4-3.1 and 1.5 logarithmic removals of coliforms and coliphages, respectively, and up to a 58% increase in the concentration of inorganics in the digestate solids compared to conventional digestion. The COD balance of the process showed that the observed increase in methane production was proportional to the pre-treatment solubilization efficiency. Copyright © 2018 Elsevier Ltd. All rights reserved.

  20. Development of a Novel, Sensitive, Selective, and Fast Methodology to Determine Malondialdehyde in Leaves of Melon Plants by Ultra-High-Performance Liquid Chromatography-Tandem Mass Spectrometry

    Directory of Open Access Journals (Sweden)

    Melisa E. Yonny

    2017-01-01

    Full Text Available Early production of melon plant (Cucumis melo is carried out using tunnels structures, where extreme temperatures lead to high reactive oxygen species production and, hence, oxidative stress. Malondialdehyde (MDA is a recognized biomarker of the advanced oxidative status in a biological system. Thus a reliable, sensitive, simple, selective, and rapid separative strategy based on ultra-high-performance liquid chromatography coupled to positive electrospray-tandem mass spectrometry (UPLC-(+ESI-MS/MS was developed for the first time to measure MDA, without derivatization, in leaves of melon plants exposed to stress conditions. The detection and quantitation limits were 0.02 μg·L−1 and 0.08 μg·L−1, respectively, which was demonstrated to be better than the methodologies currently reported in the literature. The accuracy values were between 96% and 104%. The precision intraday and interday values were 2.7% and 3.8%, respectively. The optimized methodology was applied to monitoring of changes in MDA levels between control and exposed to thermal stress conditions melon leaves samples. Important preliminary conclusions were obtained. Besides, a comparison between MDA levels in melon leaves quantified by the proposed method and the traditional thiobarbituric acid reactive species (TBARS approach was undertaken. The MDA determination by TBARS could lead to unrealistic conclusions regarding the oxidative stress status in plants.

  1. Cation-exchange high-performance liquid chromatography for variant hemoglobins and HbF/A2: What must hematopathologists know about methodology?

    Science.gov (United States)

    Sharma, Prashant; Das, Reena

    2016-03-26

    Cation-exchange high-performance liquid chromatography (CE-HPLC) is a widely used laboratory test to detect variant hemoglobins as well as quantify hemoglobins F and A2 for the diagnosis of thalassemia syndromes. It's versatility, speed, reproducibility and convenience have made CE-HPLC the method of choice to initially screen for hemoglobin disorders. Despite its popularity, several methodological aspects of the technology remain obscure to pathologists and this may have consequences in specific situations. This paper discusses the basic principles of the technique, the initial quality control steps and the interpretation of various controls and variables that are available on the instrument output. Subsequent sections are devoted to methodological considerations that arise during reporting of cases. For instance, common problems of misidentified peaks, totals crossing 100%, causes of total area being above or below acceptable limits and the importance of pre-integration region peaks are dealt with. Ultimately, CE-HPLC remains an investigation, the reporting of which combines in-depth knowledge of the biological basics with more than a working knowledge of the technological aspects of the technique.

  2. Criticality accident studies and methodology implemented at the CEA

    International Nuclear Information System (INIS)

    Barbry, Francis; Fouillaud, Patrick; Reverdy, Ludovic; Mijuin, Dominique

    2003-01-01

    Based on the studies and results of experimental programs performed since 1967 in the CRAC, then SILENE facilities, the CEA has devised a methodology for criticality accident studies. This methodology integrates all the main focuses of its approach, from criticality accident phenomenology to emergency planning and response, and thus includes aspects such as criticality alarm detector triggering, airborne releases, and irradiation risk assessment. (author)

  3. Methodological modifications on quantification of phosphatidylethanol in blood from humans abusing alcohol, using high-performance liquid chromatography and evaporative light scattering detection

    Directory of Open Access Journals (Sweden)

    Aradottir Steina

    2005-09-01

    Full Text Available Abstract Background Phosphatidylethanol (PEth is an abnormal phospholipid formed slowly in cell membranes by a transphosphatidylation reaction from phosphatidylcholine in the presence of ethanol and catalyzed by the enzyme phospholipase D. PEth in blood is a promising new marker of ethanol abuse depending on the high specificity and sensitivity of this marker. None of the biological markers used in clinical routine at the present time are sensitive and specific enough for the diagnosis of alcohol abuse. The method for PEth analysis includes lipid extraction of whole blood, a one-hour HPLC separation of lipids and ELSD (evaporative light scattering detection of PEth. Results Methodological improvements are presented which comprise a simpler extraction procedure, the use of phosphatidylbutanol as internal standard and a new algorithm for evaluation of unknown samples. It is further demonstrated that equal test results are obtained with blood collected in standard test tubes with EDTA as with the previously used heparinized test tubes. The PEth content in blood samples is stable for three weeks in the refrigerator. Conclusion Methodological changes make the method more suitable for routine laboratory use, lower the limit of quantification (LOQ and improve precision.

  4. On methodology

    DEFF Research Database (Denmark)

    Cheesman, Robin; Faraone, Roque

    2002-01-01

    This is an English version of the methodology chapter in the authors' book "El caso Berríos: Estudio sobre información errónea, desinformación y manipulación de la opinión pública".......This is an English version of the methodology chapter in the authors' book "El caso Berríos: Estudio sobre información errónea, desinformación y manipulación de la opinión pública"....

  5. Archetype modeling methodology.

    Science.gov (United States)

    Moner, David; Maldonado, José Alberto; Robles, Montserrat

    2018-03-01

    Clinical Information Models (CIMs) expressed as archetypes play an essential role in the design and development of current Electronic Health Record (EHR) information structures. Although there exist many experiences about using archetypes in the literature, a comprehensive and formal methodology for archetype modeling does not exist. Having a modeling methodology is essential to develop quality archetypes, in order to guide the development of EHR systems and to allow the semantic interoperability of health data. In this work, an archetype modeling methodology is proposed. This paper describes its phases, the inputs and outputs of each phase, and the involved participants and tools. It also includes the description of the possible strategies to organize the modeling process. The proposed methodology is inspired by existing best practices of CIMs, software and ontology development. The methodology has been applied and evaluated in regional and national EHR projects. The application of the methodology provided useful feedback and improvements, and confirmed its advantages. The conclusion of this work is that having a formal methodology for archetype development facilitates the definition and adoption of interoperable archetypes, improves their quality, and facilitates their reuse among different information systems and EHR projects. Moreover, the proposed methodology can be also a reference for CIMs development using any other formalism. Copyright © 2018 Elsevier Inc. All rights reserved.

  6. Examination of tapered plastic multimode fiber-based sensor performance with silver coating for different concentrations of calcium hypochlorite by soft computing methodologies--a comparative study.

    Science.gov (United States)

    Zakaria, Rozalina; Sheng, Ong Yong; Wern, Kam; Shamshirband, Shahaboddin; Wahab, Ainuddin Wahid Abdul; Petković, Dalibor; Saboohi, Hadi

    2014-05-01

    A soft methodology study has been applied on tapered plastic multimode sensors. This study basically used tapered plastic multimode fiber [polymethyl methacrylate (PMMA)] optics as a sensor. The tapered PMMA fiber was fabricated using an etching method involving deionized water and acetone to achieve a waist diameter and length of 0.45 and 10 mm, respectively. In addition, a tapered PMMA probe, which was coated by silver film, was fabricated and demonstrated using a calcium hypochlorite (G70) solution. The working mechanism of such a device is based on the observation increment in the transmission of the sensor that is immersed in solutions at high concentrations. As the concentration was varied from 0 to 6 ppm, the output voltage of the sensor increased linearly. The silver film coating increased the sensitivity of the proposed sensor because of the effective cladding refractive index, which increases with the coating and thus allows more light to be transmitted from the tapered fiber. In this study, the polynomial and radial basis function (RBF) were applied as the kernel function of the support vector regression (SVR) to estimate and predict the output voltage response of the sensors with and without silver film according to experimental tests. Instead of minimizing the observed training error, SVR_poly and SVR_rbf were used in an attempt to minimize the generalization error bound so as to achieve generalized performance. An adaptive neuro-fuzzy interference system (ANFIS) approach was also investigated for comparison. The experimental results showed that improvements in the predictive accuracy and capacity for generalization can be achieved by the SVR_poly approach in comparison to the SVR_rbf methodology. The same testing errors were found for the SVR_poly approach and the ANFIS approach.

  7. Development of methodology to construct a generic conceptual model of river-valley evolution for performance assessment of HLW geological disposal

    International Nuclear Information System (INIS)

    Kawamura, Makoto; Tanikawa, Shin-ichi; Yasue, Ken-ichi; Niizato, Tadafumi

    2011-01-01

    In order to assess the long-term safety of a geological disposal system for high-level radioactive waste (HLW), it is important to consider the impact of uplift and erosion, which cannot be precluded on a timescale in the order of several hundred thousand years for many locations in Japan. Geomorphic evolution, caused by uplift and erosion and coupled to climatic and sea-level changes, will impact the geological disposal system due to resulting spatial and temporal changes in the disposal environment. Degradation of HLW barrier performance will be particularly significant when the remnant repository structures near, and are eventually exposed at, the ground surface. In previous studies, fluvial erosion was densified as the key concern in most settings in Japan. Interpretation of the impact of the phenomena at relevant locations in Japan has led to development of a generic conceptual model which contains the features typical at middle reach of rivers. Here, therefore, we present a methodology for development of a generic conceptual model based on best current understanding of fluvial erosion in Japan, which identifies the simplifications and uncertainties involved and assesses their consequences in the context of repository performance. (author)

  8. Methodology for ranking restoration options

    International Nuclear Information System (INIS)

    Hedemann Jensen, Per

    1999-04-01

    The work described in this report has been performed as a part of the RESTRAT Project FI4P-CT95-0021a (PL 950128) co-funded by the Nuclear Fission Safety Programme of the European Commission. The RESTRAT project has the overall objective of developing generic methodologies for ranking restoration techniques as a function of contamination and site characteristics. The project includes analyses of existing remediation methodologies and contaminated sites, and is structured in the following steps: characterisation of relevant contaminated sites; identification and characterisation of relevant restoration techniques; assessment of the radiological impact; development and application of a selection methodology for restoration options; formulation of generic conclusions and development of a manual. The project is intended to apply to situations in which sites with nuclear installations have been contaminated with radioactive materials as a result of the operation of these installations. The areas considered for remedial measures include contaminated land areas, rivers and sediments in rivers, lakes, and sea areas. Five contaminated European sites have been studied. Various remedial measures have been envisaged with respect to the optimisation of the protection of the populations being exposed to the radionuclides at the sites. Cost-benefit analysis and multi-attribute utility analysis have been applied for optimisation. Health, economic and social attributes have been included and weighting factors for the different attributes have been determined by the use of scaling constants. (au)

  9. [Lung Abscess with Acute Empyema Which Improved after Performing by Video Assissted Thoracic Surgery( Including Pneumonotomy and Lung Abscess Drainage);Report of a Case].

    Science.gov (United States)

    Gabe, Atsushi; Nagamine, Naoji

    2017-05-01

    We herein report the case of a patient demonstrating a lung abscess with acute empyema which improved after performing pnemumonotomy and lung abscess drainage. A 60-year-old male was referred to our hospital to receive treatment for a lung abscess with acute empyema. At surgery, the lung parenchyma was slightly torn with pus leakage. After drainage of lung abscess by enlarging the injured part, curettage in the thoracic cavity and decortication were performed. The postoperative course was uneventful. Direct drainage of an abscess into the thoracic cavity is thought to be a choice for the treatment of lung abscesses.

  10. Analysis of 6-mercaptopurine in human plasma with a high-performance liquid chromatographic method including post-column derivatization and fluorimetric detection

    NARCIS (Netherlands)

    Jonkers, R. E.; Oosterhuis, B.; ten Berge, R. J.; van Boxtel, C. J.

    1982-01-01

    A relatively simple assay with improved reliability and sensitivity for measuring levels of 6-mercaptopurine in human plasma is presented. After extraction of the compound and the added internal standard with phenyl mercury acetate, samples were separated by ion-pair reversed-phase high-performance

  11. Boring of full scale deposition holes at the Aespoe Hard Rock Laboratory. Operational experiences including boring performance and a work time analysis

    Energy Technology Data Exchange (ETDEWEB)

    Andersson, Christer [Swedish Nuclear Fuel and Waste Management Co., Stockholm (Sweden); Johansson, Aasa [SWECO, Stockholm (Sweden)

    2002-12-01

    Thirteen experimental deposition holes similar to those in the present KBS-3 design have been bored at the Aespoe Hard Rock Laboratory, Oskarshamn, Sweden. The objective with the boring program was to test and demonstrate the current technique for boring of large vertical holes in granitic rock. Conclusions and results from this project is used in the planning process for the deposition holes that will be bored in the real repository for spent nuclear fuel. The boreholes are also important for three major projects. The Prototype Repository, the Canister Retrieval Test and the Demonstration project will all need full-scale deposition holes for their commissioning. The holes are bored in full scale and have a radius of 1.75 m and a depth of 8.5 m. To bore the holes an existing TBM design was modified to produce a novel type Shaft Boring Machine (SBM) suitable for boring 1.75 m diameter holes from a relatively small tunnel. The cutter head was equipped with two types of roller cutters: two row carbide button cutters and disc cutters. Removal of the cuttings was made with a vacuum suction system. The boring was monitored and boring parameters recorded by a computerised system for the evaluation of the boring performance. During boring of four of the holes temperature, stress and strain measurements were performed. Acoustic emission measurements were also performed during boring of these four holes. The results of these activities will not be discussed in this report since they are reported separately. Criteria regarding nominal borehole diameter, deviation of start and end centre point, surface roughness and performance of the machine were set up according to the KBS-3 design and were fulfilled with a fair margin. The average total time for boring one deposition hole during this project was 105 hours.

  12. Boring of full scale deposition holes at the Aespoe Hard Rock Laboratory. Operational experiences including boring performance and a work time analysis

    International Nuclear Information System (INIS)

    Andersson, Christer; Johansson, Aasa

    2002-12-01

    Thirteen experimental deposition holes similar to those in the present KBS-3 design have been bored at the Aespoe Hard Rock Laboratory, Oskarshamn, Sweden. The objective with the boring program was to test and demonstrate the current technique for boring of large vertical holes in granitic rock. Conclusions and results from this project is used in the planning process for the deposition holes that will be bored in the real repository for spent nuclear fuel. The boreholes are also important for three major projects. The Prototype Repository, the Canister Retrieval Test and the Demonstration project will all need full-scale deposition holes for their commissioning. The holes are bored in full scale and have a radius of 1.75 m and a depth of 8.5 m. To bore the holes an existing TBM design was modified to produce a novel type Shaft Boring Machine (SBM) suitable for boring 1.75 m diameter holes from a relatively small tunnel. The cutter head was equipped with two types of roller cutters: two row carbide button cutters and disc cutters. Removal of the cuttings was made with a vacuum suction system. The boring was monitored and boring parameters recorded by a computerised system for the evaluation of the boring performance. During boring of four of the holes temperature, stress and strain measurements were performed. Acoustic emission measurements were also performed during boring of these four holes. The results of these activities will not be discussed in this report since they are reported separately. Criteria regarding nominal borehole diameter, deviation of start and end centre point, surface roughness and performance of the machine were set up according to the KBS-3 design and were fulfilled with a fair margin. The average total time for boring one deposition hole during this project was 105 hours

  13. A methodology for social experimentation

    DEFF Research Database (Denmark)

    Ravn, Ib

    A methodology is outlined whereby one may improve the performance of a social system to the satisfaction of its stakeholders, that is, facilitate desirable social and organizational transformations......A methodology is outlined whereby one may improve the performance of a social system to the satisfaction of its stakeholders, that is, facilitate desirable social and organizational transformations...

  14. MIRD methodology

    International Nuclear Information System (INIS)

    Rojo, Ana M.; Gomez Parada, Ines

    2004-01-01

    The MIRD (Medical Internal Radiation Dose) system was established by the Society of Nuclear Medicine of USA in 1960 to assist the medical community in the estimation of the dose in organs and tissues due to the incorporation of radioactive materials. Since then, 'MIRD Dose Estimate Report' (from the 1 to 12) and 'Pamphlets', of great utility for the dose calculations, were published. The MIRD system was planned essentially for the calculation of doses received by the patients during nuclear medicine diagnostic procedures. The MIRD methodology for the absorbed doses calculations in different tissues is explained

  15. PSA methodology

    Energy Technology Data Exchange (ETDEWEB)

    Magne, L

    1997-12-31

    The purpose of this text is first to ask a certain number of questions on the methods related to PSAs. Notably we will explore the positioning of the French methodological approach - as applied in the EPS 1300{sup 1} and EPS 900{sup 2} PSAs - compared to other approaches (Part One). This reflection leads to more general reflection: what contents, for what PSA? This is why, in Part Two, we will try to offer a framework for definition of the criteria a PSA should satisfy to meet the clearly identified needs. Finally, Part Three will quickly summarize the questions approached in the first two parts, as an introduction to the debate. 15 refs.

  16. PSA methodology

    International Nuclear Information System (INIS)

    Magne, L.

    1996-01-01

    The purpose of this text is first to ask a certain number of questions on the methods related to PSAs. Notably we will explore the positioning of the French methodological approach - as applied in the EPS 1300 1 and EPS 900 2 PSAs - compared to other approaches (Part One). This reflection leads to more general reflection: what contents, for what PSA? This is why, in Part Two, we will try to offer a framework for definition of the criteria a PSA should satisfy to meet the clearly identified needs. Finally, Part Three will quickly summarize the questions approached in the first two parts, as an introduction to the debate. 15 refs

  17. Analysis of 6-mercaptopurine in human plasma with a high-performance liquid chromatographic method including post-column derivatization and fluorimetric detection.

    Science.gov (United States)

    Jonkers, R E; Oosterhuis, B; ten Berge, R J; van Boxtel, C J

    1982-12-10

    A relatively simple assay with improved reliability and sensitivity for measuring levels of 6-mercaptopurine in human plasma is presented. After extraction of the compound and the added internal standard with phenyl mercury acetate, samples were separated by ion-pair reversed-phase high-performance liquid chromatography. On-line the analytes were oxidized to fluorescent products and detected in a flow-fluorimeter. The within-day coefficient of variation was 3.8% at a concentration of 25 ng/ml. The lower detection limit was 2 ng/ml when 1.0 ml of plasma was used. Mercaptopurine concentration versus time curves of two subjects after a single oral dose of azathioprine are shown.

  18. Separation of three anthraquinone glycosides including two isomers by preparative high-performance liquid chromatography and high-speed countercurrent chromatography from Rheum tanguticum Maxim. ex Balf.

    Science.gov (United States)

    Chen, Tao; Li, Hongmei; Zou, Denglang; Liu, Yongling; Chen, Chen; Zhou, Guoying; Li, Yulin

    2016-08-01

    Anthraquinone glycosides, such as chrysophanol 1-O-β-d-glucoside, chrysophanol 8-O-β-d-glucoside, and physion 8-O-β-d-glucoside, are the accepted important active components of Rheum tanguticum Maxim. ex Balf. due to their pharmacological properties: antifungal, antimicrobial, cytotoxic, and antioxidant activities. However, an effective method for the separation of the above-mentioned anthraquinone glycosides from this herb is not currently available. Especially, greater difficulty existed in the separation of the two isomers chrysophanol 1-O-β-d-glucoside and chrysophanol 8-O-β-d-glucoside. This study demonstrated an efficient strategy based on preparative high-performance liquid chromatography and high-speed countercurrent chromatography for the separation of the above-mentioned anthraquinone glycosides from Rheum tanguticum Maxim.ex Balf. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  19. Effect of including sweet potato (Ipomoea batatas Lam) meal in finishing pig diets on growth performance, carcass traits and pork quality.

    Science.gov (United States)

    Pietrosemoli, Silvana; Moron-Fuenmayor, Oneida Elizabeth; Paez, Angel; Villamide, Maria Jesús

    2016-10-01

    The partial replacement of a commercial concentrate at 10-20% and 15-30% (the first percentage of each dietary treatment corresponded to weeks 1-3 and the second to weeks 4-7 of the experiment, respectively) by sweet potato meal (SPM; 70% foliage: 30% roots) was evaluated for growth performance, carcass yield, instrumental and sensory pork quality using 36 commercial crossbred pigs (56.8 ± 1.3 kg initial body weight). Three dietary treatments were compared in a randomized complete block design. Most growth, carcass traits and pork quality variables were not affected by the SPM inclusion. Growth performance averaged 868 g/day and feed efficiency 0.24 kg/kg. However, feed intake increased 2.2% (P = 0.04) in pigs fed the 10-20% SPM diets, in a similar order of magnitude as the decrease in dietary energy. Despite an increase in gastrointestinal tract as a percent of hot carcass weight (+14.7%) (P = 0.03) with SPM inclusion, carcass yield averaged 69.4%. Conversely, decreases in loin yield (-4.2%) (P = 0.05), backfat thickness (-6.0%) (P < 0.01) and pork tenderness (-13%) (P = 0.02) were observed with 15-30% SPM inclusion. Results suggest that up to 20% SPM inclusion is a viable feed strategy for finishing pigs, easily replicable in small farm settings. © 2016 Japanese Society of Animal Science. © 2016 Japanese Society of Animal Science.

  20. NUSAM Methodology for Assessment.

    Energy Technology Data Exchange (ETDEWEB)

    Leach, Janice [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Snell, Mark K. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2016-07-01

    This document provides a methodology for the performance-based assessment of security systems designed for the protection of nuclear and radiological materials and the processes that produce and/or involve them. It is intended for use with both relatively simple installations and with highly regulated complex sites with demanding security requirements.

  1. The effects of performance criteria including accounting, market, and economy on the quality of financial reporting: A case study on Tehran Stock Exchange

    Directory of Open Access Journals (Sweden)

    Mohammad Mahdi Hosseini

    2013-01-01

    Full Text Available This research studies the effects of performance criteria (accounting, market and economy on the quality of financial reporting in Iran. To evaluate the variable financial reporting quality, the scores given to each company are applied based on the checklist introduced by Iranian Association of Certified Public Accountants and used for the disclosure of the information of the annual financial statements of companies. The statistical population of this research consists of the companies listed on Tehran Stock Exchange over the period 2006-2011. This research, which is classified as applied research, uses the methods of multivariate regression test. The data and hypotheses of this research are analyzed and tested using correlation test and means difference test. The results of the tests conducted on 99 companies indicate that there is a significant and positive relation between the rate of return on equity and the equality of financial reporting. There is also a significant and positive relation between earnings per share and the equality of financial reporting. However, there is no relationship between QTOBIN and the equality of financial reporting. Finally, our results indicate there is a significant and positive relation between market value-added and the equality of financial reporting.

  2. Sensitive determination of iodine species, including organo-iodine, for freshwater and seawater samples using high performance liquid chromatography and spectrophotometric detection

    International Nuclear Information System (INIS)

    Schwehr, Kathleen A.; Santschi, Peter H.

    2003-01-01

    In order to more effectively use iodine isotope ratios, 129 I/ 127 I, as hydrological and geochemical tracers in aquatic systems, a new high performance liquid chromatography (HPLC) method was developed for the determination of iodine speciation. The dissolved iodine species that dominate natural water systems are iodide, iodate, and organic iodine. Using this new method, iodide was determined directly by combining anion exchange chromatography and spectrophotometry. Iodate and the total of organic iodine species are determined as iodide, with minimal sample preparation, compared to existing methods. The method has been applied to quantitatively determine iodide, iodate as the difference of total inorganic iodide and iodide after reduction of the sample by NaHSO 3 , and organic iodine as the difference of total iodide (after organic decomposition by dehydrohalogenation and reduction by NaHSO 3 ) and total inorganic iodide. Analytical accuracy was tested: (1) against certified reference material, SRM 1549, powdered milk (NIST); (2) through the method of standard additions; and (3) by comparison to values of environmental waters measured independently by inductively coupled plasma mass spectrometry (ICP-MS). The method has been successfully applied to measure the concentrations of iodide species in rain, surface and ground water, estuarine and seawater samples. The detection limit was ∼1 nM (0.2 ppb), with less than 3% relative standard deviation (R.S.D.) for samples determined by standard additions to an iodide solution of 20 nM in 0.1 M NaCl. This technique is one of the few methods sensitive enough to accurately quantify stable iodine species at nanomolar concentrations in aquatic systems across a range of matrices, and to quantitatively measure organic iodine. Additionally, this method makes use of a very dilute mobile phase, and may be applied to small sample volumes without pre-column concentration or post-column reactions

  3. Proposed methodology for completion of scenario analysis for the Basalt Waste Isolation Project

    International Nuclear Information System (INIS)

    Roberds, W.J.; Plum, R.J.; Visca, P.J.

    1984-11-01

    This report presents the methodology to complete an assessment of postclosure performance, considering all credible scenarios, including the nominal case, for a proposed repository for high-level nuclear waste at the Hanford Site, Washington State. The methodology consists of defensible techniques for identifying and screening scenarios, and for then assessing the risks associated with each. The results of the scenario analysis are used to comprehensively determine system performance and/or risk for evaluation of compliance with postclosure performance criteria (10 CFR 60 and 40 CFR 191). In addition to describing the proposed methodology, this report reviews available methodologies for scenario analysis, discusses pertinent performance assessment and uncertainty concepts, advises how to implement the methodology (including the organizational requirements and a description of tasks) and recommends how to use the methodology in guiding future site characterization, analysis, and engineered subsystem design work. 36 refs., 24 figs., 1 tab

  4. How can activity-based costing methodology be performed as a powerful tool to calculate costs and secure appropriate patient care?

    Science.gov (United States)

    Lin, Blossom Yen-Ju; Chao, Te-Hsin; Yao, Yuh; Tu, Shu-Min; Wu, Chun-Ching; Chern, Jin-Yuan; Chao, Shiu-Hsiung; Shaw, Keh-Yuong

    2007-04-01

    Previous studies have shown the advantages of using activity-based costing (ABC) methodology in the health care industry. The potential values of ABC methodology in health care are derived from the more accurate cost calculation compared to the traditional step-down costing, and the potentials to evaluate quality or effectiveness of health care based on health care activities. This project used ABC methodology to profile the cost structure of inpatients with surgical procedures at the Department of Colorectal Surgery in a public teaching hospital, and to identify the missing or inappropriate clinical procedures. We found that ABC methodology was able to accurately calculate costs and to identify several missing pre- and post-surgical nursing education activities in the course of treatment.

  5. Mirror-mark tests performed on jackdaws reveal potential methodological problems in the use of stickers in avian mark-test studies.

    Directory of Open Access Journals (Sweden)

    Manuel Soler

    Full Text Available Some animals are capable of recognizing themselves in a mirror, which is considered to be demonstrated by passing the mark test. Mirror self-recognition capacity has been found in just a few mammals having very large brains and only in one bird, the magpie (Pica pica. The results obtained in magpies have enormous biological and cognitive implications because the fact that magpies were able to pass the mark test meant that this species is at the same cognitive level with great apes, that mirror self-recognition has evolved independently in the magpie and great apes (which diverged 300 million years ago, and that the neocortex (which is not present in the bird's brains is not a prerequisite for mirror self-recognition as previously believed. Here, we have replicated the experimental design used on magpies to determine whether jackdaws (Corvus monedula are also capable of mirror self-recognition by passing the mark test. We found that our nine jackdaws showed a very high interest towards the mirror and exhibited self-contingent behavior as soon as mirrors were introduced. However, jackdaws were not able to pass the mark test: both sticker-directed actions and sticker removal were performed with a similar frequency in both the cardboard (control and the mirror conditions. We conclude that our jackdaws' behaviour raises non-trivial questions about the methodology used in the avian mark test. Our study suggests that the use of self-adhesive stickers on sensitive throat feathers may open the way to artefactual results because birds might perceive the stickers tactilely.

  6. Methodological approaches to perform a site specific PSA on the effects of comprehensive events; Methodische Ansaetze zur Durchfuehrung einer standortspezifischen PSA zu den Auswirkungen uebergreifender Einwirkungen

    Energy Technology Data Exchange (ETDEWEB)

    Tuerschmann, Michael; Sperbeck, Silvio; Frey, Walter

    2016-12-15

    Main objective of the project 3612R01550 performed on behalf of the Federal Ministry for the Environment, Nature Conservation, Building and Nuclear Safety (BMUB) is the development of an approach for systematic consideration of dependencies in case of internal and external hazards and their combinations in the probabilistic plant model for nuclear power plants. One of the major aspects of a site specific Level 1 PSA carried out for a nuclear power plant outlined in this report is taking comprehensively into account the entire risks resulting from internal and external hazards. In a first step, all the hazards which may occur at the site under investigation have to be identified. This requires a compilation of the potential hazards and their possible combinations: Based on this compilation of generic hazards a site specific list of hazards to be considered in the analysis can be derived based on a screening process taking into account regulatory requirements and insights from site and plant walk-downs. In a second step, the hazards to be considered for the specific site have to be classified with respect to the depth of the probabilistic analyses to be carried out. This classification covers three categories: hazards with a negligible contribution to the overall risk, hazards with such a low risk contribution that a rough quantitative assessment is sufficient, and hazards which need in-depth probabilistic analysis. Based on the available Level 1 PSA model for internal events, a systematic approach for in-depth probabilistic analyses of hazards and their combinations is proposed. In this context, lists of those structures, systems and components, which can be impaired in their required function resulting in a risk increase, are provided. One of these lists contains the equipment, the other one the dependencies to be considered for the corresponding hazard. In addition to the general approach for performing site specific PSA, a procedure for modelling dependencies in

  7. Testing methodologies

    Energy Technology Data Exchange (ETDEWEB)

    Bender, M.A.

    1990-01-01

    Several methodologies are available for screening human populations for exposure to ionizing radiation. Of these, aberration frequency determined in peripheral blood lymphocytes is the best developed. Individual exposures to large doses can easily be quantitated, and population exposures to occupational levels can be detected. However, determination of exposures to the very low doses anticipated from a low-level radioactive waste disposal site is more problematical. Aberrations occur spontaneously, without known cause. Exposure to radiation induces no new or novel types, but only increases their frequency. The limitations of chromosomal aberration dosimetry for detecting low level radiation exposures lie mainly in the statistical signal to noise'' problem, the distribution of aberrations among cells and among individuals, and the possible induction of aberrations by other environmental occupational or medical exposures. However, certain features of the human peripheral lymphocyte-chromosomal aberration system make it useful in screening for certain types of exposures. Future technical developments may make chromosomal aberration dosimetry more useful for low-level radiation exposures. Other methods, measuring gene mutations or even minute changes on the DNA level, while presently less will developed techniques, may eventually become even more practical and sensitive assays for human radiation exposure. 15 refs.

  8. Testing methodologies

    International Nuclear Information System (INIS)

    Bender, M.A.

    1990-01-01

    Several methodologies are available for screening human populations for exposure to ionizing radiation. Of these, aberration frequency determined in peripheral blood lymphocytes is the best developed. Individual exposures to large doses can easily be quantitated, and population exposures to occupational levels can be detected. However, determination of exposures to the very low doses anticipated from a low-level radioactive waste disposal site is more problematical. Aberrations occur spontaneously, without known cause. Exposure to radiation induces no new or novel types, but only increases their frequency. The limitations of chromosomal aberration dosimetry for detecting low level radiation exposures lie mainly in the statistical ''signal to noise'' problem, the distribution of aberrations among cells and among individuals, and the possible induction of aberrations by other environmental occupational or medical exposures. However, certain features of the human peripheral lymphocyte-chromosomal aberration system make it useful in screening for certain types of exposures. Future technical developments may make chromosomal aberration dosimetry more useful for low-level radiation exposures. Other methods, measuring gene mutations or even minute changes on the DNA level, while presently less will developed techniques, may eventually become even more practical and sensitive assays for human radiation exposure. 15 refs

  9. Regional Shelter Analysis Methodology

    Energy Technology Data Exchange (ETDEWEB)

    Dillon, Michael B. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Dennison, Deborah [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Kane, Jave [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Walker, Hoyt [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Miller, Paul [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2015-08-01

    The fallout from a nuclear explosion has the potential to injure or kill 100,000 or more people through exposure to external gamma (fallout) radiation. Existing buildings can reduce radiation exposure by placing material between fallout particles and exposed people. Lawrence Livermore National Laboratory was tasked with developing an operationally feasible methodology that could improve fallout casualty estimates. The methodology, called a Regional Shelter Analysis, combines the fallout protection that existing buildings provide civilian populations with the distribution of people in various locations. The Regional Shelter Analysis method allows the consideration of (a) multiple building types and locations within buildings, (b) country specific estimates, (c) population posture (e.g., unwarned vs. minimally warned), and (d) the time of day (e.g., night vs. day). The protection estimates can be combined with fallout predictions (or measurements) to (a) provide a more accurate assessment of exposure and injury and (b) evaluate the effectiveness of various casualty mitigation strategies. This report describes the Regional Shelter Analysis methodology, highlights key operational aspects (including demonstrating that the methodology is compatible with current tools), illustrates how to implement the methodology, and provides suggestions for future work.

  10. A review of the current state-of-the-art methodology for handling bias and uncertainty in performing criticality safety evaluations. Final report

    International Nuclear Information System (INIS)

    Disney, R.K.

    1994-10-01

    The methodology for handling bias and uncertainty when calculational methods are used in criticality safety evaluations (CSE's) is a rapidly evolving technology. The changes in the methodology are driven by a number of factors. One factor responsible for changes in the methodology for handling bias and uncertainty in CSE's within the overview of the US Department of Energy (DOE) is a shift in the overview function from a ''site'' perception to a more uniform or ''national'' perception. Other causes for change or improvement in the methodology for handling calculational bias and uncertainty are; (1) an increased demand for benchmark criticals data to expand the area (range) of applicability of existing data, (2) a demand for new data to supplement existing benchmark criticals data, (3) the increased reliance on (or need for) computational benchmarks which supplement (or replace) experimental measurements in critical assemblies, and (4) an increased demand for benchmark data applicable to the expanded range of conditions and configurations encountered in DOE site restoration and remediation

  11. Performance and Perception in the Flipped Learning Model: An Initial Approach to Evaluate the Effectiveness of a New Teaching Methodology in a General Science Classroom

    Science.gov (United States)

    González-Gómez, David; Jeong, Jin Su; Airado Rodríguez, Diego; Cañada-Cañada, Florentina

    2016-01-01

    "Flipped classroom" teaching methodology is a type of blended learning in which the traditional class setting is inverted. Lecture is shifted outside of class, while the classroom time is employed to solve problems or doing practical works through the discussion/peer collaboration of students and instructors. This relatively new…

  12. The analysis of RWAP(Rod Withdrawal at Power) using the KEPRI methodology

    International Nuclear Information System (INIS)

    Yang, C. K.; Kim, Y. H.

    2001-01-01

    KEPRI developed new methodology which was based on RASP(Reactor Analysis Support Package). In this paper, The analysis of RWAP(Rod Withdrawal at Power) accident which can result in reactivity and power distribution anomaly was performed using the KEPRI methodology. The calculation describes RWAP transient and documents the analysis, including the computer code modeling assumptions and input parameters used in the analysis. To validity for the new methodology, the result of calculation was compared with FSAR. As compared with FSAR, result of the calculation using the KEPRI Methodology is similar to FSAR's. And result of the sensitivity of postulated parameters were similar to the existing methodology

  13. Building ASIPS the Mescal methodology

    CERN Document Server

    Gries, Matthias

    2006-01-01

    A number of system designers use ASIP's rather than ASIC's to implement their system solutions. This book gives a comprehensive methodology for the design of these application-specific instruction processors (ASIPs). It includes demonstrations of applications of the methodologies using the Tipi research framework.

  14. Enhancement of the methodology of repository design and post-closure performance assessment for preliminary investigation stage (3). Progress report on NUMO-JAEA collaborative research in FY2013 (Joint research)

    International Nuclear Information System (INIS)

    Shibata, Masahiro; Sawada, Atsushi; Tachi, Yukio; Makino, Hitoshi; Wakasugi, Keiichiro; Mitsui, Seiichiro; Kitamura, Akira; Oda, Chie; Ishidera, Takamitsu; Suyama, Tadahiro; Hatanaka, Koichiro; Kamei, Gento; Yoshikawa, Hideki; Senba, Takeshi; Seo, Toshihiro; Kurosawa, Susumu; Goto, Junichi; Shibutani, Sanae; Goto, Takahiro; Kubota, Shigeru; Inagaki, Manabu; Moriya, Toshifumi; Suzuki, Satoru; Ishida, Keisuke; Nishio, Hikaru; Makiuchi, Akie; Fujihara, Hiroshi

    2015-03-01

    JAEA and NUMO have conducted a collaborative research work which is designed to enhance the methodology of repository design and post-closure performance assessment in preliminary investigation stage. With regard to (1) study on host rock suitability in terms of hydrology, based on some examples of developing method of hydro-geological structure model, acquired knowledge are arranged using the tree diagram, and model uncertainty and its influence on the evaluation items were discussed. With regard to (2) study on scenario development, the developed approach for “defining conditions” has been reevaluated and improved from practical viewpoints. In addition, the uncertainty evaluation for the effect of use of cementitious material, as well as glass dissolution model, was conducted with analytical evaluation. With regard to (3) study on setting radionuclide migration parameters, based on survey of precedent procedures, multiple-approach for distribution coefficient of rocks was established, and the adequacy of the approach was confirmed through its application to sedimentary rock and granitic rock. Besides, an approach for solubility setting was developed including the procedure of selection of solubility limiting solid phase. The adequacy of the approach was confirmed through its application to key radionuclides. (author)

  15. Measuring the Performance of Attention Networks with the Dalhousie Computerized Attention Battery (DalCAB): Methodology and Reliability in Healthy Adults.

    Science.gov (United States)

    Jones, Stephanie A H; Butler, Beverly C; Kintzel, Franziska; Johnson, Anne; Klein, Raymond M; Eskes, Gail A

    2016-01-01

    Attention is an important, multifaceted cognitive domain that has been linked to three distinct, yet interacting, networks: alerting, orienting, and executive control. The measurement of attention and deficits of attention within these networks is critical to the assessment of many neurological and psychiatric conditions in both research and clinical settings. The Dalhousie Computerized Attention Battery (DalCAB) was created to assess attentional functions related to the three attention networks using a range of tasks including: simple reaction time, go/no-go, choice reaction time, dual task, flanker, item and location working memory, and visual search. The current study provides preliminary normative data, test-retest reliability (intraclass correlations) and practice effects in DalCAB performance 24-h after baseline for healthy young adults (n = 96, 18-31 years). Performance on the DalCAB tasks demonstrated Good to Very Good test-retest reliability for mean reaction time, while accuracy and difference measures (e.g., switch costs, interference effects, and working memory load effects) were most reliable for tasks that require more extensive cognitive processing (e.g., choice reaction time, flanker, dual task, and conjunction search). Practice effects were common and pronounced at the 24-h interval. In addition, performance related to specific within-task parameters of the DalCAB sub-tests provides preliminary support for future formal assessment of the convergent validity of our interpretation of the DalCAB as a potential clinical and research assessment tool for measuring aspects of attention related to the alerting, orienting, and executive control networks.

  16. Measuring the performance of attention networks with the Dalhousie Computerized Attention Battery (DalCAB: Methodology and reliability in healthy adults

    Directory of Open Access Journals (Sweden)

    Stephanie Anne Holland Jones

    2016-06-01

    Full Text Available Attention is an important, multifaceted cognitive domain that has been linked to three distinct, yet interacting, networks: alerting, orienting, and executive control. The measurement of attention and deficits of attention within these networks is critical to the assessment of many neurological and psychiatric conditions in both research and clinical settings. The Dalhousie Computerized Attention Battery (DalCAB was created to assess attentional functions related to the three attention networks using a range of tasks including: simple reaction time, go/no-go, choice reaction time, dual task, flanker, item and location working memory and visual search. The current study provides preliminary normative data, test-retest reliability (intraclass correlations and practice effects in DalCAB performance 24-hours after baseline for healthy young adults (n = 96, 18-31 years. Performance on the DalCAB tasks demonstrated Good to Excellent test-retest reliability for mean reaction time, while accuracy and difference measures (e.g., switch costs, interference effects and working memory load effects were most reliable for tasks that require more extensive cognitive processing (e.g., choice reaction time, flanker, dual task, and conjunction search. Practice effects were common and pronounced at the 24-hour interval. In addition, performance related to specific within-task parameters of the DalCAB sub-tests provides preliminary support for future formal assessment of the convergent validity of our interpretation of the DalCAB as a potential clinical and research assessment tool for measuring aspects of attention related to the alerting, orienting and executive control networks.Keywords: computerized assessment; attention; orienting; alerting; executive function

  17. Exploring Health System Responsiveness in Ambulatory Care and Disease Management and its Relation to Other Dimensions of Health System Performance (RAC) - Study Design and Methodology.

    Science.gov (United States)

    Röttger, Julia; Blümel, Miriam; Engel, Susanne; Grenz-Farenholtz, Brigitte; Fuchs, Sabine; Linder, Roland; Verheyen, Frank; Busse, Reinhard

    2015-05-20

    The responsiveness of a health system is considered to be an intrinsic goal of health systems and an essential aspect in performance assessment. Numerous studies have analysed health system responsiveness and related concepts, especially across different countries and health systems. However, fewer studies have applied the concept for the evaluation of specific healthcare delivery structures and thoroughly analysed its determinants within one country. The aims of this study are to assess the level of perceived health system responsiveness to patients with chronic diseases in ambulatory care in Germany and to analyse the determinants of health system responsiveness as well as its distribution across different population groups. The target population consists of chronically ill people in Germany, with a focus on patients suffering from type 2 diabetes and/or from coronary heart disease (CHD). Data comes from two different sources: (i) cross-sectional survey data from a postal survey and (ii) claims data from a German sickness fund. Data from both sources will be linked at an individual-level. The postal survey has the purpose of measuring perceived health system responsiveness, health related quality of life, experiences with disease management programmes (DMPs) and (subjective) socioeconomic background. The claims data consists of information on (co)morbidities, service utilization, enrolment within a DMP and sociodemographic characteristics, including the type of residential area. RAC is one of the first projects linking survey data on health system responsiveness at individual level with claims data. With this unique database, it will be possible to comprehensively analyse determinants of health system responsiveness and its relation to other aspects of health system performance assessment. The results of the project will allow German health system decision-makers to assess the performance of nonclinical aspects of healthcare delivery and their determinants in two

  18. Transparent Guideline Methodology Needed

    DEFF Research Database (Denmark)

    Lidal, Ingeborg; Norén, Camilla; Mäkelä, Marjukka

    2013-01-01

    As part of learning at the Nordic Workshop of Evidence-based Medicine, we have read with interest the practice guidelines for central venous access, published in your Journal in 2012.1 We appraised the quality of this guideline using the checklist developed by The Evidence-Based Medicine Working ...... are based on best currently available evidence. Our concerns are in two main categories: the rigor of development, including methodology of searching, evaluating, and combining the evidence; and editorial independence, including funding and possible conflicts of interest....... Group.2 Similar criteria for guideline quality have been suggested elsewhere.3 Our conclusion was that this much needed guideline is currently unclear about several aspects of the methodology used in developing the recommendations. This means potential users cannot be certain that the recommendations...

  19. Performance assessment of the direct disposal in unsaturated tuff or spent nuclear fuel and high-level waste owned by USDOE: Volume 2, Methodology and results

    Energy Technology Data Exchange (ETDEWEB)

    Rechard, R.P. [ed.

    1995-03-01

    This assessment studied the performance of high-level radioactive waste and spent nuclear fuel in a hypothetical repository in unsaturated tuff. The results of this 10-month study are intended to help guide the Office of Environment Management of the US Department of Energy (DOE) on how to prepare its wastes for eventual permanent disposal. The waste forms comprised spent fuel and high-level waste currently stored at the Idaho National Engineering Laboratory (INEL) and the Hanford reservations. About 700 metric tons heavy metal (MTHM) of the waste under study is stored at INEL, including graphite spent nuclear fuel, highly enriched uranium spent fuel, low enriched uranium spent fuel, and calcined high-level waste. About 2100 MTHM of weapons production fuel, currently stored on the Hanford reservation, was also included. The behavior of the waste was analyzed by waste form and also as a group of waste forms in the hypothetical tuff repository. When the waste forms were studied together, the repository was assumed also to contain about 9200 MTHM high-level waste in borosilicate glass from three DOE sites. The addition of the borosilicate glass, which has already been proposed as a final waste form, brought the total to about 12,000 MTHM.

  20. Performance assessment of the direct disposal in unsaturated tuff or spent nuclear fuel and high-level waste owned by USDOE: Volume 2, Methodology and results

    International Nuclear Information System (INIS)

    Rechard, R.P.

    1995-03-01

    This assessment studied the performance of high-level radioactive waste and spent nuclear fuel in a hypothetical repository in unsaturated tuff. The results of this 10-month study are intended to help guide the Office of Environment Management of the US Department of Energy (DOE) on how to prepare its wastes for eventual permanent disposal. The waste forms comprised spent fuel and high-level waste currently stored at the Idaho National Engineering Laboratory (INEL) and the Hanford reservations. About 700 metric tons heavy metal (MTHM) of the waste under study is stored at INEL, including graphite spent nuclear fuel, highly enriched uranium spent fuel, low enriched uranium spent fuel, and calcined high-level waste. About 2100 MTHM of weapons production fuel, currently stored on the Hanford reservation, was also included. The behavior of the waste was analyzed by waste form and also as a group of waste forms in the hypothetical tuff repository. When the waste forms were studied together, the repository was assumed also to contain about 9200 MTHM high-level waste in borosilicate glass from three DOE sites. The addition of the borosilicate glass, which has already been proposed as a final waste form, brought the total to about 12,000 MTHM

  1. Including social impacts in LCIA

    DEFF Research Database (Denmark)

    Dreyer, Louise Camilla; Hauschild, Michael Zwicky; Schierbeck, Jens

    2004-01-01

    Sustainability management in industries is often defined by measuring the performance against the trippel bottom-line, People, Planet and Profit in business decisions. The product chain perspective inherent in LCA is very suitable for sustainability management but LCA methodology only considers...... activities in the product life cycle. Workersø fundamental rights, as defined by the ILO, are used as baseline in the method, and as a consequence, some of the issues addressed by the method are: child labour, discrimination, right to organise, and forced labour....

  2. Radiotracer methodology

    International Nuclear Information System (INIS)

    Eng, R.R.

    1988-01-01

    In 1923, George Hevesy demonstrated the distribution of radioactive lead in the horsebean plant. This early demonstration of the potential use of radiotracers in biology was reinforced when J.G. Hamilton and colleagues used iodine-131 for diagnostic purposes in patients. Then in 1950 Cassen et al. designed the first scintillation counter for measuring radioiodine in the body, using calcium tungstate crystals coupled to a photomultiplier tube. This was followed by the development of the Anger camera, which permitted visualization of radiotracer distribution in biological systems. From these significant early discoveries to the present, many advances have been made. They include the discovery and production of many useful radioisotopes; the formulation of these radioisotopes into useful radiotracers; the advent of first- , and second-, and third-generation instrumentation for monitoring in vitro and in vivo distributions of new radiotracers; and the application of this knowledge to allow us to better understand physiological processes and treat disease states. Radiotracer techniques are integral to numerous techniques described in this volume. Autoradiography, nuclear scintigraphy, positron emission tomography, and single-photon emission computed tomography (SPECT) are all dependent on an understanding of radiotracer techniques to properly utilize these probe devices

  3. Chapter three: methodology of exposure modeling

    CSIR Research Space (South Africa)

    Moschandreas, DJ

    2002-12-01

    Full Text Available methodologies and models are reviewed. Three exposure/measurement methodologies are assessed. Estimation methods focus on source evaluation and attribution, sources include those outdoors and indoors as well as in occupational and in-transit environments. Fate...

  4. Methodological Issues and Practices in Qualitative Research.

    Science.gov (United States)

    Bradley, Jana

    1993-01-01

    Discusses methodological issues concerning qualitative research and describes research practices that qualitative researchers use to address these methodological issues. Topics discussed include the researcher as interpreter, the emergent nature of qualitative research, understanding the experience of others, trustworthiness in qualitative…

  5. Economic evaluation studies in reproductive medicine: a systematic review of methodologic quality

    NARCIS (Netherlands)

    Moolenaar, Lobke M.; Vijgen, Sylvia M. C.; Hompes, Peter; van der Veen, Fulco; Mol, Ben Willem J.; Opmeer, Brent C.

    2013-01-01

    To evaluate the methodologic quality of economic analyses published in the field of reproductive medicine. Systematic review. Centers for reproductive care. Infertility patients. We performed a Medline search to identify economic evaluation studies in reproductive medicine. We included studies that

  6. The Behavior of Procurement Process as Described by Using System Dynamics Methodology

    OpenAIRE

    Mohd Yusoff, Mohd Izhan

    2018-01-01

    System dynamics methodology has been used in many fields of study which include supply chain, project management and performance, and procurement process. The said methodology enables the researchers to identify and study the impact of the variables or factors on the outcome of the model they developed. In this paper, we showed the use of system dynamics methodology in studying the behavior of procurement process that is totally different from those mentioned in previous studies. By using a t...

  7. Site-condition map for Portugal, Western Iberia: methodology and constraints on the performance of Vs30 proxies for stable continental regions in Europe.

    Science.gov (United States)

    Vilanova, S. P.; Narciso, J.; Carvalho, J. P.; Cancela, C.; Lopes, I.; Nemser, E. S.; Borges, J.

    2014-12-01

    Information on the amplification characteristics of the near-surface formations in a regional sense is essential to adequately represent both seismic hazard maps and ground shaking maps. Due to the scarceness of shear-wave velocity data in most regions, several methods have been proposed in order to obtain first order representations of Vs30. These include the surface geology method and the topographic slope method. The latter method has become the standard way for incorporating site effects into regional studies worldwide given the convenience provided by the global Vs30 Internet server. In the framework of project SCENE we developed a shear wave velocity database for Portugal. The database consists of 87 shear-wave velocity depth profiles from a variety of lithological and geological formations. We used an iterative three-step procedure to develop the Vs30 based site-condition map: 1) to define a preliminary set of geologically defined units based on the literature; 2) to calculate the distribution of Vs30 for each unit; and 3) to perform statistical tests in order to estimate the significance of the difference in the Vs30 distribution characteristics between the units. The units were merged according to the results of the statistical tests and the procedure was repeated. We started by classifying the sites into six generalized geological units. The final set consists of three units only: F1 (igneous, metamorphic and old sedimentary rocks); F2 (Neogene and Pleistocene formations); and F3 (Holocene deposits). We used the database to evaluate the performance of Vs30 proxies. The use of proxies based either on geological units or on correlations with the topographic slope shows relatively unbiased total residual distributions of the logarithm of Vs30. However, the performance of the methods varies significantly with the generalized geological unit analyzed. Both methods are biased towards lower values of Vs30 for rock formations. The topographic-slope method is

  8. Three-dimensional RAMA fluence methodology benchmarking

    International Nuclear Information System (INIS)

    Baker, S. P.; Carter, R. G.; Watkins, K. E.; Jones, D. B.

    2004-01-01

    This paper describes the benchmarking of the RAMA Fluence Methodology software, that has been performed in accordance with U. S. Nuclear Regulatory Commission Regulatory Guide 1.190. The RAMA Fluence Methodology has been developed by TransWare Enterprises Inc. through funding provided by the Electric Power Research Inst., Inc. (EPRI) and the Boiling Water Reactor Vessel and Internals Project (BWRVIP). The purpose of the software is to provide an accurate method for calculating neutron fluence in BWR pressure vessels and internal components. The Methodology incorporates a three-dimensional deterministic transport solution with flexible arbitrary geometry representation of reactor system components, previously available only with Monte Carlo solution techniques. Benchmarking was performed on measurements obtained from three standard benchmark problems which include the Pool Criticality Assembly (PCA), VENUS-3, and H. B. Robinson Unit 2 benchmarks, and on flux wire measurements obtained from two BWR nuclear plants. The calculated to measured (C/M) ratios range from 0.93 to 1.04 demonstrating the accuracy of the RAMA Fluence Methodology in predicting neutron flux, fluence, and dosimetry activation. (authors)

  9. Status of IAEA CRPI31018 “Development of Methodologies for the Assessment of Passive Safety System Performance in Advanced Reactors”

    International Nuclear Information System (INIS)

    Subki, Hadid M.

    2011-01-01

    Purpose of research coordination meeting: • To review progress and milestones on all research activities; • To discuss the preliminary experimental data obtained from the Natural Circulation Loop Facility L2 in Italy constructed for the assessment of different methodologies for the evaluation of the reliability of passive safety system; • To discuss lessons-to be-learned from the Fukushima Daiichi Accident in Japan and its implications to near future R&D needs on thermal-hydraulics and reactor safety; • To develop an outline of integrated annual technical report and future collaboration plan

  10. Third (3rd) Research Coordination Meeting of the CRP on Development of Methodologies for the Assessment of Passive Safety System Performance in Advanced Reactors. Presentations

    International Nuclear Information System (INIS)

    2011-01-01

    Purpose of the meeting: • To review progress and milestones on all research activities; • To discuss the preliminary experimental data obtained from the Natural Circulation Loop Facility L2 in Italy constructed for the assessment of different methodologies for the evaluation of the reliability of passive safety system; • To discuss lessons-to be-learned from the Fukushima Daiichi Accident in Japan and its implications to near future R&D needs on thermal-hydraulics and reactor safety; • To develop an outline of integrated annual technical report and future collaboration plan

  11. AEGIS methodology and a perspective from AEGIS methodology demonstrations

    International Nuclear Information System (INIS)

    Dove, F.H.

    1981-03-01

    Objectives of AEGIS (Assessment of Effectiveness of Geologic Isolation Systems) are to develop the capabilities needed to assess the post-closure safety of waste isolation in geologic formation; demonstrate these capabilities on reference sites; apply the assessment methodology to assist the NWTS program in site selection, waste package and repository design; and perform repository site analyses for the licensing needs of NWTS. This paper summarizes the AEGIS methodology, the experience gained from methodology demonstrations, and provides an overview in the following areas: estimation of the response of a repository to perturbing geologic and hydrologic events; estimation of the transport of radionuclides from a repository to man; and assessment of uncertainties

  12. Case Study Research Methodology

    Directory of Open Access Journals (Sweden)

    Mark Widdowson

    2011-01-01

    Full Text Available Commenting on the lack of case studies published in modern psychotherapy publications, the author reviews the strengths of case study methodology and responds to common criticisms, before providing a summary of types of case studies including clinical, experimental and naturalistic. Suggestions are included for developing systematic case studies and brief descriptions are given of a range of research resources relating to outcome and process measures. Examples of a pragmatic case study design and a hermeneutic single-case efficacy design are given and the paper concludes with some ethical considerations and an exhortation to the TA community to engage more widely in case study research.

  13. Soft Systems Methodology

    Science.gov (United States)

    Checkland, Peter; Poulter, John

    Soft systems methodology (SSM) is an approach for tackling problematical, messy situations of all kinds. It is an action-oriented process of inquiry into problematic situations in which users learn their way from finding out about the situation, to taking action to improve it. The learning emerges via an organised process in which the situation is explored using a set of models of purposeful action (each built to encapsulate a single worldview) as intellectual devices, or tools, to inform and structure discussion about a situation and how it might be improved. This paper, written by the original developer Peter Checkland and practitioner John Poulter, gives a clear and concise account of the approach that covers SSM's specific techniques, the learning cycle process of the methodology and the craft skills which practitioners develop. This concise but theoretically robust account nevertheless includes the fundamental concepts, techniques, core tenets described through a wide range of settings.

  14. Clinical trial methodology

    National Research Council Canada - National Science Library

    Peace, Karl E; Chen, Ding-Geng

    2011-01-01

    ... in the pharmaceutical industry, Clinical trial methodology emphasizes the importance of statistical thinking in clinical research and presents the methodology as a key component of clinical research...

  15. Novel methodology to perform sulfur hexafluoride (SF6)-based multiple-breath wash-in and washout in infants using current commercially available equipment.

    Science.gov (United States)

    Gustafsson, P M; Robinson, P D; Lindblad, A; Oberli, D

    2016-11-01

    Multiple-breath inert gas washout (MBW) is ideally suited for early detection and monitoring of serious lung disease, such as cystic fibrosis, in infants and young children. Validated commercial options for the MBW technique are limited, and suitability of nitrogen (N 2 )-based MBW is of concern given the detrimental effect of exposure to pure O 2 on infant breathing pattern. We propose novel methodology using commercially available N 2 MBW equipment to facilitate 4% sulfur hexafluoride (SF 6 ) multiple-breath inert gas wash-in and washout suitable for the infant age range. CO 2 , O 2 , and sidestream molar mass sensor signals were used to accurately calculate SF 6 concentrations. An improved dynamic method for synchronization of gas and respiratory flow was developed to take into account variations in sidestream sample flow during MBW measurement. In vitro validation of triplicate functional residual capacity (FRC) assessments was undertaken under dry ambient conditions using lung models ranging from 90 to 267 ml, with tidal volumes of 28-79 ml, and respiratory rates 20-60 per minute. The relative mean (SD, 95% confidence interval) error of triplicate FRC determinations by washout was -0.26 (1.84, -3.86 to +3.35)% and by wash-in was 0.57 (2.66, -4.66 to +5.79)%. The standard deviations [mean (SD)] of percentage error among FRC triplicates were 1.40 (1.14) and 1.38 (1.32) for washout and wash-in, respectively. The novel methodology presented achieved FRC accuracy as outlined by current MBW consensus recommendations (95% of measurements within 5% accuracy). Further clinical evaluation is required, but this new technique, using existing commercially available equipment, has exciting potential for research and clinical use. Copyright © 2016 the American Physiological Society.

  16. Methodological Considerations on the Relationship Between the 1,500-M Rowing Ergometer Performance and Vertical Jump in National-Level Adolescent Rowers.

    Science.gov (United States)

    Maciejewski, Hugo; Rahmani, Abderrahmane; Chorin, Frédéric; Lardy, Julien; Samozino, Pierre; Ratel, Sébastien

    2018-03-12

    The purpose of the present study was to investigate whether three different approaches for evaluating squat jump performance were correlated to rowing ergometer performance in elite adolescent rowers. Fourteen young male competitive rowers (15.3 ± 0.6 years), who took part in the French rowing national championships, performed a 1,500-m all-out rowing ergometer performance (P1500) and a squat jump (SJ) test. The performance in SJ was determined by calculating the jump height (HSJ in cm), a jump index (ISJ = HSJ · body mass · gravity, in J) and the mean power output (PSJ in W) from the Samozino et al.'s method. Furthermore, allometric modelling procedures were used to consider the importance of body mass (BM) in the assessment of HSJ, ISJ and PSJ, and their relationships with between P1500 and jump scores. P1500 was significantly correlated to HSJ (r2 = 0.29, P jump and rowing ergometer performances at the same rate, and that PSJ could be the best correlate of P1500. Therefore, the calculation of power seems to be more relevant than HSJ and ISJ to (i) evaluate jump performance, and (ii) infer the capacity of adolescent rowers to perform 1,500-m all-out rowing ergometer performance, irrespective of their body mass. This could help coaches to improve their training program and potentially identify talented young rowers.

  17. Supplement to the Disposal Criticality Analysis Methodology

    International Nuclear Information System (INIS)

    Thomas, D.A.

    1999-01-01

    The methodology for evaluating criticality potential for high-level radioactive waste and spent nuclear fuel after the repository is sealed and permanently closed is described in the Disposal Criticality Analysis Methodology Topical Report (DOE 1998b). The topical report provides a process for validating various models that are contained in the methodology and states that validation will be performed to support License Application. The Supplement to the Disposal Criticality Analysis Methodology provides a summary of data and analyses that will be used for validating these models and will be included in the model validation reports. The supplement also summarizes the process that will be followed in developing the model validation reports. These reports will satisfy commitments made in the topical report, and thus support the use of the methodology for Site Recommendation and License Application. It is concluded that this report meets the objective of presenting additional information along with references that support the methodology presented in the topical report and can be used both in validation reports and in answering request for additional information received from the Nuclear Regulatory Commission concerning the topical report. The data and analyses summarized in this report and presented in the references are not sufficient to complete a validation report. However, this information will provide a basis for several of the validation reports. Data from several references in this report have been identified with TBV-1349. Release of the TBV governing this data is required prior to its use in quality affecting activities and for use in analyses affecting procurement, construction, or fabrication. Subsequent to the initiation of TBV-1349, DOE issued a concurrence letter (Mellington 1999) approving the request to identify information taken from the references specified in Section 1.4 as accepted data

  18. A novel methodology for energy performance benchmarking of buildings by means of Linear Mixed Effect Model: The case of space and DHW heating of out-patient Healthcare Centres

    International Nuclear Information System (INIS)

    Capozzoli, Alfonso; Piscitelli, Marco Savino; Neri, Francesco; Grassi, Daniele; Serale, Gianluca

    2016-01-01

    Highlights: • 100 Healthcare Centres were analyzed to assess energy consumption reference values. • A novel robust methodology for energy benchmarking process was proposed. • A Linear Mixed Effect estimation Model was used to treat heterogeneous datasets. • A nondeterministic approach was adopted to consider the uncertainty in the process. • The methodology was developed to be upgradable and generalizable to other datasets. - Abstract: The current EU energy efficiency directive 2012/27/EU defines the existing building stocks as one of the most promising potential sector for achieving energy saving. Robust methodologies aimed to quantify the potential reduction of energy consumption for large building stocks need to be developed. To this purpose, a benchmarking analysis is necessary in order to support public planners in determining how well a building is performing, in setting credible targets for improving performance or in detecting abnormal energy consumption. In the present work, a novel methodology is proposed to perform a benchmarking analysis particularly suitable for heterogeneous samples of buildings. The methodology is based on the estimation of a statistical model for energy consumption – the Linear Mixed Effects Model –, so as to account for both the fixed effects shared by all individuals within a dataset and the random effects related to particular groups/classes of individuals in the population. The groups of individuals within the population have been classified by resorting to a supervised learning technique. Under this backdrop, a Monte Carlo simulation is worked out to compute the frequency distribution of annual energy consumption and identify a reference value for each group/class of buildings. The benchmarking analysis was tested for a case study of 100 out-patient Healthcare Centres in Northern Italy, finally resulting in 12 different frequency distributions for space and Domestic Hot Water heating energy consumption, one for

  19. Methodology for ranking restoration options

    DEFF Research Database (Denmark)

    Jensen, Per Hedemann

    1999-01-01

    techniques as a function of contamination and site characteristics. The project includes analyses of existing remediation methodologies and contaminated sites, and is structured in the following steps:-characterisation of relevant contaminated sites -identication and characterisation of relevant restoration...... techniques -assessment of the radiological impact -development and application of a selection methodology for restoration options -formulation ofgeneric conclusions and development of a manual The project is intended to apply to situations in which sites with nuclear installations have been contaminated...

  20. Sample size methodology

    CERN Document Server

    Desu, M M

    2012-01-01

    One of the most important problems in designing an experiment or a survey is sample size determination and this book presents the currently available methodology. It includes both random sampling from standard probability distributions and from finite populations. Also discussed is sample size determination for estimating parameters in a Bayesian setting by considering the posterior distribution of the parameter and specifying the necessary requirements. The determination of the sample size is considered for ranking and selection problems as well as for the design of clinical trials. Appropria

  1. Performance and Feasibility Analysis of a Grid Interactive Large Scale Wind/PV Hybrid System based on Smart Grid Methodology Case Study South Part – Jordan

    Directory of Open Access Journals (Sweden)

    Qais H. Alsafasfeh

    2015-02-01

    Full Text Available Most recent research on renewable energy resources main one goal to make Jordan less dependent on imported energy with locally developed and produced solar power, this paper discussed the efficient system of Wind/ PV Hybrid System to be than main power sources for south part of Jordan, the proposed hybrid system design based on Smart Grid Methodology,  the solar energy will be installed on top roof of  electricity subscribers across the Governorate of Maan, Tafila, Karak and Aqaba and the wind energy will set in one site by this way the capital cost for project will be reduced also the  simulation result show   the feasibility  is a very competitive and feasible cost . Economics analysis of a proposed renewable energy system was made using HOMER simulation and evaluation was completed with the cost per kilowatt of EDCO company, the net present cost is $2,551,676,416, the cost of energy is 0.07kWhr with a renewable fraction of 86.6 %.

  2. Butterfly valve torque prediction methodology

    International Nuclear Information System (INIS)

    Eldiwany, B.H.; Sharma, V.; Kalsi, M.S.; Wolfe, K.

    1994-01-01

    As part of the Motor-Operated Valve (MOV) Performance Prediction Program, the Electric Power Research Institute has sponsored the development of methodologies for predicting thrust and torque requirements of gate, globe, and butterfly MOVs. This paper presents the methodology that will be used by utilities to calculate the dynamic torque requirements for butterfly valves. The total dynamic torque at any disc position is the sum of the hydrodynamic torque, bearing torque (which is induced by the hydrodynamic force), as well as other small torque components (such as packing torque). The hydrodynamic torque on the valve disc, caused by the fluid flow through the valve, depends on the disc angle, flow velocity, upstream flow disturbances, disc shape, and the disc aspect ratio. The butterfly valve model provides sets of nondimensional flow and torque coefficients that can be used to predict flow rate and hydrodynamic torque throughout the disc stroke and to calculate the required actuation torque and the maximum transmitted torque throughout the opening and closing stroke. The scope of the model includes symmetric and nonsymmetric discs of different shapes and aspects ratios in compressible and incompressible fluid applications under both choked and nonchoked flow conditions. The model features were validated against test data from a comprehensive flowloop and in situ test program. These tests were designed to systematically address the effect of the following parameters on the required torque: valve size, disc shapes and disc aspect ratios, upstream elbow orientation and its proximity, and flow conditions. The applicability of the nondimensional coefficients to valves of different sizes was validated by performing tests on 42-in. valve and a precisely scaled 6-in. model. The butterfly valve model torque predictions were found to bound test data from the flow-loop and in situ testing, as shown in the examples provided in this paper

  3. Managerial Methodology in Public Institutions

    Directory of Open Access Journals (Sweden)

    Ion VERBONCU

    2010-10-01

    Full Text Available One of the most important ways of making public institutions more efficient is by applying managerial methodology, embodied in the promotion of management tools, modern and sophisticated methodologies, as well as operation of designing/redesigning and maintenance of the management process and its components. Their implementation abides the imprint of constructive and functional particularities of public institutions, decentralized and devolved, and, of course, the managers’ expertise of these organizations. Managerial methodology is addressed through three important instruments diagnosis, management by objectives and scoreboard. Its presence in the performance management process should be mandatory, given the favorable influence on the management and economic performance and the degree of scholastic approach of the managers’ performance.

  4. An Innovative Fuzzy-Logic-Based Methodology for Trend Identification

    International Nuclear Information System (INIS)

    Wang Xin; Tsoukalas, Lefteri H.; Wei, Thomas Y.C.; Reifman, Jaques

    2001-01-01

    A new fuzzy-logic-based methodology for on-line signal trend identification is introduced. The methodology may be used for detecting the onset of nuclear power plant (NPP) transients at the earliest possible time and could be of great benefit to diagnostic, maintenance, and performance-monitoring programs. Although signal trend identification is complicated by the presence of noise, fuzzy methods can help capture important features of on-line signals, integrate the information included in these features, and classify incoming NPP signals into increasing, decreasing, and steady-state trend categories. A computer program named PROTREN is developed and tested for the purpose of verifying this methodology using NPP and simulation data. The results indicate that the new fuzzy-logic-based methodology is capable of detecting transients accurately, it identifies trends reliably and does not misinterpret a steady-state signal as a transient one

  5. Response surface methodology for the determination of the design space of enantiomeric separations on cinchona-based zwitterionic chiral stationary phases by high performance liquid chromatography.

    Science.gov (United States)

    Hanafi, Rasha Sayed; Lämmerhofer, Michael

    2018-01-26

    Quality-by-Design approach for enantioselective HPLC method development surpasses Quality-by-Testing in offering the optimal separation conditions with the least number of experiments and in its ability to describe the method's Design Space visually which helps to determine enantiorecognition to a significant extent. Although some schemes exist for enantiomeric separations on Cinchona-based zwitterionic stationary phases, the exact design space and the weights by which each of the chromatographic parameters influences the separation have not yet been statistically studied. In the current work, a screening design followed by a Response Surface Methodology optimization design were adopted for enantioseparation optimization of 3 model drugs namely the acidic Fmoc leucine, the amphoteric tryptophan and the basic salbutamol. The screening design proved that the acid/base additives are of utmost importance for the 3 chiral drugs, and that among 3 different pairs of acids and bases, acetic acid and diethylamine is the couple able to provide acceptable resolution at variable conditions. Visualization of the response surface of the retention factor, separation factor and resolution helped describe accurately the magnitude by which each chromatographic factor (% MeOH, concentration and ratio of acid base modifiers) affects the separation while interacting with other parameters. The global optima compromising highest enantioresolution with the least run time for the 3 chiral model drugs varied extremely, where it was best to set low % methanol with equal ratio of acid-base modifiers for the acidic drug, very high % methanol and 10-fold higher concentration of the acid for the amphoteric drug while 20 folds of the base modifier with moderate %methanol were needed for the basic drug. Considering the selected drugs as models for many series of structurally related compounds, the design space defined and the optimum conditions computed are the key for method development on

  6. The intersections between TRIZ and forecasting methodology

    Directory of Open Access Journals (Sweden)

    Georgeta BARBULESCU

    2010-12-01

    Full Text Available The authors’ intention is to correlate the basic knowledge in using the TRIZ methodology (Theory of Inventive Problem Solving or in Russian: Teoriya Resheniya Izobretatelskikh Zadatch as a problem solving tools meant to help the decision makers to perform more significant forecasting exercises. The idea is to identify the TRIZ features and instruments (40 inventive principles, i.e. for putting in evidence the noise and signal problem, for trend identification (qualitative and quantitative tendencies and support tools in technological forecasting, to make the decision-makers able to refine and to increase the level of confidence in the forecasting results. The interest in connecting TRIZ to forecasting methodology, nowadays, relates to the massive application of TRIZ methods and techniques for engineering system development world-wide and in growing application of TRIZ’s concepts and paradigms for improvements of non-engineering systems (including the business and economic applications.

  7. Tornado missile simulation and design methodology. Volume 1: simulation methodology, design applications, and TORMIS computer code. Final report

    International Nuclear Information System (INIS)

    Twisdale, L.A.; Dunn, W.L.

    1981-08-01

    A probabilistic methodology has been developed to predict the probabilities of tornado-propelled missiles impacting and damaging nuclear power plant structures. Mathematical models of each event in the tornado missile hazard have been developed and sequenced to form an integrated, time-history simulation methodology. The models are data based where feasible. The data include documented records of tornado occurrence, field observations of missile transport, results of wind tunnel experiments, and missile impact tests. Probabilistic Monte Carlo techniques are used to estimate the risk probabilities. The methodology has been encoded in the TORMIS computer code to facilitate numerical analysis and plant-specific tornado missile probability assessments. Sensitivity analyses have been performed on both the individual models and the integrated methodology, and risk has been assessed for a hypothetical nuclear power plant design case study

  8. Improvement of Safety Assessment Methodologies for Near Surface Disposal Facilities

    International Nuclear Information System (INIS)

    Batandjieva, B.; Torres-Vidal, C.

    2002-01-01

    The International Atomic Energy Agency (IAEA) Coordinated research program ''Improvement of Safety Assessment Methodologies for Near Surface Disposal Facilities'' (ISAM) has developed improved safety assessment methodology for near surface disposal facilities. The program has been underway for three years and has included around 75 active participants from 40 countries. It has also provided examples for application to three safety cases--vault, Radon type and borehole radioactive waste disposal facilities. The program has served as an excellent forum for exchange of information and good practices on safety assessment approaches and methodologies used worldwide. It also provided an opportunity for reaching broad consensus on the safety assessment methodologies to be applied to near surface low and intermediate level waste repositories. The methodology has found widespread acceptance and the need for its application on real waste disposal facilities has been clearly identified. The ISAM was finalized by the end of 2000, working material documents are available and an IAEA report will be published in 2002 summarizing the work performed during the three years of the program. The outcome of the ISAM program provides a sound basis for moving forward to a new IAEA program, which will focus on practical application of the safety assessment methodologies to different purposes, such as licensing radioactive waste repositories, development of design concepts, upgrading existing facilities, reassessment of operating repositories, etc. The new program will also provide an opportunity for development of guidance on application of the methodology that will be of assistance to both safety assessors and regulators

  9. Initial performance assessment of the disposal of spent nuclear fuel and high-level waste stored at Idaho National Engineering Laboratory. Volume 1, Methodology and results

    Energy Technology Data Exchange (ETDEWEB)

    Rechard, R.P. [ed.

    1993-12-01

    This performance assessment characterized plausible treatment options conceived by the Idaho National Engineering Laboratory (INEL) for its spent fuel and high-level radioactive waste and then modeled the performance of the resulting waste forms in two hypothetical, deep, geologic repositories: one in bedded salt and the other in granite. The results of the performance assessment are intended to help guide INEL in its study of how to prepare wastes and spent fuel for eventual permanent disposal. This assessment was part of the Waste Management Technology Development Program designed to help the US Department of Energy develop and demonstrate the capability to dispose of its nuclear waste. Although numerous caveats must be placed on the results, the general findings were as follows: Though the waste form behavior depended upon the repository type, all current and proposed waste forms provided acceptable behavior in the salt and granite repositories.

  10. Training and qualification of the auxiliaries of operation using the methodology On the Job Training (OJT) and Task Performance Evaluation (TPE)

    International Nuclear Information System (INIS)

    Martinez Casado, J.

    2015-01-01

    On the Job Training (OJT) and Task Performance Evaluation (TPE). This plan has been developed and put in practice entirely, by a group of experienced auxiliary operation that have distinguished themselves by their professionalism, knowledge of the work, technical expertise and commitment to nuclear safety. (Author)

  11. Response surface methodology based optimization of diesel–n-butanol –cotton oil ternary blend ratios to improve engine performance and exhaust emission characteristics

    International Nuclear Information System (INIS)

    Atmanlı, Alpaslan; Yüksel, Bedri; İleri, Erol; Deniz Karaoglan, A.

    2015-01-01

    Highlights: • RSM based optimization for optimum blend ratio of diesel fuel, n-butanol and cotton oil was done. • 65.5 vol.% diesel fuel, 23.1 vol.% n-butanol and 11.4 vol.% cotton oil (DnBC) was determined. • DnBC decreased brake torque, brake power, BTE and BMEP, while increased BSFC. • DnBC decreased NO x , CO and HC emissions. - Abstract: Many studies declare that 20% biodiesel is the optimum concentration for biodiesel–diesel fuel blends to improve performance. The present work focuses on finding diesel fuel, n-butanol, and cotton oil optimum blend ratios for diesel engine applications by using the response surface method (RSM). Experimental test fuels were prepared by choosing 7 different concentrations, where phase decomposition did not occur in the phase diagram of −10 °C. Experiments were carried out at full load conditions and the constant speed (2200 rpm) of maximum brake torque to determine engine performance and emission parameters. According to the test results of the engine, optimization was done by using RSM considering engine performance and exhaust emissions parameters, to identify the rates of concentrations of components in the optimum blend of three. Confirmation tests were employed to compare the output values of concentrations that were identified by optimization. The real experiment results and the R 2 actual values that show the relation between the outputs from the optimizations and real experiments were determined in high accordance. The optimum component concentration was determined as 65.5 vol.% diesel, 23.1 vol.% n-butanol and 11.4 vol.% cotton oil (DnBC). According to engine performance tests brake torque, brake power, BTE and BMEP of DnBC decreased while BSFC increased compared to those of diesel fuel. NO x , CO and HC emissions of DnBC drastically decreased as 11.33%, 45.17% and 81.45%, respectively

  12. Enhancement of the methodology of repository design and post-closure performance assessment for preliminary investigation stage. Progress report on NUMO-JAEA collaborative research in FY2011 (Joint research)

    International Nuclear Information System (INIS)

    Shibata, Masahiro; Sawada, Atsushi; Tachi, Yukio; Makino, Hitoshi; Hayano, Akira; Mitsui, Seiichiro; Taniguchi, Naoki; Oda, Chie; Kitamura, Akira; Osawa, Hideaki; Semba, Takeshi; Hioki, Kazumasa; Kamei, Gento; Ebashi, Takeshi; Kubota, Shigeru; Kurosawa, Susumu; Goto, Junichi; Goto, Takahiro; Ishii, Eiichi; Inagaki, Manabu; Moriya, Toshifumi; Suzuki, Satoru; Ohi, Takao; Ichihara, Takayuki; Ishida, Keisuke; Ishiguro, Katsuhiko; Tsuchi, Hiroyuki

    2012-09-01

    JAEA and NUMO have conducted a collaborative research work which is designed to enhance the methodology of repository design and performance assessment in preliminary investigation stage. The topics of such joint research are (1) study on selection of host rock, (2) study on development of scenario, (3) study on setting nuclide migration parameters, (4) study on ensuring quality of knowledge. With regard to (1), in terms of hydraulic properties, items for assessing rock property, and assessment methodology of groundwater travel time has been organized with interaction from site investigation. With regard to (2), the existing approach has been embodied, in addition, the phenomenological understanding regarding dissolution of and nuclide release from vitrified waste, corrosion of the overpack, long-term performance of the buffer are summarized. With regard to (3), the approach for parameter setting has been improved for sorption and diffusion coefficient of buffer/rock, and applied and tested for parameter setting of key radionuclides. With regard to (4), framework for ensuring quality of knowledge has been studied and examined aimed at the likely disposal facility condition. (author)

  13. New seismograph includes filters

    Energy Technology Data Exchange (ETDEWEB)

    1979-11-02

    The new Nimbus ES-1210 multichannel signal enhancement seismograph from EG and G geometrics has recently been redesigned to include multimode signal fillers on each amplifier. The ES-1210F is a shallow exploration seismograph for near subsurface exploration such as in depth-to-bedrock, geological hazard location, mineral exploration, and landslide investigations.

  14. Waste Package Design Methodology Report

    Energy Technology Data Exchange (ETDEWEB)

    D.A. Brownson

    2001-09-28

    The objective of this report is to describe the analytical methods and processes used by the Waste Package Design Section to establish the integrity of the various waste package designs, the emplacement pallet, and the drip shield. The scope of this report shall be the methodology used in criticality, risk-informed, shielding, source term, structural, and thermal analyses. The basic features and appropriateness of the methods are illustrated, and the processes are defined whereby input values and assumptions flow through the application of those methods to obtain designs that ensure defense-in-depth as well as satisfy requirements on system performance. Such requirements include those imposed by federal regulation, from both the U.S. Department of Energy (DOE) and U.S. Nuclear Regulatory Commission (NRC), and those imposed by the Yucca Mountain Project to meet repository performance goals. The report is to be used, in part, to describe the waste package design methods and techniques to be used for producing input to the License Application Report.

  15. Waste Package Design Methodology Report

    International Nuclear Information System (INIS)

    D.A. Brownson

    2001-01-01

    The objective of this report is to describe the analytical methods and processes used by the Waste Package Design Section to establish the integrity of the various waste package designs, the emplacement pallet, and the drip shield. The scope of this report shall be the methodology used in criticality, risk-informed, shielding, source term, structural, and thermal analyses. The basic features and appropriateness of the methods are illustrated, and the processes are defined whereby input values and assumptions flow through the application of those methods to obtain designs that ensure defense-in-depth as well as satisfy requirements on system performance. Such requirements include those imposed by federal regulation, from both the U.S. Department of Energy (DOE) and U.S. Nuclear Regulatory Commission (NRC), and those imposed by the Yucca Mountain Project to meet repository performance goals. The report is to be used, in part, to describe the waste package design methods and techniques to be used for producing input to the License Application Report

  16. Analysis of eleven phenolic compounds including novel p-coumaroyl derivatives in lettuce (Lactuca sativa L.) by ultra-high-performance liquid chromatography with photodiode array and mass spectrometry detection.

    Science.gov (United States)

    Ribas-Agustí, Albert; Gratacós-Cubarsí, Marta; Sárraga, Carmen; García-Regueiro, José-Antonio; Castellari, Massimo

    2011-01-01

    Lettuce is a widely consumed vegetable and a good source of phenolic compounds. Several factors (genetic, agronomical and environmental) can influence the lettuce composition; their effects are not completely defined and more studies are needed on this topic. To develop an improved ultra-high-performance liquid chromatography (UHPLC) method to quantify the main target intact phenolic compounds in lettuce. UHPLC identification of the compounds was supported by PAD spectra and MS(n) analyses. Quantification was carried out by PAD, by creating matrix-matched calibration curves at the specific wavelength for each compound. Sample pretreatment was simplified, with neither purification nor hydrolysis steps. Chromatographic conditions were chosen to minimise matrix interferences and to give a suitable separation of the major phenolic compounds within 27 min. The method allowed the quantification of 11 intact phenolic compounds in Romaine lettuces, including phenolic acids (caffeoyl and p-coumaroyl esters) and flavonoids (quercetin glycosides). Four p-coumaroyl esters were tentatively identified and quantified for the first time in lettuce. The main intact phenolic compounds, including four novel p-coumaroyl esters, were simultaneously quantified in lettuce with optimal performances and a reduced total time of analysis. These findings make headway in the understanding of the lettuce phytochemicals with potential nutritional relevance. Copyright © 2011 John Wiley & Sons, Ltd.

  17. Analytic device including nanostructures

    KAUST Repository

    Di Fabrizio, Enzo M.; Fratalocchi, Andrea; Totero Gongora, Juan Sebastian; Coluccio, Maria Laura; Candeloro, Patrizio; Cuda, Gianni

    2015-01-01

    A device for detecting an analyte in a sample comprising: an array including a plurality of pixels, each pixel including a nanochain comprising: a first nanostructure, a second nanostructure, and a third nanostructure, wherein size of the first nanostructure is larger than that of the second nanostructure, and size of the second nanostructure is larger than that of the third nanostructure, and wherein the first nanostructure, the second nanostructure, and the third nanostructure are positioned on a substrate such that when the nanochain is excited by an energy, an optical field between the second nanostructure and the third nanostructure is stronger than an optical field between the first nanostructure and the second nanostructure, wherein the array is configured to receive a sample; and a detector arranged to collect spectral data from a plurality of pixels of the array.

  18. Saskatchewan resources. [including uranium

    Energy Technology Data Exchange (ETDEWEB)

    1979-09-01

    The production of chemicals and minerals for the chemical industry in Saskatchewan are featured, with some discussion of resource taxation. The commodities mentioned include potash, fatty amines, uranium, heavy oil, sodium sulfate, chlorine, sodium hydroxide, sodium chlorate and bentonite. Following the successful outcome of the Cluff Lake inquiry, the uranium industry is booming. Some developments and production figures for Gulf Minerals, Amok, Cenex and Eldorado are mentioned.

  19. RHIC Data Correlation Methodology

    International Nuclear Information System (INIS)

    Michnoff, R.; D'Ottavio, T.; Hoff, L.; MacKay, W.; Satogata, T.

    1999-01-01

    A requirement for RHIC data plotting software and physics analysis is the correlation of data from all accelerator data gathering systems. Data correlation provides the capability for a user to request a plot of multiple data channels vs. time, and to make meaningful time-correlated data comparisons. The task of data correlation for RHIC requires careful consideration because data acquisition triggers are generated from various asynchronous sources including events from the RHIC Event Link, events from the two Beam Sync Links, and other unrelated clocks. In order to correlate data from asynchronous acquisition systems a common time reference is required. The RHIC data correlation methodology will allow all RHIC data to be converted to a common wall clock time, while still preserving native acquisition trigger information. A data correlation task force team, composed of the authors of this paper, has been formed to develop data correlation design details and provide guidelines for software developers. The overall data correlation methodology will be presented in this paper

  20. Methodology of a systematic review.

    Science.gov (United States)

    Linares-Espinós, E; Hernández, V; Domínguez-Escrig, J L; Fernández-Pello, S; Hevia, V; Mayor, J; Padilla-Fernández, B; Ribal, M J

    2018-05-03

    The objective of evidence-based medicine is to employ the best scientific information available to apply to clinical practice. Understanding and interpreting the scientific evidence involves understanding the available levels of evidence, where systematic reviews and meta-analyses of clinical trials are at the top of the levels-of-evidence pyramid. The review process should be well developed and planned to reduce biases and eliminate irrelevant and low-quality studies. The steps for implementing a systematic review include (i) correctly formulating the clinical question to answer (PICO), (ii) developing a protocol (inclusion and exclusion criteria), (iii) performing a detailed and broad literature search and (iv) screening the abstracts of the studies identified in the search and subsequently of the selected complete texts (PRISMA). Once the studies have been selected, we need to (v) extract the necessary data into a form designed in the protocol to summarise the included studies, (vi) assess the biases of each study, identifying the quality of the available evidence, and (vii) develop tables and text that synthesise the evidence. A systematic review involves a critical and reproducible summary of the results of the available publications on a particular topic or clinical question. To improve scientific writing, the methodology is shown in a structured manner to implement a systematic review. Copyright © 2018 AEU. Publicado por Elsevier España, S.L.U. All rights reserved.

  1. Methodology for the case studies

    NARCIS (Netherlands)

    Smits, M.J.W.; Woltjer, G.B.

    2017-01-01

    This document is about the methodology and selection of the case studies. It is meant as a guideline for the case studies, and together with the other reports in this work package can be a source of inform ation for policy officers, interest groups and researchers evaluating or performing impact

  2. Development of measures to assess the safety of existing NPPs and the effectiveness of regulations and regulatory actions (including 'prescriptive' and 'performance based' approaches). Peer discussions on regulatory practices

    International Nuclear Information System (INIS)

    1996-09-01

    This report arises from the fourth series of peer discussions on regulatory practices entitled D evelopment of measures to assess the safety of existing nuclear power plants and the effectiveness of regulations and regulatory actions (including 'prescriptive' and 'performance based' approaches) . Senior regulators from 23 Member States participated in four peer group discussions during 1995-1996. This report presents the outcome of these meetings and recommendations of good practices identified by these senior regulators. The purpose of this report is to disseminate the views which the senior regulators presented at the meetings relating to measures used for assessing the safety of existing nuclear power plants and evaluating the effectiveness of regulators and regulatory actions. The intention in doing this is to assist Member States in the enhancement of their regulatory practices by identifying commonly accepted good practices. This report is structured so that it covers the subject matter under the following main headings: 'Prescriptive and Performance Based' Approaches to Regulation; Common Features of Regulatory Approaches; Effectiveness of the Regulator and Regulatory Actions; Recommendations of Good Practice. It is important to note that recommendations of good practice are included if they have been identified by at least one of the groups. It does not follow that all of the groups or individual Member States would necessarily endorse all of the recommendations. However, it is considered that if a single group of senior regulators judge that a particular practice is worthy of recommendation then it should be included for serious consideration. In some cases the same recommendations arise from all of the Groups

  3. Implementing DBS methodology for the determination of Compound A in monkey blood: GLP method validation and investigation of the impact of blood spreading on performance.

    Science.gov (United States)

    Fan, Leimin; Lee, Jacob; Hall, Jeffrey; Tolentino, Edward J; Wu, Huaiqin; El-Shourbagy, Tawakol

    2011-06-01

    This article describes validation work for analysis of an Abbott investigational drug (Compound A) in monkey whole blood with dried blood spots (DBS). The impact of DBS spotting volume on analyte concentration was investigated. The quantitation range was between 30.5 and 10,200 ng/ml. Accuracy and precision of quality controls, linearity of calibration curves, matrix effect, selectivity, dilution, recovery and multiple stabilities were evaluated in the validation, and all demonstrated acceptable results. Incurred sample reanalysis was performed with 57 out of 58 samples having a percentage difference (versus the mean value) less than 20%. A linear relationship between the spotting volume and the spot area was drawn. The influence of spotting volume on concentration was discussed. All validation results met good laboratory practice acceptance requirements. Radial spreading of blood on DBS cards can be a factor in DBS concentrations at smaller spotting volumes.

  4. Environmental Testing Methodology in Biometrics

    OpenAIRE

    Fernández Saavedra, Belén; Sánchez Reíllo, Raúl; Alonso Moreno, Raúl; Miguel Hurtado, Óscar

    2010-01-01

    8 pages document + 5-slide presentation.-- Contributed to: 1st International Biometric Performance Conference (IBPC 2010, NIST, Gaithersburg, MD, US, Mar 1-5, 2010). Recently, biometrics is used in many security systems and these systems can be located in different environments. As many experts claim and previous works have demonstrated, environmental conditions influence biometric performance. Nevertheless, there is not a specific methodology for testing this influence at the moment...

  5. Being Included and Excluded

    DEFF Research Database (Denmark)

    Korzenevica, Marina

    2016-01-01

    Following the civil war of 1996–2006, there was a dramatic increase in the labor mobility of young men and the inclusion of young women in formal education, which led to the transformation of the political landscape of rural Nepal. Mobility and schooling represent a level of prestige that rural...... politics. It analyzes how formal education and mobility either challenge or reinforce traditional gendered norms which dictate a lowly position for young married women in the household and their absence from community politics. The article concludes that women are simultaneously excluded and included from...... community politics. On the one hand, their mobility and decision-making powers decrease with the increase in the labor mobility of men and their newly gained education is politically devalued when compared to the informal education that men gain through mobility, but on the other hand, schooling strengthens...

  6. Tourism Methodologies - New Perspectives, Practices and Procedures

    DEFF Research Database (Denmark)

    This volume offers methodological discussions within the multidisciplinary field of tourism and shows how tourism researchers develop and apply new tourism methodologies. The book is presented as an anthology, giving voice to many diverse researchers who reflect on tourism methodology in differen...... codings and analysis, and tapping into the global network of social media.......This volume offers methodological discussions within the multidisciplinary field of tourism and shows how tourism researchers develop and apply new tourism methodologies. The book is presented as an anthology, giving voice to many diverse researchers who reflect on tourism methodology in different...... in interview and field work situations, and how do we engage with the performative aspects of tourism as a field of study? The book acknowledges that research is also performance and that it constitutes an aspect of intervention in the situations and contexts it is trying to explore. This is an issue dealt...

  7. Determination of reversed-phase high performance liquid chromatography based octanol-water partition coefficients for neutral and ionizable compounds: Methodology evaluation.

    Science.gov (United States)

    Liang, Chao; Qiao, Jun-Qin; Lian, Hong-Zhen

    2017-12-15

    Reversed-phase liquid chromatography (RPLC) based octanol-water partition coefficient (logP) or distribution coefficient (logD) determination methods were revisited and assessed comprehensively. Classic isocratic and some gradient RPLC methods were conducted and evaluated for neutral, weak acid and basic compounds. Different lipophilicity indexes in logP or logD determination were discussed in detail, including the retention factor logk w corresponding to neat water as mobile phase extrapolated via linear solvent strength (LSS) model from isocratic runs and calculated with software from gradient runs, the chromatographic hydrophobicity index (CHI), apparent gradient capacity factor (k g ') and gradient retention time (t g ). Among the lipophilicity indexes discussed, logk w from whether isocratic or gradient elution methods best correlated with logP or logD. Therefore logk w is recommended as the preferred lipophilicity index for logP or logD determination. logk w easily calculated from methanol gradient runs might be the main candidate to replace logk w calculated from classic isocratic run as the ideal lipophilicity index. These revisited RPLC methods were not applicable for strongly ionized compounds that are hardly ion-suppressed. A previously reported imperfect ion-pair RPLC method was attempted and further explored for studying distribution coefficients (logD) of sulfonic acids that totally ionized in the mobile phase. Notably, experimental logD values of sulfonic acids were given for the first time. The IP-RPLC method provided a distinct way to explore logD values of ionized compounds. Copyright © 2017 Elsevier B.V. All rights reserved.

  8. Methodologic frontiers in environmental epidemiology.

    OpenAIRE

    Rothman, K J

    1993-01-01

    Environmental epidemiology comprises the epidemiologic study of those environmental factors that are outside the immediate control of the individual. Exposures of interest to environmental epidemiologists include air pollution, water pollution, occupational exposure to physical and chemical agents, as well as psychosocial elements of environmental concern. The main methodologic problem in environmental epidemiology is exposure assessment, a problem that extends through all of epidemiologic re...

  9. Test reactor risk assessment methodology

    International Nuclear Information System (INIS)

    Jennings, R.H.; Rawlins, J.K.; Stewart, M.E.

    1976-04-01

    A methodology has been developed for the identification of accident initiating events and the fault modeling of systems, including common mode identification, as these methods are applied in overall test reactor risk assessment. The methods are exemplified by a determination of risks to a loss of primary coolant flow in the Engineering Test Reactor

  10. Methodology for analyzing risk at nuclear facilities

    International Nuclear Information System (INIS)

    Yoo, Hosik; Lee, Nayoung; Ham, Taekyu; Seo, Janghoon

    2015-01-01

    economic damages are difficult to evaluate. Therefore, radiation levels and theft of nuclear materials that could be quantified are adopted as attributes for analyzing the consequences. Awareness of the nuclear security culture and physical protection resources such as staffing, capabilities, and cost required to provide PP should be considered when evaluating risks. In this study, these attributes are included in the measure of human resources. Human resources include such factors as trustworthiness, degree of nuclear security culture awareness, and frequency of psychiatric testing of employees. A case study performed on hypothetical facilities demonstrates that the developed methodology could be used to analyze innovative nuclear systems as well as existing facilities

  11. 'Reference Biospheres' for solid radioactive waste disposal: the BIOMASS Methodology

    International Nuclear Information System (INIS)

    Crossland, I.G.; Pinedo, P.; Kessler, J.H.; Torres-Vidal, C.; Walters, B.

    2005-01-01

    The BIOMASS Theme 1 project has developed a methodology for the logical and defensible construction of 'assessment biospheres': mathematical representations of biospheres used in the total system performance assessment of radioactive waste disposal. The BIOMASS Methodology provides a systematic approach to decision making, including decisions on how to address biosphere change. The BIOMASS Methodology was developed through consultation and collaboration with many relevant organisations, including regulators, operators and a variety of independent experts. It has been developed to be practical and to be consistent with recommendations from ICRP and IAEA on radiation protection in the context of the disposal of long-lived solid radioactive wastes. The five main steps in the methodology are described in this paper. The importance of a clear assessment context, to clarify intentions and to support a coherent biosphere assessment process within an overall repository performance assessment, is strongly emphasised. A well described assessment context is an important tool for ensuring consistency across the performance assessment as a whole. The use of interaction matrices has been found to be helpful in clarifying the interactions between different habitats within the biosphere system and the significant radionuclide transfer pathways within the biosphere system. Matrices also provide a useful means of checking for consistency

  12. Reliability Centered Maintenance - Methodologies

    Science.gov (United States)

    Kammerer, Catherine C.

    2009-01-01

    Journal article about Reliability Centered Maintenance (RCM) methodologies used by United Space Alliance, LLC (USA) in support of the Space Shuttle Program at Kennedy Space Center. The USA Reliability Centered Maintenance program differs from traditional RCM programs because various methodologies are utilized to take advantage of their respective strengths for each application. Based on operational experience, USA has customized the traditional RCM methodology into a streamlined lean logic path and has implemented the use of statistical tools to drive the process. USA RCM has integrated many of the L6S tools into both RCM methodologies. The tools utilized in the Measure, Analyze, and Improve phases of a Lean Six Sigma project lend themselves to application in the RCM process. All USA RCM methodologies meet the requirements defined in SAE JA 1011, Evaluation Criteria for Reliability-Centered Maintenance (RCM) Processes. The proposed article explores these methodologies.

  13. Photovoltaic module energy rating methodology development

    Energy Technology Data Exchange (ETDEWEB)

    Kroposki, B.; Myers, D.; Emery, K.; Mrig, L. [National Renewable Energy Lab., Golden, CO (United States); Whitaker, C.; Newmiller, J. [Endecon Engineering, San Ramon, CA (United States)

    1996-05-01

    A consensus-based methodology to calculate the energy output of a PV module will be described in this paper. The methodology develops a simple measure of PV module performance that provides for a realistic estimate of how a module will perform in specific applications. The approach makes use of the weather data profiles that describe conditions throughout the United States and emphasizes performance differences between various module types. An industry-representative Technical Review Committee has been assembled to provide feedback and guidance on the strawman and final approach used in developing the methodology.

  14. GO methodology. Volume 1. Overview manual

    International Nuclear Information System (INIS)

    1983-06-01

    The GO methodology is a success-oriented probabilistic system performance analysis technique. The methodology can be used to quantify system reliability and availability, identify and rank critical components and the contributors to system failure, construct event trees, and perform statistical uncertainty analysis. Additional capabilities of the method currently under development will enhance its use in evaluating the effects of external events and common cause failures on system performance. This Overview Manual provides a description of the GO Methodology, how it can be used, and benefits of using it in the analysis of complex systems

  15. Methodology for the analysis of external flooding in CN Asco-II and CN Vandellos during the performance of stress tests; Metodologia para el analisis de inundaciones externas en CN Asco y CN Vandellos II duante la realizacion de las pruebas de resistencoa o stress-tests

    Energy Technology Data Exchange (ETDEWEB)

    Aleman, A.; Cobas, I.; Sabater, J.; Canadell, F.; Garces, L.; Otero, M.

    2012-07-01

    The work carried out in relation to extemal floods have allowed synthesized in a unique methodology to obtain the entire process of margins against external flooding, including identification of the extemal external events could cause flooding.

  16. Methodology, Measurement and Analysis of Flow Table Update Characteristics in Hardware OpenFlow Switches

    KAUST Repository

    Kuźniar, Maciej; Pereší ni, Peter; Kostić, Dejan; Canini, Marco

    2018-01-01

    and performance characteristics is essential for ensuring successful and safe deployments.We propose a systematic methodology for SDN switch performance analysis and devise a series of experiments based on this methodology. The methodology relies on sending a

  17. Status of the Gen-IV Proliferation Resistance and Physical Protection (PRPP) Evaluation Methodology

    International Nuclear Information System (INIS)

    Whitlock, J.; Bari, R.; Peterson, P.; Padoani, F.; Cojazzi, G.G.M.; Renda, G.; ); Cazalet, J.; Haas, E.; Hori, K.; Kawakubo, Y.; Chang, S.; Kim, H.; Kwon, E.-H.; Yoo, H.; Chebeskov, A.; Pshakin, G.; Pilat, J.F.; Therios, I.; Bertel, E.

    2015-01-01

    Methodologies have been developed within the Generation IV International Forum (GIF) to support the assessment and improvement of system performance in the areas safeguards, security, economics and safety. Of these four areas, safeguards and security are the subjects of the GIF working group on Proliferation Resistance and Physical Protection (PRPP). Since the PRPP methodology (now at Revision 6) represents a mature, generic, and comprehensive evaluation approach, and is freely available on the GIF public website, several non-GIF technical groups have chosen to utilize the PRPP methodology for their own goals. Indeed, the results of the evaluations performed with the methodology are intended for three types of generic users: system designers, programme policy makers, and external stakeholders. The PRPP Working Group developed the methodology through a series of demonstration and case studies. In addition, over the past few years various national and international groups have applied the methodology to inform nuclear energy system designs, as well as to support the development of approaches to advanced safeguards. A number of international workshops have also been held which have introduced the methodology to design groups and other stakeholders. In this paper we summarize the technical progress and accomplishments of the PRPP evaluation methodology, including applications outside GIF, and we outline the PRPP methodology's relationship with the IAEA's INPRO methodology. Current challenges with the efficient implementation of the methodology are outlined, along with our path forward for increasing its accessibility to a broader stakeholder audience - including supporting the next generation of skilled professionals in the nuclear non-proliferation field. (author)

  18. Probabilistic methodology for turbine missile risk analysis

    International Nuclear Information System (INIS)

    Twisdale, L.A.; Dunn, W.L.; Frank, R.A.

    1984-01-01

    A methodology has been developed for estimation of the probabilities of turbine-generated missile damage to nuclear power plant structures and systems. Mathematical models of the missile generation, transport, and impact events have been developed and sequenced to form an integrated turbine missile simulation methodology. Probabilistic Monte Carlo techniques are used to estimate the plant impact and damage probabilities. The methodology has been coded in the TURMIS computer code to facilitate numerical analysis and plant-specific turbine missile probability assessments. Sensitivity analyses have been performed on both the individual models and the integrated methodology, and probabilities have been estimated for a hypothetical nuclear power plant case study. (orig.)

  19. Socially Responsible Investments : Methodology, Risk and Performance

    NARCIS (Netherlands)

    Renneboog, L.D.R.; Ter Horst, J.R.; Zhang, C.

    2007-01-01

    This paper surveys the literature on socially responsible investments (SRI). Over the past decade, SRI has experienced an explosive growth around the world. Particular to the SRI funds is that both financial goals and social objectives are pursued. While corporate social responsibility (CSR) -

  20. Methodology for technical risk assessment

    International Nuclear Information System (INIS)

    Waganer, L.M.; Zuckerman, D.S.

    1983-01-01

    A methodology has been developed for and applied to the assessment of the technical risks associated with an evolving technology. This methodology, originally developed for fusion by K. W. Billman and F. R. Scott at EPRI, has been applied to assess the technical risk of a fuel system for a fusion reactor. Technical risk is defined as the risk that a particular technology or component which is currently under development will not achieve a set of required technical specifications (i.e. probability of failure). The individual steps in the technical risk assessment are summarized. The first step in this methodology is to clearly and completely quantify the technical requirements for the particular system being examined. The next step is to identify and define subsystems and various options which appear capable of achieving the required technical performance. The subsystem options are then characterized regarding subsystem functions, interface requirements with the subsystems and systems, important components, developmental obstacles and technical limitations. Key technical subsystem performance parameters are identified which directly or indirectly relate to the system technical specifications. Past, existing and future technical performance data from subsystem experts are obtained by using a Bayesian Interrogation technique. The input data is solicited in the form of probability functions. Thus the output performance of the system is expressed as probability functions

  1. New quickest transient detection methodology. Nuclear engineering applications

    International Nuclear Information System (INIS)

    Wang, Xin; Jevremovic, Tatjana; Tsoukalas, Lefteri H.

    2003-01-01

    A new intelligent systems methodology for quickest online transient detection is presented. Based on information that includes, but is not limited to, statistical features, energy of frequency components and wavelet coefficients, the new methodology decides whether a transient has emerged. A fuzzy system makes the final decision, the membership functions of which are obtained by artificial neural networks and adjusted in an online manner. Comparisons are performed with conventional methods for transient detection using simulated and plant data. The proposed methodology could be useful in power plant operations, diagnostic and maintenance activities. It is also considered as a design tool for quick design modifications in a virtual design environment aimed at next generation University Research and Training Reactors (URTRs). (The virtual design environment is pursued as part of the Big-10 Consortium sponsored by the new Innovations in Nuclear Infrastructure and Education (INIE) program sponsored by the US Department of Energy.) (author)

  2. Introducing an ILS methodology into research reactors

    International Nuclear Information System (INIS)

    Lorenzo, N. de; Borsani, R.C.

    2003-01-01

    Integrated Logistics Support (ILS) is the managerial organisation that co-ordinates the activities of many disciplines to develop the supporting resources (training, staffing, designing aids, equipment removal routes, etc) required by technologically complex systems. The application of an ILS methodology in defence projects is described in several places, but it is infrequently illustrated for other areas; therefore the present paper deals with applying this approach to research reactors under design or already in operation. Although better results are obtained when applied since the very beginning of a project, it can be applied successfully in facilities already in operation to improve their capability in a cost-effective way. In applying this methodology, the key objectives shall be previously identified in order to tailor the whole approach. Generally in high power multipurpose reactors, obtaining maximum profit at the lowest possible cost without reducing the safety levels are key issues, while in others the goal is to minimise drawbacks like spurious shutdowns, low quality experimental results or even to reduce staff dose to ALARA values. These items need to be quantified for establishing a system status base line in order to trace the process evolution. Thereafter, specific logistics analyses should be performed in the different areas composing the system. RAMS (Reliability, Availability, Maintainability and Supportability), Manning, Training Needs, Supplying Needs are some examples of these special logistic assessments. The following paragraphs summarise the different areas, encompassed by this ILS methodology. Plant design is influenced focussing the designers? attention on the objectives already identified. Careful design reviews are performed only in an early design stage, being useless a later application. In this paper is presented a methodology including appropriate tools for ensuring the designers abide to ILS issues and key objectives through the

  3. Introduction to LCA Methodology

    DEFF Research Database (Denmark)

    Hauschild, Michael Z.

    2018-01-01

    In order to offer the reader an overview of the LCA methodology in the preparation of the more detailed description of its different phases, a brief introduction is given to the methodological framework according to the ISO 14040 standard and the main elements of each of its phases. Emphasis...

  4. Methodologies, languages and tools

    International Nuclear Information System (INIS)

    Amako, Katsuya

    1994-01-01

    This is a summary of the open-quotes Methodologies, Languages and Toolsclose quotes session in the CHEP'94 conference. All the contributions to methodologies and languages are relevant to the object-oriented approach. Other topics presented are related to various software tools in the down-sized computing environment

  5. Menopause and Methodological Doubt

    Science.gov (United States)

    Spence, Sheila

    2005-01-01

    Menopause and methodological doubt begins by making a tongue-in-cheek comparison between Descartes' methodological doubt and the self-doubt that can arise around menopause. A hermeneutic approach is taken in which Cartesian dualism and its implications for the way women are viewed in society are examined, both through the experiences of women…

  6. VEM: Virtual Enterprise Methodology

    DEFF Research Database (Denmark)

    Tølle, Martin; Vesterager, Johan

    2003-01-01

    This chapter presents a virtual enterprise methodology (VEM) that outlines activities to consider when setting up and managing virtual enterprises (VEs). As a methodology the VEM helps companies to ask the right questions when preparing for and setting up an enterprise network, which works...

  7. Data Centric Development Methodology

    Science.gov (United States)

    Khoury, Fadi E.

    2012-01-01

    Data centric applications, an important effort of software development in large organizations, have been mostly adopting a software methodology, such as a waterfall or Rational Unified Process, as the framework for its development. These methodologies could work on structural, procedural, or object oriented based applications, but fails to capture…

  8. Methodology for evaluation of alternative technologies applied to nuclear fuel reprocessing

    International Nuclear Information System (INIS)

    Selvaduray, G.S.; Goldstein, M.K.; Anderson, R.N.

    1977-07-01

    An analytic methodology has been developed to compare the performance of various nuclear fuel reprocessing techniques for advanced fuel cycle applications including low proliferation risk systems. The need to identify and to compare those processes, which have the versatility to handle the variety of fuel types expected to be in use in the next century, is becoming increasingly imperative. This methodology allows processes in any stage of development to be compared and to assess the effect of changing external conditions on the process

  9. Performance optimization methodology in a Francis turbine vacuum cleaner based on the modification of the boundary conditions; Metodologia de optimizacion del rendimiento en un aspirador de turbina Francis en base a la modificacion de las condiciones de frontera

    Energy Technology Data Exchange (ETDEWEB)

    Galvan Gonzalez, S.; Rubio Maya, C.; Mendoza Covarrubias, C. [Universidad Michoacana de San Nicolas Hidalgo, Morelia, Michoacan (Mexico)]. E-mail: srgalvan@umich.mx; crmaya@umich.mx; cmendoza@umich.mx; Pacheco Ibarra, J.; Martinez Patino, J. [Universidad de Guanajuato, Salamanca, Guanajuato (Mexico)

    2010-11-15

    This paper focus on establishing an optimization methodology for maximizing the draft tube performance as a function of the inlet velocity profile. The overall work is defined by four steps: the inlet velocity profile parametrization, the numerical optimization set-up, the numerical computational fluid dynamics (CFD) draft tube model and the objective function definition. However as the inlet velocity profile is a new challenge, each step of the parametrization must be properly validated. In the first step, a suitable parametrization with an appropriate number of variables is chosen to approximate an experimental draft tube inlet velocity profile. The second step considers the reduction of the actual numerical model to develop the CFD calculations. In the third step, the optimization algorithm set-up is specified. Finally, the objective function is evaluated. As each step is properly validated, it is considered this proved methodology will help to find an inlet velocity profile shape which will be able to suppress or mitigate the undesirable draft tube flow characteristics. [Spanish] Este documento se enfoca en establecer una metodologia de optimizacion para maximizar el rendimiento del tubo de aspiracion en funcion de las condiciones de frontera del aspirador. El metodo es definido en cuatro etapas: la parametrizacion del perfil de velocidades, la evaluacion del algoritmo de optimizacion, la valoracion de la funcion objetivo y la validacion del modelo numerico. La parametrizacion se realiza con un numero de variables apropiado para aproximar el perfil de velocidades experimental. El segundo paso considera establecer los parametros del algoritmo de optimizacion, la funcion objetivo es obtenida mediante un estudio de sensibilidad y el modelo numerico es validado y reducido en la cantidad de elementos para adecuarlo al proceso de optimizacion. Se considera que esta metodologia ayudara a encontrar un perfil de velocidad de entrada el cual reduzca las perdidas de

  10. Thermodynamic and thermoeconomic analyses of a trigeneration (TRIGEN) system with a gas-diesel engine: Part I - Methodology

    International Nuclear Information System (INIS)

    Balli, Ozgur; Aras, Haydar; Hepbasli, Arif

    2010-01-01

    This paper consists of two parts. Part 1 deals with the thermodynamic and thermoeconomic methodology of a trigeneration (TRIGEN) system with a rated output of 6.5 MW gas-diesel engine while the application of the methodology is presented in Part 2. The system has been installed in the Eskisehir Industry Estate Zone in Turkey. Thermodynamic methodology includes the relations and performance parameters for energy and exergy analysis, while thermoeconomic methodology covers the cost balance relations, cost of products and thermodynamic inefficiencies, relative cost difference and exergoeconomic factor.

  11. THE ASSESSMENT METHODOLOGY PDCA/PDSA – A METHODOLOGY FOR COORDINATING THE EFFORTS TO IMPROVE THE ORGANIZATIONAL PROCESSES TO ACHIEVE EXCELLENCE

    Directory of Open Access Journals (Sweden)

    Cristina Raluca POPESCU

    2015-07-01

    Full Text Available In the paper “The Assessment Methodology PDCA/PDSA – A Methodology for Coordinating the Efforts to Improve the Organizational Processes to Achieve Excellence” the authors present the basic features of the assessment methodology PDCA/PDSA that is designed to coordinate the efforts to improve the organizational processes in order to achieve excellence. In the first part of the paper (the introduction of the paper, the authors present the general background concerning the performance of management business processes and the importance of achieving excellence and furthermore correctly assessing/evaluating it. In the second part of the paper (the assessment methodology PDCA/PDSA – as a methodology for coordinating the efforts to improve the organizational processes to achieve excellence, the authors describe the characteristics of the assessment methodology PDCA/PDSA from a theoretical point of view. We can say that in the current state of global economy, the global performance includes the economic, social and environmental issues, while, effectiveness and efficiency acquire new dimensions, both quantitative and qualitative. Performance needs to adopt a more holistic view of the interdependence of internal and external parameters, quantitative and qualitative, technical and human, physical and financial management of, thus leading to what we call today overall performance.

  12. 24 CFR 904.205 - Training methodology.

    Science.gov (United States)

    2010-04-01

    ... Training methodology. Equal in importance to the content of the pre- and post-occupancy training is the training methodology. Because groups vary, there should be adaptability in the communication and learning experience. Methods to be utilized may include group presentations, small discussion groups, special classes...

  13. A methodology for developing distributed programs

    NARCIS (Netherlands)

    Ramesh, S.; Mehndiratta, S.L.

    1987-01-01

    A methodology, different from the existing ones, for constructing distributed programs is presented. It is based on the well-known idea of developing distributed programs via synchronous and centralized programs. The distinguishing features of the methodology are: 1) specification include process

  14. Methodological issues of postoperative cognitive dysfunction research

    DEFF Research Database (Denmark)

    Funder, Kamilia S; Steinmetz, Jacob; Rasmussen, Lars S

    2010-01-01

    to reveal postoperative cognitive decline, and questionnaires are not useful for this purpose. There is a profound lack of consensus regarding the research methodology for detection of cognitive deterioration, especially the diagnostic criteria. Issues, such as baseline performance, learning effects...

  15. Contribution to the integration methodology of environment in the small and medium enterprises or industries: evaluation of environmental performances; Contribution a la methodologie d'integration de l'environnement dans les PME-PMI: evaluation des performances environnementales

    Energy Technology Data Exchange (ETDEWEB)

    Personne, M

    1998-01-16

    The integration of environmental criteria into industrial plants working is nowadays an obligation for companies. Implementation of an Environmental Management System (EMS) is a mean to integrate these criteria, and the system registration (by ISO 14001 or EMAS standards) enables companies to demonstrate the validity of their environmental behaviour to interested parties. Our experience in Small and Medium Enterprises (SMEs) has allowed us to note the inadequacy between their environmental integration level and EMS requirements. In addition to that, we have observed that environmental assessment methods, which could enable SMEs to make up for lost time, were not adapted to their specificities. However, two recent approaches are innovative: the first one is based on a progressive processes, the second one on an environmental information system, based on indicators construction. On the basis of existing methods study, improved with our SMEs experience, our approach consists of developing an environmental integration method, joining the progressive aspect (construction of a 'multi-phases' method) and the information treatment (exploitation of environmental data by indicators construction). We propose a four phases method, - environmental performance evaluation, internal and external results exploitation, process perpetuation -, setting up an information treatment system, by means of compliance, progress and monitoring indicators. Leading to implementation of an environmental performance continuous improvement cycle, this process enables companies to step forward EMS implementation. (author)

  16. Methodological Problems of Nanotechnoscience

    Science.gov (United States)

    Gorokhov, V. G.

    Recently, we have reported on the definitions of nanotechnology as a new type of NanoTechnoScience and on the nanotheory as a cluster of the different natural and engineering theories. Nanotechnology is not only a new type of scientific-engineering discipline, but it evolves also in a “nonclassical” way. Nanoontology or nano scientific world view has a function of the methodological orientation for the choice the theoretical means and methods toward a solution to the scientific and engineering problems. This allows to change from one explanation and scientific world view to another without any problems. Thus, nanotechnology is both a field of scientific knowledge and a sphere of engineering activity, in other words, NanoTechnoScience is similar to Systems Engineering as the analysis and design of large-scale, complex, man/machine systems but micro- and nanosystems. Nano systems engineering as well as Macro systems engineering includes not only systems design but also complex research. Design orientation has influence on the change of the priorities in the complex research and of the relation to the knowledge, not only to “the knowledge about something”, but also to the knowledge as the means of activity: from the beginning control and restructuring of matter at the nano-scale is a necessary element of nanoscience.

  17. Nuclear power plant simulation facility evaluation methodology

    International Nuclear Information System (INIS)

    Haas, P.M.; Carter, R.J.; Laughery, K.R. Jr.

    1985-01-01

    A methodology for evaluation of nuclear power plant simulation facilities with regard to their acceptability for use in the US Nuclear Regulatory Commission (NRC) operator licensing exam is described. The evaluation is based primarily on simulator fidelity, but incorporates some aspects of direct operator/trainee performance measurement. The panel presentation and paper discuss data requirements, data collection, data analysis and criteria for conclusions regarding the fidelity evaluation, and summarize the proposed use of direct performance measurment. While field testing and refinement of the methodology are recommended, this initial effort provides a firm basis for NRC to fully develop the necessary methodology

  18. Disposal criticality analysis methodology for fissile waste forms

    International Nuclear Information System (INIS)

    Davis, J.W.; Gottlieb, P.

    1998-03-01

    A general methodology has been developed to evaluate the criticality potential of the wide range of waste forms planned for geologic disposal. The range of waste forms include commercial spent fuel, high level waste, DOE spent fuel (including highly enriched), MOX using weapons grade plutonium, and immobilized plutonium. The disposal of these waste forms will be in a container with sufficiently thick corrosion resistant barriers to prevent water penetration for up to 10,000 years. The criticality control for DOE spent fuel is primarily provided by neutron absorber material incorporated into the basket holding the individual assemblies. For the immobilized plutonium, the neutron absorber material is incorporated into the waste form itself. The disposal criticality analysis methodology includes the analysis of geochemical and physical processes that can breach the waste package and affect the waste forms within. The basic purpose of the methodology is to guide the criticality control features of the waste package design, and to demonstrate that the final design meets the criticality control licensing requirements. The methodology can also be extended to the analysis of criticality consequences (primarily increased radionuclide inventory), which will support the total performance assessment for the respository

  19. Soft systems methodology: other voices

    OpenAIRE

    Holwell, Sue

    2000-01-01

    This issue of Systemic Practice and Action Research, celebrating the work of Peter Checkland, in the particular nature and development of soft systems methodology (SSM), would not have happened unless the work was seen by others as being important. No significant contribution to thinking happens without a secondary literature developing. Not surprisingly, many commentaries have accompanied the ongoing development of SSM. Some of these are insightful, some full of errors, and some include both...

  20. Methodological remarks on contraction theory

    DEFF Research Database (Denmark)

    Jouffroy, Jerome; Slotine, Jean-Jacques E.

    Because contraction analysis stems from a differential and incremental framework, the nature and methodology of contraction-based proofs are significantly different from those of their Lyapunov-based counterparts. This paper specifically studies this issue, and illustrates it by revisiting some c...... classical examples traditionally addressed using Lyapunov theory. Even in these cases, contraction tools can often yield significantly simplified analysis. The examples include adaptive control, robotics, and a proof of convergence of the deterministic Extended Kalman Filter....

  1. PERFORMANCE

    Directory of Open Access Journals (Sweden)

    M Cilli

    2014-10-01

    Full Text Available This study aimed to investigate the kinematic and kinetic changes when resistance is applied in horizontal and vertical directions, produced by using different percentages of body weight, caused by jumping movements during a dynamic warm-up. The group of subjects consisted of 35 voluntary male athletes (19 basketball and 16 volleyball players; age: 23.4 ± 1.4 years, training experience: 9.6 ± 2.7 years; height: 177.2 ± 5.7 cm, body weight: 69.9 ± 6.9 kg studying Physical Education, who had a jump training background and who were training for 2 hours, on 4 days in a week. A dynamic warm-up protocol containing seven specific resistance movements with specific resistance corresponding to different percentages of body weight (2%, 4%, 6%, 8%, 10% was applied randomly on non consecutive days. Effects of different warm-up protocols were assessed by pre-/post- exercise changes in jump height in the countermovement jump (CMJ and the squat jump (SJ measured using a force platform and changes in hip and knee joint angles at the end of the eccentric phase measured using a video camera. A significant increase in jump height was observed in the dynamic resistance warm-up conducted with different percentages of body weight (p 0.05. In jump movements before and after the warm-up, while no significant difference between the vertical ground reaction forces applied by athletes was observed (p>0.05, in some cases of resistance, a significant reduction was observed in hip and knee joint angles (p<0.05. The dynamic resistance warm-up method was found to cause changes in the kinematics of jumping movements, as well as an increase in jump height values. As a result, dynamic warm-up exercises could be applicable in cases of resistance corresponding to 6-10% of body weight applied in horizontal and vertical directions in order to increase the jump performance acutely.

  2. Methodological quality of systematic reviews on influenza vaccination.

    Science.gov (United States)

    Remschmidt, Cornelius; Wichmann, Ole; Harder, Thomas

    2014-03-26

    There is a growing body of evidence on the risks and benefits of influenza vaccination in various target groups. Systematic reviews are of particular importance for policy decisions. However, their methodological quality can vary considerably. To investigate the methodological quality of systematic reviews on influenza vaccination (efficacy, effectiveness, safety) and to identify influencing factors. A systematic literature search on systematic reviews on influenza vaccination was performed, using MEDLINE, EMBASE and three additional databases (1990-2013). Review characteristics were extracted and the methodological quality of the reviews was evaluated using the assessment of multiple systematic reviews (AMSTAR) tool. U-test, Kruskal-Wallis test, chi-square test, and multivariable linear regression analysis were used to assess the influence of review characteristics on AMSTAR-score. Fourty-six systematic reviews fulfilled the inclusion criteria. Average methodological quality was high (median AMSTAR-score: 8), but variability was large (AMSTAR range: 0-11). Quality did not differ significantly according to vaccination target group. Cochrane reviews had higher methodological quality than non-Cochrane reviews (p=0.001). Detailed analysis showed that this was due to better study selection and data extraction, inclusion of unpublished studies, and better reporting of study characteristics (all p<0.05). In the adjusted analysis, no other factor, including industry sponsorship or journal impact factor had an influence on AMSTAR score. Systematic reviews on influenza vaccination showed large differences regarding their methodological quality. Reviews conducted by the Cochrane collaboration were of higher quality than others. When using systematic reviews to guide the development of vaccination recommendations, the methodological quality of a review in addition to its content should be considered. Copyright © 2014 Elsevier Ltd. All rights reserved.

  3. Design Methodology - Design Synthesis

    DEFF Research Database (Denmark)

    Andreasen, Mogens Myrup

    2003-01-01

    Design Methodology is part of our practice and our knowledge about designing, and it has been strongly supported by the establishing and work of a design research community. The aim of this article is to broaden the reader¿s view of designing and Design Methodology. This is done by sketching...... the development of Design Methodology through time and sketching some important approaches and methods. The development is mainly forced by changing industrial condition, by the growth of IT support for designing, but also by the growth of insight into designing created by design researchers.......ABSTRACT Design Methodology shall be seen as our understanding of how to design; it is an early (emerging late 60ies) and original articulation of teachable and learnable methodics. The insight is based upon two sources: the nature of the designed artefacts and the nature of human designing. Today...

  4. GPS system simulation methodology

    Science.gov (United States)

    Ewing, Thomas F.

    1993-01-01

    The following topics are presented: background; Global Positioning System (GPS) methodology overview; the graphical user interface (GUI); current models; application to space nuclear power/propulsion; and interfacing requirements. The discussion is presented in vugraph form.

  5. Hazard classification methodology

    International Nuclear Information System (INIS)

    Brereton, S.J.

    1996-01-01

    This document outlines the hazard classification methodology used to determine the hazard classification of the NIF LTAB, OAB, and the support facilities on the basis of radionuclides and chemicals. The hazard classification determines the safety analysis requirements for a facility

  6. Nonlinear Image Denoising Methodologies

    National Research Council Canada - National Science Library

    Yufang, Bao

    2002-01-01

    In this thesis, we propose a theoretical as well as practical framework to combine geometric prior information to a statistical/probabilistic methodology in the investigation of a denoising problem...

  7. Clinical trial methodology

    National Research Council Canada - National Science Library

    Peace, Karl E; Chen, Ding-Geng

    2011-01-01

    "Now viewed as its own scientific discipline, clinical trial methodology encompasses the methods required for the protection of participants in a clinical trial and the methods necessary to provide...

  8. PSA methodology including new design, operational and safety factors, 'Level of recognition of phenomena with a presumed dominant influence upon operational safety' (failures of conventional as well as non-conventional passive components, dependent failures, influence of operator, fires and external threats, digital control, organizational factors)

    International Nuclear Information System (INIS)

    Jirsa, P.

    2001-10-01

    The document represents a specific type of discussion of existing methodologies for the creation and application of probabilistic safety assessment (PSA) in light of the EUR document summarizing requirements placed by Western European NPP operators on the future design of nuclear power plants. A partial goal of this discussion consists in mapping, from the PSA point of view, those selected design, operational and/or safety factors of future NPPs that may be entirely new or, at least, newly addressed. Therefore, the terms of reference for this stage were formulated as follows: Assess current level of knowledge and procedures in the analysis of factors and phenomena with a dominant influence upon operational safety of new generation reactors, especially in the following areas: (1) Phenomenology of failure types and mechanisms and reliability of conventional passive safety system components; (2) Phenomenology of failure types and mechanisms and reliability of non-conventional passive components of newly designed safety systems; (3) Phenomenology of types and mechanisms of dependent failures; (4) Human factor role in new generation reactors and its effect upon safety; (5) Fire safety and other external threats to new nuclear installations; (6) Reliability of the digital systems of the I and C system and their effect upon safety; and (7) Organizational factors in new nuclear installations. (P.A.)

  9. Hanford Site baseline risk assessment methodology

    International Nuclear Information System (INIS)

    1992-03-01

    This report describes risk assessment methodology associated with the remedial action programs at the Hanford Reservation. Topics addressed include human health evaluation, pollutant and radionuclide transport through the environment, and environmental transport pathways

  10. Methodology of sustainability accounting

    Directory of Open Access Journals (Sweden)

    O.H. Sokil

    2017-03-01

    Full Text Available Modern challenges of the theory and methodology of accounting are realized through the formation and implementation of new concepts, the purpose of which is to meet the needs of users in standard and unique information. The development of a methodology for sustainability accounting is a key aspect of the management of an economic entity. The purpose of the article is to form the methodological bases of accounting for sustainable development and determine its goals, objectives, object, subject, methods, functions and key aspects. The author analyzes the theoretical bases of the definition and considers the components of the traditional accounting methodology. Generalized structural diagram of the methodology for accounting for sustainable development is offered in the article. The complex of methods and principles of sustainable development accounting for systematized and non-standard provisions has been systematized. The new system of theoretical and methodological provisions of accounting for sustainable development is justified in the context of determining its purpose, objective, subject, object, methods, functions and key aspects.

  11. Formalizing the ISDF Software Development Methodology

    Directory of Open Access Journals (Sweden)

    Mihai Liviu DESPA

    2015-01-01

    Full Text Available The paper is aimed at depicting the ISDF software development methodology by emphasizing quality management and software development lifecycle. The ISDF methodology was built especially for innovative software development projects. The ISDF methodology was developed empirically by trial and error in the process of implementing multiple innovative projects. The research process began by analysing key concepts like innovation and software development and by settling the important dilemma of what makes a web application innovative. Innovation in software development is presented from the end-user, project owner and project manager’s point of view. The main components of a software development methodology are identified. Thus a software development methodology should account for people, roles, skills, teams, tools, techniques, processes, activities, standards, quality measuring tools, and team values. Current software development models are presented and briefly analysed. The need for a dedicated innovation oriented software development methodology is emphasized by highlighting shortcomings of current software development methodologies when tackling innovation. The ISDF methodology is presented in the context of developing an actual application. The ALHPA application is used as a case study for emphasizing the characteristics of the ISDF methodology. The development life cycle of the ISDF methodology includes research, planning, prototyping, design, development, testing, setup and maintenance. Artefacts generated by the ISDF methodology are presented. Quality is managed in the ISDF methodology by assessing compliance, usability, reliability, repeatability, availability and security. In order to properly asses each quality component a dedicated indicator is built. A template for interpreting each indicator is provided. Conclusions are formulated and new related research topics are submitted for debate.

  12. Country report: a methodology

    International Nuclear Information System (INIS)

    Colin, A.

    2013-01-01

    This paper describes a methodology which could be applicable to establish a country report. In the framework of nuclear non proliferation appraisal and IAEA safeguards implementation, it is important to be able to assess the potential existence of undeclared nuclear materials and activities as undeclared facilities in the country under review. In our views a country report should aim at providing detailed information on nuclear related activities for each country examined taken 'as a whole' such as nuclear development, scientific and technical capabilities, etc. In order to study a specific country, we need to know if there is already an operating nuclear civil programme or not. In the first case, we have to check carefully if it could divert nuclear material, if there are misused declared facilities or if they operate undeclared facilities and conduct undeclared activities aiming at manufacturing nuclear weapon. In the second case, we should pay attention to the development of a nuclear civil project. A country report is based on a wide span of information (most of the time coming from open sources but sometimes coming also from confidential or private ones). Therefore, it is important to carefully check the nature and the credibility (reliability?) of these sources through cross-check examination. Eventually, it is necessary to merge information from different sources and apply an expertise filter. We have at our disposal a lot of performing tools to help us to assess, understand and evaluate the situation (cartography, imagery, bibliometry, etc.). These tools allow us to offer the best conclusions as far as possible. The paper is followed by the slides of the presentation. (author)

  13. Methodology for evaluation of railroad technology research projects

    Science.gov (United States)

    1981-04-01

    This Project memorandum presents a methodology for evaluating railroad research projects. The methodology includes consideration of industry and societal benefits, with special attention given to technical risks, implementation considerations, and po...

  14. Methodology for building confidence measures

    Science.gov (United States)

    Bramson, Aaron L.

    2004-04-01

    This paper presents a generalized methodology for propagating known or estimated levels of individual source document truth reliability to determine the confidence level of a combined output. Initial document certainty levels are augmented by (i) combining the reliability measures of multiply sources, (ii) incorporating the truth reinforcement of related elements, and (iii) incorporating the importance of the individual elements for determining the probability of truth for the whole. The result is a measure of confidence in system output based on the establishing of links among the truth values of inputs. This methodology was developed for application to a multi-component situation awareness tool under development at the Air Force Research Laboratory in Rome, New York. Determining how improvements in data quality and the variety of documents collected affect the probability of a correct situational detection helps optimize the performance of the tool overall.

  15. Screening radon risks: A methodology for policymakers

    International Nuclear Information System (INIS)

    Eisinger, D.S.; Simmons, R.A.; Lammering, M.; Sotiros, R.

    1991-01-01

    This paper provides an easy-to-use screening methodology to estimate potential excess lifetime lung cancer risk resulting from indoor radon exposure. The methodology was developed under U.S. EPA Office of Policy, Planning, and Evaluation sponsorship of the agency's Integrated Environmental Management Projects (IEMP) and State/Regional Comparative Risk Projects. These projects help policymakers understand and use scientific data to develop environmental problem-solving strategies. This research presents the risk assessment methodology, discusses its basis, and identifies appropriate applications. The paper also identifies assumptions built into the methodology and qualitatively addresses methodological uncertainties, the direction in which these uncertainties could bias analyses, and their relative importance. The methodology draws from several sources, including risk assessment formulations developed by the U.S. EPA's Office of Radiation Programs, the EPA's Integrated Environmental Management Project (Denver), the International Commission on Radiological Protection, and the National Institute for Occupational Safety and Health. When constructed as a spreadsheet program, the methodology easily facilitates analyses and sensitivity studies (the paper includes several sensitivity study options). The methodology will be most helpful to those who need to make decisions concerning radon testing, public education, and exposure prevention and mitigation programs.26 references

  16. Risk analysis methodologies for the transportation of radioactive materials

    International Nuclear Information System (INIS)

    Geffen, C.A.

    1983-05-01

    Different methodologies have evolved for consideration of each of the many steps required in performing a transportation risk analysis. Although there are techniques that attempt to consider the entire scope of the analysis in depth, most applications of risk assessment to the transportation of nuclear fuel cycle materials develop specific methodologies for only one or two parts of the analysis. The remaining steps are simplified for the analyst by narrowing the scope of the effort (such as evaluating risks for only one material, or a particular set of accident scenarios, or movement over a specific route); performing a qualitative rather than a quantitative analysis (probabilities may be simply ranked as high, medium or low, for instance); or assuming some generic, conservative conditions for potential release fractions and consequences. This paper presents a discussion of the history and present state-of-the-art of transportation risk analysis methodologies. Many reports in this area were reviewed as background for this presentation. The literature review, while not exhaustive, did result in a complete representation of the major methods used today in transportation risk analysis. These methodologies primarily include the use of severity categories based on historical accident data, the analysis of specifically assumed accident sequences for the transportation activity of interest, and the use of fault or event tree analysis. Although the focus of this work has generally been on potential impacts to public groups, some effort has been expended in the estimation of risks to occupational groups in transportation activities

  17. Microbiological Methodology in Astrobiology

    Science.gov (United States)

    Abyzov, S. S.; Gerasimenko, L. M.; Hoover, R. B.; Mitskevich, I. N.; Mulyukin, A. L.; Poglazova, M. N.; Rozanov, A. Y.

    2005-01-01

    Searching for life in astromaterials to be delivered from the future missions to extraterrestrial bodies is undoubtedly related to studies of the properties and signatures of living microbial cells and microfossils on Earth. As model terrestrial analogs of Martian polar subsurface layers are often regarded the Antarctic glacier and Earth permafrost habitats where alive microbial cells preserved viability for millennia years due to entering the anabiotic state. For the future findings of viable microorganisms in samples from extraterrestrial objects, it is important to use a combined methodology that includes classical microbiological methods, plating onto nutrient media, direct epifluorescence and electron microscopy examinations, detection of the elemental composition of cells, radiolabeling techniques, PCR and FISH methods. Of great importance is to ensure authenticity of microorganisms (if any in studied samples) and to standardize the protocols used to minimize a risk of external contamination. Although the convincing evidence of extraterrestrial microbial life will may come from the discovery of living cells in astromaterials, biomorphs and microfossils must also be regarded as a target in search of life evidence bearing in mind a scenario that alive microorganisms had not be preserved and underwent mineralization. Under the laboratory conditions, processes that accompanied fossilization of cyanobacteria were reconstructed, and artificially produced cyanobacterial stromatolites resembles by their morphological properties those found in natural Earth habitats. Regarding the vital importance of distinguishing between biogenic and abiogenic signatures and between living and fossil microorganisms in analyzed samples, it is worthwhile to use some previously developed approaches based on electron microscopy examinations and analysis of elemental composition of biomorphs in situ and comparison with the analogous data obtained for laboratory microbial cultures and

  18. Application and licensing requirements of the Framatome ANP RLBLOCA methodology

    International Nuclear Information System (INIS)

    Martin, R.P.; Dunn, B.M.

    2004-01-01

    The Framatome ANP Realistic Large-Break LOCA methodology (FANP RLBLOCA) is an analysis approach approved by the US NRC for supporting the licensing basis of 3- and 4-loop Westinghouse PWRs and CE 2x4 PWRs. It was developed consistent with the NRC's Code Scaling, Applicability, and Uncertainty (CSAU) methodology for performing best-estimate large-break LOCAs. The CSAU methodology consists of three key elements with the second and third element addressing uncertainty identification and application. Unique to the CSAU methodology is the use of engineering judgment and the Process Identification and Ranking Table (PIRT) defined in the first element to lay the groundwork for achieving the ultimate goal of quantifying the total uncertainty in predicted measures of interest associated with the large-break LOCA. It is the PIRT that not only directs the methodology development, but also directs the methodology review. While the FANP RLBLOCA methodology was generically approved, a plant-specific application is customized in two ways addressing how the unique plant characterization 1) is translated to code input and 2) relates to the unique methodology licensing requirements. Related to the former, plants are required by 10 CFR 50.36 to define a technical specification limiting condition for operation based on the following criteria: 1. Installed instrumentation that is used in the control room to detect, and indicate, a significant abnormal degradation of the reactor coolant pressure boundary. 2. A process variable, design feature, or operating restriction that is an initial condition of a design basis accident or transient analysis that either assumes the failure of or presents a challenge to the integrity of a fission product barrier. 3. A structure, system, or component that is part of the primary success path and which functions or actuates to mitigate a design basis accident or transient that either assumes the failure of or presents a challenge to the integrity of a

  19. Multicriteria methodology for decision aiding

    CERN Document Server

    Roy, Bernard

    1996-01-01

    This is the first comprehensive book to present, in English, the multicriteria methodology for decision aiding In the foreword the distinctive features and main ideas of the European School of MCDA are outlined The twelve chapters are essentially expository in nature, but scholarly in treatment Some questions, which are too often neglected in the literature on decision theory, such as how is a decision made, who are the actors, what is a decision aiding model, how to define the set of alternatives, are discussed Examples are used throughout the book to illustrate the various concepts Ways to model the consequences of each alternative and building criteria taking into account the inevitable imprecisions, uncertainties and indeterminations are described and illustrated The three classical operational approaches of MCDA synthesis in one criterion (including MAUT), synthesis by outranking relations, interactive local judgements, are studied This methodology tries to be a theoretical or intellectual framework dire...

  20. Design methodology of Dutch banknotes

    Science.gov (United States)

    de Heij, Hans A. M.

    2000-04-01

    Since the introduction of a design methodology for Dutch banknotes, the quality of Dutch paper currency has improved in more than one way. The methodology is question provides for (i) a design policy, which helps fix clear objectives; (ii) design management, to ensure a smooth cooperation between the graphic designer, printer, papermaker an central bank, (iii) a program of requirements, a banknote development guideline for all parties involved. This systematic approach enables an objective selection of design proposals, including security features. Furthermore, the project manager obtains regular feedback from the public by conducting market surveys. Each new design of a Netherlands Guilder banknote issued by the Nederlandsche Bank of the past 50 years has been an improvement on its predecessor in terms of value recognition, security and durability.

  1. Methodology for seismic PSA of NPPs

    International Nuclear Information System (INIS)

    Jirsa, P.

    1999-09-01

    A general methodology is outlined for seismic PSA (probabilistic safety assessment). The main objectives of seismic PSA include: description of the course of an event; understanding the most probable failure sequences; gaining insight into the overall probability of reactor core damage; identification of the main seismic risk contributors; identification of the range of peak ground accelerations contributing significantly to the plant risk; and comparison of the seismic risk with risks from other events. The results of seismic PSA are typically compared with those of internal PSA and of PSA of other external events. If the results of internal and external PSA are available, sensitivity studies and cost benefit analyses are performed prior to any decision regarding corrective actions. If the seismic PSA involves analysis of the containment, useful information can be gained regarding potential seismic damage of the containment. (P.A.)

  2. Methodology of dose calculation for the SRS SAR

    International Nuclear Information System (INIS)

    Price, J.B.

    1991-07-01

    The Savannah River Site (SRS) Safety Analysis Report (SAR) covering K reactor operation assesses a spectrum of design basis accidents. The assessment includes estimation of the dose consequences from the analyzed accidents. This report discusses the methodology used to perform the dose analysis reported in the SAR and also includes the quantified doses. Doses resulting from postulated design basis reactor accidents in Chapter 15 of the SAR are discussed, as well as an accident in which three percent of the fuel melts. Doses are reported for both atmospheric and aqueous releases. The methodology used to calculate doses from these accidents as reported in the SAR is consistent with NRC guidelines and industry standards. The doses from the design basis accidents for the SRS reactors are below the limits set for commercial reactors by the NRC and also meet industry criteria. A summary of doses for various postulated accidents is provided

  3. USGS Methodology for Assessing Continuous Petroleum Resources

    Science.gov (United States)

    Charpentier, Ronald R.; Cook, Troy A.

    2011-01-01

    The U.S. Geological Survey (USGS) has developed a new quantitative methodology for assessing resources in continuous (unconventional) petroleum deposits. Continuous petroleum resources include shale gas, coalbed gas, and other oil and gas deposits in low-permeability ("tight") reservoirs. The methodology is based on an approach combining geologic understanding with well productivities. The methodology is probabilistic, with both input and output variables as probability distributions, and uses Monte Carlo simulation to calculate the estimates. The new methodology is an improvement of previous USGS methodologies in that it better accommodates the uncertainties in undrilled or minimally drilled deposits that must be assessed using analogs. The publication is a collection of PowerPoint slides with accompanying comments.

  4. Methodologies for certification of transuranic waste packages

    International Nuclear Information System (INIS)

    Christensen, R.N.; Kok, K.D.

    1980-10-01

    The objective of this study was to postulate methodologies for certification that a waste package is acceptable for disposal in a licensed geologic repository. Within the context of this report, certification means the overall process which verifies that a waste package meets the criteria or specifications established for acceptance for disposal in a repository. The overall methodology for certification will include (1) certifying authorities, (2) tests and procedures, and (3) documentation and quality assurance programs. Each criterion will require a methodology that is specific to that criterion. In some cases, different waste forms will require a different methodology. The purpose of predicting certification methodologies is to provide additional information as to what changes, if any, are needed for the TRU waste in storage

  5. National Certification Methodology for the Nuclear Weapons Stockpile

    International Nuclear Information System (INIS)

    Goodwin, B T; Juzaitis, R J

    2006-01-01

    Lawrence Livermore and Los Alamos National Laboratories have developed a common framework and key elements of a national certification methodology called Quantification of Margins and Uncertainties (QMU). A spectrum from senior managers to weapons designers has been engaged in this activity at the two laboratories for on the order of a year to codify this methodology in an overarching and integrated paper. Following is the certification paper that has evolved. In the process of writing this paper, an important outcome has been the realization that a joint Livermore/Los Alamos workshop on QMU, focusing on clearly identifying and quantifying differences between approaches between the two labs plus developing an even stronger technical foundation on methodology, will be valuable. Later in FY03, such a joint laboratory workshop will be held. One of the outcomes of this workshop will be a new version of this certification paper. A comprehensive approach to certification must include specification of problem scope, development of system baseline models, formulation of standards of performance assessment, and effective procedures for peer review and documentation. This document concentrates on the assessment and peer review aspects of the problem. In addressing these points, a central role is played by a 'watch list' for weapons derived from credible failure modes and performance gate analyses. The watch list must reflect our best assessment of factors that are critical to weapons performance. High fidelity experiments and calculations as well as full exploitation of archival test data are essential to this process. Peer review, advisory groups and red teams play an important role in confirming the validity of the watch list. The framework for certification developed by the Laboratories has many basic features in common, but some significant differences in the detailed technical implementation of the overall methodology remain. Joint certification workshops held in June

  6. Air pollution monitoring - a methodological approach

    International Nuclear Information System (INIS)

    Trajkovska Trpevska, Magdalena

    2002-01-01

    Methodology for monitoring the emission of polluters in the air is a complex concept that in general embraces following fazes: sampling, laboratory treatment, and interpretation of results. In Company for technological and laboratory investigation and environmental protection - Mining Institute Skopje, the control of emission of polluters in the air is performing according methodology based in general on the recommendation of standard VDI 2.066 prescribe from Ministry of Ecology in Germany, because adequate legislation in our country does not exist. In this article the basic treatment of methodology for the air polluters emission control is presented. (Original)

  7. Comparative analysis of proliferation resistance assessment methodologies

    International Nuclear Information System (INIS)

    Takaki, Naoyuki; Kikuchi, Masahiro; Inoue, Naoko; Osabe, Takeshi

    2005-01-01

    Comparative analysis of the methodologies was performed based on the discussions in the international workshop on 'Assessment Methodology of Proliferation Resistance for Future Nuclear Energy Systems' held in Tokyo, on March 2005. Through the workshop and succeeding considerations, it is clarified that the proliferation resistance assessment methodologies are affected by the broader nuclear options being pursued and also by the political situations of the state. Even the definition of proliferation resistance, despite the commonality of fundamental issues, derives from perceived threat and implementation circumstances inherent to the larger programs. Deep recognitions of the 'difference' among communities would help us to make further essential and progressed discussion with harmonization. (author)

  8. The policy trail methodology

    DEFF Research Database (Denmark)

    Holford, John; Larson, Anne; Melo, Susana

    of ‘policy trail’, arguing that it can overcome ‘methodological nationalism’ and link structure and agency in research on the ‘European educational space’. The ‘trail’ metaphor, she suggests, captures the intentionality and the erratic character of policy. The trail connects sites and brings about change......, but – although policy may be intended to be linear, with specific outcomes – policy often has to bend, and sometimes meets insurmountable obstacles. This symposium outlines and develops the methodology, but also reports on research undertaken within a major FP7 project (LLLIght’in’Europe, 2012-15) which made use......In recent years, the “policy trail” has been proposed as a methodology appropriate to the shifting and fluid governance of lifelong learning in the late modern world (Holford et al. 2013, Holford et al. 2013, Cort 2014). The contemporary environment is marked by multi-level governance (global...

  9. Comparative proteomic assessment of matrisome enrichment methodologies

    Science.gov (United States)

    Krasny, Lukas; Paul, Angela; Wai, Patty; Howard, Beatrice A.; Natrajan, Rachael C.; Huang, Paul H.

    2016-01-01

    The matrisome is a complex and heterogeneous collection of extracellular matrix (ECM) and ECM-associated proteins that play important roles in tissue development and homeostasis. While several strategies for matrisome enrichment have been developed, it is currently unknown how the performance of these different methodologies compares in the proteomic identification of matrisome components across multiple tissue types. In the present study, we perform a comparative proteomic assessment of two widely used decellularisation protocols and two extraction methods to characterise the matrisome in four murine organs (heart, mammary gland, lung and liver). We undertook a systematic evaluation of the performance of the individual methods on protein yield, matrisome enrichment capability and the ability to isolate core matrisome and matrisome-associated components. Our data find that sodium dodecyl sulphate (SDS) decellularisation leads to the highest matrisome enrichment efficiency, while the extraction protocol that comprises chemical and trypsin digestion of the ECM fraction consistently identifies the highest number of matrisomal proteins across all types of tissue examined. Matrisome enrichment had a clear benefit over non-enriched tissue for the comprehensive identification of matrisomal components in murine liver and heart. Strikingly, we find that all four matrisome enrichment methods led to significant losses in the soluble matrisome-associated proteins across all organs. Our findings highlight the multiple factors (including tissue type, matrisome class of interest and desired enrichment purity) that influence the choice of enrichment methodology, and we anticipate that these data will serve as a useful guide for the design of future proteomic studies of the matrisome. PMID:27589945

  10. Changing methodologies in TESOL

    CERN Document Server

    Spiro, Jane

    2013-01-01

    Covering core topics from vocabulary and grammar to teaching, writing speaking and listening, this textbook shows you how to link research to practice in TESOL methodology. It emphasises how current understandings have impacted on the language classroom worldwide and investigates the meaning of 'methods' and 'methodology' and the importance of these for the teacher: as well as the underlying assumptions and beliefs teachers bring to bear in their practice. By introducing you to language teaching approaches, you will explore the way these are influenced by developments in our understanding of l

  11. Creativity in phenomenological methodology

    DEFF Research Database (Denmark)

    Dreyer, Pia; Martinsen, Bente; Norlyk, Annelise

    2014-01-01

    on the methodologies of van Manen, Dahlberg, Lindseth & Norberg, the aim of this paper is to argue that the increased focus on creativity and arts in research methodology is valuable to gain a deeper insight into lived experiences. We illustrate this point through examples from empirical nursing studies, and discuss......Nursing research is often concerned with lived experiences in human life using phenomenological and hermeneutic approaches. These empirical studies may use different creative expressions and art-forms to describe and enhance an embodied and personalised understanding of lived experiences. Drawing...... may support a respectful renewal of phenomenological research traditions in nursing research....

  12. Implementation impacts of PRL methodology

    International Nuclear Information System (INIS)

    Caudill, J.A.; Krupa, J.F.; Meadors, R.E.; Odum, J.V.; Rodrigues, G.C.

    1993-02-01

    This report responds to a DOE-SR request to evaluate the impacts from implementation of the proposed Plutonium Recovery Limit (PRL) methodology. The PRL Methodology is based on cost minimization for decisions to discard or recover plutonium contained in scrap, residues, and other plutonium bearing materials. Implementation of the PRL methodology may result in decisions to declare as waste certain plutonium bearing materials originally considered to be a recoverable plutonium product. Such decisions may have regulatory impacts, because any material declared to be waste would immediately be subject to provisions of the Resource Conservation and Recovery Act (RCRA). The decision to discard these materials will have impacts on waste storage, treatment, and disposal facilities. Current plans for the de-inventory of plutonium processing facilities have identified certain materials as candidates for discard based upon the economic considerations associated with extending the operating schedules for recovery of the contained plutonium versus potential waste disposal costs. This report evaluates the impacts of discarding those materials as proposed by the F Area De-Inventory Plan and compares the De-Inventory Plan assessments with conclusions from application of the PRL. The impact analysis was performed for those materials proposed as potential candidates for discard by the De-Inventory Plan. The De-Inventory Plan identified 433 items, containing approximately 1% of the current SRS Pu-239 inventory, as not appropriate for recovery as the site moves to complete the mission of F-Canyon and FB-Line. The materials were entered into storage awaiting recovery as product under the Department's previous Economic Discard Limit (EDL) methodology which valued plutonium at its incremental cost of production in reactors. An application of Departmental PRLs to the subject 433 items revealed that approximately 40% of them would continue to be potentially recoverable as product plutonium

  13. Business intelligence and performance management theory, systems and industrial applications

    CERN Document Server

    2013-01-01

    This book covers all the basic concepts of business intelligence and performance management including strategic support, business applications, methodologies and technologies from the field, and thoroughly explores the benefits, issues and challenges of each.

  14. Nuclear EMP: ingredients of an EMP protection engineering methodology

    International Nuclear Information System (INIS)

    Latorre, V.R.; Spogen, L.R. Jr.

    1977-02-01

    A fundamental methodology of electromagnetic pulse (EMP) protection engineering is described. Operations performed within the framework of this methodology are discussed. These operations, along with problem constraints and data, constitute the essential ingredients needed to implement the overall engineering methodology. Basic definitions and descriptions of these essential ingredients are provided. The issues discussed represent the first step in developing a methodology for protecting systems against EMP effects

  15. A design methodology for unattended monitoring systems

    International Nuclear Information System (INIS)

    SMITH, JAMES D.; DELAND, SHARON M.

    2000-01-01

    The authors presented a high-level methodology for the design of unattended monitoring systems, focusing on a system to detect diversion of nuclear materials from a storage facility. The methodology is composed of seven, interrelated analyses: Facility Analysis, Vulnerability Analysis, Threat Assessment, Scenario Assessment, Design Analysis, Conceptual Design, and Performance Assessment. The design of the monitoring system is iteratively improved until it meets a set of pre-established performance criteria. The methodology presented here is based on other, well-established system analysis methodologies and hence they believe it can be adapted to other verification or compliance applications. In order to make this approach more generic, however, there needs to be more work on techniques for establishing evaluation criteria and associated performance metrics. They found that defining general-purpose evaluation criteria for verifying compliance with international agreements was a significant undertaking in itself. They finally focused on diversion of nuclear material in order to simplify the problem so that they could work out an overall approach for the design methodology. However, general guidelines for the development of evaluation criteria are critical for a general-purpose methodology. A poor choice in evaluation criteria could result in a monitoring system design that solves the wrong problem

  16. Software engineering methodologies and tools

    Science.gov (United States)

    Wilcox, Lawrence M.

    1993-01-01

    Over the years many engineering disciplines have developed, including chemical, electronic, etc. Common to all engineering disciplines is the use of rigor, models, metrics, and predefined methodologies. Recently, a new engineering discipline has appeared on the scene, called software engineering. For over thirty years computer software has been developed and the track record has not been good. Software development projects often miss schedules, are over budget, do not give the user what is wanted, and produce defects. One estimate is there are one to three defects per 1000 lines of deployed code. More and more systems are requiring larger and more complex software for support. As this requirement grows, the software development problems grow exponentially. It is believed that software quality can be improved by applying engineering principles. Another compelling reason to bring the engineering disciplines to software development is productivity. It has been estimated that productivity of producing software has only increased one to two percent a year in the last thirty years. Ironically, the computer and its software have contributed significantly to the industry-wide productivity, but computer professionals have done a poor job of using the computer to do their job. Engineering disciplines and methodologies are now emerging supported by software tools that address the problems of software development. This paper addresses some of the current software engineering methodologies as a backdrop for the general evaluation of computer assisted software engineering (CASE) tools from actual installation of and experimentation with some specific tools.

  17. Probabilistic risk assessment methodology

    International Nuclear Information System (INIS)

    Shinaishin, M.A.

    1988-06-01

    The objective of this work is to provide the tools necessary for clear identification of: the purpose of a Probabilistic Risk Study, the bounds and depth of the study, the proper modeling techniques to be used, the failure modes contributing to the analysis, the classical and baysian approaches for manipulating data necessary for quantification, ways for treating uncertainties, and available computer codes that may be used in performing such probabilistic analysis. In addition, it provides the means for measuring the importance of a safety feature to maintaining a level of risk at a Nuclear Power Plant and the worth of optimizing a safety system in risk reduction. In applying these techniques so that they accommodate our national resources and needs it was felt that emphasis should be put on the system reliability analysis level of PRA. Objectives of such studies could include: comparing systems' designs of the various vendors in the bedding stage, and performing grid reliability and human performance analysis using national specific data. (author)

  18. Probabilistic risk assessment methodology

    Energy Technology Data Exchange (ETDEWEB)

    Shinaishin, M A

    1988-06-15

    The objective of this work is to provide the tools necessary for clear identification of: the purpose of a Probabilistic Risk Study, the bounds and depth of the study, the proper modeling techniques to be used, the failure modes contributing to the analysis, the classical and baysian approaches for manipulating data necessary for quantification, ways for treating uncertainties, and available computer codes that may be used in performing such probabilistic analysis. In addition, it provides the means for measuring the importance of a safety feature to maintaining a level of risk at a Nuclear Power Plant and the worth of optimizing a safety system in risk reduction. In applying these techniques so that they accommodate our national resources and needs it was felt that emphasis should be put on the system reliability analysis level of PRA. Objectives of such studies could include: comparing systems' designs of the various vendors in the bedding stage, and performing grid reliability and human performance analysis using national specific data. (author)

  19. Computer Network Operations Methodology

    Science.gov (United States)

    2004-03-01

    means of their computer information systems. Disrupt - This type of attack focuses on disrupting as “attackers might surreptitiously reprogram enemy...by reprogramming the computers that control distribution within the power grid. A disruption attack introduces disorder and inhibits the effective...between commanders. The use of methodologies is widespread and done subconsciously to assist individuals in decision making. The processes that

  20. SCI Hazard Report Methodology

    Science.gov (United States)

    Mitchell, Michael S.

    2010-01-01

    This slide presentation reviews the methodology in creating a Source Control Item (SCI) Hazard Report (HR). The SCI HR provides a system safety risk assessment for the following Ares I Upper Stage Production Contract (USPC) components (1) Pyro Separation Systems (2) Main Propulsion System (3) Reaction and Roll Control Systems (4) Thrust Vector Control System and (5) Ullage Settling Motor System components.

  1. A Functional HAZOP Methodology

    DEFF Research Database (Denmark)

    Liin, Netta; Lind, Morten; Jensen, Niels

    2010-01-01

    A HAZOP methodology is presented where a functional plant model assists in a goal oriented decomposition of the plant purpose into the means of achieving the purpose. This approach leads to nodes with simple functions from which the selection of process and deviation variables follow directly...

  2. Complicating Methodological Transparency

    Science.gov (United States)

    Bridges-Rhoads, Sarah; Van Cleave, Jessica; Hughes, Hilary E.

    2016-01-01

    A historical indicator of the quality, validity, and rigor of qualitative research has been the documentation and disclosure of the behind-the-scenes work of the researcher. In this paper, we use what we call "methodological data" as a tool to complicate the possibility and desirability of such transparency. Specifically, we draw on our…

  3. Methodological Advances in Dea

    NARCIS (Netherlands)

    L. Cherchye (Laurens); G.T. Post (Thierry)

    2001-01-01

    textabstractWe survey the methodological advances in DEA over the last 25 years and discuss the necessary conditions for a sound empirical application. We hope this survey will contribute to the further dissemination of DEA, the knowledge of its relative strengths and weaknesses, and the tools

  4. MIRD methodology. Part 1

    International Nuclear Information System (INIS)

    Rojo, Ana M.

    2004-01-01

    This lecture develops the MIRD (Medical Internal Radiation Dose) methodology for the evaluation of the internal dose due to the administration of radiopharmaceuticals. In this first part, the basic concepts and the main equations are presented. The ICRP Dosimetric System is also explained. (author)

  5. Response Surface Methodology

    NARCIS (Netherlands)

    Kleijnen, Jack P.C.

    2014-01-01

    Abstract: This chapter first summarizes Response Surface Methodology (RSM), which started with Box and Wilson’s article in 1951 on RSM for real, non-simulated systems. RSM is a stepwise heuristic that uses first-order polynomials to approximate the response surface locally. An estimated polynomial

  6. MIRD methodology. Part 2

    International Nuclear Information System (INIS)

    Gomez Parada, Ines

    2004-01-01

    This paper develops the MIRD (Medical Internal Radiation Dose) methodology for the evaluation of the internal dose due to the administration of radiopharmaceuticals. In this second part, different methods for the calculation of the accumulated activity are presented, together with the effective half life definition. Different forms of Retention Activity curves are also shown. (author)

  7. Covariance Evaluation Methodology for Neutron Cross Sections

    Energy Technology Data Exchange (ETDEWEB)

    Herman,M.; Arcilla, R.; Mattoon, C.M.; Mughabghab, S.F.; Oblozinsky, P.; Pigni, M.; Pritychenko, b.; Songzoni, A.A.

    2008-09-01

    We present the NNDC-BNL methodology for estimating neutron cross section covariances in thermal, resolved resonance, unresolved resonance and fast neutron regions. The three key elements of the methodology are Atlas of Neutron Resonances, nuclear reaction code EMPIRE, and the Bayesian code implementing Kalman filter concept. The covariance data processing, visualization and distribution capabilities are integral components of the NNDC methodology. We illustrate its application on examples including relatively detailed evaluation of covariances for two individual nuclei and massive production of simple covariance estimates for 307 materials. Certain peculiarities regarding evaluation of covariances for resolved resonances and the consistency between resonance parameter uncertainties and thermal cross section uncertainties are also discussed.

  8. The Speaker Respoken: Material Rhetoric as Feminist Methodology.

    Science.gov (United States)

    Collins, Vicki Tolar

    1999-01-01

    Presents a methodology based on the concept of "material rhetoric" that can help scholars avoid problems as they reclaim women's historical texts. Defines material rhetoric and positions it theoretically in relation to other methodologies, including bibliographical studies, reception theory, and established feminist methodologies. Illustrates…

  9. Review of methodologies and polices for evaluation of energy efficiency in high energy-consuming industry

    International Nuclear Information System (INIS)

    Li, Ming-Jia; Tao, Wen-Quan

    2017-01-01

    Highlights: • The classification of the industrial energy efficiency index has been summarized. • The factors of energy efficiency and their implement in industries are discussed. • Four main evaluation methodologies of energy efficiency in industries are concluded. • Utilization of the methodologies in energy efficiency evaluations are illustrated. • Related polices and suggestions based on energy efficiency evaluations are provided. - Abstract: Energy efficiency of high energy-consuming industries plays a significant role in social sustainability, economic performance and environmental protection of any nation. In order to evaluate the energy efficiency and guide the sustainability development, various methodologies have been proposed for energy demand management and to measure the energy efficiency performance accurately in the past decades. A systematical review of these methodologies are conducted in the present paper. First, the classification of the industrial energy efficiency index has been summarized to track the previous application studies. The single measurement indicator and the composite index benchmarking are highly recognized as the modeling tools for power industries and policy-making in worldwide countries. They are the pivotal figures to convey the fundamental information in energy systems for improving the performance in fields such as economy, environment and technology. Second, the six factors that influence the energy efficiency in industry are discussed. Third, four major evaluation methodologies of energy efficiency are explained in detail, including stochastic frontier analysis, data envelopment analysis, exergy analysis and benchmarking comparison. The basic models and the developments of these methodologies are introduced. The recent utilization of these methodologies in the energy efficiency evaluations are illustrated. Some drawbacks of these methodologies are also discussed. Other related methods or influential indicators

  10. The methodological quality of economic evaluation studies in obstetrics and gynecology: a systematic review

    NARCIS (Netherlands)

    Vijgen, Sylvia M. C.; Opmeer, Brent C.; Mol, Ben Willem J.

    2013-01-01

    We evaluated the methodological quality of economic evaluation studies in the field of obstetrics and gynecology published in the last decade. A MEDLINE search was performed to find economic evaluation studies in obstetrics and gynecology from the years 1997 through 2009. We included full economic

  11. BEEHIVE: Sustainable Methodology for Fashion Design

    OpenAIRE

    Morais, C.; Carvalho, C.; Broega, A. C.

    2014-01-01

    The proposal methodology tends to close the “product fashion cycle”, defending the existence of a good waste management policy, so that the clothing are thrown away can be reused or recycled to come back again as material to produce yarn, fabric or knit. Subsequently these materials should be include in the production of sustainable apparel, whose design methodologies should be concerned in providing more durable garments and being possible to transform according to the occasion and the user....

  12. Update of Part 61 impacts analysis methodology

    International Nuclear Information System (INIS)

    Oztunali, O.I.; Roles, G.W.

    1986-01-01

    The US Nuclear Regulatory Commission is expanding the impacts analysis methodology used during the development of the 10 CFR Part 61 rule to allow improved consideration of costs and impacts of disposal of waste that exceeds Class C concentrations. The project includes updating the computer codes that comprise the methodology, reviewing and updating data assumptions on waste streams and disposal technologies, and calculation of costs for small as well as large disposal facilities. This paper outlines work done to date on this project

  13. Update of Part 61 impacts analysis methodology

    International Nuclear Information System (INIS)

    Oztunali, O.I.; Roles, G.W.; US Nuclear Regulatory Commission, Washington, DC 20555)

    1985-01-01

    The US Nuclear Regulatory Commission is expanding the impacts analysis methodology used during the development of the 10 CFR Part 61 regulation to allow improved consideration of costs and impacts of disposal of waste that exceeds Class C concentrations. The project includes updating the computer codes that comprise the methodology, reviewing and updating data assumptions on waste streams and disposal technologies, and calculation of costs for small as well as large disposal facilities. This paper outlines work done to date on this project

  14. Mais de mil e uma noites de experiência etnográfica: uma construção metodológica para pesquisadores-performers da religião More than a thousand nights of ethnographic experience: a methodological construction for researcher-performers of religion

    Directory of Open Access Journals (Sweden)

    Francirosy Campos Barbosa Ferreira

    2009-11-01

    Full Text Available Este texto busca apresentar uma experiência etnográfica realizada em duas comunidades islâmicas em São Paulo e São Bernardo do Campo, Brasil, no âmbito do meu doutorado. Esta pesquisa estabelece um diálogo profícuo com a antropologia da performance e da experiência (ritual proposta por Victor Turner, Richard Schechner e Ronald Grimes, sendo permeada também pela teoria do “ser afetado” de Favret-Saada. Busco desenvolver, portanto, uma metodologia que abarque tanto a antropologia da performance quanto a antropologia da religião. Para isso, procurei (a argumentar a respeito da pesquisadora-performer, sendo esta mulher, mãe e antropóloga, e sobre o modo como esses elementos contribuem para uma etnografia performativa; (b apresentar o percurso etnográfico; (c destacar a experiência em um acampamento islâmico.This essay presents an ethnographic experience which took place in two Muslim communities, in São Paulo and São Bernardo do Campo, Brazil, as part of my doctoral field work. This research not only establishes an interesting dialogue between the anthropology of performance of Richard Schechner and Ronald Grimes and the anthropology of experience of Victor Turner, but also evokes Favret-Saada’s theory of affectivity. In this way, an attempt is made to develop a methodology capable of dealing with questions of both the anthropology of performance and the anthropology of religion. The objective are threefold: a to discuss the place of the researcher as a performer (as a woman, mother and anthropologist, within the framework of a performing ethnography; b to present the course which this ethnography made possible; and c to highlight the experience which took place in an Islamic camp.

  15. Performance tests of snow-related variables over the Tibetan Plateau and Himalayas using a new version of NASA GEOS-5 land surface model that includes the snow darkening effect

    Science.gov (United States)

    Yasunari, T. J.; Lau, W. K.; Koster, R. D.; Suarez, M.; Mahanama, S. P.; da Silva, A.; Colarco, P. R.

    2011-12-01

    The snow darkening effect, i.e. the reduction of snow albedo, is caused by absorption of solar radiation by absorbing aerosols (dust, black carbon, and organic carbon) deposited on the snow surface. This process is probably important over Himalayan and Tibetan glaciers due to the transport of highly polluted Atmospheric Brown Cloud (ABC) from the Indo-Gangetic Plain (IGP). This effect has been incorporated into the NASA Goddard Earth Observing System model, version 5 (GEOS-5) atmospheric transport model. The Catchment land surface model (LSM) used in GEOS-5 considers 3 snow layers. Code was developed to track the mass concentration of aerosols in the three layers, taking into account such processes as the flushing of the compounds as liquid water percolates through the snowpack. In GEOS-5, aerosol emissions, transports, and depositions are well simulated in the Goddard Chemistry Aerosol Radiation and Transport (GOCART) module; we recently made the connection between GOCART and the GEOS-5 system fitted with the revised LSM. Preliminary simulations were performed with this new system in "replay" mode (i.e., with atmospheric dynamics guided by reanalysis) at 2x2.5 degree horizontal resolution, covering the period 1 November 2005 - 31 December 2009; we consider the final three years of simulation here. The three simulations used the following variants of the LSM: (1) the original Catchment LSM with a fixed fresh snowfall density of 150 kg m-3; (2) the LSM fitted with the new snow albedo code, used here without aerosol deposition but with changes in density formulation and melting water effect on snow specific surface area, (3) the LSM fitted with the new snow albedo code as same as (2) but with fixed aerosol deposition rates (computed from GOCART values averaged over the Tibetan Plateau domain [lon.: 60-120E; lat.: 20-50N] during March-May 2008) applied to all grid points at every time step. For (2) and (3), the same setting on the fresh snowfall density as in (1) was

  16. A case study of the crack sizing performance of the Ultrasonic Phased Array combined crack and wall loss inspection tool on the Centennial pipeline, the defect evaluation, including the defect evaluation, field feature verification and tool performance validation (performed by Marathon Oil, DNV and GE Oil and Gas)

    Energy Technology Data Exchange (ETDEWEB)

    Hrncir, T.; Turner, S. [Marathon Pipe Line LLC, Findley, OH (United States); Polasik, SJ [DNV Columbus, Inc, Dublin, OH 43017 (United States); Vieth, P. [BP EandP, Houston, TX (United States); Allen, D.; Lachtchouk, I.; Senf, P.; Foreman, G. [GE Oil and Gas PII Pipeline Solutions, Stutensee (Germany)], email: geoff.foreman@ge.com

    2010-07-01

    The Centennial Pipeline System is operated by Marathon Pipe Line LLC. It is 754 miles long and carries liquid products from eastern Texas to southern Illinois. Most of it was constructed in 1951 for natural gas, but it was converted in 2001 for liquid product service. GE Oil and Gas conducted an ultrasonic phased array in-line inspection (ILI) survey of this pipeline, whose primary purpose was to detect and characterize stress corrosion cracking. A dig verification was performed in 2008 to increase the level of confidence in the detection and depth-sizing capabilities of this inspection method. This paper outlines of the USCD technology and experience and describes how the ILI survey results were validated, how the ILI data analysis was improved, and the impact on managing the integrity of the line section. Results indicate that the phased array technology approached a 90% certainty predicted depth with a tolerance of 1 mm at a 95% confidence level.

  17. Performance Tests of Snow-Related Variables Over the Tibetan Plateau and Himalayas Using a New Version of NASA GEOS-5 Land Surface Model that Includes the Snow Darkening Effect

    Science.gov (United States)

    Yasunari, Tppei J.; Lau, K.-U.; Koster, Randal D.; Suarez, Max; Mahanama, Sarith; Dasilva, Arlindo M.; Colarco, Peter R.

    2011-01-01

    The snow darkening effect, i.e. the reduction of snow albedo, is caused by absorption of solar radiation by absorbing aerosols (dust, black carbon, and organic carbon) deposited on the snow surface. This process is probably important over Himalayan and Tibetan glaciers due to the transport of highly polluted Atmospheric Brown Cloud (ABC) from the Indo-Gangetic Plain (IGP). This effect has been incorporated into the NASA Goddard Earth Observing System model, version 5 (GEOS-5) atmospheric transport model. The Catchment land surface model (LSM) used in GEOS-5 considers 3 snow layers. Code was developed to track the mass concentration of aerosols in the three layers, taking into account such processes as the flushing of the compounds as liquid water percolates through the snowpack. In GEOS-5, aerosol emissions, transports, and depositions are well simulated in the Goddard Chemistry Aerosol Radiation and Transport (GO CART) module; we recently made the connection between GOCART and the GEOS-5 system fitted with the revised LSM. Preliminary simulations were performed with this new system in "replay" mode (i.e., with atmospheric dynamics guided by reanalysis) at 2x2.5 degree horizontal resolution, covering the period 1 November 2005 - 31 December 2009; we consider the final three years of simulation here. The three simulations used the following variants of the LSM: (1) the original Catchment LSM with a fixed fresh snowfall density of 150 kg m-3 ; (2) the LSM fitted with the new snow albedo code, used here without aerosol deposition but with changes in density formulation and melting water effect on snow specific surface area, (3) the LSM fitted with the new snow albedo code as same as (2) but with fixed aerosol deposition rates (computed from GOCART values averaged over the Tibetan Plateau domain [Ion.: 60-120E; lat.: 20-50N] during March-May 2008) applied to all grid points at every time step. For (2) and (3), the same setting on the fresh snowfall density as in (1

  18. Hanford Site baseline risk assessment methodology

    International Nuclear Information System (INIS)

    1993-03-01

    This methodology has been developed to prepare human health and environmental evaluations of risk as part of the Comprehensive Environmental Response, Compensation, and Liability Act remedial investigations (RIs) and the Resource Conservation and Recovery Act facility investigations (FIs) performed at the Hanford Site pursuant to the Hanford Federal Facility Agreement and Consent Order referred to as the Tri-Party Agreement. Development of the methodology has been undertaken so that Hanford Site risk assessments are consistent with current regulations and guidance, while providing direction on flexible, ambiguous, or undefined aspects of the guidance. The methodology identifies Site-specific risk assessment considerations and integrates them with approaches for evaluating human and environmental risk that can be factored into the risk assessment program supporting the Hanford Site cleanup mission. Consequently, the methodology will enhance the preparation and review of individual risk assessments at the Hanford Site

  19. Exploring Participatory Methodologies in Organizational Discourse Analysis

    DEFF Research Database (Denmark)

    Plotnikof, Mie

    2014-01-01

    Recent debates in the field of organizational discourse analysis stress contrasts in approaches as single-level vs. multi-level, critical vs. participatory, discursive vs. material methods. They raise methodological issues of combining such to embrace multimodality in order to enable new contribu......Recent debates in the field of organizational discourse analysis stress contrasts in approaches as single-level vs. multi-level, critical vs. participatory, discursive vs. material methods. They raise methodological issues of combining such to embrace multimodality in order to enable new...... contributions. As regards conceptual efforts are made but further exploration of methodological combinations and their practical implications are called for. This paper argues 1) to combine methodologies by approaching this as scholarly subjectification processes, and 2) to perform combinations in both...

  20. Hanford Site Risk Assessment Methodology. Revision 3

    International Nuclear Information System (INIS)

    1995-05-01

    This methodology has been developed to prepare human health and ecological evaluations of risk as part of the Comprehensive Environmental Response, Compensation, and Liability Act of 1980 (CERCLA) remedial investigations (RI) and the Resource conservation and Recovery Act of 1976 (RCRA) facility investigations (FI) performed at the Hanford Site pursuant to the hanford Federal Facility Agreement and Consent Order (Ecology et al. 1994), referred to as the Tri-Party Agreement. Development of the methodology has been undertaken so that Hanford Site risk assessments are consistent with current regulations and guidance, while providing direction on flexible, ambiguous, or undefined aspects of the guidance. The methodology identifies site-specific risk assessment considerations and integrates them with approaches for evaluating human and ecological risk that can be factored into the risk assessment program supporting the Hanford Site cleanup mission. Consequently, the methodology will enhance the preparation and review of individual risk assessments at the Hanford Site